preface
Recently, IN the process of small program development, I encountered a headache problem. Many ICONS of icon class were used in the project. Because of their small size, they were not accessed by CDN, but used locally by packaging. However, the small program has a package size limit, so every time a new requirement is developed, the new icon needs to be compressed in Tinypng, and the operation feels tedious
So we implemented a tinify based image batch compression CLI tool, img-squash-cli
This project is for learning to use, mainly to reduce the repetitive image compression operation in the work, not for commercial use. So, for commercial use, buy the legitimate service Tinypng
Tool is introduced
This section describes the common image compression tool Tinypng
Tinypng is a high performance image compression tool that makes the texture of compressed images indistinguishable to the naked eye. For front-end development, in many scenarios where images are heavily used, the easiest way to optimize access performance is to compress related image resources to reduce resource volume
Common advantages of Tinypng:
- Image compression texture is good
- External API
- Easy to use, a certain number of free, can be batch compression
Disadvantages of Tinypng:
- You can only compress 500 free copies a month
- Image size limit with free compression (5M)
- In the case of batch compression, there is a quantity limit (20 sheets)
- Less compressible types (JPG vs. PNG)
There are so many disadvantages mentioned above, but I still want to use it, because it can meet most of the usage scenarios, and most people can compress no more than 500 pictures in a month. In general, the compressed pictures are ICONS and the volume is also controllable
After all, the product is free and the experience is good, why do you want more?
Problems encountered in using Tinypng
However, there are some scenarios where you may encounter the following problems. When you need to batch compress images for the entire project, you will find that you will exceed the limit of 500 images, and you cannot compress all images at once (limit of 20 images at a time).
Common solutions are as follows:
- Buy paid services (for teams with economic capacity, it is better to buy services, brainless operation, only need to spend money can save a lot of trouble)
- Register multiple mailboxes to avoid the limit of 500 pieces for a single account (register 10 mailboxes and you can compress 5000 pictures in a month)
- Build a batch compression service, automatically read folder resources for compression
implementation
Batch compression is performed through the API provided by the official
Several types of language implementations are officially available, including Ruby, PHP, Node.js, Python, Java, and. NET, etc. Official documentation address tynying API
Tinify NPM, the node-based core library, is also available. Tinify NPM is fairly simple to use, but you need to apply for an API KEY to compress it. Each API KEY is limited to 500 files per month
Application method:
Implementation idea: Apply for multiple API keys and use different API keys in turn during compression to avoid the 500 limit problem
Code implementation:
// index.js const fs = require('fs'); const path = require('path'); const tinify = require('tinify'); // File type to process const imgsInclude = ['.png', '.jpg']; // Const filesExclude = ['dist', 'build', 'node_modules', 'config']; const keys = ['ZNsTlW44Lbv1v82GG7WwBWW8VVD09nh9', 'HR5gGtwfVvNYyQtwS4HL1VLww3dnndvH']; Const config = {// Max size: 5M Max: 5200000, // Max size: 20, default: 10 maxLength: 10,}; tinify.key = keys[1]; function readFile(filePath, filesList) { const files = fs.readdirSync(filePath); files.forEach((file) => { const fPath = path.join(filePath, file); const states = fs.statSync(fPath); Const extName = path.extname(file); if (states.isFile()) { const obj = { size: states.size, name: file, path: fPath, }; If (states.size > config.max) {console.log(' file ${file} exceeds the compression limit of 5M '); } if (states.size < config.max && imgsInclude.includes(extname)) { filesList.push(obj); } } else { if (! filesExclude.includes(file)) { readFile(fPath, filesList); }}}); } function getFileList(filePath) { const filesList = []; readFile(filePath, filesList); return filesList; } function complete() {console.log(' file generated successfully '); } function writeFile(fileName, data) { fs.writeFile(fileName, data, 'utf-8', complete); } function sort(m, n, key = 'size') { if (m[key] > n[key]) { return -1; } else if (m[key] < n[key]) { return 1; } else { return 0; } } const filesList = getFileList('src'); filesList.sort(sort); Let STR = ` original image contrast \ n # # # project image information \ n image compression to the total amount of ${filesList. Length} \ n | | filename file size | | compressed volume compression ratio | | file path \ n | -- - | -- - | -- - | -- - | -- |\n`; function output(list) { for (let i = 0; i < list.length; i++) { const { name, path: _path, size, miniSize } = list[i]; const fileSize = `${size > 1024 ? (size / 1024).toFixed(2) + 'KB' : size + 'B'}`; const compressionSize = `${ miniSize > 1024 ? (miniSize / 1024).toFixed(2) + 'KB' : miniSize + 'B' }`; const compressionRatio = `${((size - miniSize) / size).toFixed(2) * 100 + '%'}`; const desc = `| ${name} | ${fileSize} | ${compressionSize} | ${compressionRatio} | ${_path} |\n`; str += desc; } } const list = []; function squash() { Promise.all( filesList.map(async (item) => { const io = path.resolve(item.path); const source = tinify.fromFile(item.path); try { return new Promise(async (resolve, reject) => { await source.toFile(io); const miniSize = fs.statSync(item.path).size; list.push({ ... item, miniSize }); resolve(); }); } catch (error) {if (error instanceof tinify.AccountError) {console.log('The error message is: your monthly quota has been exceeded '); } else if (error instanceof tinify.clienterror) {console.log('The error message is: check your source image and request options '); } else if (error instanceof tinify.ServerError) {console.log('The error message is: tinify API temporary problem '); } else if (error instanceof tinify.ConnectionError) {console.log('The error message is: network ConnectionError '); } else {console.log('The error message is: something else is wrong, not related to Tinify API '); } } }) ).then(() => { output(list); writeFile('test.md', str); }); } squash();Copy the code
To use this method, change getFileList(‘ SRC ‘) to indicate that the contents of the folder cannot be compressed. Run node index.js in the root directory of the project to start the compression
Existing problems:
- API KEY acquisition trouble, and in the batch compression process, if a compression 501, may lead to compression failure, and the number of compression is also used, but also need to maintain the number of state in real time
- Modifying the compressed directory requires modifying the code, which is difficult to maintain
- When the image compression is complete, the secondary compression will still compress the previously compressed file, resulting in waste
- When you want to use it between different projects, you need to copy the code to the specified project, which is cumbersome
So how to solve the above several problems?
Is there a way to compress that doesn’t require an API KEY?
Yes, Tinify also offers a way to compress images on the Web side by dragging images into the browser, uploading them, and then downloading them
Problems with this approach:
- Image size limit
- Limit the number of photos uploaded per photo
- 500 compression limit per computer
- You need to manually replace the compressed file
For other problems, you can run the compression command through the CLI command of the global installation, so that you do not need to copy scripts for each project. The compressed directory of the project can also be passed in by command, and md5 is generated for the compressed file to avoid the problem of repeated compression
Batch operations are performed through the Web interface
The web client has two access addresses. You can invoke different addresses alternately to prevent IP access failures caused by high access frequency. You can also add a dynamically generated IP address to the request header to avoid access denial when a single IP address frequently accesses the request
const urls = [
"tinyjpg.com",
"tinypng.com"
];
const ip = new Array(4).fill(0).map(() => parseInt(Math.random() * 255)).join(".");
Copy the code
Advantages:
- Dynamic IP
- File compression generates MD5 to avoid repeated compression
- There is no limit to the amount of compression
- Will generate file compression ratio file
Output file compression:
Code implementation:
#!/usr/bin/env node
const fs = require("fs");
const path = require("path");
const https = require('https')
const chalk = require('chalk');
const md5 = require('md5');
const args = require('minimist')(process.argv.slice(2))
/**
* args参数
* @param {*} md
* 默认不生成md文件
* 如果需要生成md文件,传入参数md
* node index.js --md=true
* @returns 是否生成md文件
*
* @param {*} folder
* 图片压缩文件范围,默认src文件夹
* node index.js --folder=src
* @returns
*/
// 需要处理的文件类型
const imgsInclude = ['.png', '.jpg'];
// 不进行处理的文件夹
const filesExclude = ['dist', 'build', 'node_modules', 'config'];
const urls = [
"tinyjpg.com",
"tinypng.com"
];
const config = {
// 图片最大限制5M
max: 5242880,
// 每次最多处理20张,默认处理10张
maxLength: 10,
};
const Log = console.log
const Success = chalk.green
const Error = chalk.bold.red;
const Warning = chalk.keyword('orange');
// 历史文件压缩后生成的md5
let keys = []
// 读取指定文件夹下所有文件
let filesList = []
// 压缩后文件列表
const squashList = []
// 请求头
function header() {
const ip = new Array(4).fill(0).map(() => parseInt(Math.random() * 255)).join(".");
const index = Math.round(Math.random(0, 1));
return {
headers: {
"Cache-Control": "no-cache",
"Content-Type": "application/x-www-form-urlencoded",
"Postman-Token": Date.now(),
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.116 Safari/537.36",
"X-Forwarded-For": ip
},
hostname: urls[index],
method: "POST",
path: "/web/shrink",
rejectUnauthorized: false
};
}
// 判断文件是否存在
function access(dir) {
return new Promise((resolve, reject) => {
fs.access(dir, fs.constants.F_OK, async err => {
if (!err) {
resolve(true)
} else {
resolve(false)
}
})
})
}
// 读取文件
function read(dir) {
return new Promise((resolve, reject) => {
fs.readFile(dir, 'utf-8', (err, data) => {
if (!err) {
keys = JSON.parse(data.toString()).list
Log(Success('文件指纹读取成功'))
resolve(keys)
}
})
})
}
// 上传文件
function upload(file) {
const options = header()
return new Promise((resolve, reject) => {
const req = https.request(options, res => res.on('data', data => {
const obj = JSON.parse(data.toString());
obj.error ? reject(obj.message) : resolve(obj);
}));
req.on('error', error => {
Error('upload', file)
reject(error)
})
req.write(file, 'binary')
req.end()
})
}
// 下载文件
function download(url) {
const options = new URL(url);
return new Promise((resolve, reject) => {
const req = https.request(options, res => {
let file = '';
res.setEncoding('binary');
res.on('data', chunk => {
file += chunk;
});
res.on('end', () => resolve(file));
});
req.on('error', error => {
Error('download', url)
reject(error)
})
req.end();
});
}
// 遍历指定类型文件
function readFile(filePath, filesList) {
const files = fs.readdirSync(filePath);
files.forEach((file) => {
const fPath = path.join(filePath, file);
const states = fs.statSync(fPath);
// 获取文件后缀
const extname = path.extname(file);
if (states.isFile()) {
const obj = {
size: states.size,
name: file,
path: fPath,
};
const key = md5(fPath + states.size)
if (states.size > config.max) {
Warning(fPath)
Log(`文件${file}超出5M的压缩限制`);
}
if (states.size < config.max && imgsInclude.includes(extname) && !keys.includes(key)) {
filesList.push(obj);
}
} else {
if (!filesExclude.includes(file)) {
readFile(fPath, filesList);
}
}
});
}
function getFileList(filePath) {
const filesList = [];
readFile(filePath, filesList);
return filesList;
}
function writeFile(fileName, data) {
fs.writeFile(fileName, data, 'utf-8', () => {
Log(Success('文件生成成功'))
});
}
function transformSize(size) {
return size > 1024 ? (size / 1024).toFixed(2) + 'KB' : size + 'B'
}
let str = `# 项目原始图片对比\n
## 图片压缩信息\n
| 文件名 | 文件体积 | 压缩后体积 | 压缩比 | 文件路径 |\n| -- | -- | -- | -- | -- |\n`;
function output(list) {
for (let i = 0; i < list.length; i++) {
const { name, path: _path, size, miniSize } = list[i];
const fileSize = `${transformSize(size)}`;
const compressionSize = `${transformSize(miniSize)}`;
const compressionRatio = `${(100 * (size - miniSize) / size).toFixed(2) + '%'}`;
const desc = `| ${name} | ${fileSize} | ${compressionSize} | ${compressionRatio} | ${_path} |\n`;
str += desc;
}
let size = 0, miniSize = 0
list.forEach(item => {
size += item.size
miniSize += item.miniSize
})
const s = `
## 体积变化信息\n
| 原始体积 | 压缩后提交 | 压缩比 |\n| -- | -- | -- |\n| ${transformSize(size)} | ${transformSize(miniSize)} | ${(100 * (size - miniSize) / size).toFixed(2) + '%'} |
`
str = str + s
writeFile('图片压缩比.md', str);
}
// 生成文件指纹
function fingerprint() {
const list = []
squashList.forEach(item => {
const { miniSize, path } = item
const md5s = md5(path + miniSize)
list.push(md5s)
})
fs.writeFile('squash.json', JSON.stringify({ list: keys.concat(list) }, null, '\t'), err => {
if (err) throw err
Log(Success('文件指纹生成成功'))
})
}
function squash() {
try {
Promise.all(
filesList.map(async item => {
Log(Success(item.path))
const fpath = fs.readFileSync(item.path, 'binary')
const { output = {} } = await upload(fpath)
if (!output.url) return
const data = await download(output.url)
if (!data) return
fs.writeFileSync(item.path, data, 'binary')
return new Promise(async (resolve, reject) => {
const miniSize = fs.statSync(item.path).size;
squashList.push({ ...item, miniSize });
resolve();
});
})
).then(() => {
if (args['md']) {
output(squashList);
}
fingerprint()
console.timeEnd('squash time')
})
} catch (error) {
return Promise.reject(error)
}
}
async function start() {
try {
const files = args['folder'] || 'src'
const check = await access(files)
if (!check) {
Log(Error('当前文件或者文件夹不存在,请更换压缩目录'))
return
}
const res = await access('squash.json')
if (res) {
await read('squash.json')
}
new Promise((resolve, reject) => {
filesList = getFileList(files)
resolve()
}).then(() => {
squash()
})
} catch (error) {
Log(error)
}
}
console.time('squash time')
start()
Copy the code
img-squash-cli
NPM img-squsah-ci: NPM img-squsah-ci: NPM img-squsah-ci
Usage:
Install dependencies
npm install -g img-squash-cli
Copy the code
API
npx squash-cli
Copy the code
If you need to specify a file for image compression, add Folder. The default is the SRC folder in the project directory
// Change to process contents in config folder
npx squash-cli --folder=config
Copy the code
If you want to generate information such as image compression ratio, add md parameter. This parameter is disabled by default
npx suqash-cli --md=true
Copy the code