feat: add automated image optimization pipeline with Sharp#243
feat: add automated image optimization pipeline with Sharp#243Rozerxshashank wants to merge 5 commits intoAOSSIE-Org:devfrom
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughThis PR adds an automated image optimization pipeline using Sharp. A new script compresses PNG, JPEG, and WebP images in the app/assets directory, generates WebP versions for non-WebP files, and replaces originals only when compression improves file size, with atomic file operations and comprehensive error handling. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
🚥 Pre-merge checks | ✅ 3 | ❌ 2❌ Failed checks (2 warnings)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Fix all issues with AI agents
In `@package.json`:
- Around line 18-24: Add a prebuild npm script so image optimization runs
automatically before builds: create a "prebuild" script that invokes the
existing "optimize-images" script, ensuring the NPM lifecycle runs optimization
prior to the "build" script; update package.json scripts to include "prebuild"
(which calls "optimize-images") and verify that "build" remains as
"react-scripts build" so npm run build will first run the new prebuild hook.
- Around line 25-27: Update the sharp devDependency in package.json from
"0.33.2" to the latest stable "0.34.5": edit the "devDependencies" entry for the
"sharp" package to use version "0.34.5", then run your package manager's install
command (npm install or yarn install) and verify builds/tests to ensure no
breaking changes from the bump.
In `@scripts/optimize-images.js`:
- Around line 68-70: The current pattern deletes the original with
fs.promises.unlink(inputPath) before renaming, risking data loss; instead remove
the unlink and perform an atomic replacement by calling
fs.promises.rename(tempPath, inputPath) directly (this lets POSIX rename
atomically replace the target) and keep error handling around the rename to
handle failures; update the code around the tempPath/inputPath replacement in
scripts/optimize-images.js (the block that currently calls unlink then rename)
to only call fs.promises.rename(tempPath, inputPath) and handle/propagate any
errors.
- Around line 128-131: The final summary prints can produce NaN when
totalOriginal is 0; update the end-of-run logging that uses totalOriginal,
totalCompressed and formatBytes to guard against division by zero: compute a
percentSaved variable that is 0 (or "0.0") when totalOriginal === 0, otherwise
compute ((totalOriginal - totalCompressed) / totalOriginal * 100). Use that
percentSaved in the Saved log message so the Saved line never prints NaN; modify
the block containing totalOriginal/totalCompressed/formatBytes to perform this
check before formatting the percent.
- Around line 12-14: The PNG encoder options currently include an invalid
"effort" property (used only by WebP); locate the .png(...) call that uses
PNG_QUALITY and remove the effort option there (keep WEBP-specific options on
the .webp(...) call). If you want lossless PNGs, remove both "quality" and
"effort" and keep only "compressionLevel"; if you want palette-quantized PNG-8,
keep "quality" (PNG_QUALITY) and remove "effort". Ensure only valid Sharp PNG
options are passed to the .png(...) invocation.
🧹 Nitpick comments (2)
scripts/optimize-images.js (2)
16-26: Consider dynamic PNG discovery instead of hardcoded list.The hardcoded list requires manual maintenance when images are added or removed. A glob-based approach would be more maintainable:
const glob = require('glob'); const IMAGES_TO_OPTIMIZE = glob.sync('*.png', { cwd: ASSETS_DIR });Alternatively, use
fs.readdirSyncwith a filter if you want to avoid adding another dependency. You could also add an exclusion list for files that shouldn't be optimized.
28-35: Silent error handling may mask filesystem issues.Returning
0on error could produce misleading optimization statistics (e.g., "100% savings" when file couldn't be read). Consider logging a warning:} catch (error) { + console.warn(`Warning: Could not stat ${filePath}: ${error.message}`); return 0; }
|
package-lock.json is automatically generated by npm which shows 18k lines changes. |
M4dhav
left a comment
There was a problem hiding this comment.
Please change base branch to dev, and remove unrelated changes.
For example, many images are logged as changes, despite nothing changing in them
f1a162e to
dfd7e6b
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@scripts/optimize-images.js`:
- Around line 69-96: Compare the compressed temp file size to the original
before renaming: after writing tempPath and computing compressedSize (via
getFileSize) check if compressedSize < originalSize and only then await
fs.promises.rename(tempPath, inputPath); if compressedSize >= originalSize,
remove the temp file (fs.promises.unlink(tempPath)), keep the original inputPath
intact, set compressedSize = originalSize and savings = 0, and then generate the
WebP from the preserved inputPath (webpPath) as you currently do; reference
tempPath, inputPath, webpPath, getFileSize, compressedSize, originalSize and
preserve the atomic rename behavior when replacing.
|
Hey @M4dhav , I removed the unrelated changes. But I am not allowed to change the base branch to dev. I think only the maintainers have that access. |
|
Hey @Rozerxshashank , does this script only work on PNGs? |
|
Currently, it's focused on PNGs as they were the largest assets in the project, but I can quickly update it to support JPG/JPEG as well if that would be helpful. |
Yes, sorry for the delayed reply. Please extend support to all image types so we don't have to refactor later. |
Done! |
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (1)
scripts/optimize-images.js (1)
25-43: Consider recursive discovery so nested assets are not skipped.At Line 27,
readdir(ASSETS_DIR)is non-recursive, so images in subfolders underapp/assetswon’t be optimized.♻️ Suggested refactor
async function findImages() { - try { - const files = await fs.promises.readdir(ASSETS_DIR); - const images = []; - - for (const file of files) { - const ext = path.extname(file).toLowerCase(); - if (SUPPORTED_EXTENSIONS.includes(ext)) { - const filePath = path.join(ASSETS_DIR, file); - const stats = await fs.promises.stat(filePath); - - // Only include files above minimum size threshold - if (stats.size >= MIN_FILE_SIZE) { - images.push(file); - } - } - } - - return images; + async function walk(dir, base = '') { + const entries = await fs.promises.readdir(dir, { withFileTypes: true }); + const out = []; + for (const entry of entries) { + const rel = path.join(base, entry.name); + const abs = path.join(dir, entry.name); + if (entry.isDirectory()) { + out.push(...await walk(abs, rel)); + continue; + } + const ext = path.extname(entry.name).toLowerCase(); + if (!SUPPORTED_EXTENSIONS.includes(ext)) continue; + const stats = await fs.promises.stat(abs); + if (stats.size >= MIN_FILE_SIZE) out.push(rel); + } + return out; + } + + try { + return await walk(ASSETS_DIR); } catch (error) { console.error('Error reading assets directory:', error.message); return []; } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/optimize-images.js` around lines 25 - 43, The findImages function currently uses fs.promises.readdir(ASSETS_DIR) non-recursively so files in subdirectories are skipped; change findImages to perform a recursive directory walk starting from ASSETS_DIR (use fs.promises.readdir with { withFileTypes: true } or equivalent) so it descends into subfolders, builds full paths via path.join, checks stats.size against MIN_FILE_SIZE and filters by SUPPORTED_EXTENSIONS before pushing results (preserve existing behavior but return relative or absolute file paths consistently).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@scripts/optimize-images.js`:
- Around line 149-154: The catch block in optimizeImage currently calls await
fs.promises.unlink(tempPath) directly which can throw and break per-file fault
isolation; wrap the cleanup of tempPath in its own safe guard (e.g., check
fileExists(tempPath) then attempt unlink inside a nested try/catch) so any error
during fs.promises.unlink(tempPath) is swallowed or logged but does not rethrow,
ensuring optimizeImage continues to resolve/reject based on the original
optimization error only.
- Around line 131-134: The code writes WebP output directly to webpPath (using
sharp(inputPath)...toFile(webpPath)) which can overwrite an existing WebP even
when larger and bypasses the atomic/temp replace pattern used for originals;
change the logic to write to a temp path (e.g., tempWebpPath), call
sharp(...).toFile(tempWebpPath), get its size with getFileSize(tempWebpPath),
compare sizes against existing webpPath, and only atomically replace/rename the
existing webpPath with tempWebpPath if the new file is smaller (or otherwise
desirable), cleaning up the temp file on failure — references: inputPath,
webpPath, WEBP_QUALITY, getFileSize.
---
Nitpick comments:
In `@scripts/optimize-images.js`:
- Around line 25-43: The findImages function currently uses
fs.promises.readdir(ASSETS_DIR) non-recursively so files in subdirectories are
skipped; change findImages to perform a recursive directory walk starting from
ASSETS_DIR (use fs.promises.readdir with { withFileTypes: true } or equivalent)
so it descends into subfolders, builds full paths via path.join, checks
stats.size against MIN_FILE_SIZE and filters by SUPPORTED_EXTENSIONS before
pushing results (preserve existing behavior but return relative or absolute file
paths consistently).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 48647914-02d4-407c-983f-3b3227e3c895
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (1)
scripts/optimize-images.js
| await sharp(inputPath) | ||
| .webp({ quality: WEBP_QUALITY }) | ||
| .toFile(webpPath); | ||
| webpSize = await getFileSize(webpPath); |
There was a problem hiding this comment.
WebP output should use temp + guarded replace, not direct overwrite.
At Line 131, writing directly to webpPath can overwrite an existing WebP even when the new output is larger, and it isn’t using the same atomic replacement pattern you used for originals.
🛡️ Suggested fix
- if (ext !== '.webp') {
- await sharp(inputPath)
- .webp({ quality: WEBP_QUALITY })
- .toFile(webpPath);
- webpSize = await getFileSize(webpPath);
+ if (ext !== '.webp') {
+ const webpTempPath = path.join(
+ path.dirname(webpPath),
+ `temp_${path.basename(webpPath)}`
+ );
+ await sharp(inputPath)
+ .webp({ quality: WEBP_QUALITY })
+ .toFile(webpTempPath);
+
+ const newWebpSize = await getFileSize(webpTempPath);
+ const existingWebpSize = await getFileSize(webpPath);
+
+ if (existingWebpSize === 0 || (newWebpSize > 0 && newWebpSize < existingWebpSize)) {
+ await fs.promises.rename(webpTempPath, webpPath);
+ webpSize = newWebpSize;
+ } else {
+ await fs.promises.unlink(webpTempPath);
+ webpSize = existingWebpSize;
+ }
} else {
webpSize = compressedSize;
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/optimize-images.js` around lines 131 - 134, The code writes WebP
output directly to webpPath (using sharp(inputPath)...toFile(webpPath)) which
can overwrite an existing WebP even when larger and bypasses the atomic/temp
replace pattern used for originals; change the logic to write to a temp path
(e.g., tempWebpPath), call sharp(...).toFile(tempWebpPath), get its size with
getFileSize(tempWebpPath), compare sizes against existing webpPath, and only
atomically replace/rename the existing webpPath with tempWebpPath if the new
file is smaller (or otherwise desirable), cleaning up the temp file on failure —
references: inputPath, webpPath, WEBP_QUALITY, getFileSize.
| } catch (error) { | ||
| console.error(`Error optimizing ${filename}:`, error.message); | ||
| // Clean up temp file if it exists | ||
| if (await fileExists(tempPath)) { | ||
| await fs.promises.unlink(tempPath); | ||
| } |
There was a problem hiding this comment.
Guard cleanup unlink inside catch to preserve per-file fault isolation.
At Line 153, if unlink(tempPath) fails inside the catch block, optimizeImage() can still reject and interrupt the batch.
🧯 Suggested hardening
} catch (error) {
console.error(`Error optimizing ${filename}:`, error.message);
// Clean up temp file if it exists
if (await fileExists(tempPath)) {
- await fs.promises.unlink(tempPath);
+ try {
+ await fs.promises.unlink(tempPath);
+ } catch (cleanupError) {
+ console.error(`Failed to clean temp file for ${filename}:`, cleanupError.message);
+ }
}
return null;
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/optimize-images.js` around lines 149 - 154, The catch block in
optimizeImage currently calls await fs.promises.unlink(tempPath) directly which
can throw and break per-file fault isolation; wrap the cleanup of tempPath in
its own safe guard (e.g., check fileExists(tempPath) then attempt unlink inside
a nested try/catch) so any error during fs.promises.unlink(tempPath) is
swallowed or logged but does not rethrow, ensuring optimizeImage continues to
resolve/reject based on the original optimization error only.
Description
Adds a build-time image optimization pipeline using Sharp to compress PNG images and generate WebP versions. This reduces total image assets by 77.9% (4.52 MB → 1.02 MB), significantly improving page load performance.
This approach is fully compatible with static site generation as Sharp runs at build-time, not runtime.
Fixes #236
Type of change
How Has This Been Tested?
npm installto install Sharp dependencynpm run optimize-images- images compressed successfully (77.9% reduction)npm run build- production build completes without errorsResults:
Checklist:
Maintainer Checklist
Summary by CodeRabbit