← All articles

How to Compress JPEG Files: A Guide to Optimal Quality

JPEG compression is a balancing act. Compress too little and you're serving 3MB photos to mobile users. Compress too hard and your product images look like watercolors. Get it right and you have images that look sharp, load fast, and don't bloat your site.

This guide covers how JPEG compression actually works, what quality settings mean in practice, and which tool to use depending on your situation.

How JPEG Compression Works

JPEG compression uses a technique called the Discrete Cosine Transform (DCT). When you save a JPEG, the encoder:

  1. Splits the image into 8x8 pixel blocks
  2. Transforms each block into frequency data (low frequencies = broad color areas, high frequencies = sharp edges and fine detail)
  3. Applies a quantization table that rounds down the high-frequency data based on your quality setting
  4. Encodes the result with Huffman compression

The quality parameter controls how aggressively step 3 discards detail. High quality = fine-grained quantization = large file. Low quality = coarse quantization = small file with visible artifacts.

JPEG is lossy by design — once you throw away that frequency data, you cannot recover it. That is fine. The human visual system is much more sensitive to brightness changes than color changes, and much more tolerant of subtle, uniform distortion than sharp edges. JPEG's quantization tables are tuned to exploit exactly that.

JPEG Quality Settings Explained

Most tools expose quality as a 0-100 scale. Here is what that actually means:

Quality File Size Reduction Visual Difference Best Use Case
90-100 10-30% Imperceptible Print, archival, source files
80-89 40-60% Imperceptible on screen High-quality web images, product photos
75-79 55-70% Imperceptible to most viewers General web delivery — the sweet spot
65-74 65-80% Slight softness at close inspection Thumbnails, background images
50-64 75-90% Noticeable degradation Previews, very low-bandwidth contexts
Below 50 90%+ Heavy blocking artifacts Rarely useful

The 75-85 sweet spot: For web delivery, quality 75-85 hits the Pareto point. You get 55-70% file size reduction with zero visible degradation at normal viewing distances. Quality 85+ gives diminishing returns — the extra bytes do not produce visible improvement. Quality below 70 risks artifacts on high-contrast edges like text-on-image.

When in doubt, start at 80 and compare against the original at 100% zoom. If you cannot spot the difference, you are done.

Method 1: Compress JPEG Online

The fastest path is Pixotter's compression tool. It runs entirely in your browser — no upload, no server round-trip, no waiting.

  1. Drop your JPEG (or up to 20 JPEGs for batch mode)
  2. Adjust the quality slider — the live preview updates in real time
  3. Check the file size reduction counter
  4. Download

The in-browser approach has a practical advantage beyond speed: your images never leave your machine. For client photos, medical images, or anything sensitive, that matters.

Batch mode handles up to 20 images at once with a single quality setting applied to all. If you need per-image control, process them individually.

Method 2: Command Line Tools

mozjpeg (v4.1.5) — The Gold Standard

mozjpeg is Mozilla's optimized JPEG encoder. At the same quality setting, it produces files 5-15% smaller than standard libjpeg because it uses better quantization tables and trellis quantization.

# Install (macOS)
brew install mozjpeg

# Install (Linux — build from source)
git clone --branch v4.1.5 https://github.com/nickt/mozjpeg.git
cd mozjpeg && mkdir build && cd build
cmake -DCMAKE_INSTALL_PREFIX=/usr/local .. && make && sudo make install

# Compress at quality 80
cjpeg -quality 80 -outfile output.jpg input.jpg

# Batch compress all JPEGs in a directory
for f in *.jpg; do
  cjpeg -quality 80 -outfile "compressed_${f}" "$f"
done

jpegtran — Lossless Optimization

jpegtran (included with libjpeg-turbo v3.0.3) performs lossless JPEG optimization. It does not re-encode — it strips metadata, optimizes Huffman tables, and optionally converts to progressive encoding. Typical savings: 2-10%.

# Strip metadata and optimize
jpegtran -optimize -copy none -outfile output.jpg input.jpg

# Convert to progressive JPEG
jpegtran -optimize -progressive -copy none -outfile output.jpg input.jpg

Use jpegtran when you need zero quality loss — on files that have already been compressed to their target quality, or when you cannot afford any additional lossy encoding.

ImageMagick (v7.1.1) — Batch Control

ImageMagick handles batch operations and format conversions across large image sets:

# Compress a single file
magick convert -quality 80 input.jpg output.jpg

# Batch convert a folder
magick mogrify -quality 80 -path ./compressed/ *.jpg

# With chroma subsampling for additional savings on photos
magick convert -quality 80 -sampling-factor 4:2:0 input.jpg output.jpg

Method 3: Build Pipeline Integration

For automated workflows, Sharp handles JPEG compression in Node.js with optional mozjpeg support.

Sharp (v0.33.5)

npm install [email protected]

Basic compression:

import sharp from 'sharp';

await sharp('input.jpg')
  .jpeg({ quality: 80, mozjpeg: true })
  .toFile('output.jpg');

The mozjpeg: true flag enables mozjpeg's quantization tables inside Sharp — you get mozjpeg quality without a separate binary dependency.

Batch processing a directory:

import sharp from 'sharp';
import { readdir } from 'fs/promises';
import path from 'path';

const inputDir = './images';
const outputDir = './compressed';
const files = await readdir(inputDir);

await Promise.all(
  files
    .filter(f => f.match(/\.(jpg|jpeg)$/i))
    .map(file =>
      sharp(path.join(inputDir, file))
        .jpeg({ quality: 80, mozjpeg: true })
        .toFile(path.join(outputDir, file))
    )
);

For a broader look at image size reduction strategies, including format conversion and dimension changes, see how to reduce image size effectively.

Progressive JPEG vs Baseline JPEG

A baseline JPEG loads top-to-bottom. A progressive JPEG loads in multiple passes — first a blurry full-image preview, then progressively sharper passes until the final image is complete.

Progressive loading matters for:

Progressive JPEGs are often slightly smaller than their baseline equivalents (1-5%) because the multi-scan encoding achieves marginally better compression. The file size benefit is a bonus, not the reason to choose it.

Convert to progressive with jpegtran:

jpegtran -progressive -optimize -copy none -outfile output.jpg input.jpg

Or with Sharp:

await sharp('input.jpg')
  .jpeg({ quality: 80, progressive: true })
  .toFile('output.jpg');

For web delivery, progressive is the better default. The downside — slightly more CPU to decode — is negligible on any device made in the last decade.

Compress JPEG for Specific Uses

Different contexts have different requirements. Here is a reference for a 1920x1080 photographic image:

Use Case Recommended Quality Typical File Size Notes
Email attachment 70-75 150-250 KB Many clients block images over 1 MB
Web page (body image) 75-82 200-350 KB Use progressive encoding
Social media 80-85 300-500 KB Platforms re-compress on upload; start higher
E-commerce product 80-85 300-500 KB Users zoom in; preserve edge detail
Blog thumbnail 65-75 80-150 KB Small display size hides compression artifacts
Print 92-95 2-5 MB Print requires full detail; do not compress hard

Social media note: Instagram, Twitter/X, and Facebook re-compress images on upload. If you pre-compress to 70, the platform's second compression pass will compound the artifacts. Start at 80-85 for social uploads.

JPEG vs WebP: Should You Convert Instead?

Sometimes the right move is not compressing harder — it is switching formats.

WebP produces files 25-35% smaller than JPEG at the same perceived quality. If your users are on modern browsers (which, as of 2026, is essentially everyone — WebP support is at 97%+), converting to WebP beats squeezing more out of JPEG.

Switch to WebP when:

Stick with JPEG when:

For a full format comparison — including AVIF and PNG — see Best Image Format for Web: JPEG, PNG, WebP, or AVIF?.

If you are trying to hit a specific file size target, how to compress an image to 100KB walks through the iterative approach.

Frequently Asked Questions

What quality should I use for web images?

Start at 80. That is the sweet spot for photographic content — imperceptible quality loss, 50-65% smaller than the uncompressed source. If file size is critical (mobile-first, large images), drop to 75. If you are showing product images that users zoom into, bump to 85.

Does saving a JPEG multiple times reduce quality?

Yes. Every time you open a JPEG and re-save it as a JPEG, the quantization step runs again on already-quantized data. The degradation compounds. Two saves at quality 80 is not the same as one save at quality 64 — the pattern of artifacts is different — but the quality does degrade.

Keep your source files as lossless formats (PNG or TIFF) and generate the JPEG deliverable from the source in a single step. Never edit-and-resave JPEGs iteratively.

How do I compress JPEG without any quality loss?

Use jpegtran with -optimize and -copy none. It strips metadata and optimizes Huffman tables without touching the pixel data. Typical savings are 2-10%. That is the ceiling for lossless JPEG optimization — if you need more compression, you have to accept some quality trade-off.

What is mozjpeg and why is it better?

mozjpeg is Mozilla's fork of libjpeg, the reference JPEG implementation. It uses improved quantization tables (tuned from perceptual quality research), trellis quantization, and better Huffman optimization. At quality 80, mozjpeg typically produces files 5-15% smaller than standard libjpeg while looking identical. The downside: it encodes 3-5x slower than libjpeg, which matters for real-time encoding but is irrelevant for batch processing.

Can I compress JPEG to a specific file size?

Not directly — JPEG quality settings produce variable output sizes depending on image content. A photo of a clear blue sky compresses to a much smaller file than a photo of a dense forest at the same quality setting.

The practical approach is iterative: start at quality 80, check the output size, and adjust. Some tools (including Pixotter) show the output size in real time as you adjust the quality slider. For a step-by-step process, see compress image to 100KB.