Gzip/Brotli Compression in Apache and Nginx
Introduction
Compression is one of the most effective and easiest ways to improve website performance. By compressing text-based content before transmission, web servers can reduce bandwidth usage by 60-90%, dramatically decreasing page load times and improving user experience. Gzip has been the industry standard for decades, while Brotli is a newer algorithm offering 15-30% better compression ratios.
For modern websites serving megabytes of HTML, CSS, JavaScript, JSON, and other text formats, enabling compression is not optional—it's essential. A typical uncompressed web page might be 2-3 MB, but with Brotli compression, it can shrink to 400-600 KB, reducing load time from 6 seconds to under 1 second on a 4G connection. This improvement directly impacts SEO rankings, conversion rates, and user satisfaction.
This comprehensive guide covers both Gzip and Brotli compression configuration for Apache and Nginx, performance comparisons, optimization techniques, and troubleshooting. You'll learn how to implement compression correctly to achieve maximum performance benefits while avoiding common pitfalls.
Understanding Compression
Why Compression Matters
Performance Benefits:
- Reduced Bandwidth: 60-90% less data transferred
- Faster Load Times: 3-10x faster page loads
- Lower Costs: Reduced bandwidth costs
- Better SEO: Google favors fast sites
- Improved UX: Faster is always better
Gzip vs Brotli
| Feature | Gzip | Brotli |
|---|---|---|
| Compression Ratio | Good | Better (15-30% more) |
| Compression Speed | Fast | Slower |
| Decompression Speed | Fast | Fast |
| Browser Support | Universal (99.9%) | Modern (95%+) |
| CPU Usage | Low | Medium |
| Best For | All content | Static assets |
Compression Effectiveness by File Type
| File Type | Compression Ratio | Worth Compressing? |
|---|---|---|
| HTML | 70-85% | Yes |
| CSS | 70-85% | Yes |
| JavaScript | 60-80% | Yes |
| JSON | 70-90% | Yes |
| XML | 75-85% | Yes |
| SVG | 60-80% | Yes |
| Plain Text | 60-80% | Yes |
| Images (JPEG/PNG) | 0-5% | No (already compressed) |
| Video | 0-2% | No (already compressed) |
| 5-15% | Maybe |
Benchmarking Without Compression
Test Page Size
# Create test HTML file
cat > /var/www/html/compression-test.html << 'EOF'
<!DOCTYPE html>
<html>
<head>
<title>Compression Test</title>
<style>
body { font-family: Arial; padding: 20px; }
.content { line-height: 1.6; }
</style>
</head>
<body>
<h1>Compression Test Page</h1>
<div class="content">
<!-- Repeated content to create size -->
Lorem ipsum dolor sit amet... (repeated 100 times)
</div>
<script>
console.log('Page loaded');
// More JavaScript code...
</script>
</body>
</html>
EOF
# Check uncompressed size
curl -H "Accept-Encoding: identity" http://localhost/compression-test.html -o test.html
ls -lh test.html
# Size: 156KB uncompressed
Baseline Performance
# Test without compression
curl -w "\nSize: %{size_download} bytes\nTime: %{time_total}s\n" \
-H "Accept-Encoding: identity" \
-o /dev/null http://localhost/compression-test.html
# Results (typical):
# Size: 156,842 bytes
# Time: 0.234s (on slow connection: 2.5s)
# Full page load test
curl -w "Time: %{time_total}s\n" http://localhost/ -o /dev/null
# Time: 3.8s (multiple resources)
Gzip Configuration
Nginx Gzip Configuration
# /etc/nginx/nginx.conf or /etc/nginx/sites-available/default
http {
# Enable Gzip compression
gzip on;
# Compression level (1-9)
# 1 = fastest, least compression
# 9 = slowest, most compression
# 5-6 = good balance (recommended)
gzip_comp_level 6;
# Minimum file size to compress (bytes)
# Don't compress tiny files (overhead not worth it)
gzip_min_length 1000;
# Compress for all clients (even old browsers)
gzip_proxied any;
# File types to compress
gzip_types
text/plain
text/css
text/xml
text/javascript
application/json
application/javascript
application/xml+rss
application/rss+xml
application/atom+xml
image/svg+xml
text/x-component
text/x-cross-domain-policy;
# Enable compression for proxied requests
gzip_vary on;
# Disable for old IE6 (optional, IE6 is dead)
gzip_disable "msie6";
# Buffer size (default: 32 4k or 16 8k)
gzip_buffers 16 8k;
# HTTP version (1.1 recommended)
gzip_http_version 1.1;
# Rest of nginx configuration...
}
Apache Gzip Configuration
# /etc/apache2/mods-available/deflate.conf
# Or in .htaccess
<IfModule mod_deflate.c>
# Enable compression
SetOutputFilter DEFLATE
# Compression level (1-9)
# 6 is good balance
DeflateCompressionLevel 6
# Don't compress images, videos, PDFs
SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png|ico|pdf|flv|swf|gz|zip|bz2|rar|7z)$ no-gzip
# File types to compress
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE text/javascript
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/atom+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE application/json
AddOutputFilterByType DEFLATE image/svg+xml
# Vary header (for proper caching)
Header append Vary Accept-Encoding
# Netscape 4.x issues
BrowserMatch ^Mozilla/4 gzip-only-text/html
# Netscape 4.06-4.08 issues
BrowserMatch ^Mozilla/4\.0[678] no-gzip
# MSIE masquerades as Netscape
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
# Don't compress if already compressed
SetEnvIfNoCase Request_URI \.(?:exe|t?gz|zip|bz2|sit|rar)$ no-gzip
# Proxy settings
Header append Vary User-Agent env=!dont-vary
# Memory level (1-9, 8 is default)
DeflateMemLevel 8
# Window size (9-15, 15 is default)
DeflateWindowSize 15
# Buffer size
DeflateBufferSize 8096
</IfModule>
Testing Gzip Compression
# Test if Gzip is working
curl -H "Accept-Encoding: gzip" -I http://localhost/compression-test.html | grep -i "content-encoding"
# Should show: Content-Encoding: gzip
# Compare sizes
# Uncompressed
curl -H "Accept-Encoding: identity" http://localhost/test.html -o test-uncompressed.html
ls -lh test-uncompressed.html
# Size: 156KB
# Compressed
curl -H "Accept-Encoding: gzip" http://localhost/test.html -o test-compressed.html.gz
ls -lh test-compressed.html.gz
# Size: 24KB (85% reduction)
# Detailed test with size and time
curl -w "\nOriginal: %{size_download} bytes\nTime: %{time_total}s\n" \
-H "Accept-Encoding: gzip" \
-o /dev/null http://localhost/compression-test.html
# Results:
# Original: 24,156 bytes (compressed)
# Time: 0.042s (85% faster)
Brotli Configuration
Installing Brotli
Nginx:
# Ubuntu/Debian - Install from package
apt-get install nginx-module-brotli -y
# Or compile from source
apt-get install git gcc make libpcre3-dev zlib1g-dev -y
git clone https://github.com/google/ngx_brotli.git
cd ngx_brotli
git submodule update --init
# Compile with Nginx
./configure --add-module=/path/to/ngx_brotli
make && make install
Apache:
# Ubuntu/Debian
apt-get install libbrotli-dev -y
# Install mod_brotli
git clone https://github.com/apache/httpd.git
cd httpd/modules/filters
git clone https://github.com/kjdev/apache-mod-brotli.git mod_brotli
cd mod_brotli
apxs -i -a -c mod_brotli.c -lbrotlienc
Nginx Brotli Configuration
# /etc/nginx/nginx.conf
# Load Brotli module
load_module modules/ngx_http_brotli_filter_module.so;
load_module modules/ngx_http_brotli_static_module.so;
http {
# Enable Brotli compression
brotli on;
# Compression level (0-11)
# 4-6 = good balance (recommended)
# Higher = better compression but slower
brotli_comp_level 6;
# File types to compress
brotli_types
text/plain
text/css
text/xml
text/javascript
application/json
application/javascript
application/xml+rss
application/rss+xml
application/atom+xml
image/svg+xml
application/x-font-ttf
application/x-font-opentype
application/vnd.ms-fontobject
application/x-web-app-manifest+json;
# Minimum file size to compress
brotli_min_length 1000;
# Window size (default: 512k)
brotli_window 512k;
# Buffer size
brotli_buffers 16 8k;
# Enable for static files (pre-compressed .br files)
brotli_static on;
# Keep Gzip as fallback
gzip on;
gzip_comp_level 6;
gzip_types text/plain text/css text/xml text/javascript application/json application/javascript;
# Rest of configuration...
}
Apache Brotli Configuration
# /etc/apache2/mods-available/brotli.conf
<IfModule mod_brotli.c>
# Enable Brotli
SetOutputFilter BROTLI
# Compression quality (0-11)
# 4-6 recommended for dynamic content
BrotliCompressionQuality 6
# Window size (10-24, 22 is good)
BrotliCompressionWindow 22
# File types to compress
AddOutputFilterByType BROTLI_COMPRESS text/html
AddOutputFilterByType BROTLI_COMPRESS text/plain
AddOutputFilterByType BROTLI_COMPRESS text/xml
AddOutputFilterByType BROTLI_COMPRESS text/css
AddOutputFilterByType BROTLI_COMPRESS text/javascript
AddOutputFilterByType BROTLI_COMPRESS application/xml
AddOutputFilterByType BROTLI_COMPRESS application/javascript
AddOutputFilterByType BROTLI_COMPRESS application/json
AddOutputFilterByType BROTLI_COMPRESS image/svg+xml
# Keep Gzip as fallback for older browsers
<IfModule mod_deflate.c>
SetOutputFilter BROTLI_COMPRESS;DEFLATE
</IfModule>
# Vary header
Header append Vary Accept-Encoding
</IfModule>
Testing Brotli Compression
# Test if Brotli is working
curl -H "Accept-Encoding: br" -I http://localhost/test.html | grep -i "content-encoding"
# Should show: Content-Encoding: br
# Compare sizes: Uncompressed vs Gzip vs Brotli
# Uncompressed
curl -H "Accept-Encoding: identity" http://localhost/test.html | wc -c
# Size: 156,842 bytes
# Gzip
curl -H "Accept-Encoding: gzip" http://localhost/test.html | wc -c
# Size: 24,156 bytes (85% reduction)
# Brotli
curl -H "Accept-Encoding: br" http://localhost/test.html | wc -c
# Size: 20,987 bytes (87% reduction, 13% better than Gzip)
Pre-Compression (Static Compression)
Pre-Compress Assets
For static assets, pre-compress during build:
#!/bin/bash
# compress-assets.sh
# Find all static assets
find /var/www/html/assets -type f \( -name "*.css" -o -name "*.js" -o -name "*.svg" -o -name "*.html" \) | while read file; do
# Create Gzip version
gzip -9 -k -f "$file"
# Create Brotli version (if brotli installed)
if command -v brotli &> /dev/null; then
brotli -9 -k -f "$file"
fi
echo "Compressed: $file"
done
Nginx Static Compression
# Serve pre-compressed files
location ~* \.(css|js|svg|html)$ {
# Try .br file first (Brotli)
gzip_static on;
brotli_static on;
# Example request flow:
# Request: /style.css
# Try: /style.css.br (if client supports Brotli)
# Try: /style.css.gz (if client supports Gzip)
# Serve: /style.css (uncompressed fallback)
# Cache headers
expires 1y;
add_header Cache-Control "public, immutable";
}
Benefits of Pre-Compression
Dynamic Compression:
- CPU usage per request: High
- Compression time: 10-50ms per request
- Best compression: Good (level 6)
Pre-Compression:
- CPU usage per request: None
- Compression time: 0ms (already compressed)
- Best compression: Excellent (level 11)
- Build time: Increased
- Disk space: 2-3x (original + .gz + .br)
Compression Level Optimization
Compression Level Comparison
# Test different Gzip levels
for level in {1..9}; do
echo "Testing Gzip level $level"
time gzip -$level -k -c /var/www/html/large-file.js > /tmp/test-$level.js.gz
ls -lh /tmp/test-$level.js.gz
done
# Results (typical 500KB JS file):
# Level 1: 156KB, 0.02s (fast, 69% reduction)
# Level 4: 138KB, 0.06s (balanced, 72% reduction)
# Level 6: 134KB, 0.12s (recommended, 73% reduction)
# Level 9: 132KB, 0.35s (slow, 74% reduction)
# Recommendation: Level 6 for dynamic, Level 9 for pre-compression
Performance vs Compression Trade-off
# Benchmark compression levels under load
ab -n 1000 -c 100 http://localhost/large-page.html
# Results:
# No compression: 1,250 req/s, 450ms response
# Gzip level 1: 3,180 req/s, 140ms response
# Gzip level 6: 2,850 req/s, 155ms response
# Gzip level 9: 2,120 req/s, 210ms response
# Brotli level 4: 2,940 req/s, 150ms response
# Brotli level 11 (pre): 3,620 req/s, 120ms response
# Best: Brotli level 11 pre-compressed
Configuration Examples
High-Traffic Website (Nginx)
http {
# Brotli (primary)
brotli on;
brotli_comp_level 4; # Fast compression for dynamic content
brotli_types text/plain text/css text/javascript application/json application/javascript image/svg+xml;
brotli_min_length 1000;
brotli_static on; # Use pre-compressed .br files
# Gzip (fallback)
gzip on;
gzip_comp_level 5; # Fast compression
gzip_types text/plain text/css text/javascript application/json application/javascript image/svg+xml;
gzip_min_length 1000;
gzip_static on; # Use pre-compressed .gz files
gzip_vary on;
# Disable for IE6
gzip_disable "msie6";
# Cache settings
location ~* \.(css|js|svg)$ {
expires 1y;
add_header Cache-Control "public, immutable";
brotli_static on;
gzip_static on;
}
}
API Server (Nginx)
http {
# Optimize for JSON responses
brotli on;
brotli_comp_level 6; # Better compression for JSON
brotli_types application/json application/javascript;
brotli_min_length 256; # Compress even small JSON
gzip on;
gzip_comp_level 6;
gzip_types application/json application/javascript;
gzip_min_length 256;
gzip_vary on;
# No static compression needed (dynamic API)
brotli_static off;
gzip_static off;
}
WordPress Site (Apache)
<IfModule mod_brotli.c>
SetOutputFilter BROTLI_COMPRESS
BrotliCompressionQuality 4
AddOutputFilterByType BROTLI_COMPRESS text/html
AddOutputFilterByType BROTLI_COMPRESS text/css
AddOutputFilterByType BROTLI_COMPRESS text/javascript
AddOutputFilterByType BROTLI_COMPRESS application/javascript
AddOutputFilterByType BROTLI_COMPRESS application/json
Header append Vary Accept-Encoding
</IfModule>
<IfModule mod_deflate.c>
SetOutputFilter DEFLATE
DeflateCompressionLevel 5
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE text/javascript
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/json
Header append Vary Accept-Encoding
</IfModule>
# Cache static assets
<filesMatch "\.(css|js|svg)$">
Header set Cache-Control "max-age=31536000, public"
</filesMatch>
Performance Testing
Before and After Comparison
# Test suite
echo "=== Compression Performance Test ==="
# Test 1: Large HTML page
echo "Test 1: HTML Page (250KB)"
curl -w "Uncompressed: %{size_download} bytes, %{time_total}s\n" \
-H "Accept-Encoding: identity" -o /dev/null http://localhost/large.html
curl -w "Gzip: %{size_download} bytes, %{time_total}s\n" \
-H "Accept-Encoding: gzip" -o /dev/null http://localhost/large.html
curl -w "Brotli: %{size_download} bytes, %{time_total}s\n" \
-H "Accept-Encoding: br" -o /dev/null http://localhost/large.html
# Results:
# Uncompressed: 250,000 bytes, 0.342s
# Gzip: 38,500 bytes, 0.065s (85% smaller, 80% faster)
# Brotli: 32,750 bytes, 0.058s (87% smaller, 83% faster)
# Test 2: JavaScript bundle
echo "Test 2: JavaScript (1.2MB)"
curl -w "Uncompressed: %{size_download} bytes\n" \
-H "Accept-Encoding: identity" -o /dev/null http://localhost/app.js
curl -w "Gzip: %{size_download} bytes\n" \
-H "Accept-Encoding: gzip" -o /dev/null http://localhost/app.js
curl -w "Brotli: %{size_download} bytes\n" \
-H "Accept-Encoding: br" -o /dev/null http://localhost/app.js
# Results:
# Uncompressed: 1,200,000 bytes
# Gzip: 285,000 bytes (76% reduction)
# Brotli: 245,000 bytes (80% reduction)
# Test 3: Full page load
ab -n 100 -c 10 http://localhost/
# Without compression: 5.2s page load, 3.4MB transferred
# With compression: 1.1s page load, 620KB transferred (79% less data, 79% faster)
Monitoring and Troubleshooting
Verify Compression Working
#!/bin/bash
# check-compression.sh
URL="$1"
echo "Checking compression for: $URL"
echo
# Check Content-Encoding header
ENCODING=$(curl -s -I -H "Accept-Encoding: gzip, br" "$URL" | grep -i "content-encoding:")
echo "Content-Encoding: $ENCODING"
# Get sizes
UNCOMPRESSED=$(curl -s -H "Accept-Encoding: identity" "$URL" | wc -c)
GZIP=$(curl -s -H "Accept-Encoding: gzip" "$URL" | wc -c)
BROTLI=$(curl -s -H "Accept-Encoding: br" "$URL" | wc -c)
echo "Uncompressed: $UNCOMPRESSED bytes"
echo "Gzip: $GZIP bytes ($(awk "BEGIN {printf \"%.1f\", (1-$GZIP/$UNCOMPRESSED)*100}")% reduction)"
echo "Brotli: $BROTLI bytes ($(awk "BEGIN {printf \"%.1f\", (1-$BROTLI/$UNCOMPRESSED)*100}")% reduction)"
Common Issues
Issue 1: Compression Not Working
# Check module loaded (Nginx)
nginx -V 2>&1 | grep -o with-http_gzip_static_module
# Check module enabled (Apache)
apachectl -M | grep deflate
apachectl -M | grep brotli
# Check configuration syntax
nginx -t
apachectl configtest
Issue 2: Wrong Content Types
# Verify Content-Type header
curl -I http://localhost/style.css | grep -i content-type
# Should be: Content-Type: text/css
# Fix in Nginx mime.types
types {
text/css css;
text/javascript js;
application/json json;
}
Issue 3: Already Compressed Content
# Don't compress images, videos
# In Nginx:
gzip_types text/plain text/css text/javascript; # Don't add image/*
# In Apache:
SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png|mp4)$ no-gzip
Best Practices
-
Always Enable Compression
- Huge performance gain, minimal cost
- Use Brotli with Gzip fallback
-
Choose Right Compression Level
- Dynamic content: Level 4-6
- Pre-compressed: Level 9-11
-
Compress Right File Types
- Compress: HTML, CSS, JS, JSON, XML, SVG, fonts
- Don't compress: Images, videos, binaries
-
Set Minimum Size
- Don't compress files < 1KB
- Overhead not worth it
-
Use Pre-Compression
- Pre-compress static assets during build
- Maximum compression, zero runtime cost
-
Set Vary Header
- Essential for proper caching
- Header: Vary: Accept-Encoding
-
Monitor Performance
- Track compression ratio
- Monitor CPU usage
- Measure actual page load times
Conclusion
Compression is one of the highest-impact, lowest-effort optimizations available:
Performance Improvements:
- Bandwidth usage: 60-90% reduction
- Page load time: 50-85% faster
- User experience: Dramatically better
- SEO ranking: Improved
- Infrastructure costs: 40-70% lower bandwidth costs
Implementation Summary:
- Gzip: Universal support, good compression, low CPU
- Brotli: Better compression (15-30%), modern browsers
- Best approach: Both (Brotli with Gzip fallback)
- Optimal levels: 4-6 dynamic, 9-11 pre-compressed
- What to compress: Text files (HTML, CSS, JS, JSON, XML, SVG)
- What not to compress: Images, videos, already compressed files
By implementing proper compression configuration, you can deliver dramatically faster websites, reduce infrastructure costs, and provide better user experience with minimal configuration effort.


