Image Proxy Drama: When External Images Break Your Perfect Layout
It started with a single broken book cover image. One tiny 40x60 pixel placeholder that somehow managed to destroy my entire live feed layout. What followed was a deep dive into the world of external image handling, CORS policies, and the realization that trusting third-party images is like trusting a cat to guard your fish tank.
Here's how a simple image problem spiraled into building a full image proxy service, and the performance optimizations that came along for the ride.
The Innocent Beginning
My live feed was supposed to show book covers from Goodreads alongside my reading progress. Simple enough—just grab the image URL from their RSS feed and drop it into an tag:
{book.author}typescript
// My naive first attempt
const BookCard = ({ book }: { book: Book }) => {
return (
src={book.coverImg}
alt={Cover of ${book.title}}
className={styles.bookCover}
/>
{book.title}
);
};
This worked perfectly in development with my test data. Then I deployed to production and watched my beautiful grid layout collapse like a house of cards.
The Problem Reveals Itself
The issues started piling up immediately:
- Broken image links: About 30% of Goodreads cover URLs returned 404s
- CORS errors: Some images blocked cross-origin requests
- Mixed content warnings: HTTP images on my HTTPS site
- Performance disasters: Some cover images were 2MB+ files
- Layout shifts: Images loading at different rates caused jarring layout changes
- Handle broken links gracefully
- Solve CORS issues by serving from my domain
- Convert HTTP to HTTPS
- Optimize image sizes and formats
- Cache frequently requested images
- Goodreads: Sometimes allowed cross-origin, sometimes didn't
- Amazon S3: Strict CORS policies
- Random CDNs: Complete lottery
- Hit my API route
- Fetch from the external URL
- Stream back to the client
- Cache hit rates (85%+ after a few days)
- Average response times (cached: 50ms, uncached: 800ms)
- Common failure patterns
- Bandwidth usage
- Broken image errors: 30% → 0%
- Layout shifts: Eliminated with consistent aspect ratios
- Load times: 40% faster with caching and optimization
- Bandwidth usage: 60% reduction with WebP optimization
- User experience: Smooth, predictable image loading
- Never trust external images: Always have fallbacks and error handling
- CORS is everywhere: Proxying eliminates a whole class of problems
- Caching multiplies benefits: Performance and reliability improvements
- Consistent sizing prevents layout shifts: Use aspect-ratio and object-fit
- Optimization should be automatic: Users shouldn't have to think about it
- Implement background image processing to pre-optimize popular covers
- Add retry logic with exponential backoff for failed requests
- Use a dedicated image CDN for even better performance
- Implement progressive loading with low-quality placeholders
My browser console looked like a war zone:
Failed to load resource: the server responded with a status of 404 (Not Found)
Access to image at 'http://images.gr-assets.com/books/...' from origin 'https://zachliibbe.com' has been blocked by CORS policy
Mixed Content: The page at 'https://zachliibbe.com/live-feed' was loaded over HTTPS, but requested an insecure image 'http://images.gr-assets.com/books/...'. This request has been blocked
My perfect grid became a chaotic mess of broken images, missing covers, and content jumping around as images (maybe) loaded.
The Lightbulb Moment: Image Proxy
The solution hit me during my third cup of coffee: I needed to proxy all external images through my own server. This would let me:
I built a simple Next.js API route to proxy images:
`typescript
// /api/utils/image-proxy/route.ts
export async function GET(request: NextRequest) {
try {
const { searchParams } = new URL(request.url);
const imageUrl = searchParams.get('url');
if (!imageUrl) {
return new NextResponse('Missing image URL', { status: 400 });
}
const response = await fetch(imageUrl, {
headers: {
// Add a user agent to avoid being blocked
'User-Agent':
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36',
},
});
if (!response.ok) {
console.error(
Failed to fetch image: ${response.status} ${response.statusText}
);
return new NextResponse(Failed to fetch image: ${response.status}, {
status: response.status,
});
}
// Get the image data and content type
const imageData = await response.arrayBuffer();
const contentType = response.headers.get('content-type') || 'image/jpeg';
// Return the image with the correct content type and caching
return new NextResponse(imageData, {
headers: {
'Content-Type': contentType,
'Cache-Control': 'public, max-age=86400', // Cache for 24 hours
},
});
} catch (error) {
console.error('Image proxy error:', error);
return new NextResponse(
Error proxying image: ${error instanceof Error ? error.message : String(error)},
{ status: 500 }
);
}
}`
Now I could route all external images through my proxy:
`typescript
const BookCard = ({ book }: { book: Book }) => {
const proxyUrl = /api/utils/image-proxy?url=${encodeURIComponent(book.coverImg)};
return (
alt={
Cover of ${book.title}}className={styles.bookCover}
/>
{book.title}
{book.author}
);
};
`
The CORS Conundrum
CORS (Cross-Origin Resource Sharing) became my nemesis. Different image hosts had wildly different policies:
The image proxy solved this by making all requests server-side, where CORS policies don't apply. From the browser's perspective, all images were coming from my domain.
Performance Optimization: The Caching Strategy
The proxy worked, but it was slow. Every image request had to:
For frequently accessed book covers, this was inefficient. I needed caching.
My first attempt used simple HTTP caching headers:
typescript
return new NextResponse(imageData, {
headers: {
'Content-Type': contentType,
'Cache-Control': 'public, max-age=86400', // 24 hours
ETag: generateETag(imageData),
},
});
This helped with browser caching, but the proxy still had to fetch images from external sources repeatedly.
Server-Side Caching with Vercel KV
I added a server-side cache using Vercel KV to store frequently requested images:
`typescript
import { kv } from '@vercel/kv';
export async function GET(request: NextRequest) {
try {
const { searchParams } = new URL(request.url);
const imageUrl = searchParams.get('url');
if (!imageUrl) {
return new NextResponse('Missing image URL', { status: 400 });
}
// Check cache first
const cacheKey = image:${Buffer.from(imageUrl).toString('base64')};
const cachedImage = await kv.get(cacheKey);
if (cachedImage) {
const { data, contentType } = cachedImage as any;
return new NextResponse(Buffer.from(data, 'base64'), {
headers: {
'Content-Type': contentType,
'Cache-Control': 'public, max-age=86400',
'X-Cache': 'HIT',
},
});
}
// Fetch from external source
const response = await fetch(imageUrl, {
headers: {
'User-Agent':
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36',
},
});
if (!response.ok) {
return new NextResponse(Failed to fetch image: ${response.status}, {
status: response.status,
});
}
const imageData = await response.arrayBuffer();
const contentType = response.headers.get('content-type') || 'image/jpeg';
// Cache the image (with size limit)
if (imageData.byteLength < 1024 * 1024) {
// Only cache images under 1MB
await kv.set(
cacheKey,
{
data: Buffer.from(imageData).toString('base64'),
contentType,
},
{ ex: 86400 } // 24 hours
);
}
return new NextResponse(imageData, {
headers: {
'Content-Type': contentType,
'Cache-Control': 'public, max-age=86400',
'X-Cache': 'MISS',
},
});
} catch (error) {
console.error('Image proxy error:', error);
return new NextResponse('Error proxying image', { status: 500 });
}
}`
The Fallback Image Strategy
Broken images still happened, even with the proxy. I needed a graceful fallback system:
`typescript
const BookCard = ({ book }: { book: Book }) => {
const [imageError, setImageError] = useState(false);
const [imageLoading, setImageLoading] = useState(true);
const proxyUrl = book.coverImg
? /api/utils/image-proxy?url=${encodeURIComponent(book.coverImg)}
: null;
return (
{imageLoading && (
)}
{proxyUrl && !imageError ? (
src={proxyUrl}
alt={Cover of ${book.title}}
className={styles.bookCover}
onLoad={() => setImageLoading(false)}
onError={() => {
setImageError(true);
setImageLoading(false);
}}
/>
) : (
{book.title}
)}
{book.title}
{book.author}
);
};
`
The fallback design actually looked pretty good:
`css
.fallbackCover {
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
width: 100%;
height: 200px;
background: linear-gradient(135deg, var(--gradient-one), var(--gradient-two));
color: white;
text-align: center;
padding: 1rem;
border-radius: 8px;
}
.fallbackCover span {
font-size: 0.875rem;
font-weight: 600;
margin-top: 0.5rem;
line-height: 1.2;
}`
Layout Shift Prevention
The most jarring user experience issue was layout shifts as images loaded. I needed consistent sizing:
`css
.imageContainer {
position: relative;
width: 100%;
aspect-ratio: 2/3; / Standard book cover ratio /
background: var(--background-secondary);
border-radius: 8px;
overflow: hidden;
}
.bookCover {
width: 100%;
height: 100%;
object-fit: cover;
object-position: center;
}
.imagePlaceholder {
position: absolute;
inset: 0;
display: flex;
align-items: center;
justify-content: center;
background: var(--background-secondary);
}`
Using aspect-ratio ensured the container maintained consistent dimensions whether the image loaded or not.
Lazy Loading for Performance
With the proxy handling reliability, I could focus on performance. Lazy loading was crucial for pages with many images:
`typescript
const BookCard = ({ book }: { book: Book }) => {
const [isInView, setIsInView] = useState(false);
const imgRef = useRef
useEffect(() => {
const observer = new IntersectionObserver(
([entry]) => {
if (entry.isIntersecting) {
setIsInView(true);
observer.disconnect();
}
},
{ threshold: 0.1 }
);
if (imgRef.current) {
observer.observe(imgRef.current);
}
return () => observer.disconnect();
}, []);
const proxyUrl =
book.coverImg && isInView
? /api/utils/image-proxy?url=${encodeURIComponent(book.coverImg)}
: null;
return (
{!isInView ? (
) : (
alt={
Cover of ${book.title}}className={styles.bookCover}
loading="lazy"
/>
)}
);
};
`
Image Optimization: WebP and Resizing
The proxy also gave me a chance to optimize images on-the-fly. Many book covers were unnecessarily large:
`typescript
import sharp from 'sharp';
export async function GET(request: NextRequest) {
try {
const { searchParams } = new URL(request.url);
const imageUrl = searchParams.get('url');
const width = parseInt(searchParams.get('width') || '300');
const quality = parseInt(searchParams.get('quality') || '80');
// ... fetch image logic ...
let processedImage = imageData;
// Optimize if requested
if (searchParams.get('optimize') === 'true') {
processedImage = await sharp(Buffer.from(imageData))
.resize({ width, withoutEnlargement: true })
.webp({ quality })
.toBuffer();
contentType = 'image/webp';
}
return new NextResponse(processedImage, {
headers: {
'Content-Type': contentType,
'Cache-Control': 'public, max-age=86400',
},
});
} catch (error) {
// ... error handling ...
}
}`
Usage:
typescript
const optimizedUrl = /api/utils/image-proxy?url=${encodeURIComponent(book.coverImg)}&optimize=true&width=300&quality=80;
Monitoring and Analytics
I added simple analytics to understand image proxy usage:
`typescript
export async function GET(request: NextRequest) {
const startTime = Date.now();
try {
// ... proxy logic ...
const duration = Date.now() - startTime;
console.log(
Image proxy: ${duration}ms, cache: ${cachedImage ? 'HIT' : 'MISS'}, size: ${imageData.byteLength} bytes
);
return response;
} catch (error) {
const duration = Date.now() - startTime;
console.error(Image proxy error after ${duration}ms:, error);
throw error;
}
}`
This helped me understand:
Real-World Results
After implementing the image proxy system:
Edge Cases and Gotchas
Several edge cases emerged in production:
1. Infinite redirects: Some image hosts redirected to themselves
typescript
const response = await fetch(imageUrl, {
redirect: 'follow',
// Limit redirects to prevent infinite loops
headers: { 'User-Agent': '...' },
});
2. Massive images: Some covers were 10MB+ files
`typescript
const MAX_IMAGE_SIZE = 5 1024 1024; // 5MB limit
if (imageData.byteLength > MAX_IMAGE_SIZE) {
return new NextResponse('Image too large', { status: 413 });
}`
3. Rate limiting: Image hosts occasionally rate-limited my proxy
typescript
if (response.status === 429) {
// Return cached version if available, or fallback
const staleCache = await kv.get(${cacheKey}:stale);
if (staleCache) {
return staleCache;
}
}
Security Considerations
Proxying arbitrary URLs opened potential security risks:
1. URL validation:
`typescript
const allowedHosts = [
'images.gr-assets.com',
'i.gr-assets.com',
's.gr-assets.com',
'images-na.ssl-images-amazon.com',
];
const url = new URL(imageUrl);
if (!allowedHosts.includes(url.hostname)) {
return new NextResponse('Host not allowed', { status: 403 });
}`
2. Content-Type validation:
typescript
const allowedTypes = ['image/jpeg', 'image/png', 'image/webp', 'image/gif'];
if (!allowedTypes.includes(contentType)) {
return new NextResponse('Invalid content type', { status: 400 });
}
Lessons Learned
What I'd Do Differently
Looking back, I would:
The Bigger Picture
Building an image proxy taught me that third-party content is inherently unreliable. Whether it's images, APIs, or any external dependency, you need to design for failure and inconsistency.
The proxy became more than just a solution to broken images—it became a reliability layer that gave me control over a critical part of my user experience.
The complete image proxy implementation is available in my GitHub repository, and you can see it in action throughout my live feed page.
_Handling external content gracefully is about building defensive systems that work despite the chaos of the internet. Want to see more stories about building reliable web experiences? Follow my journey as I share the real challenges of optimizing for performance and reliability._