Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
Faster than a stationary bullet.

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.

user:
pass:
register,


       

Compression Proxy

A personal web proxy server that looks for compressed versions of files
 
(+2, -2)
  [vote for,
against]

Local proxies are gaining favour at the moment with the publicity of Google's Web Accelerator as well as long standing products like my favourite Proxomitron.

I was thinking that someone could write a small personal proxy that tests for, for example, http://asite.com/photo.jpg.zip whenever it gets a request for http://asite.com/photo.jpg. If it finds photo.jpg.zip it fetches that instead, unpacks photo.jpg and hands that up to the browser (or the proxy up the chain).

Not a big improvement on JPEGs when you're using zip, but there are other packages that do better. Or webmasters could offer photo.jpg.jpeg2000 and the proxy could convert it to a JPEG.

Bonus point for the following features:

* Pass through to another proxy (fairly necessary these days)

* Record hits and misses for a different websites (After 10 misses for a given website it could only check once every 10 requests, just so it doesn't fill logs with the requests)

* Option to only allow only localhost requests or requests from a specified subnet.

* Use an enhanced robots.txt file to find out what optimised formats are offered on a given website.

This would mean that as a web developer I could create a set of packed files using much better compression than current web standards allow and potentially reduce my traffic by quite a bit if I can convince my regulars to install the proxy. The advantage of implementing this as a proxy rather than a plug-in is that it would support all browsers. Also, the implementation described wouldn't need websites to replace img tags with embed tags -- since the browser still gets the file type it's expecting.

Repacking libraries could be modular. This proxy could form the basis of a number of web enhancments to do with file compression. You wouldn't have to wait for your favourite browser to get updated to support new file formats. ISPs could implement this to reduce their upstream traffic. Big sites with huge traffic bills could see a big saving as more people adopt better compression.

Krisjohn, May 25 2005

Please log in.
If you're not logged in, you can see what this page looks like, but you will not be able to add anything.
Short name, e.g., Bob's Coffee
Destination URL. E.g., https://www.coffee.com/
Description (displayed with the short name and URL.)






       Good one [+]. But you're squeezing blood from a stone. Most common web formats for media (JPG, mpeg, mp3, gif, pdf, swf) are already compressed to some degree, so further compression isn't going to help much.   

       (...unless it's a lossy-compression that introduces more loss....which you could only do on image files)
sophocles, Jun 14 2005
  

       Ah, but there's a new technique on the rise that repacks older formats with better compression. I used Zip in the example because I didn't want to turn the original idea into an ad for Stuffit, but their latest version can compress, losslessly, a JPEG by around 20%.
Krisjohn, Jun 15 2005
  

       Baked. HTTP compression (in HTTP 1.1) uses Gzip already.   

       Compression for JPEG isn't really worthwhile anyway and I think the Google Web Accellerator further compresses JPEGs anyway   

       [marked-for-deletion]
zapped, Feb 08 2006
  


 

back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle