h a l f b a k e r yTrying to contain nuts.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
I may well have missed a trick here, so apologies to everyone whose understanding of image compression stretches beyond 10 minutes of google.
My idea is that we could make a whole folder of images compress to less than the sum of their file sizes in exactly the same way as individual images
are compressed, using colour /area sampling etc. i.e. the application or OS would patchwork all of the images in a folder together (in the order which made most sense for the file format) and then use the same file compression as the individual files on the new whole.
perhaps the link below might be elucidative.
(??) a page explaining my idea better
http://www.neilphillips.com/image.html four become one [neilp, Oct 04 2004, last modified Oct 05 2004]
Please log in.
If you're not logged in,
you can see what this page
looks like, but you will
not be able to add anything.
Annotation:
|
|
In your linked page you state "I've cut up whole.jpg".. thus this is not a valid test. |
|
|
A valid test would be: start off with an uncompressed image (eg whole.bmp - from a high quality source, not some jpeg converted into a bmp). Then split that into four, and compress into five jpegs (1,2,3,4,whole). |
|
|
I think you'll find that the file size of whole.jpg will be pretty close to total of the sizes of 1 through 4.jpg (when using the same quality settings.. it might be helpful to use quite low settings in order to visually confirm that the image quality is consistent). |
|
|
Additional problems with this idea include the difficultly of adding to, removing from and changing individual images in a folder. |
|
|
Couple of points: Wouldn't applying lossy compression (e.g. JPEG) affect the image boundaries as well? I.e. I'd expect to see some artifacts emerge after uncompression. It'd be OK, but after repeated additions and deletes from the folder, the repeated compression/ uncompression/ rearrangement, etc... will degrade the images over time. But I guess that would depend on the compression type used. |
|
|
If the algorithm were to arrange the images such that similar images were arranged next to each other, that would reduce this problem. So I'd expect the approach to work best in really large image libraries, when there's enough images to do this effectively, without creating distinct boundaries. |
|
|
Depending on compression technologies, the algorithm also takes into account regional similarities. For that same reason, the compression ratios are better for your example, since the images are more or less similar in nature. |
|
|
Try it with a discrete image (like white noise, or GIF photo), and it'll probably fall apart. |
|
|
Between these points, I'd expect the benefits to average off. |
|
|
I like this idea. If you're taking scenic pictures of the
beach or of the grand canyon, I'm sure certain hues
would show up repeatedly, and some information
may be repeated among the pictures. If you're taking
stereographic pictures (with slightly different
viewpoints), I'm sure a LOT of information is
repeated. The usual Huffman compression sucks,
though. A much better technique would have to be
employed to find such patterns. |
|
|
Sounds a really good idea. Consider a professional photographer taking hundreds of photographs in quick succession. There will be not only a finite set of hues but also whole sections of the image that are almost identical - perhaps even identical sections of background in the same position within multiple images if a good tripod is used. Of course, multi-frame compression has been used for video for many years, e.g. HDV, but I haven't heard of it being used for a folder of stills. |
|
|
I sometimes do something like this when i send a series of photos by email, but as a PNG. |
|
| |