h a l f b a k e r yTrying to contain nuts.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
CD-Zip
Increasing capacity of the humble CD-R | |
A usual CD-R can carry 650MB. There are also CD-Rs around that are 700MB->800MB. However, the capacity of the disk is limited to the size of the disk.
In order to increase the amount of useful data I can fit onto any one CD, I tend to zip my files before burning. However, when running complete
system backups or large file backups, the spare room needed for the zip file is extremely large. For example; when backing up a system that contains user documents, which usually zip to about 30% of their original size, I can fit about 2.3GB onto a single CD. This means, however, that I need to have a spare 700MB on my system to temporarily store this zip file.
I propose that CD burning software be developed that can be used in conjunction with any CD burner, which compresses the files before burning them. The support for this would, however need to be built into the OS so that the data can then be read off the cd at a later time without using special reader software.
By compressing even simple repetitions or repeated patterns would help conserve the space used. A version which would be more advanced could also be devised which finds repeated files (like precompiled headers in user's accounts, or when you have many students downloading the same lecture slides) and merely links them all in the backup to one copy of the file.
isocompr
http://www.pps.juss...eSoftware/isocompr/ [egnor, Oct 04 2004, last modified Oct 05 2004]
zisofs
http://freshmeat.ne...jects/zisofs-tools/ [egnor, Oct 04 2004, last modified Oct 05 2004]
[link]
|
|
baked. set the CD-R to be writable as a normal device. then, use a multi-disk archival program like PKZIP to write the ZIP directly to CD-R. if the zip is too big, you switch disks and span the archive along them all. |
|
|
hey dude, how do you get to be site moderator, i'd like to mark other's ideas for deletion too. I guess i should have read the 'help' section first!
PS: Come out from behind your firewall and fight like a man! tom.jovanov@latrobe.edu.au |
|
|
Hi Tom, you can put your email address on your profile page. |
|
|
//hey dude, how do you get to be site moderator, // |
|
|
An answer here would probably trigger a theological debate. |
|
|
//i'd like to mark other's ideas for deletion too// |
|
|
You can mark anyone's ideas for deletion, but only moderators can actually do it. |
|
|
//I guess i should have read the 'help' section first// |
|
|
//This means, however, that I need to have a spare 700MB on my system to temporarily store this zip file...compresses the files before burning them...compressing even simple repetitions or repeated patterns would help conserve the space used// |
|
|
Yes decent backup software would do this, not require a full CD's worth of cache, and should use an algorithm more tailored to the function than good ol' PKZIP. As for duplicate files, perhaps so, but to be really sure you'd have to do a bitwise comparison and that's a waste of processor time. |
|
|
The trouble with obviously useful ideas is they're usually baked, unbakeable or redundant. This idea is unfortunately at least one of the above, so sorry, no croissant from me. |
|
|
Hard drives are big and cheap these days. You can get 160GB drives for under US$200. Surely, keeping a spare 700 megabytes is not a difficult thing. |
|
|
Yeah, what is that . . . about 75¢ worth? |
|
| |