h a l f b a k e r yIt's as much a hovercraft as a pancake is a waffle.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Has anyone (besides me) wondered how they could archive data that could be retrieved even if all their hardware died? CD/DVD crapped out, USB stick ate a static discharge, etc..?
I got to thinking, "If only the data could be read as easily as a book." How about saving the data as a series of glyphs/bits
printed on a sheet of acid-free paper, that can be scanned in and recognized.
If necessarry, they could be printed at low-res, and keyed in manually!
Most printers and scanners have 300 dpi as their low-res setting. If we set our data density at 1/3 of that (100 bits-per-inch), a sheet of paper with 1/2 inch borders (7 1/2*10) will give a data storage of about 75Kilobits. Nothing to write home about.
But if we use a basic compression method of datagram/dictionary we can actually improve the data density drasticlly. We could implement the algorithm in just about any language or OS, or even use an old-fashioned book as the compression database.
Get your fishbones ready... go!
e-card
http://www.caitsith...ne/images/strip.gif well baked. e-cards being the latest of many implementations. [gtoal, Feb 27 2006]
2-d barcodes
http://www.records-...tml?fact_or_fiction 1000-yr storage [gtoal, Feb 27 2006]
text compression for 2-d barcodes
http://csdl2.comput...761/00/77610457.pdf So baked. [gtoal, Feb 27 2006]
Terabit+ per square inch 'punchcard'
http://news.zdnet.c...366,39191254,00.htm [Shz, Feb 27 2006]
If you're going to get serius about this problem
Rosetta_20Cemetery [theircompetitor, Feb 28 2006]
[link]
|
|
Good idea, except for the data density issue. Much of the flabby data on my hard drives is already compressed (jpeg images, mp3 audio) so compression might not work. |
|
|
Optical tape drives using digital paper from ICI have been around for more than a decade. sp. "necessary", "drastically" |
|
|
You're just reinventing microfiche. |
|
|
Yes, I thought of this in 1995, and worked out that about 1MB per A4 page was feasable with scanners and laser printers available at the time. More if you used colour. Back then, when you used to back up your hard disk onto a stack of 1.44MB floppies, this seemed respectable. And that's still a decent sized novel on one page. |
|
|
If you want to get all post-apocalyptic, I agree that complex compression should be avoided - keep it as close to human readable as possible, in case there are no computers left. I don't want to be decoding a gzipped tarball on an abacus. And make sure some decent magnifying glasses survive the apocalypse. |
|
|
//reinventing microfiche// Not quite. This is about storing digital data on paper. Microfiche is analog data on film. |
|
|
Sounds to me more like re-inventing punch cards rather than microfiche. |
|
|
According to [Shz]'s link, IBM have just done exactly that. |
|
|
This idea won't really work, but it's built on a reasonable premise and contains a not insignificant amount of whimsy. [+] |
|
|
//This idea won't really work// I'm not sure why you'd say that. |
|
|
Anyway, [+] for the complete impracticality of it. Perhaps we could enlist some monks to do the transcription... That would be better. |
|
|
After awhile, people will then find better paper for higher & higher densities & quicker read/writing. Before you know it, we're back to silicon with magnetic dipoles. |
|
| |