h a l f b a k e r yWhere life imitates science.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
The computer security design imperative behind modern
encryption is the diminution of the role of obscurity. This
applies principally to the algorithm of encryption. However,
even vanguards of this approach rely crucially on obscurity in
the form of requiring the user to hide the encryption
passphrase, prevention of physical access or line of sight,
prevention of power or analysis or acoustic inference. This
means even encryption that is so strong it would take all the
computers on earth today the entire history of the universe to
crack, can still be defeated, and usually incredibly easily.
The answer to this is elegant:
Actually follow the security design imperative by banning all
obscurity. Of course, this would mean no secret keys since
pass phrases would be published. Your computer would have
no password, and you would allow anyone to use it. You
would also freely type your private messages in plain,
unambiguous language on any public computer you wanted.
The more the merrier. No more passwords to remember. No
more updates. Perfect security.
No one can really know what you mean by anything you are
saying, and furthermore, no one could even be sure who was
writing from your email address (or anyone's for that matter)
since you expresely have no security or password.
[link]
|
|
Plausible deniability works until someone puts a gun
to your head. You might be better off with private
channels of communication, steganography, and/or
strong encryption. |
|
|
This rant ignores the reason "security by obscurity" is a
Bad Thing and the meaning of the phrase when used
by people who understand information security. [-] |
|
|
Then how will online banking work? |
|
|
The security rule is "Kerckhoffs' principle": A cryptosystem should be secure even if everything about the system, except the key, is public knowledge.
"Security through obscurity" is just a perjorative term for systems which don't follow it. Clearly, your derivation is incorrect. |
|
|
However, I certainly can imagine that there is a place for systems where effectively everything is publicly editable. In the online domain this effectively makes all users anonymous. |
|
|
/Then how will online banking work?/ |
|
|
No need. Property will be held in common. 8th, this setup works well for the Borg, no? |
|
|
What you need is a bait security system that leads
the attacker into a honeypot, and a much more
obscure /nonstandard one that is the real front door. |
|
|
This will not give people security it will give people
plausible deniability. A feature most desirable to
criminals. |
|
|
This gets me thinking about how if heaven or hell exist there are probably no passwords in either. |
|
|
With this title, the idea should be as follows: |
|
|
A particular server has a display (possibly of the
type favored by bad 80s hacking movies) that
shows where all data in the system is stored, and
when someone is accessing it. This display would
be echoed (in small) everywhere that someone
with legitimate access to the system was. |
|
|
What the actual data is, and what manipulation is
being done to it is not apparent, but when it is
being accessed and who (in terms of an
anonymous but persistent user ID) is. Therefore,
an unusual user access would be very conspicuous. |
|
| |