h a l f b a k e r yThe leaning tower of Piezo
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Very soon, it seems that we will no longer be able to
trust
our eyes when it comes to video speeches taken of,
well,
anyone
important. Deepfake videos are most probably
being
developed at this very moment of every major
candidate,
saying, doing, something slightly off of their usual,
in order
to tilt the next election cycle and drive the news.
Fortunately, we also have developed some methods
of
verification of transaction. Bitcoins record every
transaction of a mined coin as it progresses through
its
economic life. Bear with me. Computer
cryptography,
programming, and hashing aren't really my thing.
Computers are barely my thing.
Video viewership can be seen as a type of
transaction. An
authenticated video of a real person doing a real
event
could be recorded with a hash of the event's
location, date,
etc. that could be processed by a datacenter and
turned
into a key of some type. News orgs wishing to
verify that
said video was real could register with that
datacenter and
retreive a handshake of some form that agrees with
the
key. Or the video itself could be hashed into a
strong
encryption which is then verified against a key
somehow.
Weaknesses of this type of verification method
would be
primarily the time. The true video would have a hard
time
staying ahead of the deepfakes shared on social
media or
wherever these types of things are disseminated.
Please log in.
If you're not logged in,
you can see what this page
looks like, but you will
not be able to add anything.
Annotation:
|
|
Before the Singularity, this would have made some sense. But now, when everthing is just an artefact of [Ian Tindale]'s deranged imagination, it seems a trifle pointless. |
|
|
Perhaps the singularity will rule us this way. For
some reason this seems strangely comforting. |
|
|
New Scientist, 16th March 2019, p.22-3, "Don't believe your eyes: smartphones equipped with artificially intelligent cameras are changing how we see reality".
P.23 column 2: "If a whole generation grows up not trusting photos or videos, what do we have left?" says Raskar. Digital watermarking and cloud-based encryption are possibilities, he says, as is using the blockchain technology behind bitcoin. |
|
|
So I'm on the right track. I'll take that as
confirmation. |
|
|
ins't the bigger problem not trusting reality? |
|
|
You cannot fix fakes with encryption, unless we imagine
all recording equipment having encryption built in, in a
way that is not modifiable or "turn-offable". |
|
|
Computer analysis would be able to distinguish fakes for a
long time to come based, in the same way that you can
still tell CGI even when the movie costs $250 Million to
make. But it will get harder. |
|
|
I suspect that given where drones and video generally are
going, people who care will likely have alibi drones. You
could come up with a system where their video would not
be accessible without witness cooperation (to avoid self-
incrimination) but multiple sources could then verify
where/what you were doing when the deepfake allegedly
occurred |
|
|
This is why I have for a while wanted a way to cryptographically sign things I
say or write before they leave my brain, using a private key that never leaves
my brain. Unfortunately, I have as yet no idea how to begin to conceive of a
mechanism that could accomplish that. A probably minor difficulty that
occurs to me right now is that it's impossible with current algorithms (that I
know about) to sign a text before the wording is finalized, while things that
one speaks and writes hardly ever have the final wording before they come
out. |
|
|
On the other hand, deepfakes give a lot more plausible deniability, which can
be a good thing too (for the individual accused, at least). For example, if you
want to get away with having said racist things in the past, you can just
claim someone faked the video. But if you signed the words when they came
out of your mouth, you can't. |
|
|
Didn't I read something about attempts at using
quantum mechanics for information security awhile
back where you could always detect if a
transmission was compromised? Couldn't
something like that be utilized? |
|
|
Errr, how do we know you posted this? |
|
|
Come to that, how do you know anything ? |
|
|
<Wonders if teaching [n_m_rm] phenomenology will cause him to self-destruct/> |
|
|
<Decides it's worth a try, cleans blackboard, shuffles lecture notes/> |
|
| |