h a l f b a k e r yRight twice a day.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
The human mind is very flexible and capable of re-wiring
itself pretty much as needed, as has been proven by
various brain implant experiments.
Presently the
communications mechanisms for each such experiment
tends to be developed just for that experiment. For
example optic implants
only connect to the optic nerve
and
generic implants connect directly to the gray matter and
the user is expected to do the heavy lifting of
communicating with it.
While every brain is
different major centers of communication, concept
storage, association, and so forth are about the same.
I propose a standardized brain-computer
communication protocol generalized to most people's
brain
types, requiring specific implant locations, and involving
a
series of interaction steps that will almost always result
in
effective communication.
For example the
computer could send signals to arbitrary locations in the
user's visual cortex and the user would respond to the
computer with what it's trying to say. Many such
prearranged conversations would let the computer
understand how to best communicate but mutual
understanding would be established much faster because
most people store visual information in the same general
places.
The idea is that given a certain set of
guidelines the computer is following the typical user can
establish effective communication faster using these
protocols.
The computer-brain interface can
then be used to send and receive arbitrary kinds of
signals and a computer can have a driver that sees the
interface as a session layer protocol.
[link]
|
|
People make so much of machine-human interfaces, but it's
been going on, gradually, for centuries, starting with writing,
the place-value system, movable type, telephone,
television, computers, internet .... What's so special about
surgically implanted devices? Our sensory and motor systems
are still,
in 2011, more sophisticated interfaces than anything
implantable, and, other than the cool factor what's so special
about direct cortical stimulation? |
|
|
A quick Wikipedia scan shows a reference to the
word Cyborg in 1960. The idea of using a "protocol"
for communication must predate Morse, to use a
"recent" example. |
|
|
While the earliest appearances
of direct machine interfaces did not necessarily
detail a protocol, it's difficult to characterize this as
a new idea in any meaningful sense. |
|
|
Sorry, I don't get it. So the computer sends an output to the user's visual cortex. Fine, that's already been done through a biometric implant - no biggie there. But then what? The user "respond(s) to the computer with what it's trying to say" - How? Using a non-brain-interface like a keyboard and screen? Directly pressing a button? How does the user get to communicate back to the computer in this set-up? And shouldn't the whole communication loop be 'in-brain'? And if so, you need to connect the computer up to specific bits of the brain to monitor incoming traffic - something that is going to be tricky to do without invasive and 'interactive' surgury. i.e. your hardware connections have to be made at specific points in the human's CNS prior to the human getting the chance to 'negotiate' where these points are via your protocol. |
|
| |