h a l f b a k e r yIt might be better to just get another gerbil.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Given that laptops generally have a camera sitting idle, why
not use it for tracking eye movement? Press a designated
key, and the mouse pointer jumps to wherever you're looking
(hopefully not the keyboard). Doesn't need to be accurate -
just accurate enough to get the pointer roughly where
you
want it.
Tobii
https://en.wikipedi...ki/Tobii_Technology [Inyuki, Oct 12 2015]
Golan Levine Art that looks back at you. TED 2009
http://www.ted.com/...golan_levin_ted2009 [pashute, Oct 13 2015]
[link]
|
|
Yeah, exactly, [MaxwellBuchanan]... How can we use the
camera to
achieve the same functionality and quality like with the
Tobii (link)? |
|
|
Department store display windows ? |
|
|
Use a camera to see where people are looking, and than do something so that the sidewalk is full of people and police have to be called to do crowd control. |
|
|
"Yes, sir it is very cute manikin, but you can't block pedestrian traffic." |
|
|
Phones have the camera on the wrong side for this "eye tracking mouse pointer" So you will need a snap on 180 mirror so it can see the phone users eyeballs. Get the laptop version working first, but there are a lot of phones out there. |
|
|
Or you could track the mouth smile up, frown down, tongue visible - right, tongue hidden - left
cough - enter |
|
|
Such users would appear crazy and be fun to watch. |
|
|
[+] although laptop cameras are generally not fast enough
for eye invoked mouse tracking. The Bar Ilan brain lab I
managed had special hardware for that. |
|
|
Then again there was Golan Levine that did something with
regular laptop cameras back in 2009. (Link to TED talk in this
Halfbakery idea Links section) |
|
|
//laptop cameras are generally not fast enough for
eye invoked mouse tracking.// |
|
|
They don't have to be. We are not asking the mouse
pointer to follow your gaze. I am saying that, when
you hit a given key combination, the computer
captures an image and works out where you're
looking. A 100ms delay would not be a disastrophe. |
|
|
//laptop cameras are generally not fast enough for eye invoked mouse tracking// |
|
|
But they could be at almost nil cost. Build it and they will come as it were - if the software appeared and was popular enough, the next generation of pads, phones and laptops would have a camera specced to optimise it's use. |
|
|
I am still pretty sure that current cameras are good
enough. As I mentioned, we don't need realtime
tracking - just the ability to take a snapshot when a
keyboard key is pressed, and figure out the gaze
direction within, say, 100ms. I would be
dumbfounded if that's not possible with existing
hardware. |
|
|
My memory of this is that there are lots of open source
projects to do pupil and/or random-point-on-face-
tracking but not to track the difference between the
reflection from the front and back of your eye which is
the really accurate but proprietary kind. |
|
|
I also wonder if they program physics into this so that it is
not just a one to one correspondence between your face
or eye and the mouse pointer but there is some kind of
intelligent or other-body-part-(say foot)-based capture
and release and the equivalent of
friction or gravity. |
|
|
Our eyes focus on a 3 degree cone. The rest is periphery. |
|
|
But, that means, you really can't get better than a 3 degree
(~5mm on the nearby screen) accuracy just from looking at
my eyes. |
|
|
Perhaps this could be combined with minor head cues. So,
you focus on the screen and a 5mm translucent pointer
shows where it thinks you're looking, and then you make
very minor head tilts (0.05 degree in x-y) and the 5mm
pointer moves that way & gets smaller. |
|
| |