h a l f b a k e r yThese statements have not been evaluated by the Food and Drug Administration.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
An omnidirectional live video stream probably isn't a novel
concept (in fact I just googled it and it's not), but what I'm
thinking of here is to arrange cameras on a sphere with
enough overlap that it's possible to synthesise virtual
cameras within that sphere so as to give a stereoscopic
view.
Then
to put that into a hamster ball, with a movable
ballast, and to chase the cat around the house while
watching from the safety of your VR headset.
The whole VR latency problem is kept within the system of
virtual camera synthesis, which would be done locally to
the user (and independently for each eye, and for multiple
viewers), so ping time to The Ball isn't going to be the most
nauseating factor.
The nauseating bit should be whatever the limits are of
real-time virtual camera synthesis.
Preheated by xkcd
http://www.explainx...ex.php/413:_New_Pet [notexactly, Apr 18 2016]
Further preheated by xkcd readers
https://blog.xkcd.c...ay-robots/#comments [notexactly, Apr 18 2016]
BB-8 App-Enabled Droid || Built by Sphero
http://www.sphero.com/starwars [tatterdemalion, Apr 18 2016]
Virtual viewpoint video
http://research.mic...ond/groups/ivm/VVV/ Microsoft demo of viewpoint interpolation technology [b153b, Apr 18 2016, last modified Apr 19 2016]
Jump
https://www.google....get/cardboard/jump/ Stereo video capture on a ring rather than a sphere. [b153b, Apr 19 2016]
[link]
|
|
Put the camera on top of the ball, in the manner of the BB-8 Star Wars droid toys. With some extra widget you could get the camera to move with the headset. (This is pretty much the XKCD thing now that I think of it.) |
|
|
So to make a stereo image, you can't simply transmit two omnidirectional videos, because you can't rotate the two cameras in perfect synchronisation with the user's head (most particularly if you have more than one viewer). |
|
|
This is why I was thinking of using viewpoint interpolation. With several cameras covering every visible point you can, in theory, use the combined polyscopic information to synthesise a virtually-positioned omnidirectional camera for each eyeball. If you do this viewpoint synthesis locally to the user then you can synthesise new eyeballs as needed as they move their head, and you don't have to rely on the robot being able to swivel its cameras at the same rate as a human can. |
|
|
Strictly this doesn't have to be done inside of a hamster ball... but I like the idea of projecting the human into a much smaller space in a domestic setting. Like a six-inch hamster ball, confronted by an enormous cat. |
|
|
Helicopters and drones have been done already, and I assume that James Cameron is fixing omnidirectional video stream cameras to a deep-sea exploration robot as we speak (or has done so already). |
|
|
Clearly the hamsterballverse is the next great unexplored space. |
|
|
What I'm thinking is that your virtual viewpoints are constrained to being within the sphere (or more specifically, within the polyhedron defined by the cameras on its surface). Even for human-scale parallax you need only a three inch separation between cameras that can see most of the same field. |
|
|
But it all depends on viewpoint interpolation technology (like the Microsoft one I linked) working well enough to not produce nauseating artefacts. |
|
|
Because I originally read about Microsoft's research in this area more than ten years ago, I assumed that the processing required would be trivial by now. Given that I haven't heard much about it since, it probably doesn't work. Or maybe nobody knew what to do with it? |
|
|
But that's just for stereo. Maybe mono is sufficient. |
|
| |