http://www.youtube.com/watch?feature=pl ... R_LBcZg_84 This talk is about research done to allow a monkey to control a robotic arm, as well as a virtual arm- including receiving sensory feedback from the virtual arm. Potential path to a direct neural interface for VR?
http://www.youtube.com/watch?feature=player_embedded&v=CR_LBcZg_84 This talk is about research done to allow a monkey to control a robotic arm, as well as a virtual arm- including receiving sensory feedback from the virtual arm. Potential path to a direct neural interface for VR?
Yeah, we are certainly making progress. I'm always telling people that our technology is more advanced than people realize, its just not always apparent/on the consumer market yet. I occasionally have to catch myself with that too- I remember being shocked when I realized how much processing power a smartphone had.
We can already do Vision (thank you Rift), and hearing (headphones). Those are easily two of our highest bandwidth senses. We also have limited control over balance/acceleration (GVS).
The last sense we need to spoof for full immersion is proprioreception (being able to tell how our limbs are oriented without looking). From there, it will be a simple matter of putting the patient into a "sleep paralysis" like state, and measuring the neuronal impulses heading toward the muscles. It still won't be cheap with all those electrodes, but at least it will be possible.
Any sense after that (smell, taste, more haptics etc.) will all be icing on the cake. Studies have shown that resistance-to-movement sensed through proprioreception is a major part of the haptic sense. The other part is merely sensation on the surface of the skin.
We can already do Vision (thank you Rift), and hearing (headphones). Those are easily two of our highest bandwidth senses. We also have limited control over balance/acceleration (GVS).
The last sense we need to spoof for full immersion is proprioreception (being able to tell how our limbs are oriented without looking). From there, it will be a simple matter of putting the patient into a "sleep paralysis" like state, and measuring the neuronal impulses heading toward the muscles. It still won't be cheap with all those electrodes, but at least it will be possible.
Any sense after that (smell, taste, more haptics etc.) will all be icing on the cake. Studies have shown that resistance-to-movement sensed through proprioreception is a major part of the haptic sense. The other part is merely sensation on the surface of the skin.
Proprioception with phantom limbs is actually something that can happen naturally during sleep paralysis. Personally, I'm able to move my dream limbs during sleep paralysis and sometimes transition into an "out of body experience" by pulling an imaginary rope. Not saying I'm actually leaving my body, just that I perceive myself as doing so. And that's all that matters, really, since it will light up the same brain areas for movement which could then be read by a BCI.
Of course if you could induce OOBs with a drug, why even bother with video games?
With a snug-fitting exoskeleton haptic feedback suit, you should be able to apply pressure in any direction to simulate gravity or weightlessness. Like floating in neutral bouyancy, perhaps...
@Exoskeleton Any pressure force from above would be met with an equal pressure force from underneath?
Regarding the haptic exoskeleton, I was thinking that it could apply a force that you resist, like standing against gravity. Because the tug of gravity on your limbs is supposed to be how you sense gravity, perhaps the exosuit could apply force to all your limb joints such as gravity would do, essentially simulating gravity. If the simulated gravity was stronger than real gravity, you should feel the change in perceived direction.
Just a personal thought experiment. I do not know if it would actually work. I would like to try such a gravity simulation using a haptic exosuit though.
Or perhaps we can use electrical stimulation other parts of the body, similar to GVS (galvanic vestibular stimulation). If you apply enough current to enough electrodes attached to enough locations on your body, who knows WHAT you can accomplish, eh?
@Exoskeleton Any pressure force from above would be met with an equal pressure force from underneath?
Regarding the haptic exoskeleton, I was thinking that it could apply a force that you resist, like standing against gravity. Because the tug of gravity on your limbs is supposed to be how you sense gravity, perhaps the exosuit could apply force to all your limb joints such as gravity would do, essentially simulating gravity. If the simulated gravity was stronger than real gravity, you should feel the change in perceived direction.
Just a personal thought experiment. I do not know if it would actually work. I would like to try such a gravity simulation using a haptic exosuit though.
Or perhaps we can use electrical stimulation other parts of the body, similar to GVS (galvanic vestibular stimulation). If you apply enough current to enough electrodes attached to enough locations on your body, who knows WHAT you can accomplish, eh?
I'm pretty sure that one of the major ways we sense gravity is through out inner ear. The same as how we detect rotation, movement, etc. This can only be truly simulated through acceleration (or gvs, neural interface, etc)
No, I think he's right that tug-on-limbs is a core component of the gravity sensation. I think more than that, it's how we perceive the positions of our internal bodily organs. Wikipedia described this as an "Internal Sense". When we're laying down, we can feel our guts kind of tug towards in our body, so I think that will be really hard to fake without some hardcore neural engineering.
geekmaster wrote:
If you apply enough current to enough electrodes attached to enough locations on your body, who knows WHAT you can accomplish, eh?
I'm pretty sure that one of the major ways we sense gravity is through out inner ear. The same as how we detect rotation, movement, etc. This can only be truly simulated through acceleration (or gvs, neural interface, etc)
No, the inner ear only senses rotation when fluid moves through the inner ear canals. It cannot directly sense nonrotational force such as gravity, which sensed by the force we feel pulling on our body, trying to rotate our joints to minimize the force on our weaker muscles.
EDIT: I have since learned that the inner ear contains weighted portions that flex the circular canals in the inner ears under acceleration, allowing us to sense acceleration direction. This means that contrary to the common understanding of forces on the limbs and organs used to sense gravity, the inner ear also contributes to the sense of gravity.
A better way to look at the inner ear is a a set of three circular tubes filled with fluid, aligned to each rotational axis. When the body rotates, the tubes rotate around the fluid (mostly), causing the fluid to bend little hairs attached to nerves that sense which way and how far these hairs bend. That measures the rotational velocity on each axis. The hairs cause some friction, causing the fluid to accelerate with the rotating canals. If you rotate long enough to get significant fluid motion, stopping the rotation of the body causes the fluid to bend the hairs in the other direction, causing dizziness.
The body really does not like to feel motion when the eyes say you are not moving (such as when intoxicate and the room spins around you). This is a problem in VR too.
Gravity is sensed mostly by force pulling on your limbs (and internal organs to some degree). A simulated force that is stronger than nature may be sensed in a way that overrides the natural force, such as the direction of gravity (or acceleration). Of course, there are complications with trying to simulate gravity. The easiest way to CHANGE the direction is in a motion simulator (such as a stewart platform). To change the force of gravity, you need a centrifuge to increase it, or a parabolic (ballistic) flight path to reduce it, or some (unknown) form of electrical stimulation to simulate it.
Transitory dynamic forces (such as vibration) are much easier to simulate.
http://www.youtube.com/watch?feature=player_embedded&v=CR_LBcZg_84 This talk is about research done to allow a monkey to control a robotic arm, as well as a virtual arm- including receiving sensory feedback from the virtual arm. Potential path to a direct neural interface for VR?
Brain-to-brain interface allows transmission of tactile and motor information between rats Researchers have electronically linked the brains of pairs of rats for the first time, enabling them to communicate directly to solve simple behavioral puzzles. A further test of this work successfully linked the brains of two animals thousands of miles apart—one in Durham, N.C., and one in Natal, Brazil.
When quoting posts for another thread, I sometimes click the WRONG submit button! Oh, well, let's put something thread-relevant here in place of the previous redundant quote:
Pulling images from your "mind's eye" off your retina was mentioned in that video. That coincides with something I read a long time ago about how 90-percent of the traffic in the optic nerve is OUTPUT data and only 10-percent input. The speculation at that time was that the brain is asking the retina "do you see this, do you see that?" and the retina is just answering "yes, or no", as a form of "dictionary based" data compression for the limited bandwidth of the optic nerve fibers. It would be VERY interesting to see a modern interpretation of this, based on whatever Mary Lou Jepsen was referring to in that video.
I find that a very interesting "coincidence", and I would love to see what more developed in that area in recent years.
thanks for the recommendations. Neuroscience is more and more attracting me, especially in the context of VR. will keep the recommend's in mind. first i have to read more about anatomy,physiology and neurons. i guess thats what David Courtnay Marr did. EDIT: i wonder if it will be one day possible to fully embody another body apart from humanoids. Sth like a birds body. from what ive seen know control of additional limbs should be possible, but another Skin and the feeling of muscles?
By the way, anyone knew about Vitrectomy? http://en.wikipedia.org/wiki/Vitrectomy Replacing the vitreous humour. LOL. Shouldnt it be possible then to put a µm thick sphere curved OLED microdisplay and a microcamera into an eye. Until today there is no need for such a surgery, because such microdisplays with that high res arent there yet , but in future it could be considered. In fact, we are not that far away from:
You do not have the required permissions to view the files attached to this post.
... i guess thats what David Courtnay Marr did. ...
Hmm... I remember his name as "David L. Marr", but I see that Google says you are right. Some of those brain cells must have gotten recycled (or gone missing) after all these years... But yeah, he wrote some really good stuff... And I removed the errant "L." from my post.
... Shouldnt it be possible then to put a µm thick sphere curved OLED microdisplay and a microcamera into an eye. ... In fact, we are not that far away from:
Perhaps closer than we think:
And guess who has "patents" on this obvious biotech:
Users browsing this forum: No registered users and 3 guests
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot post attachments in this forum