Loading
Nota de Estudos
Study Reminders
Support
Text Version

Birds-Eye View (General)

Set your study reminders

We will email you at these times to remind you to study.
  • Monday

    -

    7am

    +

    Tuesday

    -

    7am

    +

    Wednesday

    -

    7am

    +

    Thursday

    -

    7am

    +

    Friday

    -

    7am

    +

    Saturday

    -

    7am

    +

    Sunday

    -

    7am

    +

Virtual Reality
Prof. Steve Lavalle
Department of Multidisciplinary
Indian Institute of Technology, Madras

Lecture - 2-1
Bird’s-eye view (hardware)
(Refer Slide Time: 00:15)

So, again this is part of the bird’s eye view. And I wanna talk about hardware, all right. So, one thing talk about is displays. So, again we have talked about screens for I should take visual. So, examples are visual. So, we could have screens like, maybe going back to cathode ray tube monitors, all right. I think you are old enough to remember those at least, right. LCD's plasmas I do not know organic LED's so forth. And maybe you might want to involve lenses in some kind of optical system, if you are making a head mounted display out of these. You can get a little more exotic, and do virtual retina display. There is a lot of work on this going on in industry a lot of it let us say behind closed doors.
So, lot of research lot of prototypes being developed with virtual retinal displays, essentially just take a tiny projector, and just project images directly onto your retina, but of course, they are still going through the lens of the eye. So, it is not exactly same as you would think, but at least you are not trying to just simply look at some pixels, and deal with whatever the relationship is between that, and how it falls onto the retina, you are directly considering what information should be presented to the retina. So, that is one possibility is also an interesting and exotic family of display is called light field displays, you can research these on your own light field displays I will not cover these things.
In this case, rather than presenting the information on as a 2-dimensional projection, they try to capture the entire field of light within a 3Dimensional region, and this is very helpful so that when you rotate your head and your eyes are translating through space, or you move your head around inside of a cubic let us say volume. Then you can adjust the information that is actually presented to your retina in terms of where your position and orientation are in that space. So, you capture the entire light field using something like say a plenoptic camera array or some other kind of device it is also very difficult to do, and then present the entire information in some way so that you can in real time take into account these small motions and present the entire light field. So, that is another possibility. And I guess somewhere in here I should have mentioned, let us say projectors, I will put it up here by screens. In fact, I probably should have put it first, we had movie projectors for a long time going all the way back to the beginning a film, like the like the train pulling into the station from 1895.
So, maybe projectors should even be first there. For audio so, that is visual just think about displays again.
(Refer Slide Time: 03:49)

So, audio speakers or earphones; which are small speakers. You can fool your audio sense by bone conduction as well right, some bone conduction audio techniques, touch so, there are what are called haptic devices. And you know, would not I would not have time to cover this these in the class, but couple of general categories which are worth considering about is cutaneous versus kinesthesia. So, nice way to divide these up. An example of the first one is maybe I just get some kind of vibration as a kind of feedback, right. A rumble pack in a in a in a game controller would be an example of that, or maybe my emotions actually get blocked I feel a resistant force, there may be a robot on the other end pushing back at me, all right. And providing the right force to that it feels like I am touching a wall I keep getting stopped every time I put my hand out right, but maybe a robot hand that is touching mine, let us say.
So, that would be an example of kinesthesia. So, so you can provide this kind of feedback as well. And as far as going into smell or taste or vestibular stimulation I guess I would not include too many examples of that, but there certainly is ongoing research on those as well providing the right chemical feedback to trick your sense of smell or taste. And there is electrical stimulation of the vestibular organ that is possible. But I do not recommend trying it. So, it is very painful from what I hear, all right. Other hardware you might think about inside of hardware, other hardware of course, we have computers on use and we have graphics processing units let me just kind of you know. (Refer Slide Time: 06:04)

Just to kind of list off everything computers and GPU's inside of there. So, let us say CPU’s and GPU's, and I should also say that there are inputs, and in this case maybe something like a game controller, keyboard we talked about data gloves been around for a long time.
So, lot of interesting research and development going on this part by the way input devices for virtual reality is a lot of open work to be done. There because once you put on a virtual reality headset it is like a blindfold, and probably the last thing you want to be doing is just feeling around for the keyboard, all right. So, how can you do things? Like, suppose you wanna sit and write software inside a virtual reality. You just want that to become your desktop, all right. So, how do you do that how do you enter data fast enough and in a comfortable way, and interact with your environment. Is it going to be a keyboard and mouse? Or is it going to be something else? One more bit I want to say about hardware is what do we use for tracking. So, tracking is going to become very important as I talked about in the case of a head mounted display, when you turn your head you have to get the visual information to update correctly.
So, that it really looks like the virtual world or this alternate world is responding appropriately to fool your brain. So, tracking becomes very important; and the more portable the device gets or the closer it gets to the actual sense organ, the more kinds of tracking you need. If I am in a cave system, I can see my entire body as I walk around all, right? If I put on a virtual reality headset and look down it do not see my body. So, I have to invent a body, how do I do that? Well, I can track my body and then bring that into virtual reality, but that is a lot of work, it is very challenging. It is reasonable as a class project, but it is very hard to make it a 100 percent reliable and accurate.
So, these things do not work very well they look great they make great demos, but very, very hard to do that. You could put on a motion capture suit a full body suit with markers, gets a lot better than, but if you are going to do that much work, why not just build a cave system. You know, so it is very interesting kind of problem. So, I want to talk about tracking, and this will finish off the hardware. Where is my chalk? Let us see leaving it there, all right.
(Refer Slide Time: 09:02)

Tracking so, you call this tracking or estimation or filtering. So, people in electrical engineering will tend to use words more like that. One of the most useful components is the inertial measurement unit people call it IMU.
So, IMU has been around for a long time they were originally designed for navigation, especially useful in aircraft and spacecraft. So, if you fly on if you have flown on airplanes before almost certainly there was an IMU on board. Usually a rather large mechanical device; so the purpose is to provide orientation estimation. So, these are used for orientation, orientation or rotation if you want estimation. So, that is their main design. What it measures is 3D angular velocity and 3D linear acceleration. So, the angular velocity measuring part is usually called a gyroscope.
(Refer Slide Time: 11:06)

Sometimes people refer to the entire inertial measurement unit as a gyroscope I think that is not quite right. And this part is called appropriately an accelerometer.
So, this measures linear acceleration, but it is also measuring acceleration due to gravity. Because there is no way to really separate, true linear acceleration with respect to the fixed earth and gravity in in a natural way. There is no natural way to separate that you are always measuring the vector sum of those 2. It is as if we are all on a rocket ship right now accelerating that way at 9.81 meters per second square which is why we are all stuck to the floor all right. So, that acceleration is always being measured as well all right. So, we have the inertial measurement unit, the interesting thing that is happened in this technology is that our smartphones have IMUs inside. Mainly so that you can play these apps where there is a ball rolling around or I guess it is designed so that it tells you whether or not the screen needs to be reoriented; which frequently seems to fail, and it very often fails because you are moving while orienting and this linear acceleration of the device is getting confused with gravity it is all kinds of issues like that.
But these started appearing in smartphones they became very, very cheap mass produced on the order of 100s of millions. And now you can just take those and put them inside of virtual reality headsets. And you have an IMU to use, and does not cost very much money at all right. So, so it can be mass produced you can also put them inside of ear phones as well. So, this is great you can stick them on robots. You can put them all over the place to estimate orientation. So, that is one of the greatest enabling technologies, and one thing that Is very closely related to the inertial measurement unit, and it is sometimes considered a part of it is magnetometer.
(Refer Slide Time: 12:53)

Which measures magnetic fields 3D magnetic fields, you can sort of think of it as a compass, but it is not exactly a compass it is going to measure the vector sum of the earth’s magnetic field and whatever other magnetic fields are around, in the building let us say. Or on the circuit board that contains the magnetometer.
So, it is just generally measuring magnetic fields, that is also used to provide orientation information. Other hardware, cameras have become quite useful. You will notice on the dk 2 oculus dk 2 used in the lab. There is a there is a camera. So, that is being used as part of the tracking system as well. And when you use cameras there is generally 2 different kinds of ways of doing tracking. The names are inside out and outside in. Honestly, I if got these I thought it very confusing it took me at least a few weeks to keep them straight. So, I do not, but these are the terms that are used. So, in the inside out case. You are this is a top down view you are wearing a headset, and there is a camera on it here let us say. And then there are some markers in the physical world I will draw markers as pluses features or markers. So, they see these markers.
And so, based on how these markers appear to the camera, that is fixed to the headset you can figure out where you are at. These markers could be engineered or it could be using natural features from the environment. If you engineer them yourself, then you get very accurate very reliable performance. If you just extract them automatically it makes a fun project, but it is an interesting demo maybe, but it is very hard to get perfect accuracy and reliability. Or very, very high accuracy and reliability like this. But nevertheless, either way this is inside out, the other way is outside in which is what happens in this oculus rift dk 2 case, can you have the head looking straight down. You have the headset on, and then there is the camera out here looking. And so, these features draw them in another colour here. So, we see them are on the headset itself. In the case of oculus rift dk 2 that you will use in the lab, these are infrared LED's that are embedded inside of the headset. The plastic is infrared transparent.
But in the visible spectrum it looks black. So, you can not see the LED's, but they are inside of there. So, in this case it has markers. So, this is outside in other words the camera is outside looking inward, and in this case the camera is inside the headset looking outward. So, these are ways to use cameras, and we can talk about the technology of doing this tracking later in the course. I will spend some time on that. And of course, you can add to the cameras depth information. So, you can have what are called you know including depth like information provided by the kinect, using current technology to give you what are called RGBD sensors, all right. Giving you colour and distance information. So, there is cameras for that and those are going to be very important in the coming few years as the cost of those goes down. The reliability goes up the accuracy improves, there is going to be more and more technologies that use depth cameras for various kinds of tracking. So, you put all this together there is also more methods that use electromagnetic fields or magnetic fields that are generated by a base station, and you can do tracking that way. One thing to think about throughout these tracking technologies is what do you need to track. Track your head, track your eyes, you want to track your entire body?
Do you want to track an entire room full of people? Whereas, all kinds of things you think you want to do. And you do not necessarily, have to do that for example, I am giving a lecture today. And I can only look at one of you at a time, right. If we were all in virtual reality doing this lecture, I could be looking at all of you at the same time. They each one of you would see me looking directly at you, right? So, you do not need to have my head tracked perfectly. I need to have my head track perfectly I guess to have the display correctly. But I do not need to have where my eyes are looking tracked perfectly in terms of what you see I could just make it. So, that I am always looking at you. So, that you are always paying attention to me. And you get the feeling individually that that you are being looked at, but really, I cannot possibly be doing that.
So, there is cases where you do not want to provide the exact information that is going on in the real world. So, you really have to think about these things carefully, and there is always a cost involved an accuracy issues and comfort issues again, if I if you are tracking the body and it looks like my arm keeps breaking and bending in in in a backward direction, that is not comfortable, all right? It is torture of some kind right.
So, if your system is not working reliably, it is probably not worth doing or you have to assess is it really necessary. If it is then throw all the resources you have into it and make it work, but it might not be even though. It might look cool might not be necessary, all right? So, that finishes the hardware part, I want to talk about the bird’s eye view of the software are there any questions about hardware, again this is very high-level overview. I am going to go into more details of these things. In fact, I am covering the fundamentals. So, by giving these examples, you will be able to see more of why I am jumping into various fundamentals.

Virtual Reality
Prof. Steve Lavalle
Department of Multidisciplinary
Indian Institute of Technology, Madras

Lecture - 2-2
Bird’s-eye view (software)
(Refer Slide Time: 00:17)

So, last time we had this alternate world generator, maybe I can just abbreviate it this time as AWG and we have let us say an input side get this out of the way here; we have an input side and an output side and on the input side we get information from let us say a tracker.
Let us say maybe just a head tracker, I can make that go into this AWG maybe a game controller, does not matter what in particular I am putting in here I just want to give you kind of a high level view of the of the system keyboard maybe, I can put a hand tracker in whatever else.
(Refer Slide Time: 01:46)

So, there is information going into this the alternate world generator has to use this information, to make some kind of updates to the renderers. So, for output see there could be a hope for me right at first visual renderer and then eventually this goes out to a visual display and we talked about different examples of that and it does not really matter which sense.
So, we can have another 1 that is an audio renderer and we get an audio display and I want this to continue onward for other senses. So, if you want to have a haptic renderer you can have a haptic display, that gives you force feedback if you like right. So, the job of the alternate world generator is to maintain this kind of fictitious or remote or recorded whatever kind of world it is, it has to kind of maintain a lot of the physics of that world a lot of the geometry and physics of that world in such a way, so that you can output to these renderers which then output to the displays, which then output to your sense organs which then fool your brain right, so that is the idea. So, we have to figure out what goes into this alternate world generator that is the main part of the software challenge here.
(Refer Slide Time: 03:36)

So, here is another way to look at it, if we had a little bit of geometry I can say imagine we are superimposing, I am alternate and not really sure what to call the world that you are actually in.
So, the alternate world and the local world maybe I will say local real, world kind of emphasizing local the non alternate world all right. So, think about superimposing so somehow there is an alternate world that is being generated by the alternate world generator; I want this to be really geometry here. So, there is an alternate world and in this world there are you know maybe pieces of furniture around obstacles of some kind right.
So, there is stuff in the world in this alternate world and then this kind of story is being maintained that is fooling your brain, that you are being placed into this world. So, somewhere you appear in this world. So, you maybe have your own kind of region where you are able to move around, so your head has been transformed into this world and embedded into this alternate world.
So, your head is here and you have your eyes looking out in some direction here and they are kind of like virtual cameras looking out and as you take your head here and move around, there should be a kind of perfect correspondence you are moving around inside of here this yellow zone is, where there is a kind of superposition or overlay between the local real world and this alternate world or kind of virtual world that you may be constructing it could be virtual could be real. But it is the alternate world, so there should be a kind of overlay here between the 2 and there interesting things happen; if for example, you are moving your head and all of a sudden maybe this is 1 of the obstacles here this would be a top down view.
So, all of a sudden there is an obstacle in the world some big piece of furniture or wall and you decide to move your head into it what should happen, if you have haptic feedback it could smack you on the head right. If you do not what is supposed to happen it is a difficult problem. So, you have to figure these things out right or maybe you just want to grab on to the controller and move your character around, move your virtual self around; if you do that what happens when you come to these you need collision detection you need something that will tell you know you cannot go that way or maybe you want to move right, through it that is the physics of this alternate world, you have to decide on what the physics of this world is going to be based on your task and based on things like comfort.
It might not be very comfortable to just start passing through walls all of a sudden or it may not be comfortable to just have your virtual self abruptly stop in the virtual world, because there is a wall that may not be comfortable; this is difficult problem and I hope you would understand why that is as we go along. In fact, I started talking about motion. So, imagine this yellow rectangle that is like your zone in the virtual reality system you may be you may be sitting down in a chair, you have the headset on you can move around like this right.
If you are on a rotating chair you could rotate around, so you are moving around inside the virtual world, but in the physical world you are also moving around that is this yellow zone here and maybe you cannot leave the yellow zone in the physical world; if you can then you have a larger super position problem right. Maybe you are in an enormous region in the physical world and you can walk around in there with a headset on you are being tracked that is fine, you have to deal with again this whole problem of if you can really walk around.
So, this yellow region is larger then you could interact with all kinds of obstacles it ends up being an interesting challenge right, maybe a better experience overall but you have more challenges. So, I want to talk about different kinds of self motions and then we will get onto the sensation and perception part which is the final part of the bird’s eye view. So, self motion is a very important concept in virtual reality call ego motion or self motion.
(Refer Slide Time: 08:01)

So, there are 2 kinds of self motion, 1 the user moves and the motion is masked in the alternate world.
So, if I turn my head in the real world, in this alternate world I am also turning my virtual head and the 2 should match perfectly right.
(Refer Slide Time: 08:30)

If you do that all with 0 latency there is no delays everything is very accurate, everything is optically done correctly audios done correctly; if all the things are done perfectly that should be very comfortable and very convincing to your brain; the other motion is not through an interface an example would be a game controller the user loco mote or moves it is virtual self.
So, in this case you are not moving in the physical world, you are just grabbing onto a controller and maybe moving a joystick. So, a little bit of motion, but then you are moving your virtual body through space perhaps very quickly. So, this case you can see it is quite different right so; that means, you are doing it all artificially it is not a perfect correspondence, that would correspond in this picture too moving the rectangle around using a controller, but you have convinced your brain.
That you are actually there so that means, you are convincing your brain that you are actually accelerating moving along at different velocities and that is where the discomfort part comes from and that is where simulator sickness starts to get really bad. So, to distinguish between the 2 if you do faithful reproduction of the motions perfect in the local real world and match that to the alternate world, you get the number 1 case here which is considered generally very comfortable and if you do the other case where you grab on to a controller and move then it can be quite uncomfortable and there is a lot of interesting research and how to make it comfortable because, you would like to be able to sit in a chair and travel over long distances in virtual reality rather, than having to walk there right.
So, there is a lot of motivation, but it does cause sickness in many cases if you do not do it well and it is it is very difficult almost impossible to solve in many cases, to make it completely comfortable; so that is 1 of the things that people are struggling with. So, case number 1 to summarise that is where the head is moving inside of this box, in case number 2 if when you are moving the entire box right, in inside of this alternate world. Questions about that some examples of alternate world generators a game engines are the number 1 example that people are using right.
So, we are going to use 1 game engine in the lab unity 3 d unreal engine is another 1. So, game engines great there is other possibilities you can make your own game engine and what if we are our own simulator right, you can make your own alternate world generator as just a kind of simple simulation environment.
Now, you may want to hack it up in python for example, you do not have to do something that is as complete as a game engine and also keep in mind that if you choose to use a game engine for virtual reality, you are abusing the game engine the game engine, was designed to make games on screens and there are virtual reality integrations of those; but fundamentally it took many years to develop game engines a lot of trial and error and now that is being adapted to VR, they were not optimized for VR on a screen. You can do this number 2 motion very easily and nobody complains too much, but in virtual reality people are complaining a lot.
So, it was not designed for the entire experiences need to be redesigned what we would like to have is not a not a game engine, but a VR engine a virtual reality engine; I do not think anybody knows what that is going to look like yet, but that is what is really needed and if we have a virtual reality engine that can support the kinds of things we want to do over and over again in virtual reality experiences we will be in very good shape. For the time being we are just leveraging the existence of game engines to make the software easy.
So, the game engine is this AWG I talked about the alternate world generator, if you have a robot with a camera and you want to do telepresence, then you do not have exactly the situation do you do not need a game engine; what you need is some way to maintain a panoramic representation while you are out moving around. So, that is different it is not going to be a game engine, you probably want to hack something special for it.
So, there are a lot of different possibilities and you should really think about when you need to start from scratch or when it is best at issues a game engine; since this is a 2 week course we will just use a game engine and make it easy, but if you want to for your project use something much more lower level and optimize it all yourself, then I would say go for it you know and get the experience you.