Speaker 1: Hello visitor.
 
Today we will be delving into the Informatorium 56 files to compile an episode on extended reality.
 
If you don’t know what that is, don’t worry, we will of course explain it.
 
But for now, just imagine a world where you can see anything on the internet, but instead of being on a monitor, it appears right in front of your field of vision at any time.
 
Instead of counting sheep to go to sleep, you could see tiny adorable digital sheep jumping a fence in front of you in your normal field of vision in your actual room.
 
Or you’re getting dressed for work in the morning and wonder what’s the weather like? Just ask and a display pops up in your vision with today’s weather while you can still see yourself pulling up a pant leg over your right leg.
 
Maybe you are walking through a new city on vacation and want a cup of coffee.
 
You just say, “Where is there a coffee shop and show me directions?” Suddenly there are arrows on the sidewalk in front of you pointing to your destination to get that cup of coffee.
 
These are all things augmented reality glasses can provide.
 
Could we even be so bold as to ask for contact lenses that do all this?
 
Or how about having the ability to step into a completely immersive digital environment, like stepping into a video game or even just a recreation of your favorite museum and having a walk around? Both things that virtual reality software and a head-mounted display can provide.
 
Or finally, what about playing a nice game of ping pong on a digital ping pong table that appears right in your living room when you put on mixed reality glasses that allow you to still see everything else that’s going on in your living room as though the table was in the real world?
 
Sounds like the future, doesn’t it? But is it? Today we are going to start a series on extended reality and learn about its three component parts: augmented, virtual, and mixed reality.
 
In the first episode, we will explore the language and learn the basics of the technology.
 
In the next episode, we will go on and cover a timeline of the history of the technology and inventions relevant to extended reality leading up to just before the present era.
 
And in a third episode, we will dive into a look at some of the huge players in extended reality in recent history like Facebook, Microsoft, and Google.
 
Then of course we will wrap it all up and look at the current status of extended reality and its impact on modern culture.
 
There’s going to be a lot of interesting stuff, so grab a comfy chair and enjoy the show.
 
Welcome and thank you for visiting the Informatorium 56 podcast studio.
 
This location is dedicated to general education and information and features this podcast.
 
I am Greg Bell and my partner Julia Korony is here with me.
 
How are you doing today, Julia?
 
Speaker 2: I’m doing good.
 
How are you?
 
Speaker 1: I’m great.
 
Julia, what topic do you want to talk about today?
 
Speaker 2: Um… hmm…
 
Speaker 1: I can do anything, just pick a topic.
 
Whatever you want, just pick a topic.
 
Speaker 2: How about computers?
 
Speaker 1: Um… no.
 
You know what, I’m actually not ready for that topic.
 
Let’s start again.
 
Julia, what topic do you want to talk about today? I can do anything.
 
Speaker 2: This is… you’re putting me on the spot and you know…
 
Speaker 1: Just pick something, whatever you think of, that’s fine.
 
Speaker 2: Nail polish.
 
Speaker 1: Okay… that’s probably not going to work either.
 
You know, this time, this time just say, just say extended.
 
Speaker 2: Okay.
 
Speaker 1: Julia, what topic do you want to talk about today?
 
Speaker 2: You know what I’m really interested in right now? Um, extended reality.
 
Speaker 1: Wow, that is awesome.
 
You know what, I have spent like the last I think about four months reading about that, so that is really… yeah, no, that’s a convenient… that’s a good one.
 
Um, have you ever heard of extended reality before? The phrase, I mean.
 
Speaker 2: No.
 
Speaker 1: No? Yeah, I gotta be honest, like I didn’t really either.
 
Do you… what do you think it is? Do you have any idea? Like what would you… what would you make up as a meaning for extended reality?
 
Speaker 2: Well, what comes to mind is, um, virtual reality.
 
Speaker 1: Yeah, exactly.
 
And it’s kind of the ballpark basically.
 
To me, I like to keep up with like the new technology and stuff like that because you start to get older and it starts to go away.
 
You know, you start to forget how things work or not learn them in the first place.
 
And you know, you hit a certain age and it’s going to be more as like a tourist, you know, than a resident.
 
You start to just want to try to keep up with what’s happening, but it takes… it takes more effort, right? Like you just… you’re just not ingrained in these things.
 
You don’t grow up with them.
 
I actually had a professor who actually in grad school would always say, you know, you… you could just lose track of everything and you should read the newspaper every day to keep up with what’s going on in the world.
 
And like that’s how long ago it was I went to grad school, he was talking about reading a newspaper.
 
Speaker 2: What do you find in a news… what is a newspaper?
 
Speaker 1: I used to go to Starbucks and like read the New York Times all the time.
 
Obviously, you can’t do that anymore, but the idea… the idea is still there, right? And like I interpret it as just, you know, living the experiences of the day.
 
And you know, even though I got out of that field a long time ago, I always thought it was an interesting perspective and a good way to live your life, right?
 
So as a result, you know, every time like a PlayStation comes out, I get one just to see what’s going on, like is there some new awesome development? And you know, as you know, sometimes I get sucked in a little bit more than others.
 
Or, you know, I get to build a computer, see what’s going on with the hardware and stuff like that.
 
The hardware side’s a little easier for me.
 
The games…
 
I’m not super into social media, so it, you know, I’m way behind.
 
Yeah, it’s just too much for my generation or at least me personally, my way of thinking to really get absorbed into that.
 
Speaker 2: I just don’t think it’s good for your mental health.
 
Speaker 1: Yeah, I dip my toe in and then I’m just like, I can’t, this is… this is annoying and something happens and I…
 
I bow out for a long time.
 
So, but I thought in keeping with that, it would be interesting to try to get caught up at least on something going on in technology.
 
And in this case, that’s going to be extended reality.
 
To try to learn what we need to do to get everyone plugged into the Matrix.
 
And to do that, we’re going to learn about virtual, augmented, and mixed reality.
 
Honestly, the Matrix is really more of a virtual reality and we’re going to cover all three, but the Matrix always sounds cool and it’s a good callback for people in my age group.
 
So I’m going to use it as a signal to me and a super effective extended reality, whether it be augmented, mixed, or virtual.
 
So let’s hit the rundown and see what we’re going to cover on today’s show.
 
First, definitions.
 
What is virtual… virtual reality? Really hard to say, first of all.
 
What is augmented reality?
 
Speaker 2: Tongue twister.
 
Speaker 1: Yeah, and what is mixed reality? And how do they go together in this landscape that, as I said, I recently learned is called extended reality.
 
You know, then we’re going to take a brief look at the hardware that’s required to get yourself into these extended realities, like the head-mounted display, which is, you know, the goggles you think of when you see somebody doing virtual reality, the controllers, the motion capture, all that kind of stuff.
 
Just, you know, a little bare-bones thing on, you know, what it takes to do that.
 
Then we’re going to take stock at the end.
 
And then just as a bit of a preview, like I said, in the next episode, we’ll do a timeline of all the cool technology and things in history, even some literature that led up to the world of extended reality, pretty close to our modern era.
 
And then finally, we’ll look at some of the big companies and the things that they’ve done to try to get us all into the Matrix or, you know, into at least an augmented reality-friendly world.
 
And then of course at the end, we’ll wrap it all up by seeing what impact these things all have on today’s modern culture.
 
So let’s jump in and start out with that terminology and get started on the extended reality landscape.
 
So, the types of extended reality.
 
There’s actually a design school that has an article on extended realities and it defines extended reality like this: Extended Reality (XR), as they call it, drop the ‘E’ go for the ‘X’, it’s just cooler, you know, that’s the way… that’s the way we do things.
 
But it’s an umbrella term that encompasses any sort of technology that alters reality by adding digital elements to the physical or real-world environments by any extent, blurring the line between the physical and the digital world.
 
So this is kind of broad for them, okay? That’s really kind of like encompassing anything that just has like an effect on your vision digitally, right?
 
Speaker 2: Right.
 
Speaker 1: And before we get too lost in the weeds, let me just say, we’re just talking about digital stuff in your vision, okay? So you can play with it, you can get information from it, you know, like Tony Stark in Iron Man or Tom Cruise in Minority Report, right? You know, you can see these different things in your vision.
 
It’s part of a digital reality that you can interact with or at least see and get information from.
 
Speaker 2: Okay.
 
So it’s not like a Photoshop picture that was altered with a filter?
 
Speaker 1: No, we’re talking about live, yeah.
 
We’re not talking about like, you know, like well, I guess with the movie that’s how they did it.
 
But we’re talking about the stuff that actually makes this happen for you, okay? So like your PlayStation VR headset, your Meta Quest, like basically these types of things.
 
Like how can we go into these other worlds of digital reality basically and what extent can we make them a part of our lives?
 
Okay, so we got our three categories.
 
Extended reality.
 
We’re going to start out with virtual reality.
 
I believe this is the most common thing that people are aware of, virtual reality, right?
 
Speaker 2: Right.
 
Speaker 1: So right off the bat, let’s point out there’s a lot of overlap, right? Some people stick with two: virtual and augmented.
 
We’re going to do all three.
 
And in 1994, Paul Milgram and Fumio Kishima actually created what’s called the virtuality continuum, which even makes it even more simple.
 
Basically, they just made this continuum of reality to try to explain why all these different technologies fit in that continuum.
 
And I kind of like it because instead of pigeonholing things into different categories, because what happens is one company makes this thing, another company makes this thing, and they all kind of do different things, you know what I mean? So it’s kind of hard to just put them into these hard categories.
 
So this is kind of a neat thing where they put it into… it’s basically just the spectrum from completely virtual vision reality all the way over to just regular reality where you’re just walking around with nothing enhanced, you know, just regular seeing the world as it is.
 
Speaker 2: Okay.
 
Speaker 1: So they kind of came up with that idea, which is kind of how we’re going to treat it moving forward.
 
But to get to that point, you kind of have to understand the differences.
 
So, but to me, no matter how you look at it, there’s three key elements that you can use to categorize virtual, augmented, and mixed realities.
 
And I realize in abstraction this doesn’t make a ton of sense, but we will get there, so just hang on.
 
So, but one is how much does your vision get covered by the digital overlays? Okay, so that’s the first thing.
 
So just imagine you’re putting this pair of glasses on.
 
Like how much can you still see of the real world, how much is fake digital world? Okay?
 
Speaker 2: Mhm.
 
Speaker 1: Second, can you interact with those digital overlays? So you’re putting these glasses on, now there’s digital stuff in front of your face.
 
Let’s say, I don’t know, a little digital dog.
 
Speaker 2: Okay.
 
Speaker 1: Can you pet the dog or is he just standing there? So that’s part two.
 
And then part three, can you still interact with reality? So you got the digital dog there, can I still see past the digital dog and open the door in my living room?
 
Speaker 2: Can I see my actual dog?
 
Speaker 1: Yeah, so that’s the three things, okay? So how much is covered, can you interact with the real stuff, can you interact with the fake stuff? And I kind of made this up but it kind of really explains the whole thing.
 
So with that in mind, let’s dive into the world and let’s start with augmented reality.
 
So augmented reality, what is it? It’s a digital overlay in your vision that covers part of your viewing area.
 
So you can see digital stuff, you can still see the real world.
 
You cannot interact with the digital overlay, okay?
 
Speaker 2: Okay.
 
Speaker 1: But you can interact with reality.
 
This is your pair of glasses that you put on and say, “Hey, glasses thingy, tell me where McDonald’s is.” Okay?
 
Speaker 2: Right.
 
Speaker 1: Why are we going to McDonald’s? I really don’t know.
 
I was honestly, I was going to make this Starbucks.
 
And the other day I went to Starbucks and I was like, this is the worst.
 
Like Starbucks is just the worst.
 
I don’t think any company’s fallen from grace harder than… yeah, but you know what McDonald’s is.
 
I don’t go there either, but it’s just something people are thinking of that a lot of people go to, right? And like…
 
Speaker 2: Target.
 
Let’s go to Target.
 
Speaker 1: No, I feel offended by Starbucks.
 
I think that’s the problem, that’s why I had to… you know what I mean? Because I used to go there a lot.
 
Speaker 2: It’s so expensive.
 
Speaker 1: You’re paying $8 for a cup of coffee that’s made by a machine.
 
Like what am I paying for? I feel like I used to be paying for like the person to stand there and make the coffee and, you know, it’s like, oh thank you, taking your time to do that.
 
Now they’re just like “errr”, $9.
 
Like what are we doing? So anyway, you put your glasses on, take me to McDonald’s.
 
And because they’re AR glasses, directions pop up in your vision.
 
Right?
 
Speaker 2: Right.
 
Speaker 1: Now you can’t touch the directions and you can’t move them, okay? But you can still see where you’re going obviously because you have to walk.
 
So this is that type of reality we’re talking about.
 
Trying to think of like what movie this would be, like I don’t know, maybe that is Minority Report where he just puts them on, he’s like “Hey, you know, where am I going?” this type of thing.
 
You know, you put the glasses on, you see through them, they’re just like regular clear glasses like any other glasses you’re wearing, except these digital things pop up in your vision that look like there’s like directions on the road saying turn here.
 
Speaker 2: Okay.
 
Speaker 1: Now again, you can’t pick the arrows up and move them, you can’t interact with them, you can’t build your motor like Tony Stark does in the… you know what I mean? That’s not what we’re talking about.
 
This is just augmented reality.
 
So it’s augmented, it’s not totally taken over, it’s just augmented.
 
I mean you could put the glasses on, get something to eat and get some food before you got to McDonald’s.
 
Speaker 2: So then why go to McDonald’s?
 
Speaker 1: Well you have to pee somewhere, you know what I mean? You go to McDonald’s because they have bathrooms.
 
That’s the only reason I go to McDonald’s.
 
Speaker 2: You’re going for the ice cream.
 
Speaker 1: That’s the only reason I go to McDonald’s.
 
But part of your screen is overlaid, you can’t interact with that part, but reality you can.
 
Got it?
 
Speaker 2: AR.
 
Speaker 1: That’s AR.
 
Then we have mixed reality.
 
So this is digital overlay in your vision that covers part of your viewing area.
 
Sounds the same, right? Hope we don’t end up at McDonald’s again.
 
Second, you can interact with the digital overlay.
 
So this is different.
 
You have that digital overlay, but you can interact with it.
 
Speaker 2: Okay.
 
Speaker 1: So it’s not just that static arrow on the ground, you could pick that arrow up and throw it if you wanted to, theoretically.
 
And then the third factor would be reality.
 
You can still interact with reality because you can still see it.
 
These glasses are not covering your whole vision up.
 
This is not, you know, the Matrix where everything’s blocked out and you can’t see what’s going on.
 
So you have these digital elements in your vision with the real elements and then you can interact with both of them.
 
So that’s the difference here.
 
So the example I got, it was on this website called Interaction Design and they basically use the puppy example.
 
Because they were like imagine you have this digital puppy, instead of him just sitting there like augmented reality, in mixed reality you can actually pick the puppy up and you could actually set that puppy on a desk in your room.
 
Because he’s going to interact with that because you’re seeing reality interact with the digital world.
 
Speaker 2: Okay.
 
Speaker 1: So this is why this is mixed.
 
You’re interacting with both the fake world and the real world and your fake stuff can interact with the real stuff.
 
So like basically whatever system you’re using here is mapping out what’s going on in the world, so you just like pick up this puppy dog, you play with him, you’re like “Here, sit on my desk” and he’ll actually float there in space and look like he’s on top of your desk.
 
Speaker 2: So mixed reality is what I thought extended reality would be.
 
Speaker 1: Right.
 
So it’s pretty intense, right? So this is pretty awesome stuff and there’s a couple companies that make stuff like this.
 
This is not the most common thing in the world.
 
But in this case, you know what I mean, unlike augmented reality where it’s just the puppy is just a pointer and leads you to McDonald’s, this puppy can actually walk around, you pick him up and play with him.
 
Speaker 2: Right.
 
Speaker 1: Okay.
 
So now that’s your digital overlay, the cute puppy.
 
Can I interact with them? Yes.
 
So we’ve got mixed reality.
 
Still interact with the real world? Mixed reality.
 
You can still see the real world.
 
So that’s the difference.
 
Now that brings us to virtual reality.
 
So looking at our three parts again, we have our digital overlay.
 
In virtual reality, the digital overlay is all of your vision.
 
Speaker 2: Okay, yeah.
 
Speaker 1: Right? So can you interact with the virtual world? Of course, that’s all you’ve got.
 
That’s the whole point of doing this, otherwise you’d just be looking at a TV show.
 
Can you interact with reality? No, there is no reality left.
 
Okay, you can’t see it anymore.
 
Obviously, if you move your body it’s going to be…
 
Speaker 2: Right, if you run into the chair…
 
Speaker 1: Well no, that’s not what we’re talking about in your vision though.
 
That’s… you’re not going to see that.
 
You’re just going to hurt your foot.
 
Okay, so you’re not going to see that.
 
Speaker 2: It’s still interaction with the environment, you’re just not seeing… you’re not seeing the actual interaction.
 
Speaker 1: No, but I’m saying we’re talking about in your vision.
 
Gotcha.
 
So this is easiest to give an example of because this is what everyone thinks of, right, when they think of these realities.
 
You know, you put on VR goggles and suddenly you’re in a spaceship flying around, you know, sent back in time with dinosaurs, whatever it is.
 
I mean this is like the closest thing to the Matrix we got, right?
 
Speaker 2: Right.
 
Speaker 1: So…
 
Speaker 1: I’ve looked at all these definitions, like I said, and I came up with this one because I think it accounts for all the aspects in a comparative way.
 
And you know, you’re sitting there thinking like, should this guy with this podcast that nobody listens to be coming up with like an entire new set of definitions for these things? I mean, it’s probably…
 
Speaker 2: I mean, you can.
 
Speaker 1: Yeah, it’s probably not right, no.
 
But here’s the thing, of all the mistakes I’m going to make on the show, and like all the things I’m going to get wrong because I don’t really understand a lot of stuff and I have to go back and figure it out and read it like twelve times, this is the one you can take to the bank.
 
Like trust me, this will make complete sense in like four years.
 
Somebody’s going to listen to this and they’re going to be teaching this at MIT.
 
Speaker 2: Okay.
 
Speaker 1: So basically, how much of my vision is digital, how much is real, what can I interact with? Thank you later.
 
Let’s move on.
 
So let’s move on to the next part of the show, which is going to be the hardware.
 
This is going to be super basic, okay? Again, this is what this show is for.
 
This is not for techies to go learn about the difference between this and that.
 
This is for normal people like me to try to, you know, just hear about how these things work.
 
So now that we have the basic idea, we’re going to look briefly at what you need to participate in one of these extended realities.
 
And like I said, they’re a little bit different, but they’re not that much different.
 
It’s kind of the same stuff they do this, right?
Speaker 2: Right.
 
You need something over your eyes.
 
Speaker 1: For sure, yeah.
 
So first of all, you need some sort of display.
 
It can come in a lot of different forms.
 
But you need some kind of display because no matter what extended reality we’re talking about, it’s being made with some kind of digital image, right? So that’s like the real part of the tech here.
 
It’s like how do we get this image going?
 
And starting with the VR, I mean, this is going to be a headset that blocks out your vision and replaces it with your new immersive reality.
 
I mean, again, that’s probably the most common thing, so it’s kind of, you know, easy to understand.
 
But you know, those weird goggles you put on your head, okay, that’s a head-mounted display.
 
That’s VR.
 
Augmented reality can be a lot of things because like a hodgepodge of things are considered AR.
 
You can have your display be a pair of glasses, you know what I mean? Like if you ever did see the Google Glass, it’s just a pair of clear glasses you put on your head, gives you that digital overlay with some information.
 
Or it can actually be your cell phone.
 
Probably never thought of this.
 
You know when you go onto Amazon to buy something and they’re like, “Hey, do you want to see this in your room?”
Speaker 2: Right.
 
That’s augmented reality.
 
Speaker 1: Okay, it’s a digital overlay, right? You can’t interact with it directly.
 
You can’t move their fake couch that they’re showing in your room.
 
Speaker 2: I did that with my… when I got my glasses online, I uploaded my picture and then I tried on all the different pairs that I wanted to look at.
 
Speaker 1: Right, so this is the thing.
 
It’s overlaying it on your face, letting you see it.
 
So there’s different ways to make augmented reality happen.
 
Now, mixed reality, a few different options here.
 
One famous one is the HoloLens by Microsoft, which we’re going to talk about in the next episode.
 
But just briefly, it’s a pair of transparent glasses that had special lenses that displayed 3D holograms in your vision that you could actually interact with.
 
So it’s pretty incredible.
 
Like you would see these 3D things, build them in your vision, you could set them on a table in your room, you could do all these things.
 
Speaker 2: Right.
 
Speaker 1: Now Meta also made something like this too.
 
They made the Quest.
 
They did it a little bit different because whereas HoloLens were actually clear glasses so you could see through to the rest of the world, Meta actually made them with a camera that records reality and then feeds it back into your vision.
 
So it’s like recording reality and then mixing it back together with the digital reality.
 
So everything you’re seeing is technically a video.
 
Speaker 2: Oh, it’s like it’s edited.
 
Speaker 1: Yeah, it’s not actually… it’s like real-time, you know, photo editing is a good way to put it.
 
Right, right.
 
So it’s just splicing it together.
 
So those would be the mixed reality.
 
You can do a lot of things with that.
 
That’s the type of thing where it’s almost like going to be… it almost sounds like the most important thing you can do, even though I don’t think it ever really kind of works, which we’ll talk about.
 
But that’s the type of thing where you could have like doctors learning to do like technical skills with real and fake information at the same time so they’re not like having to worry about hurting people and stuff like that.
 
Like you can have like data interacting to what they’re doing so you can kind of teach them in a real-time basis, you know, where they’re seeing certain things that are fake and certain things that are real.
 
And then when they interact with the fake things, you get like an actual interaction with it.
 
So that’s one of the things like that Meta has set up.
 
They have training like that.
 
There’s a lot of things like that where it gets kind of intense, which kind of seems like the most important thing I said you could do with these things, but I don’t think anybody ever really has gotten too far off the ground with it.
 
But we’ll talk about some of that more later.
 
Now, another thing you can do with Meta’s mixed reality on their Quest headset, a little more practical or at least a little more common, is you can play ping pong.
 
So you put your headset on, whatever room you’re in, suddenly there’s a ping pong table in front of you.
 
You can still see your room because it has that camera set up that’s feeding the view of your room back into your goggles.
 
But now you have a ping pong table.
 
And you can interact with the digital ping pong table and you can still see your room, so you can interact with that.
 
So another little bit more practical version of something you can do with mixed reality.
 
Now you’re probably asking, “Wait a minute, I got some goggles on my head and suddenly, you know, what, I have like a ping pong table in my room?” Yeah, I mean, that’s pretty awesome.
 
But like how does that happen? Why do I want this? You know what I mean? How does this overlay just know what I’m doing? You know, is there sound involved with this? I’m sure you probably didn’t worry about the sound, but that’s… we’re going to talk about the sound, that’s what I do.
 
Speaker 2: Well, now I am.
 
Speaker 1: Yeah, no, it’s… but yeah, so like one of the things you do, you say you just throw a ping pong table up and now you’re playing ping pong, right? That’s one of the games they have on Meta.
 
You’re just playing ping pong back and forth.
 
But that’s not just going to happen by itself.
 
You know, you have to be able to control the ping pong paddle, right? You have to be able to do these things in the real world.
 
So that’s where you get into the hardware and things like that.
 
So let’s take a look at some of these concepts that make this happen.
 
And I say concepts because, you know, we’re going to cover as usual not the circuitry here, just the basic groundwork.
 
So let’s start with how did they get the 3D image, which is the foundation of all these digital overlays that look like they’re three-dimensional, right? Like that’s one of the things you don’t think about.
 
It does have to look three-dimensional.
 
If this stuff was all flat just for everything, you wouldn’t even be interested, right? Like even just having… if it’s just directions and you want to have an arrow on the ground, it’s still got to be three-dimensional or it’s not going to be laying on the ground, it’s going to be right in front of your face.
 
And the way they do that is something called stereoscopy.
 
There’s actually another way to do it, but this is the number one way they do it.
 
So what is stereoscopy? Well, it’s something that I wish I would have spent more time talking about in the Eastman podcast because quite frankly this goes all the way back to then and it’s like would have been like an awesome callback, but I don’t really think I spent that much time on it.
 
But anyway, stereoscopy is a thing that has been around for a while.
 
Basically it’s a way to manipulate your vision to see a 3D world that isn’t there.
 
So let’s have Britannica explain it for us.
 
Stereoscopy is a science and technology dealing with two-dimensional drawings or photographs that when viewed by both eyes appear to exist in a three-dimensional space.
 
In popular terms, it’s 3D.
 
When you say 3D, that’s what you mean.
 
Stereoscopic pictures are produced in pairs, the members of a pair showing the same scene or object from slightly different angles that correspond to the angles of vision of the two eyes of a person looking at the object.
 
Makes sense?
Speaker 2: Yes.
 
Speaker 1: So it’s only possible because we have binocular vision, which requires that the left eye and the right eye view of an object be perceived from different angles because that’s where our eyes are.
 
And then the brain puts this together and creates depth.
 
That’s how our vision works.
 
And stereoscopy basically just takes advantage of this, which you can do with just pictures, putting them in your vision and making you see these things as though they’re real in the same way, giving them that same depth.
 
Now, it’s possible to do this thing called monoscopic VR, but apparently nobody does that, so just pointing out like that is a thing, but…
 
So basically it’s just manipulating our natural vision, which is based on our two eyeballs looking at things at slightly different angles and turning them into a three-dimensional vision.
 
So a VR headset is basically just two screens, one in front of each eyeball, showing you pictures at slightly different angles, so then you turn these digital things into three-dimensional things in front of your three-dimensional world.
 
Makes sense?
Speaker 2: Yes.
 
Speaker 1: Okay.
 
So you know what else helps to immerse you into a digital overlay? Sound.
 
We’re back to the sound.
 
So spatial audio.
 
I threw this in here because there’s this website called Pico VR, which listed it as an element.
 
And you know, you think about it, it is actually kind of important.
 
Sound sounds kind of important.
 
It’s not necessary for, you know, what we’re talking about, but it does help the immersion, right? And it, you know, it comes up.
 
All the systems I’ve worked with all have their audio.
 
So I looked up how sound is immersive because it seems like it makes sense that it would just be like, “Yeah, of course sound tells you things.” But there’s actually a weird way that we do it.
 
So it turns out that because sound hits our ears at slightly different times with slightly different intensities, our brain processes these differences and determines the location in space, right?
Speaker 2: Okay.
 
Speaker 1: But actually, we rely on timing differences for low-frequency sounds.
 
So if it’s like a hum, like a low frequency, the time it takes to hit, your brain uses that to figure out what direction it’s coming from.
 
Whereas high frequencies, it goes by level.
 
So it’s how loud is it, it can tell what the difference is and where it’s coming from.
 
So anyway, little side note, but I thought that was kind of crazy.
 
That’s how you actually learn…
 
Speaker 2: That’s kind of neat.
 
Speaker 1: Yeah, that’s how you, you know, where something’s coming from with your hearing.
 
Anyway, if you want an immersive experience, you’re going to want to account for this.
 
Primarily this is obviously going to be done with programming and software and stuff like that.
 
But I’m going to call scope on that one.
 
We’re not going to go out into that too much deeper.
 
But I haven’t used all of the, you know, XR, VR, AR devices out there, but PlayStation has like little earbuds.
 
Basically it’s just like you have a pair of headphones, you know, you just put the earbuds in your ear while you’re working with the goggles.
 
Speaker 2: Yeah.
 
Speaker 1: And Quest just has a little speaker.
 
So that type of thing.
 
It doesn’t really matter, but they’re, you know, they’re writing the software so the sound sounds like it’s moving with whatever you’re looking at.
 
Next, that’s going to bring us to head tracking.
 
So you have position and head tracking, okay? So this is basically what’s going to sync you and your body up into your virtual world.
 
Or if you’re just talking augmented, it’s going to sync up your movement with where these arrows are supposed to be.
 
Say we’re just going back to the map where you have arrows on the ground, it’s going to tell you like when you look, it’s going to make it be in the right spot.
 
So we’re going to start off VR again.
 
There’s a website called Vive that sells VR equipment.
 
And they have a great explanation.
 
It says, “How does VR tracking work?” Well, it says, “By pinpointing exactly where the head-mounted display, controllers, and other accessories are located and how each is oriented, a VR headset can translate your body’s movement in the real world to corresponding movements in the virtual world.” So for example, if you were gazing down a river inside a rainforest in VR and then turned your head upwards, tracking is what would cue the display to shift the image from the water in front to the tops of the trees above.
 
So basically as you’re moving, it’s telling the video to change.
 
Does that make sense? So it’s like, “No, this guy was looking down, now he’s looking up.
 
We gotta show him…”
Speaker 2: Right.
 
You can’t still see the river if he’s looking up, he’s gotta see the sky.
 
Speaker 1: So that’s how you get immersed.
 
If you then raised up the controller in your hand, tracking would allow you to see your hand in VR reaching for the clouds.
 
So it’s the same thing, you know, you can put tags on your body or, you know, hold things that translate what you’re doing into that visual field.
 
There are many types of tracking in VR and different ways to achieve them.
 
Some tracking systems only read the movement and positions of your head, while others may track your hands, eyes, and even facial expressions.
 
These could make navigating VR more intuitive and enrich your overall experience.
 
So that’s basically their explanation of it.
 
So basically, you know, you have a headset and you have various other devices like controllers and gloves.
 
And they just communicate digitally to what you’re seeing, what you’re doing.
 
So whatever you’re doing gets translated to digital and then it gets represented there so you know what’s going on.
 
And like I said, these are the same things that are used in augmented reality.
 
It’s a little bit different because even an augmented device, you know, has to know what you’re doing though.
 
So it’s going to have like the same types of things.
 
Like your phone, like I said, when you’re doing that thing with Amazon and the furniture app, your phone has an onboard accelerometer and gyroscope that control… that let it know which way you’re turning it, what you’re looking at, how fast you’re turning it, which is the same stuff that you put like in the gloves or your head, your display, so you know how fast you’re moving and stuff like that.
 
And that brings us to eye tracking, which some devices have.
 
This is kind of a little bit of a next-level experience.
 
It’s a little bit more complicated and hard to achieve.
 
It sounds like not everything has this.
 
It can provide things like more detail in the area of what you’re specifically looking at.
 
So like in a VR, in PlayStation’s system, they actually have eye tracking.
 
So it’s watching where your eyes are specifically looking, so then it makes that area look better.
 
Okay.
 
So it’s taking the processing power to make that little spot look better because you’re not looking anywhere else, so you’re not going to actually see it.
 
And like I said, it’s generally considered like a bigger deal, I think.
 
Like Quest actually, it looks like they didn’t do it with the Meta Quest because of the expense that it takes to make that happen.
 
They were like, “We’re going to focus on other things.” So that’s kind of cool.
 
And then we got controllers.
 
So the last thing we’re going to look on here are the controllers.
 
You know, this kind of hit… we can hit on this a little bit before.
 
But like I said, you have additional devices tracking your movements and this increases the accuracy of syncing up what you’re doing with your digital world, right? So you can have hand controllers.
 
So whereas like if you put a glove on, like a digital glove, or I should say a glove that’s getting turned into your digital world with like little tags on it, tells you where your hand’s moving.
 
You can also just use a joystick to move yourself around in a digital world just like a regular video game.
 
Speaker 2: So you don’t need to have a huge room to…
 
Speaker 1: No, yeah, like if you just want to turn, you hit the button to the left, it’ll turn you to the left.
 
And it depends what device you’re using, but like, you know, PlayStation and Meta use those types of things.
 
That’s usually for like gaming.
 
But you can put a whole suit on.
 
There’s ways to do that where you just literally your body’s covered in suits.
 
If you think about how they make like scenes in movies, that’s kind of what they’re doing is they’re putting that suit on, it’s interpreting it into movements and all that stuff so you can put the digital overlay on top of it.
 
So you can do that same thing.
 
Basically anything you can do in real life that can be interpreted as an action can then be programmed as response into your XR world, okay? So you move your head, what you’re seeing moves.
 
You move your body, position moves.
 
You move your joystick, your virtual body moves in the virtual world.
 
So that’s basically how you make all this stuff happen.
 
So they’re the basics you’re going to encounter as far as hardware concepts to give you that extended reality experience.
 
And that is going to be the greatest depth we’re going to go as far as hardware.
 
And I kind of wanted to do that stuff up front because I wanted to do a history of the hardware and ideas that got us to this point that we’re at in extended reality.
 
And what I don’t want to do is keep categorizing things along the way or try to fit them into these buckets, you know, like this is AR, that’s VR, etc.
 
Speaker 2: I gotcha, yeah.
 
Speaker 1: It’s just not fun, right? And quite frankly, if you read the histories, you realize it just wouldn’t work.
 
And there’s like two kind of big reasons for that.
 
First, the names didn’t even exist.
 
I mean, someone comes up with the name virtual reality in 1987.
 
And the story goes way back before that, you know, we’re going back to 1800s, so that really wouldn’t make any sense.
 
And the second problem is XR, virtual reality, all these things, they’re basically consumer products when you think about it.
 
Like it’s people trying to make money off of these products.
 
Speaker 2: Right.
 
Speaker 1: So it’s really just companies making their own thing.
 
So it’s not like some philosophical idea that began in the 1600s and, you know, we’re all working off of that and we all have this like joined experience.
 
It’s really they’re just trying to make stuff that they can sell.
 
So when it comes down to it, they’re just making up their own ideas, they’re using their own terminology.
 
They don’t always interact and go over things together.
 
And basically you just… you can’t pick and choose which thing you want to call things because everybody’s calling the same thing something a little bit different.
 
Like I said, that’s how you get into the world where some people don’t even acknowledge that there’s a difference between, you know, mixed reality and virtual reality and augmented reality and that type of thing.
 
And that’s going to be the end of our hardware review.
 
So with that, we will wrap it up, which means it’s time for the summation and Julia’s big takeaway.
 
First, don’t forget that the sources used in this entry are all located on the website informatorium56.com.
 
And if you want to reach us at the show, you can email us at informatorium56@gmail.com.
 
So Julia, what is your favorite part of the show? What’s your big takeaway today?
Speaker 2: I guess it’s that stereoscopy came up again and just how manipulating your eyes is such a huge part of the whole extended reality world.
 
But you know, it makes sense because by making those images in such a way, they can make digital things seem like a part of the real world.
 
Speaker 1: Yeah, that is kind of interesting because, you know, it’s weird because like we see such a small spectrum of light and everything we talked about our, you know, our inadequacies in vision and stuff like that in previous episodes.
 
But I mean, it is… it’s your primary, you know, sense I guess for most people obviously, you know, how you put yourself in the space around you.
 
So what you can see is, you know, what you think is happening.
 
And so you said, you know, you can make these fake things seem like they’re part of your world.
 
I guess, you know, vision it’s the… it’s the quickest way to provide information, right? Because there’s a… there’s actually apps that you can get for some AR lenses and stuff that provide translation, like just real-time translation of somebody speaking a different language, it just comes up in your… in your glasses.
 
But as for the summation, I think this time I’ll just use this as a little bit of a transition because today I wanted to cover the basic terms and hardware so hopefully anyone who was completely new to the world of XR could just get a foundation for how it all works.
 
That way going into the next episode, you can just enjoy the stories.
 
Because like I said, you know, we’re not going to sit here parsing terms and categories in the next episode, you know, this was it.
 
Rather we’re just going to look at the fun stuff in history that led to our digital reality of today.
 
And in that history, we’ll cover things like the first flight simulator, something called the Sensorama, which I love, and some interesting soothsayer-like authors that predict what a future of XR might be like from their perspective in the past.
 
So that’s going to do it.
 
Thank you so much for visiting us here at the Informatorium.
 
We wish you a happy, healthy, and beautiful journey until we see you again.
 
Look on the bright side and good luck.
 
Speaker 2: Bye.