Apple Vision pro - VR  💁                               

The way i feel _


10 years down the road because we saw what happened with the iPhone and the Mac and the iPad and all sorts of other first-generation products. And on top of all of that, as far as I know, Apple has never released any other first-generation product with the word Pro already in the name, which comes with a whole nother set of implications. So is the world ready for all of this? Let's get into it.

So I might be one of the 20 people outside of Apple who has been using the Vision Pro the most over the past two weeks. Like, I've spent hours in this thing with both bands, with multiple Macs, in different setups, different rooms, indoors and outdoors, lightness and darkness. There are parts of this thing that are absolutely amazing, unparalleled, best I've ever seen. But the reason it's so interesting is because it's actually new, and there are downfalls and flaws and trade-offs that come alongside all of this stuff.

I have used a bunch of different VR headsets now, and this Vision Pro has the sharpest, best-looking micro-OLED display out of all of them. The size of individual pixels on these displays is seven and a half microns, which means you could fit 64 of them in the size of a single iPhone screen's pixel. You can't see individual pixels; there's no screen door effect; it's awesome. The native refresh rate is 90 hertz and it will crank up to 96 hertz when there's 24 FPS content playing to be an even multiple. And Apple says that they calibrate every single one of these Vision Pro displays from the factory for maximum color accuracy. They're really good, and this is a big reason why this headset is so expensive.

But then, and this is gonna be a recurring theme here, the Vision Pro runs up against the technology of today not being quite advanced enough to accomplish what they were probably hoping as ideal. So in the case of these screens, right, they're amazing, there are so many pixels, but because there's so many pixels, the computer inside cannot actually render everything in high resolution all the time at 90 Hertz. So instead, it does something clever. It combines the insanely fast eye tracking with what's called foveated rendering, meaning, it's only actually rendering in high resolution exactly what you're looking at when you're looking at it. The rest is soft and fuzzy. That actually works really well because that's

exactly how our eyes work. It's really clever, like you don't think about it, but the thing that you're looking at at the moment is sharp, but then the rest of your peripheral vision is soft and fuzzy, and that's fine. So really, now, all of the computing work is being done to track your eyes as fast as possible so that there's no lag between when you look at something and when it becomes sharp. Fun fact, you can actually see this in screen recordings from the Vision Pro. You can see the piece of the screen that I'm looking at is sharp, and then everything else around it, even parts of the same window are fuzzy on purpose. But to my eye, that looks totally natural because I'm focusing on one thing at a time.

I found that you can also screen record with developer mode in Xcode, and that'll make the clips 4K and it'll render everything in HQ all at once. But every time I did that, it would be choppy and scrolling would be slow and jittery. And I'm thinking that's just because the computers aren't really used to rendering everything in high quality all the time. So it looks like a higher quality recording, but the second I did any scrolling, it didn't look as good, so I just didn't use those recordings as often.

So the screens are great, the position tracking of objects and space are great, the eye tracking is incredibly good. The one ding against immersion on the Vision Pro though, and not a lot of people are talking about this, but it's the field of view. See, the first few times you use this headset, you don't even really think about it that much. You're so distracted by all the fun and the newness and how cool it is

that your eyes are controlling the thing. But eventually, you start to poke around the edges and it turns out, you know how people are saying it kind of looks like ski goggles from the outside? Well, it also kind of looks like ski goggles from the inside a little bit too. Again, the middle is super sharp and incredibly impressive, but if I can do my best here through a YouTube video, the edges of the headset are a little bit further in than the edges of your vision. And so, there's a little bit of like a cone effect going on and there's some chromatic aberration around the outside. So you kind of have this slight feeling of looking into a large tunnel at everything. There are actually no field of view numbers published by Apple anywhere about Vision Pro, as far as I can tell, and I kind of think that's on purpose because I have noticed from using them both that the Quest 3 has a better, wider field of view just looking inside the headset. So if I could change one thing about the Vision Pro to make it more immersive, it would be a wider field of view, no question.

Vision Pro has the best passthrough of any headset I've ever used, that much is super clear to me, and weirdly enough, this doesn't actually surprise me either. Maybe because this is one of the products that makes it so obvious that they're thinking a lot about the future, like Apple talks a lot about AR and how they want things to just be clear and just overlaying things onto your real world. But with today's technology, again, that's not quite possible yet. So instead, they have a VR headset, but they are using the highest quality camera feeds possible and the highest quality displays on the inside possible to let you almost feel like you're looking through it at the real world.

So you put this headset on, and the first thing you see is passthrough, I mean, you might as well call it transparency mode. And the sharpness and the colors and the very low latency are all so good that I really don't experience any eye fatigue, no matter how long I am in this passthrough mode, despite my eyes being inches from these screens, I can interact with the real world around me, pick things up and look at them, I can walk around, between rooms and not trip on things. I tried having people throw things at me and I could just catch them. I played table tennis successfully with the headset on, which is crazy if you think about what's actually happening here. The total latency Apple says is 12 milliseconds, that's from the outside light hitting the outside sensors to the inside image being updated and hitting your eyeballs, that's incredibly fast, and that includes the exposure time of the cameras. That's the specially designed R1 chip at work. But, as Nilay from The Verge has put it, it's still cameras and screens, like the technology of today isn't magic. So you still have to expose a camera sensor and set ISO and shutter speed, et cetera, and you can kind of play around with this a bit just by looking around like at bright objects or high dynamic range environments.

And you know what, for the variety of situations I've thrown at this thing, it's handled it very impressively the whole time, mostly prioritizing smoothness and high shutter speeds at the expense of cranking up the ISO and getting way more noise, especially in darker environments. But you can still see stuff like the hand occlusion breaks sometimes or look really janky when you put your hand in front of something. You can still see objects start to float a little bit more in X, Y, and Z space when you're in much lower lighting, as opposed to the usual perfect position. Again, it's the best I've ever seen with today's tech, but it definitely still has a long way to go."

So I don't know if the eyes on the outside of this headset are actually accomplishing what Apple probably wanted them to accomplish. I think this, again, comes down to what we expect Vision Pro to be in the future, see-through glasses, way down the line. But that's obviously not possible with today's tech, so the closest we can get is a lenticular display that shows your eyes to the outside world, and so that you can see them and they can kind of see you.

I don't know, at this point, I've tried my eyes, but also everyone else here at the studio has also scanned themselves in and tried EyeSight, and it's just not very visible. Like, the smooth glass of the headset is so incredibly reflective that there's almost always some light bouncing off it in a way that makes the eyes hard to see. And even if you can see it, it's pretty low resolution, thanks to the lenticular display, and the bottom line is it doesn't actually feel like eye contact most of the time, which is very different from what you might have seen in the ads, especially for darker skin tones.

And I was actually thinking about this like, this is so far from what I think I expected that I was wondering, does Apple in the next generation Vision Pro double down on EyeSight or do they get rid of it? And I think they have to keep it, like it's such an iconic part of the headset that everyone's expecting it to stay, so it's gotta be there. But also, this is Vision Pro and I think that implies that we're gonna get Apple Vision at some point down the road, which is like a less expensive version of this. And with that, do you take out EyeSight or does that still appear in the lower-end version? I don't know, only time will tell.