WWDC 2023 Recap – A New Vision for Apple

Reading Time: 14 minutes.

Website screenshot. Reads: "WWDC23 Introducing Apple Vision Pro and the era of spatial computing. The new 15-inch MacBook Air with M2, Mac Studio with M2 Max and M2 Ultra, and Mac Pro with M2 Ultra. And previews of iOS 17, iPadOS 17, macOS Sonoma, and watchOS 10."Yeah… that was a terrible pun. By now, you may have already heard: Apple’s new virtual reality headset has a name: Apple Vision Pro. Where’s the Apple Vision? Not here. We’re starting with the “Pro.” But let’s back up. The Vision Pro was Apple’s “One More Thing.” It may have been the most exciting thing Apple announced during WWDC, but it wasn’t the only thing. Apple also revealed updates to their software for the iPhone, iPad, Mac, and even the Apple TV and Apple Watch. Apple also revealed the end of their transition to their own M-series Apple silicon chips with the new Mac Pro.

So, like Apple, let’s quickly go over things like the new MacBook Air, Mac Studio, Pro, and such, and then jump into what you’re really here for: the first virtual reality headset that actually looks good.

The Mac

MacBook Air 15″

MacBook Air 15 inch from sideUsers will no longer have to decide between the more expensive MacBook Pro and MacBook Air if they just want a large screen size. Apple has brought one of their most popular laptop sizes, the 15″, to the MacBook Air. The new 15″ MacBook Air only comes with the M2 processor, but starts at just $1,299. That’s less than a similarly spec’d iPad Pro with Magic Keyboard. In comparison, the 13″ MacBook Air comes with an M1 processor for $999. For just $300 more, you get a larger and faster computer. Though the M2 MacBook Air may be your speed as well, with a slightly larger 13.6″ display for $1,099.

The new 15″ MacBook Air comes in four colors, Starlight, Midnight, Space Gray, and Silver. It supports up to 24GB of unified memory, more than enough for the average user, and a respectable 2TB of storage. The additional screen size doesn’t change battery life, which still sits at a whopping 18 hours.

The base model is good only for some light browsing and email, mostly web app usage. If you want to use this for everyday tasks reliably, you’ll want at least 16GB of unified memory and minimally 512GB of storage. That would run you $1,699. The full configuration, with 24GB of memory and 2TB of storage is $2,499. That’s still a good deal, and a similarly specified MacBook Pro would run you about $1,000 more. For that you get their more advanced M2 Pro processor and 32GB of memory, more than most users will need, and not worth the upgrade for those looking at the MacBook Air.

Mac Studio

The Mac Studio is Apple’s “In Between” pro desktop. It sits between the Mac Mini and Mac Pro, offering pro-level performance, without the size and expandable PCIe slots of the Mac Pro. It’s a compact desktop powerhouse, and Apple made it even more powerful. Now it features Apple’s new M2 Ultra chip, which is basically just two M2 Max chips pressed together. Actually, that’s exactly what it is. These chips were designed to be able to be combined to increase processing power almost two-fold, with minimal latency.

Screenshot from Apple's website for the M2 Max and M2 Ultra. Specs read: 12‑core CPU Up to 38‑core GPU Up to 96GB unified memory 400GB/s memory bandwidth 24‑core CPU Up to 76‑core GPU Up to 192GB unified memory 800GB/s memory bandwidth

The rest of the Mac Studio is, unfortunately unchanged. It could use another HDMI port, but with thunderbolt expansion, can easily fit more displays (up to 5). It also still has that awkward front that just doesn’t quite look complete. Still, it’s a great way to put power on your desktop.

Mac Pro

And what if you want to put power under your desk? The Mac Pro has been Apple’s last holdout for Intel Macs. Because of its expansion possibilities, with PCIe-based GPU and Afterburner expansion capabilities, Apple’s own silicon just didn’t match up. However, Apple’s finally ready to add PCIe expansion to their Macs once again.

The new Mac Pro features Apple’s M2 Ultra chip, up to 192GB of unified (graphics and system) memory, with support of up to 8 displays. Could you still beat this performance with an Intel-based Mac? It depends on what you’re doing. The fact that the new Apple silicon Mac doesn’t seem to support third party GPUs means it’s frozen in time. You won’t be able to buy a new GPU in a few years, dramatically increasing its power. You also won’t be able to run Windows on it for games or other productivity software. Instead, you’ll have to buy an all-new Mac Pro. That’s far more expensive and, frankly, incredibly wasteful. Apple’s dedication to the environment ends when they can sell you another computer. They dropped support for eGPUs some time ago, meaning they’ve always planned on locking users into Apple-only hardware.

Still, the additional expansion slots could come in handy. Apple could also be just a firmware update away from enabling eGPU and PCIe-based GPU support, but doesn’t seem to be working towards that.

The new Mac Pro starts at a whopping $6,999 for basically the same performance as the Mac Studio. You can upgrade the processor with a more powerful GPU for $1,000, go up to 192GB of unified memory for $1,600, and add up to 8TB of internal SSD storage for $2,200. Wheels are still a ridiculous $400. Without software, you can easily spend $12,199 on your new Mac Pro. It’s actually a lot of performance for the money, and should last you years, but, due to expandability issues, may not be able to last you as long as it could.

Apple really needs to bring back third party GPU support.

macOS Sonoma

Compilation of macOS Sonoma screenshots showing widgets and presentation capabilitiesApple’s macOS updates often don’t feature too much new, but this year it’s actually more exciting than the iOS 17 update. macOS Sonoma brings a feature we’ve wanted since the earliest years of macOS X: desktop widgets. Now you can drag widgets straight from your notification center to the desktop. They’ll even dim when you have apps in the foreground, so you can still read the information without being distracted. Another fantastic widget feature is iOS widgets. You’ll be able to display iOS app widgets on your Mac without installing the app. So, using Continuity, when your iPhone is in range of your Mac, it can display your iPhone’s widgets, straight from your iPhone.

I’m a kind of “jack of all trades” computer gal. Sure, I’m a software engineer, but I also enjoy designing graphics and the occasional video editing. I frequently make videos for presentations, and have gone to great length before to make picture-in-picture videos before, which allow presenters to stand in front of their presentations and demos. With macOS Sonoma, that can be done in real time. Apple silicon Macs can use their cameras to split out your background, allowing you to walk in front of your content or appear as a small bubble over it, without your background in view. Even Intel Mac users will be able to use continuity on their iPhone to allow the view to follow them as they move throughout the screen.

There are improvements to Safari for privacy and multiple user support, new messaging features to make catching up on old group texts easier, and more. Apple even added a game mode to prioritize games and reduce input lag from controllers. Though, without support for gaming GPUs, either external or internal, gaming on the Mac will never be a focus of developers. Still, you’ll be able to load up Stray and Death Stranding soon enough.

iOS 17

iOS 17 screenshots

iOS 17 feels more like a “nice to have” update. There are some neat new features, like a contact poster you can create to display when you call someone. You can also leave FaceTime voicemail messages, if you just want to capture a moment or say hi to someone who isn’t available. You can also see live voicemail transcripts, so you can decide to pick up a call while you’re on it. Nice, but certainly not in my top “must have” features… like third party browser add-on support.

Another disappointing year for those wanting to finally use something other than Safari.

There are new sticker saving features, and iMessage enhancements for group chats. You can share your location with someone more easily, and it’ll tell them when you’ve made it home safely. I know my parents will like that, and I’ll like it too. I often worry about my friends who leave a hangout late at night and need to walk home alone.

There’s a new “Standby” mode that shows widgets and other items on your lock screen while your phone is in landscape position. It’s neat, I guess? Widgets are more interactive, with the ability to check off items on reminders lists and perform other small actions. Also a nice little update.

The biggest update, and likely the most exciting and likely to change your habits, is the new AirDrop features. Here you can just bring your phone over another person’s iPhone, like you’re using Apple Pay. However, it’s not for payment, it’s for transferring contact information with a tap. Long ago, such a feature existed on iPhones. There was an app called “Bump” that let you transfer contact information by holding your phone in your hand and fist bumping someone doing the same. This, however, will be far more reliable and private.

Another interesting new API is for journaling. You’ll be able to create journal entries from your daily activity. That could be places you visited, exercise you got, photos you’ve taken, and even other activity in your apps.

Oh, and autocorrect is made to learn from what you type, so it’ll make fewer “ducking” mistakes.

iPadOS 17

iPad OS Widgets

iPad OS got a lot of “catch-up” updates as well. Finally the lock screen customization of iOS 16, along with the new widgets can find their way to iPadOS. The new widget layouts and images on the lock screen are made for the increased screen size of the iPad, offering data at a glance. This will also include live activities, like food delivery trackers and sports widgets, as well as multiple timers. Apple is finally bringing the health app to the iPad as well.

iPadOS 17 will bring many of the features of the iPhone to the iPad. Just… not a calculator.

Other Updates

watchOS 10

I’ve said it before, the Apple watch has gotten stale. watchOS 10 won’t change that very much. There are some new views for phone calls and maps, some Snoopy watch faces, and support for more bluetooth accessories for exercise. There’s not much here, but there hasn’t been for some time. It’s part of the reason I moved back to analog watches. They’re more fun, with more variety, and they don’t nudge me for something ridiculous 200 times a day.

AirPlay Everywhere

Have you ever been to a hotel, turned on the TV, and were disappointed by what passes for cable television anymore? There’s nothing on. With the latest updates to Apple’s ecosystem, you’ll be able to simply scan a QR code or tap your device to begin streaming from your phone to the TV. Obviously this won’t be every TV in every hotel, but starting this year, your night in before a conference is going to get a bit more interesting.

AirDrop and Go

One of the issues with AirDrop is how some transfers, like large videos or many photos, can take a long time. What if you have to go? With Apple’s next updates to iOS, iPadOS, and macOS, you’ll be able to just start the AirDrop and leave. Your phone will continue to upload the files to iCloud, encrypted and secured, and then downloaded to the other device. What started as a peer-to-peer transfer now has an iCloud backup.

Vision Guidance

I remember as a kid being told if I sat so close to the TV, I’d have vision problems. I was scared of this, so I’d sit on a couch far away from the TV, even though it was small and far away. Now I’m not quite so good at this. I’ll hold screens at arm’s length and sit just a little too close to my computer monitor. This can still damage vision in adults over time, but it’s especially dangerous for kids. They can develop near nearsightedness. As an adult, you can practice by giving your eyes a break a few times every hour, looking at something off in the distance for a few seconds. Kids might need a nudge. With iOS, iPadOS, and watchOS updates, they can. Kids wearing an Apple Watch will get pressured into spending more time outside in the sunlight thanks to ambient light sensors in their watch. Wear sunscreen and sunglasses too, of course! On the iPhone and iPad, they’ll get a nudge if they have their face too close to the screen for too long.

It’s a bit ironic that Apple made these tools to protect people’s visions by keeping screens far away from their eyes, then announced a product that slaps two screens an inch from their eyes.

One More Thing: Apple Vision Pro

Apple Vision Pro with headband and batterySee what I did there? I made you scroll all the way to the bottom. It builds up suspense, makes the final item the thing you leave with, and, let’s face it, it improves search engine optimization. Supposedly. Who knows anymore? Maybe some Google bot just ripped whatever I wrote without permission or compensation and shared it in some search result. Troubling times, friends. But Apple has a much less troubling use of AI that I think you’re going to love… if you can afford it.

The Apple Vision Pro will cost $3,499 when it’s released early next year. That’s a hefty price. VR headsets don’t come close to that price. But they also don’t come anywhere near the features of the Apple Vision Pro. It would be laughable to compare something from Meta, Sony, or HTC to what Apple introduced. Those largely are just screens for your face with some controllers, a bit of motion control, and made for niche audiences. The Apple Vision Pro is a full computer you strap to your face. It’s a new interface system that Apple sees as a new way to interact with technology. What the Mac did for the computer, the Vision Pro will do for virtual and augmented reality.

The Next Step in User Interfaces

Simulated view of someone using the Apple Vision ProOne of the biggest questions in the VR/AR space is how will you control what you’re seeing. Since most are gaming-focused, the solution has primarily been video game controllers. That’s not what Apple’s aiming for though. The Vision Pro is certainly something you can play games on, but more in the way you can play games on your phone or your computer. It’s only a small part of the purpose. Therefore, Apple implemented three main interaction points: your hands, your eyes, and your voice. While you can use a Bluetooth controller for games, or a keyboard and mouse for work, most usage will center around you.

To tap, you’ll just pinch your fingers together, like you’re tapping your index finger on your thumb. This pinch motion can also be used to resize or move windows around. You can also flick with this motion to scroll. As for what these gestures interact with, the Vision Pro has a myriad of sensors in the eye cups to watch your eyes using infrared tracking. As you look around, the UI will acknowledge your eyes. There’s no pointer, because there’s no reason not to trust that you’re interacting with whatever you’re looking at. With voice, you’ll have your typical Siri commands, but you can also activate text boxes simply by looking at them and then speaking to fill them out with whatever you’re looking for. There will also be an in-screen keyboard you can use.

More Mac

One of the many reasons I don’t want to return to office work is because my home setup is so much better. I’ve got an ultrawide monitor, tons of desk space, great lighting, the list goes on. But… what if I could bring my giant monitor to work?

Great, I just made a bunch of idiots trying to justify office real estate very happy, didn’t I?

With the Apple Vision Pro, you just have to look at your Mac and you can drag the screen up and away. It’ll project the screen into your view like any of the others in the Vision Pro’s view. Drag it around, resize it, and continue to work with a keyboard and mouse if you prefer. Now your Mac is as big as you want it to be.

Still don’t want to go into the office though.

FaceTime Anywhere

One fun feature of a heads up display like this is talking to other people. Being able to socialize over great distances, making your friends and family feel like they’re in the room with you is something that certainly sounds welcome after the pandemic. But how will they see you? You’re wearing your device, how could they see your face?

Turns out, Apple has a simple answer for that too.

When you set up your Vision Pro, you’ll be able to take a 3D scan of your face. Your Vision Pro will store it. Then, it’ll combine facial movements, eye movements, and hand gestures, to create a 3D version of yourself that sits just on the side of not quite uncomfortable in the uncanny valley. It reportedly isn’t perfect, but good enough to converse with someone. This technology will likely improve over time, but the 3D avatars they can make for people now is apparently lifelike enough to form a connection over FaceTime.

We’re so close to being able to make an avatar that is having a perfect hair and makeup day and only using that for work calls. So close!

Pass-through VR and AR

Someone looking "through" the front glass, a display that shows their faceThis was one of the most impressive parts of Apple’s entire presentation. We have videos all over the internet of people playing with a VR headset and accidentally hitting a bystander because they couldn’t see them. There’s a sort of fear as well, blinding yourself to the outside world, even adding headphones to the mix, and becoming immersed in something that isn’t actually there. The Vision Pro doesn’t have that problem. When a person comes into your view, it recognizes that a person is in your space and will show them, even over your content if they come close enough. You can quickly give them your attention. Then, the Vision Pro headset will show your eyes on the outside screen, so they can still have a conversation with you, making eye contact, without you ever taking off the headset.

That is seriously cool.

You can “dial in” the world around you using the digital crown, like the one on the AirPods Max and Apple Watch. Give yourself a nice view in the background of your windows, or immerse yourself in it entirely. Have your content in your area, or make your area a state park. With a simple twist of the dial, you go from immersed to present, and everyone around you will notice the difference too, with the screen on the outside showing your eyes when you can see them, and showing a colorful display when you can’t.

The Hardware

View of many sensors on the front of the Apple Vision ProSo what makes all this magic possible? At its heart, is Apple’s M2 system on a chip, supported by a new “R1” chip. The M2 you already know, from Apple’s Macs. Yes, this headset has the power of a Mac in it. It also has the R1 chip, which helps with real-time processing. This is what tracks your eye movements and gestures, so you don’t notice the lag between looking somewhere and interacting with something and that item responding to your attention or gestures. The R1 enables a 12ms response time, which Apple says is 8 times faster than your average blink. That should dramatically cut down on the usual delay from inputs that can lead to motion sickness in VR devices.

The first thing I noticed was the sheer number of cameras on this thing. It appears Apple has everything from wide angle cameras, other sensors, and potentially even a few telephoto lenses on this. The cameras and sensors, of which there appears to be at least 13 outward-facing ones, scan your surroundings for both data and to produce your internal view. This is useful for head tracking, facial tracking, hand tracking, and mapping your surroundings in 3D constantly. You can also use these to take 3D videos and re-watch them later.

As for the displays you’re looking at? There are more than 4k pixels for each eye, with about 64 pixels fitting in the space of a single iPhone pixel. These are extremely high resolution screens that help create the illusion that your content is real, not just some pixels you’re viewing. Apple, frustratingly, hasn’t announced the frame rate yet. A high frame rate is necessary to avoid motion sickness. However, Marques Brownlee got an advanced look at it, and was able to say that it feels like 120hz. This might be enough to help even those who get motion sickness in VR avoid the issue, though it won’t be enough for everyone.

Audio Pods

The speakers in the headbands use directional audio, like sunglasses with built-in sound. These “audio pods” work a lot like HomePods, with some more advanced features too. They use “audio ray tracing” to map out your space. Combining this with head tracking and spacial audio, they can make sound feel like it’s coming from within your space, completely immersing you. If you want to shut out the outside world, you can also use AirPods with noise cancellation, to completely immerse yourself in your content.

Battery Life

Now, this is likely the most disappointing news about the headset. These have an external battery pack that you’ll have to put in a pocket or on a clip on your belt. That battery pack only offers two hours of battery life. That means you can’t even watch a full movie, as so many movies are longer than two hours anymore. You can plug the battery pack into a wall to charge while you watch, but you’ll be tied to that outlet.

The battery pack is completely detachable, so third parties may release larger battery packs. Perhaps you’ll have your “walking around” battery pack and your “I’m not going anywhere” battery pack that’s too heavy to wear in your pants all the time? Perhaps third parties will even make neck-mounted battery packs, like there are for some headphones. I could even see necklaces become a thing. And, as anyone who grew up in the 90s can tell you, big front pockets on hoodies are invaluable. Battery performance is definitely going to be an important factor for Apple moving forward.

Adjustments

Internal view of the lensesI’ve tried VR before. It can be cool and immersive, but after about 5 minutes, I was out. It made sense that my demo was an underwater view, because it felt like I was swimming. Visible pixels and lag just made the whole experience neat, but ultimately unpleasant.

Apple’s working to make sure their Vision Pro headset adjusts to you. Along with the greater than 4k screens that supposedly feel like they have a mostly comfortable 120hz refresh rate, the entire device is adjustable to you as well. The internal lenses themselves will move slightly to enable them to fit exactly where your pupils are, matching your pupil distance. This is one of the biggest changes that’ll make these feel less nauseating. There’s a reason you need precise pupillary measurements for eyeglasses, and it’s why I could never use the “reader” form of computer vision glasses. Unmatched pupillary distances can lead to eye strain, headaches, and nausea.

Besides that, Apple has also partnered with Zeiss, the legendary optical lens manufacturer, to make magnetic lenses that are drop-in replacements for your glasses. That does mean that someone with glasses could, potentially, replace their glasses with the Vision Pro, if only for two hours at a time.

Outside of the visual adjustments, the device itself is customizable. Users will be able to get the perfect light shield for their face, as well as a strap custom fit for their head, dialed in to the right size. There’s also a strap that can go over the top of your head, if you need more support. And, Apple’s making all of this available to third parties, so they can make their own straps if you’d prefer something else. Everything about Apple’s Vision Pro is made to customize to your face, your head, your eyes, and even your ears.

And More

View of a content screen on the Vision Pro

For security, Apple’s opting for “Optic ID,” a new identification system that uses detailed photos of your eyes for identification. They haven’t compared it to Face ID or Touch ID as far as security is concerned, but detailed photos of a person’s iris can be very secure.

Marques Brownlee also pointed out in his first impressions video a unique blurring feature. Items you’re not focusing on will become less responsive and blurred. This not only helps improve battery life by focusing processing where you need it, it can also help make the headset feel less busy and distracting.

The Vision Pro is Apple’s vision for the future. It looks promising, though without any unique or practical uses just yet. Third party app developers will be vital to make the Vision Pro a “must-have” accessory. We’ll be waiting to find out more about the Apple Vision in the lead-up to its release in early 2024.