WWDC was absolutely packed, to absolutely no one’s surprised. Sure, the Memoji audience was cheesy, but it was kind of fun to see and wasn’t over done in my opinion. But let’s cut to the chase, because we have a lot of ground to cover. Here’s everything you need to know about WWDC 2021.
iOS 15

WWDC started out with iOS, which is kind of interesting because they usually end with that. They said with iOS 15, the team had 4 goals, which are “Stay connected, finding focus, using intelligence, and exploring the world”. And everything was organized based on that.
Starting with Staying Connected, they talked about FaceTime and Messages. FaceTime is getting some huge updates. While a number of them are bringing it in line with other chat apps like Zoom and Facebook Messenger, some of these are really stand out, particularly for Apple. The biggest news is that FaceTime will now work in modern web browsers! This means you can FaceTime someone from not only someone else’s Apple device through Safari, Chrome, etc., but users on Android and Windows will also be able to participate in FaceTime calls via their web browsers without needing to install an app. This not only helps make FaceTime more cross platform, thus reducing a barrier of entry to the service, but lets them get around building specific apps for these platforms. This also could pave the way for something like iMessage in the browser in a future release of iOS and Mac OS.
This is not the only new feature FaceTime is getting. FaceTime users can also share FaceTime links, which operate similar to Zoom or Messenger links, that users can share via messaging apps and in calendars that can be clicked on to open FaceTime and start communication from there. Also keeping in line with this, FaceTime can now share screens on Mac, iPhone, and iPad, share music from Apple Music, and video from a variety of services that integrate the SharePlay API. Some of these services include Disney+, Hulu, PlutoTV, Tik Tok, Twitch, and more. Said videos can also be AirPlayed to another screen for watch parties while the call continues on the phone or iPad. Hopefully these videos will work better and run smoother using this API than it does on existing options in other services. FaceTime can now shoot “Portrait Video”, which is a fancy way of saying it will blur the background and only focus on your face and body.
Lastly, FaceTime is getting some audio improvements too. You can have the audio from your device’s mic either come through in a wide-gamut mode or in Voice Isolation Mode. In the latter mode, FaceTime will try to pick up all the sounds around you to give the people on the other end of the line the full experience of what it’s like on your end. Alternatively, Voice Isolation Mode has the system try to block out all noise except for the speaker’s voice. When show in the video, the lady sounded a little blown out by the machine learning applying the focus, but perhaps somewhat better than trying to do this with just software is still a difficult task. This likely won’t be a full replacement for having a good quiet place to speak, but it could help. We’ll have to wait for the feature to be released before we can say for certain. Also Apple is pushing more Spatial Audio by adding to FaceTime so it sounds like the audio is coming from the part of the screen the speaker is in. Oh, and there’s a new grid view for FaceTime group calls.
Messages did not have nearly as much, but did get a few new things as well. If you share multiple photos Messages at the same time, they will appear in a stack that you can quickly swipe through. There is also a new “Shared” with you section in the Photos, Podcasts, Music, News, Safari, and TV app so that any images, links, etc. shared with you in Messages will appear in those sections with the person who sent them tied to that article, pic, etc. Apple did note in the case of pictures, things like GIFs and memes will tend to be ignored. You can also pin messages with things shared with you, so you can more easily find them later.
Next was the “Focusing” section, which focused on changes to Notifications and the new “Focus” mode.
Notifications will now show contact photos for messages when possible, and the icons displayed will be larger to help show the app or person that created the notification. But more than that, iPhones will now be able to organize your notifications onto “Notification summaries” at later points in the day, with stuff at the top of the summary being things that are more important to you and things toward the bottom being list. Messages to you from individuals won’t be included in theses summaries and will continue to come to you normally. That said, if you put your phone in Do Not Disturb Mode, now called Focus mode, the people messaging you will be notified that you aren’t receiving notifications at the time and will get to them later (similar to DND While Driving, though with a slightly different prompt). But like DND While Driving, users can respond in a urgent situation in a way that will push the notification through anyway (and presumable this will extend to VIP/Favorites in your Messages).
The New Focus Mode, which is replacing Do Not Disturb, is a much more powerful version of Do Not Disturb. Like DND before it, notifications will be silenced either for a period of time or indefinitely. However, you can set Focus Modes, both pre-made and custom ones. When you set up and enable this Focus modes, you can setup apps and Home screens in those modes that will only show the apps and widgets you allow in this focus mode. This means not only will you not get notifications from errant apps, but you won’t even see apps that are outside that focus mode. Likewise, Focus Mode is coming to iPad and Mac, and once enabled on one device will be active on all your other Apple devices at the same time.
Next was the Intelligence section, where Apple focused on improvements to Memories in Photos, Spotlight, and the new Live Text feature.
We’ll start with Live Text. This new feature will be available in both the Camera app and in any photos you have or receive on your iPhone. If there is any text that the phone can detect, the device can now read an act on that text. In the camera app it’ll be visible via a writing icon in the bottom right corner of the screen. You can then copy and paste text that’s legible in the photo, call phone numbers listed there, and even search for the address in Maps.
Live Text also extends into the descriptions of photos. While you’ve been able to have your iPhone describe pictures to you, now Live Text can bring that in written form for you. Likewise you can then search for photos via their descriptions in Spotlight. Apple said that Photos can lookup information from Spotlight based of locations, art, animals, and some other criteria from Spotlight. Admittedly a lot of this is similar to Google Lens, but it’s good to see Apple taking this feature, making it work on device, and adding some refinements (like the call option for a phone number in a picture seems a bit quicker).
Since we mention Spotlight, these photo searches aren’t the only new thing in Spotlight. Other info will now receive better highlights. If you look up actors, movies, or TV shows in Spotlight, you’ll now be presented with bigger photos of the actor’s face or poster for the media as well as better information cards below that.
This also extends to your contacts. If you search for your contacts, you’ll now be presented with their contact folder writ large and quick action buttons for calling them, messaging them, or navigating to their address.
Lastly, going back to Photos, the Memories and For You section have seen some tweaks as well. Now when it creates memory slideshows, it can automatically select what it feels is fitting music to go along with it, with transitions being tied to beats in the music. Special filters can be applied as well. And if you don’t like any of them, you can set your own tunes and filters.
The last section was Exploring the World, which was condensed to talking about Wallet, Weather, and Maps.
Wallet is continuing to grow into the one app for all your cards, keys, and ID’s (hey that rhymed). They announced BMW and other car manufacturers would be integrating the UltraWide Band into their cars so you could store digital car keys to unlock and manage your car from your phone, much like they began to do a while back. They also showed off unlocking Hyatt hotel rooms, corporate badges, and even your own home with digital keys stored in Apple Wallet. The real surprise came in that Apple announced it was rolling out support for ID’s that can be used at TSA security checkpoints and drivers’ license in participating U.S. states, meaning that one day you might be able to keep the keys to your car, your driver’s license, your car’s insurance card, and card for fuel all on your iPhone and all in one app. That will likely be a ways off, but it’s kind of neat to think about.
Next, the Weather app. While these new additions are many, I’d argue they’re long overdue. The app is getting some improvements to the weather background to better reflect the weather and time of day, and some new information tiles like barometric pressure, wind direction, and more. But the most noticeable change is that the built-in weather app is now going to be getting maps! Specifically you’ll be able to see the radar for rain and snow, temperature, and air quality inside the weather app, which has been the biggest holdoff from anyone really using the Weather app seriously. While I still enjoy Dark Sky, it’s good to see Apple actually using some of the technology inside of it besides making the weather app more accurate.
Maps is last but definitely not least. The enhanced Map experience, with 3D layouts for cities, is coming this year initially to Spain and Portugal, with Italy and Australia following later this year. These will come with some further improvements though. For one, these views will have more color options to help delineate, as well as a night-time mode that better shows how the city might look in the moonlight, and topography of different points in the city. Different city districts can also have highlights and functions for districts like commercial zones and marinas.
Roads getting a lot more detail, with Apple showing not just landmarks and buildings, but even trees and other stable items along side the road while driving. Cities will now have better detailing for things like bike and bus lanes. On highways and interstates, overlapping junctions will be better labeled and shown on the map in a 3D layered style to better show where you are and where you need to get on or off.
If you prefer public transit, don’t worry. Transit users can now pin favorite stations, stops, and routes to the top of their lists for quicker access. While riding, Maps will better follow your route and notify you when you need to prepare to get off. And when you do get off, Maps will let you scan the area with your camera, so it can tell you what direction you need to head after getting off the train or bus. Apple made it seem like this was more focused on city landmarks, so subway systems might not benefit as much from this, unless the new Live Text feature can take advantage of the subway signs. That said, those users of subway systems might want to stay with apps like CityMapper for the time being.
Safari on iOS and iPad is getting a few things as well, though for more changes you’ll want to check out the Mac section. First, the WebExtensions API, meaning that developers that produce extensions for Safari on Mac can also easily make iOS extensions that will integrate just as well into the mobile browsers, which will make Safari feel even more powerful, especially on the iPad that already has the desktop experience enable out of the box.
On the iPhone side also the URL bar is now at the bottom of the screen, and you can swipe across the URL bar to switch between tabs similar to how you swipe through apps on modern iPhones.
AirPods
AirPods was next up on the list. Apple continued with talking about bringing better audio to the AirPods in the form of Spatial Audio when the AirPods are used in conjunction with M1 Macs and Apple TV’s, as well as improved audio functionality with Apple Music.
A bigger feature was what Apple called “Conversation Boost“. When enabled, AirPods users will be able to focus on the voice of the person directly in front of them based on the angle of the internal sensors, and minimizing the ambient noise in the room. This seems to mostly be for the AirPod Pros though, so us regular AirPod users likely won’t get to use it. And while this seems like a hearing aid feature, Apple made the note this was only for people with mild hearing difficulties, so if you want to seem cool by having AirPods instead of hearing aids when you are, in fact, using them as hearing aids.
There are some other features coming to AirPods. AirPods will have better FindMy integration, as when lost they can send out a Bluetooth ping that other Apple devices can see and use to help you locate your AirPods. Then once you get close, FindMy will go into Proximity View, much like the AirTags, to help direct you toward your lost Pods. They can also still beep and ring to help locate them since they lack the UltraWide Band.
Lastly, AirPods can now better read out lists like Grocery lists, and incoming notifications as they arrive on your phone. You can disable this entirely or choose only certain apps that you want to do this.
iPadOS
iPadOS was a hefty update all on its own, with some big changes to the interface, multitasking, and how it can be used for development.
Let’s start off with some interface changes that will be familiar with iPhone users. First, the iPad is now getting access to widgets anywhere on screen, rather than currently where it’s only off to one side of the screen. With it, there is also a new extra large widget size option for developers that’ll be available exclusively to iPads. Alongside these widgets will come App Library, essentially the master list of apps on your device and allowing you to take apps off your Home screen that you don’t want. And lastly, you’ll be able to delete and reorganize whole pages of apps.
Another tool getting a revamp is the Notes App, the biggest change being with the Quick Note function. Swiping up from the bottom right corner of the screen, will open a small notepad that will hover on top of the app you’re currently using. Here you can jot down notes either by typing, writing, or drawing. The notes are context aware though, with the example being if you brought up Quick Note while in Safari, the link to the webpage you’re currently viewing would be a part of the note. These can even be made to take you to specific sections of a webpage or app. And once you’re done, you can close out of the Quick Note, and find them later in your Notes app. You can only create these in iPadOS 15 and Macs running Mac OS 12 “Monterey” (more on that later), but you can read them on iPhone via the Notes app.
And across all of these platforms, Sharing Notes improves as well. Shared Notes will now allow you to mention someone and get their attention and show an activity timeline of changes. You can use hashtags anywhere in the document to make them more searchable.
The Translate App is coming to iOS, but is also getting integrated system-wide. One of the new features is a continuous conversation mode, meaning that the app can detect the 2 languages being spoken and then translate those into the other on the fly, no need to press a button and wait for the translation to finish and then go the other way. The app will also allow you to scribble with the Pencil or other pen inside the app to write out the word or phrase you want to translate. And the Translate function will even work to translate within other apps and within the Live Text function for Photos.
The last 2 features are perhaps to biggest changes. Going back to the interface for a moment, one of the biggest sticking points on the iPad has been the multitasking limitations on the iPad, which Apple is remedying somewhat. In version 15, you can tap the top of any app, and it will bring up a multitasking menu where you can open a new full screen app, split the screen in half with another app, or take 3/4 of the screen for one app and 1/4 of another app for the other side of the screen. If you’re using a multi-pane app like Mail, you can bring up certain features, in this case composing a new email, in a pane above both apps as a different layer.
Now all of this is fine, but the next change to multitasking is a little more interesting, and may take some time getting used to. Now on the iPad you’ll have access to a new feature called the Shelf. Here you can minimize and store open apps windows (split-screen or otherwise). These split-views can even be accessed in the multitasking view, and all of this can be controlled via keyboard shortcuts. Admittedly going between the Shelf and the multitasking view seems a little confusing at first, but perhaps once we get hands on it, it will make a little more sense.
Swift Playgrounds is the last app with a major change. For starter Swift Playgrounds will have better code completion tools and new guides to help you build your first app. The code you write in Playgrounds with Swift UI will also be compatible with Xcode. But here is where we begin to continue to see that shift to the iPad becoming more on par with the Mac. Using Swift Playgrounds and UI, you will be able to compile and run your own apps on the same iPad you code them on, and then you’ll be able to submit them to the App Store from your iPad. No Mac needed. While many serious developers will likely still be using Xcode for a long time, for many people starting to code using Swift, this is a major step getting them started and getting their apps into a serious App Store. Apple still view Xcode and the Mac as the truck, the serious workhorse (more on that later), this really is a major shift in the iPad becoming more of a creative tool rather than just a consuming tool.
4: Privacy
Privacy got its own section in this WWDC. At first the changes don’t seem as big as last year, but I’d argue they have filled in some of Apple’s most glaring privacy problems.
First, let’s talk about Siri. One of Siri’s greatest benefits and curses is that it’s more private that Google Assistant or Alexa. Great if you’d like to use these voice assistant features, but also hampered by the lack of data it can safely collect. And one of the most glaring hole in both of these has been that all Siri commands have had to talk to Apple’s servers, even for commands that have nothing to do with the Internet, like opening an app, placing a phone call, or turning on the flashlight. This hole is finally getting fixed in iOS 15. Now speech processing is fully on device, meaning that your speech and requests will be analyzed and processed on your phone first, and only commands that have to connect to the Internet for information (such as looking up movie info, weather, traffic, etc.) will be used. That’s less data going to Apple and faster responses for many commands. To be clear, if you allow your device to send anonymized data about Siri usage back to Apple, some of these commands will still go to Apple for training and analysis, but this is optional and ultimately a smaller amount of data going back to Apple. And it brings Apple closer to parity with Google and Amazon.
Apple’s built-in Mail app is finally getting the ability to block tracking pixels. These are pixels that are typically hidden in images inside of HTML emails, like those that go out in ad fliers and newsletters, that quietly let companies know whether the email was opened, when it was opened, and your IP address when opened. The only way to prevent this in Apple’s Mail, without switching apps of course, was to disable loading HTML and images by default (which isn’t necessarily a bad thing from a security perspective), but it does make the emails unreadable. Now Apple is finally giving us the option to load these images while also blocking the tracking elements.
Speaking of blocking your IP address from trackers, Safari is getting an improvement to help block your IP address from getting sent to these trackers. Given it’s trying to block tracking elements anyway, I’m interested to see how this works and impacts browsing going forward. This will just need to be played around with for a bit to test what it’s doing.
Last year Safari got a Privacy reporting feature to let you know what trackers had been detected and blocked. With iOS 15, apps on your iPhone will be getting the same treatment. Now you’ll be getting an App Privacy report for the last 7 days where you can see how often apps used some of the privacy functions on your device (things like microphone access, camera, location, etc.) and what 3rd party services the app is reaching out to. To be clear, this is only for 3rd party services, so data getting sent back to the app maker or service provider isn’t covered by this, but it can help give you an idea of what’s going on on your phone.
Lastly, let’s talk about iCloud. All iCloud users will be able to now set Account Recovery contacts and Legacy Contacts. Let’s say your account is compromised or you forget your password, you’ll now have the option to have a code sent to another Apple device you own OR a person you trust. That person will receive a code, which you can then get from them to unlock your account and regain access. The idea is that you set a contact that you trust that is essentially vouching for your identity and needing access. In reality, I imagine this will be used by a lot of people who provide tech support for their family.
Legacy Contacts will operate in a similar way. You can set Legacy contacts that can receive your iCloud data, like mail, photos, iCloud drive files, etc. when you pass. These legacy contacts, when you pass, will send the request to Apple, which will check in to see if you’ve passed, and then grant them access (likely they’ll send notices to your email and device when access is request and ask you to respond if you’re alive to prevent users from getting that access prematurely against your wishes. This will be available to all users for free.
iCloud+, however, is a change to the paid service, replacing the iCloud storage arrangements. If you pay for iCloud storage, you’ll still get extra storage space, but you’ll be getting 3 new features. The good news is, the iCloud storage tiers and prices are staying exactly the same.
First will be Private Relay, which sounds very much like a VPN with some extra features, though it’s technically not a VPN. When active, all the traffic leaving your device will be encrypted, sent across 2 separate relay channels so neither Apple nor anyone else can even look into what you’re doing. One channel goes through Apple, who know who you are but not what your traffic is doing, while a third party vendor will direct your traffic while not knowing who you are. Apple has not said who this third party is yet, but said they will reveal it after the feature launches. It’s worth noting to that Private Relay will not be available in all countries, with countries such as China, Belarus, and a few others being excluded from the list.
The second feature will be “Hide My Email”, which will more easily allow you to create fake addresses that forward to your actual iCloud email address. While you can do this easily using “Sign-In with Apple”, you had a limited way to do this for other services, and that topped out at 5 email addresses. Now you can do this when signing up for newsletters or giving your email to other people, giving them a new address that will still send email back to your true iCloud email account, but you can easily close that fake address and stop getting emails from them should it get abused.
The last new feature of iCloud+ will be expanding HomeKit Secure Video if you’re using any compatible cameras. The short of it is, none of the video access and storage will count against your iCloud storage that you’re paying for.
And for everyone, when transferring data to a new iPhone or other Apple device, Apple will give you extra temporary storage in order to help you transfer stuff.
Health
Next on the list was a brief talk about some new Health features, many of these are tied specifically to the Apple Watch and iPhone. Apple started by highlighting an interview of their collaboration on the Corrie Health app for heart attack patients to help reduce paperwork and improve patients understanding of improving their health and working with doctors. According to the Corrie team, this app and collaboration reduced re-hospitalizations for heart attack patients by 52%. This is certainly a puff piece, but anything that helps reduce hospital visits and help get patients in better shape is certainly worth talking about.
Getting into the features, your iPhone and Apple Watch will soon have a new Walking Steadiness feature. While these have been able to measure your steps and gait, now it will have the ability to monitor these and determine how stable you are while moving around, based on step length, speed, motion, etc, and determine whether your overall likelihood of falling has increased or decreased. If it’s decreasing, the Health app can recommend exercises to improve your stability and strength to reduce the chances of falling.
To go along with this, Apple is introducing Trends into the Health app. Your iPhone will better be able to analyze trends for changes in things like your resting heart rate, steps taken, overall activity, etc. There are some other features it can monitor for trends that are more tied to the new Apple Watch features, which we’ll talk about later.
The Health app’s changes also improve upon not just your own health data, but also the data you receive from other sources and how you share them. When you get lab data, either that you submit or get in conjunction with your physician, the Health app can provide better descriptions about your lab reports, such as what LDL or HDL cholesterol are, whether your numbers look good or are improving based off different factors like age, sex, etc. This means it’ll be easier to understand what the lab reports mean, whether or not their health is improving, and steps to keep getting better after they’ve consulted with their physician.
Lastly, Apple wanted to talk about sharing your data, both with your doctor and with others. Apple Health is expanding the ability to share select data to your physician through their health record system on both a regular basis and for specific measurements that you may have questions about. This seems helpful, but I also imagine some doctors potentially getting frustrated by very nervous or hypochondriac patients.
Apple is also allowing you to share select health data with family members, highlighting both for kids and elderly parents. The sharee won’t have access to all the health data of the sharer, only options like steps, exercise, resting heart rate, and some other data, as well as trends for said data. Likewise the sharer can adjust what they’re sharing at any given time. I could see this being particularly helpful for those monitoring family members with particularly health conditions, those helping with elderly or injured parents, or those trying to work with their family members to get in shape.
Apple Watch
Coming right off the heels of Health is everyone’s favorite health accessory (maybe), the Apple Watch. Many of Apple Watch’s newest features come tied to the new health functions, though there are also some much needed refinements and additions to the Watch separate from those.
Starting with the Breathe app, this app will soon be called the Mindfulness app. While you can still do relaxing/meditative breathing exercises as before with new animations to boot, you’ll also have the new “Reflect” function. In this mode, the Watch will prompt you to think of something specific when going through the breathing exercises, such as to think of someone you love or something that brings you joy.
Speaking of breathing, the Sleep monitor function will now detect your breathing rate while sleeping, which Apple says can be used not only to help monitor your sleep quality, but also to detect other health problems since your nighttime breathing rate tends to be more consistent than your daytime breathing.
On the more active fitness side of things, Apple is adding both additional Taichi and Pilates workouts to Apple Watch, and bringing a new coach to Apple Fitness+, Jeanette Jenkins. And while you’re working out, the Apple Watch can do Artist Spotlight, which you can have workout routines synced to specific artists and their songs. Apple showed off Lady Gaga and a few others with more to come later.
Apple mentioned that the most popular Watch face is the picture face, and Apple is enhancing it with a new Portrait Watch face. It can take even non-Portrait Mode photo and make them into portraits It will even dynamically enhance and adjust the elements of the photo as you use the scroll wheel. And the Photos app itself is getting redesigned, have a now mosaic layout, and can more easily share photos to Messages and Mail from the Photos Watch app.
Lastly, one of the biggest personal gripes for me has been the response interface when composing messages on the watch. You’ve only been able to scribble OR dictate OR send emojis, not mix them together. Now you’ll be able to dictate, scribble messages, and add emojis all at the same time before sending the message.
There are a few other features such as the ability to set multiple simultaneous timers on the Watch, a new Music App redesign, and a bunch of other features.
Home, HomePod, and Apple TV
Home, Apple TV, and HomePod got mixed together in a larger block.
First, for Home devices, Apple showed Siri can be processed through other devices, such as the Ecobee thermostat. To be clear, the device is passing Siri to your HomePod, Apple TV, and other Home hub devices and not getting any of your voice data. Instead it allows you to still trigger commands without need HomePod across the house.
HomeKit will continue to rollout support for the new “Matter” system for cross-compatibility across smart assistants and services to help make more devices work across multiple assistants. Smart doorbells and camera will have doorbell and package detection. Your Apple TV will be able show multiple camera streams at the same time and will let you control the accessories around those camera while viewing those
Speaking about the TV side of things, you will now be able to ask Siri to launch shows on Apple TV from other devices. So you could ask your HomePod to start playing the next episode of your favorite show and have it launch it. Presumably this will only be for services that integrate with Apple TV, so things like Hulu, Prime, and Tubi will work, but not YouTube or Netflix. Also, much like the Shared With You in iMessage, people can share shows and movies with you, and they will appear in a Shared section in the Apple TV app. And if you share your Apple TV with multiple people, meaning you all likely have some different interests in what you watch, then you can check out the new “For All Of You” function. So you can get recommendations to watch something that Apple things everyone will like.
Lastly, let’s talk briefly about the HomePod Mini. It will soon be receiving the Lossless music streaming feature from Apple Music, and will be able to act as a speaker for your Apple TV. And the HomePod will be available in Austria, Ireland, Italy, and New Zealand later this year, including personalized experiences for users based off their voice.
Mac OS 12
The new Mac OS this year is version 12, code named “Monterey“. Many of the new iOS features, like FaceTime links, SharePlay, and more will be in Monterey, but Monterey will have some of its own special features as well.
First the Continuity feature is getting big new upgrade with “Universal Control”, letting you use the Mouse and Keyboard on your Mac to control other Macs and iOS devices. I would highly recommend you watch Apple’s demo on it to see the full effect, but Apple showed setting an iPad next to a MacBook, and dragging your cursor to the edge of the Mac’s screen toward the iPad. After it got off the edge of the Mac screen, a line on the edge of the iPad screen appeared, and then the cursor appeared on the iPad, and the iPad was being controlled with the Mac’s keyboard and trackpad. Even more impressive, you could drag and drop files using this from your iPad to your Mac and vice-versa. Apple even showed this working across an iMac, to a MacBook, to an iPad, and then back. That is seriously impressive.
On a smaller note, AirPlay will be coming to the Mac, as in you’ll be able to cast something from one device to a Mac, such as video and audio. While for many people this might not mean much, it will be convenient in places like conference rooms.
Going back to big changes, is Apple Shortcuts is coming to Mac. Apple has already had things like Automator for doing this, which Apple said they are transitioning from Automator to Shortcuts over the next few years, and anyone’s Automator workflows can be imported into Shortcuts. While Automator hasn’t had a big community in recent year, the Automator community has been vocal and has been a great tool for Mac power users. Hopefully Shortcuts will be just as, if not more, powerful than Automator. That said, the Shortcuts app will have Mac specific functionality like opening Splitview apps on Mac, and can be accessed from Spotlight, the dock, menubar, and Siri.
The last major change, Safari is getting a big overhaul. First, the toolbar is getting compressed almost entirely into the URL bar, including extensions, reading mode, etc. Tabs will sit either to the left or right of the URL bar of the open tab, with the color of Safari switching based of the primary color of the site. This is a pretty big and unique visual style for Safari, given just about every other browser chooses to have the tabs over the URL bar and all the elements stay the same in the toolbar beneath it. It looks cool, but it might come as a bit of a shock to some users.
Safari will also have tab groups that can save and sync across iCloud, meaning if you’re researching something you can add a tab to the group, and then close or reopen that group later as you wish. You can even see tabs being added or removed from the group in real time. The feature will be included in iPad.
Lastly, your Safari start page will now sync across devices. Favorites on the page, backgrounds, etc. will now be the same across all your stuff.
Other
There are a few other things that are more in the developer category, but still interesting to talk about. There are several not API’s and kits, such as Reality Kit and Shazam Kit, as well as Apple providing ways to do 3D capture of objects to apps both with and without an iPhone. While some people have pointed out this version of Object Capture won’t be as good as more professional and dedicated tools, for a lot of smaller developers this may help them out when they don’t have the tech to do it themselves.
App Store developers can now have customized product pages based off the app being shown both to different users and based off the app being shown. Also there will be better and more options for advertising events in games and other apps, highlighted front and center in the App Store homepages and in search.
Lastly, the biggest developer news, is a new Xcode Cloud feature. Now developers can share their code in a dedicated cloud environment. When ready, they can compile their code in the cloud, freeing their Mac to work on other things. The Cloud can then simulate running that code across multiple devices to warn of any potential issues on target devices. The app can then be distributed to other members of your team you’re working with. For privacy’s sake, Apple only stores the finished product and not the raw code. Developers will also be able to use the Testflight function on Mac for beta testing apps, similar to the way they can do it on Mac.
Apple did say there will be a cost associated with Xcode Cloud, but that it will be free in testing. Apple seems to be testing the waters for this first, though a number of developers seem excited about this from the rumblings I’ve heard. For the time being, this will be in a limited beta for now, and be available to all developers next you.
Apple said the Public Beta for all the new OS’s will come later this month, and all final versions will be released in the “Fall”, probably the September time frame if Apple sticks to their normal routines.
Final Thoughts
This WWDC was packed, as seen by how long it took me to get this article out. Here are a few things I feel like are major standouts.
- Apple is filling a lot of privacy gaps, like the on-device Siri, tracking pixels, and App Privacy report, that have been present for a while, and legit holes in Apple’s privacy position, so seeing these put in place are welcome. I do worry if the App privacy Report might scare some people with too much info, but maybe it’ll act as impetus for people to take privacy seriously or perhaps even further government legislation.
- iPad OS is definitely making multitasking more accessible and easier to handle, though I’m not sure myself how easy the shelf will be to work with. This will be a wait and see sort of thing.
- WebExtensions on iOS’s and iPadOS’s versions of Safari are also welcome changes.
- The Universal Control may be a niche use, but will likely Sherlock more than a few apps.
- FaceTime is still not full on parity with a lot of other communications apps, but it is a lot more useful now that at least users not on Apple devices can still join calls with it. I’d like to see whether this feature works when working with devices running older versions of Mac and iOS, otherwise that could be a pain point, but I imagine Apple has a way around this.
What were your favorite portions of WWDC? Did it blow you out of the water, or were you disappointed in something getting overlooked. Let me know in the comments below.
[…] of the exciting things Apple announced at WWDC 2021 was Private Relay. This VPN-like service will help keep your browsing traffic private from trackers […]
[…] to check out the full details then I’d advise you check out the page detailing it’s announcement from WWDC 2021. However, one thing interesting to note is that some of the new MacBooks, particularly the 14-inch […]