For The Talk Show Live at WWDC this year, John Gruber was joined by Apple executives Craig Federighi and Greg Joswiak. The trio talked about Apple’s announcements yesterday, including the Mac Pro, iOS 13, iPadOS, and much more.
Mac Pro + Pro Display XDR
Regarding the new Mac Pro, Gruber first pressed Joswiak and Federighi on the price of the wheels – an optional accessory that Apple showcased alongside the cheese grater tower. Naturally, Federighi wouldn’t offer a firm price, instead jokingly telling Gruber that it would depend on how many wheels he wanted and that installment plans would be available for the wheels.
Gruber also said that he’d heard from a source that the new Mac Pro “will ship with the most insane packaging” that Apple has done in a while. Joswiak’s response? “You can imagine it won’t be shitty packaging.”
As for the sheer power of the Mac, Joswiak explained that Apple knew it had to create something “really, really great” for its pro users:
We knew we had to create something special. If you ever go walking the halls at CES, you can buy a case from a Taiwanese company with a chip inside. We wanted something eally really special and designed for our pro customers, including the display.
Joswiak also offered some detail on the nano-texture design of the new Pro Display XDR, noting that Apple has undercut the competition of the “gold standard” reference display:
The normal matte people causes a sparkle. This is a process unlike anybody’s done before. It handles the glare in a way that no one has done before, it’s an incredible process that we’ve invented. The standard [Pro Display XDR] has an industry-leading anti-reflective coating.
The gold standard costs $43,000. What’s amazing about it, is that reference display can only keep the brightness for a very short period of time. Within seconds, it has to go amber and say it no longer trusts that image. We’ve blown that away at just over a tenth the price.
After talking about the new hardware introduced at WWDC, Craig Federighi dove into Apple’s new Project Catalyst, which allows developers to easily port their iPad apps to the Mac. Federighi noted that if a developer simply ticks the “Mac” option in Xcode, they’ll get some degree of “Mac-ification” right off the bat, but that developers can fine tune that for a true Mac experience.
It’s fully native framework and we have an appropriate set of controls so you can build a really distinctive experience. If you just push the Mac button, you’ll get some degree of Mac-ification. You don’t have to rewrite all of the code to do that. You can have one code base and one team. It’s a parallel native framework set for the Mac.
In defense of the first set of Catalyst apps that were released by Apple last year, such as Apple News and Home on the Mac, Federighi said that some of the complaints people voiced were actually simple interface design decisions that the development teams made. Not necessarily because of Project Catalyst framework itself. Over time, Federighi says that Apple has learned how to strike a balance between the best design for these types of media-oriented applications on the Mac:
When we released the first set of apps using Catalyst, some of the concerns that were voiced placed a certain amount of focus on the technology, but that was really design decisions we made. There were pure design decisions that were different design teams pushing the bounds of what is the future of media oriented design. I think we’re finding our balance there, pulling back in some areas. And the underlying technology has improved.
Joswiak also offered some new information about the development of the new Voice Control feature in iOS 13 and macOS Catalina. He called the work on this Accessibility feature “some of the most touching things we do.” Joswiak also noted that many of the people on the Accessibility team have a wide-range of abilities, saying that “we’re living this in house and these technologies really impact the people at Apple working on the projects.”
One of the most popular announcements yesterday among developers was Apple’s new SwiftUI platform. On The Talk Show today, Federighi called SwiftUI a “generational kind of development” for Apple because of the benefits it has on programming language:
I view our APIs and syntax as our language to be every bit the design exercise of a user interface. The way you express yourself in code should be as thought-out as something you interact with.
Apple fit a lot of information in its WWDC keynote yesterday, and Joswiak shared some color with Gruber on how Apple goes about building its keynotes:
We could spend longer on everything. When we first put these keynotes together, they’re three hours. We try to keep things down to 2:15. We have to figure of how many slides to create. We have mapped every presenter to see how many slides they can do per minute. The fastest one by far is Craig. In his prime, Craig would do 9 slides per minute. He’s slowed down to about 7 as he’s gotten older.
As for iOS 13, Federighi said that one of the important goals with the new Photos app is to help users better surface their major life events amid a growing sea of photos:
You just never have these experiences in a sea of photos. The team has gotten so much more advanced of the years; what’s the arc of a meaningful event for you? What were the big events that were important to you?
Federighi also noted that Apple continues to focus on its on-device machine learning strategy, something that the “other guys” are finally starting to catch onto. He noted, however, that on-device machine learning is easier when you have a consistent hardware base, something that companies like Google lack:
In fact, if you watch recent events from the other guys, you’d be surprised to see they’ve started to say on-device machine learning. They’re actually seeing the light on that topic. I think they’re disadvantaged because part of what makes this possible is building this great hardware and the integration of hardware and software. Pulling this off between a random fleet of devices, it’s really just impossible.
Federighi said that “having your phone know you is cool, having some cloud person know you is creepy.” That’s, of course, a clear shot at other companies that send data to the cloud for AI and machine learning.
Speaking on the decision to launch iPadOS 13 as the new operating system for the iPad, Federighi explained that while it’s a marketing thing on the surface, it’s something the engineering teams also felt strongly about:
Even though it’s a marketing thing, engineering felt very strongly on this one. We’ve been on this trajectory for the iPad from the outset, codeveloping with this incredible hardware. What do you want to do with a device that has these kind of characteristics? Things like Split View and Slide Over and Apple Pencil, and then you start to see apple and developers tailor the experience.
When asked why it took so long to bring USB drive support to iOS, Federigihi explained that it really boiled down to security:
From a security architecture point of view, we did not want to have file system drivers running in the kernel communicating with external media that could have been tampered with. Getting all of our file systems to be isolated from the kernel, it was real hardcore engineering efforts.
He also touched on the new text controls and gestures in iPadOS 13, explaining that this was an area where it still felt easier to do things on the Mac:
It is one of those areas in the past where we felt like this is harder than doing it on a Mac. I think when we first introduced text selection, and copy past, and undo on iPhone, it was extremely ‘let’s teach you how to do this.’ There was an instructional interface. Getting to the right solution here, it’s something we have taken runs at for multiple years and cane back and felt like we didn’t have it. We wanted to get it right and it took a lot of care to craft it.
The iPad experience has to be one that everyone can understand, but there can be depth that you can discover and you can become a pro and discover and really accelerate your work.
Regarding Apple’s requirement that apps implement Sign in with Apple if they also support other social sign-ins, Joswiak explained that Apple felt responsible to offer customers a more private log-in option. He repeatedly emphasized that it comes down to transparency and control for the user:
You want transparency and control. In the situation of these buttons, there’s no transparency. People had no idea the information flowing through that tap. We wanted to provide that transparency and control. Normal authentication, there’s no transparency. In the event that an app is providing those means to log in, we should offer customer this more private log in. We want to give those users transparency and control to authentication.
Lastly, Joswiak reiterated that privacy is something that Apple has been focusing on for a long time, “before it was popular.”
We’ve been doing privacy since before it was popular. If you look at some of these other companies, it’s not at the core of who they are. We’re building stuff that we would want for ourselves, our families, our children. I don’t want to be tracked. I don’t want my family to be tracked.
The full video and audio from The Talk Show with Craig Federighi and Greg Joswiak will be available tomorrow.
Subscribe to 9to5Mac on YouTube for more Apple news: