Shortcuts for Mac

June 10 2019

One of the most interesting rumors before WWDC was that Shortcuts might be coming to the Mac, which of course raises all kinds of questions re the future of automation on macOS.

WWDC came and went, and while Shortcuts got some huuge upgrades in iOS 13 (like a new conversational editor, and third-party actions with inputs and outputs), Shortcuts did not come to the Mac.

…or did it?

Turns out, Catalyst on macOS Catalina includes all the Shortcuts frameworks, including all the ones necessary to bring up almost its entire UI. So I built a dummy app that does just that, which you can find on GitHub.

This test harness isn't really useful for anything except to explore the idea of Shortcuts on the Mac, how it would fit in to the system, and who would want such a product. But it does make for a fun demo 😄

Shortcuts is a consumer product, a brand unto its own; I think there are many of us who are building libraries of Shortcuts on iOS who would love a way to bring them to the Mac, and many iOS apps coming to the Mac via Catalyst, that will likely never adopt AppleScript (even though that's indeed possible for Catalyst apps), that could benefit from it.

Let's hope that the Shortcuts team finds the time in their very busy taking-over-the-world schedule to bring Shortcuts to macOS — submit feedback and make sure they know it's important to you, too!


Beyond the Checkbox with Catalyst and AppKit

June 07 2019

WWDC 2019 is here, and with it, a new way to write apps on the Mac: Catalyst. Catalyst is the developer-facing name for the technologies — formerly known as Marzipan — bringing iPad apps to the Mac by introducing UIKit and dozens more frameworks from iOS to the platform and unifying the substrate layer between iOS and macOS.

Catalyst apps are built from the iOS 13 SDK, with relatively few changes to the API set; simply click the 'Mac' checkbox for your project in Xcode to get started.

Click the checkbox

I previously documented just how far one could go with Mojave's version of UIKit if you jumped through enough hoops, with means to integrate AppleScript and Services. There are many reasons why it will benefit your Catalyst-based Mac app to deeply integrate with system features.

So now that Catalyst is here and the SDK in our hands, how do we go about integrating with the Mac and AppKit to build the best possible Mac app with UIKit? Let's take a look…

Not available on UIKit for macOS

Catalyst apps are granted access to NSToolbar (part of AppKit) in the SDK, so that apps can create rich window toolbars, and NSTouchBar to integrate with the Touch Bar, but what if you wanted to spawn an AppKit window and AppKit view hierachy to augment your app? Try to use any other AppKit class in your Catalyst app, and you'll rapidly run up against API_UNAVAILABLE_BEGIN(ios).

'NSWindow' is unavailable: not available on UIKit for macOS

Most AppKit classes are marked as unavailable to a UIKit app; at first glance, this seems pretty clear cut: the iOS SDK doesn't let you touch AppKit in your app.

However, the new unified process model in macOS Catalina means that your UIKit app is also an AppKit app. So how, then, could we use it?

Loading an AppKit bundle

Since Apple's making a point to treat Catalyst apps as 'true Mac apps', that means that your app can do what any other Mac app can do: embed an AppKit-based bundle or framework, and load it at runtime as a plugin.

Simply add a new macOS Bundle target to your project.

Add Target: macOS Bundle


Embed your new macOS bundle in your app and make its inclusion conditional to macOS only.

Embed bundle

It's as simple as that! Now you have a loadable bundle that is built with the Mac SDK, and thus has access to all of AppKit. All you have to do is load it manually when you detect you're running on macOS.

NSString *pluginPath = [[[NSBundle mainBundle] builtInPlugInsPath] stringByAppendingPathComponent:@"AppKitGlue.bundle"];
[[NSBundle bundleWithPath:pluginPath] load];

Create a principal class for your bundle (remember to add it to your Info.plist too), then treat it like your entrypoint into AppKit land. Instantiate it from your UIKit code and you're good to go!


Using AppKit

With the unified process model in macOS Catalina, your UIKit app will have a [NSApplication sharedInstance] as you might expect in AppKit, and it lists your app's windows just like it would for any NSWindow in an AppKit app. This gives you a bridge to modify your app window in ways that otherwise Catalyst makes impossible, like changing the window style mask, setting its position & frame onscreen, and setting a minimum/maximum window size.

You are free to spawn new windows as you like, and load them up with AppKit NIBs or storyboards or whatever suits you best.

As everything is running in the same app, you're free to call selectors between UIKit and AppKit, but it might be best to use an intermediary class to pass messages in between the two, since you won't be able to import UIKit from AppKit code.

One thing you can't do is mix UIKit and AppKit layers in the same view hierarchy. It may be technically possible, but it's the kind of thing I would expect to break at any moment.

Another thing you cannot quite do is spawn a new NSWindow with a UIKit view hierarchy. However, your UIKit code has the ability to spawn a new window scene, and your AppKit code has the ability to take the resulting NSWindow it's presented in and hijack it to do whatever you want with it, so in that sense you could spawn UIKit windows for auxiliary palettes and all kinds of other features. This is clearly not an intended use of Catalyst, but we're here to push the boundaries and make the best software we can — like we expect from our fellow Mac developers!


So what else does using AppKit in your UIKit app enable you to do…? Let's go through Martin Pilkington's fantastic list of things to appreciate in AppKit and see where your Catalyst app can improve.


Auxiliary Panels & Sheets

You can spawn NSPanel instances as floating palettes for the controls in your app, like a tool palette. You can also present complex AppKit windows as sheets from your UIKit window.

If you want to be a little more adventurous, you can listen for new window spawns and use that to hijack UIKit-spawned windows for your own purposes — perhaps to make them a non-activating floating panel, or to present them as a window sheet.

Proxy Icons

Set representedURL or representedFilename on your primary window to add a proxy icon to your currently open document. This will add an icon to your window titlebar to represent the document you point to, and a user can drag and drop from this proxy icon to another app or somewhere in Finder. You can also command-click the proxy icon to show the document's full path.

Color Pickers

You now have a bridge to use NSColorPanel in your UIKit app, so that you get a rich, native color picker and use it to drive your UIKit UI.

The color picker will also load third-party color picker plugins, of which many great examples exist on macOS.

Cursor

You can change the mouse cursor with NSCursor, as you might expect, to any of the system cursors, or a custom image. Many Mac apps change the cursor based on the tool or content you're interacting with, and now you have a way to do this in your Catalyst app.

This also gives you a way to show/hide the mouse cursor if you are a game, or capture it if you wish.

Dock

You can update the Dock icon for your app at runtime with custom or generated icons (like a progress bar for a long export operation), or add a badge with a string. You can also customize your Dock menu with options at runtime.

Menu Bar Status Items

You can implement NSStatusItem from your AppKit bundle to provide a menu bar icon, like you may be used to with apps like Dropbox. A status item can has a dropdown menu, but many apps spawn a borderless window to simulate a mini-app UI invoked by a status item.

Theoretically, you might be able to present your UIKit app window from a status item in a little iPhone-sized region if it's not quite appropriate to be used as a full Mac or iPad app.

NSWorkspace

You can use NSWorkspace to move items to the trash, reveal a file in Finder, or load the icon for a file from disk, among many other things.

NSTask & system()

NSTask allows you to run shell commands or command-line apps, and pipe things between processes. With an AppKit bundle, your UIKit app can support these interactions too.

AppleScript

AppleScript & Services are now easier than ever to use. You can follow the previous instructions under the AppleScript section of my earlier post to add Apple Events support to your app, as only one tiny thing has changed since Mojave's implementation.

You no longer need the private aeInstallRunLoopDispatcher(); call in -setupAppleEvents, as Apple Events are already set up as you might expect for AppKit.

Other than that, you should be good to go; you can decide how best to relay Apple Events to your UIKit code, and what actions are appropriate for your app.

Services

Services let you vend actions to the system for various types of content, like files, images, text, etc. The Services menu shows up in the context menu, and if your app offers a service for the currently selected type of content, its service will show in the menu.

Simply follow Apple's legacy tutorial to implement Services in your AppKit bundle. Everything just works 😃.


As you can see, a bridge to AppKit like this can make a world of a difference to how Mac-like your app feels to its users.

There are many ways a hybrid UIKit/AppKit app can give you the best of both worlds: compatibility with your existing iOS codebase as well as being able to do as much as possible to give your Mac users the things they expect from any good Mac app.

Apple is starting to use little bits of Catalyst in its own Mac apps, like the fullscreen iMessage Effects in macOS Catalina, and I'm sure that will only accelerate in the future, even with something as exciting and new as SwiftUI in the cards: Apple has a lot of things that are built on UIKit, as do third-party developers, and the Mac deserves to be able to use all of it.

If UIKit is the way you need or want to build things for Mac, the future is very bright for Catalyst apps.


Why Apple needs iPad apps on the Mac

June 01 2019

Dieter Bohn makes his case for why Apple should go all-in on Marzipan.

I want Apple to force itself to be like, okay, the iPad version of Mail is the only version of Mail that Mac users inside Apple get to use, and if they don't like it, they have to fix it.

I think they should do the same for as many apps as they can stomach, because if they don't, everyone's just gonna use the regular Mac apps that they've had before.

I mean, ask Microsoft how keeping the Windows classic version of Office around, when they're trying to change apps over to Windows 8 and Windows 10, went. It didn't go well.

I have been very vocal about why I think UIKit coming to the Mac is something to be excited about. There is so much potential in unifying the software ecosystem across Apple's platforms, but to do it right you can't stay on the fence like Microsoft did. For this to work, you need to own it, and you need to make it so good that it's hard to imagine wanting to use or write any other kind of software. That is how iOS makes me feel, and that is how the Mac should make iOS users feel.

I really hope that Apple finally deciding to do Marzipan, now, after a decade of iOS being Apple's dominant platform, means that Apple is no longer on the fence about the Mac vs iOS divide, and has made the tough decisions about how to chart its course for the future of the Mac and the desktop.

If what Apple provides next week at WWDC to bring UIKit apps to the Mac isn't good enough, we need to let them know so they can fix it. This year is only step two in a multi-year transition that will inevitably leave us running Universal iOS/Mac apps on ARM Macs, and there is plenty of time to fix things that aren't up to the standard we expect from Mac apps, and the Mac as a platform. Prepare your Radars! This must be an all-hands-on-deck moment for Apple.

I really don't think there will be a viable future for the Mac if Marzipan falls flat on its face. Apple's dominant ecosystem is iOS — that ship has sailed. No new UI framework or declarative layer on top is going to change the arithmetic; any new app framework for the Mac will by definition have to be shared across iOS and Mac, or we'll be right back where we started. By the time we've got to that point, there may not be any native desktop apps left, and iOS will still be accelerating into the future with new form-factors, augmented reality or whatever comes next. Even native app development titans like Adobe have a version of Photoshop in development for WebAssembly, and it's hard to not see the appeal for developers. The web is amazing; WebGL and WebAssembly will enable all kinds of powerful new platforms.

Steve Jobs building analogy

However, I truly believe that Apple provides the best native development frameworks in the world, which is why its platforms have many of the highest quality consumer apps in the world: when Steve Jobs would explain the NeXT, later Cocoa, frameworks in a presentation, he used a building analogy that I love so much — when developers write apps on a platform, they build upon the foundation laid beneath them, and NeXT's frameworks were so powerful that it was like starting on the twentieth floor of a building, and building upwards.

If you as a developer were only ever going to build three floors worth of a great app, starting from that twentieth floor you'd end up with a twenty-three-floor building; on other OSes, where you have to reinvent the wheel every time for new apps, and start at a lower floor (the fifth floor was where the classic Mac OS started you, in Steve's analogy), your three storey app wouldn't even reach as high as the starting point for a NeXT app.

That NeXT competitive advantage became Apple's competitive advantage, and, later, iPhone's competitive advantage. This is the competitive advantage a native platform from Apple has over the web; it would be such a shame to half-ass this transition to Marzipan and concede defeat to web apps on the desktop instead of letting native apps reach the heights they deserve. And still, dividing Apple's attention between not one but two native app frameworks, each tens of floors tall, will always be a major constraint; I want to see what Apple can really do.

Will this transition be painful for Apple? Yes. But I think it'll be worth it


WWDC, A Wish List (2019 Edition)

June 01 2019

This piece first appeared on MacStories on May 27th, 2019.


Way back in 2016, in the era of iOS 9, I laid out the tentpole features I wanted to see come to iOS and the Mac. Now, three years later, so many things from that wishlist have become a reality that it's probably a good time to revisit the topics that haven't yet come to pass, and plan a new wishlist for the years to come. I originally planned this list to have a Developer/User split, but it became clear that the two go hand-in-hand; if you're doing complex things on iOS today, using the various automation apps, you are but steps away from needing the same things that developers do.

Xcode for iOS

Much has changed since 2016 for development on the iPad, but much still stays the same. Apple introduced Playgrounds that year, and provided their very own Swift IDE for iPad. Playgrounds is fantastic, but you still cannot build and install an app using it, and you cannot mix and match C and Objective-C code with your Swift. It has no project structure, so all of your code has to take place in one file, which is fine for teaching material but not for anybody wanting to create something more complex. In 2017, Apple changed the App Store rules to finally enable programming apps to live on iOS without fear of being removed, albeit with unfortunate restrictions like not being able to display the output of an app in more than 80% of the screen. However, there is still no way for a third-party programming app to run code out-of-process, so any user mistakes can crash the app completely, and of course there's no way to build and install an app locally using one of these IDEs.

Pythonista for iOS

Pythonista for iOS

Pythonista for iOS

There is still an incredible need for something to bridge this gap — a programming environment on iOS that lets you design, write, build, sign, and install an application without having to resort to using a Mac. No third party can build this, because Apple has App Store rules and platform restrictions that prevent anybody else from being able to. Thus, Apple has to be the one to build this 'Xcode for iOS' and make it as powerful as a developer might need, whilst also building the mechanisms into the OS to make it as safe and secure as possible. In 2016, many of the fundamental pillars necessary to build an Xcode like this didn't exist in iOS, like a user-accessible file system, drag and drop, multiple windows, and floating panels – but they do now (or will shortly, if rumors are to be believed).

Terminal Environment for iOS

Much like the file system, for a certain class of user the need for a command-line environment of some kind hasn't gone away as I'm sure Apple had hoped. Now, with Apple's own Shortcuts app, more users than ever are automating tasks on iOS — it makes perfect sense to provide something more for the power users that need it, especially if Xcode for iOS becomes a reality. After 2016, I went and built the sandboxed Terminal app I described in my post and populated it with the core BSD utilities from Apple's Darwin as a demonstration. We've seen Mosh, and OpenTerm, which do much of the same. Now, there's iSH, which goes as far as emulating an X86 Linux environment just to try and provide a working shell on iOS.

A beta version of iSH for iPad.

All of these apps are sandboxed and using public APIs, but only Apple is capable of building the real thing. I shouldn't have to install an X86 emulator on my iPad just to be able to curl a URL and untar it, or to run ffmpeg to convert a local video file. A sandboxed Terminal doesn't need to let you mess with the system or other apps, or provide ways to execute unsigned code or kill processes; it should be able to live in its own jail and let you do whatever you want inside it in much the same way as a GUI user gets to do whatever they want in Shortcuts while maintaining the safety and security of the OS.

System-Level Drawing/Markup Views

Four years post-Apple Pencil, and still Apple provides no developer APIs for drawing, sketching, and markup. Every app has to reinvent the wheel if they want to have Pencil-based drawing, and while that may suit the couple of developers who have invested a lot of time and effort into their own drawing engines, it excludes every other developer who happens to have a good idea for a way to integrate sketching into their own app.

Sketching in Notes

Apple has provided sample code in the past for this, sadly OpenGLES-based, but if you want anything more appropriate you're left to scour GitHub. Building a drawing engine that looks good and feels responsive at ProMotion's 120Hz is incredibly difficult, yet Apple has their own great drawing framework in the OS which they use in the Notes app that would be perfect to provide to developers.

There are several apps sitting on my shelf that would benefit greatly from a built-in API for a drawing view; I hope Apple gets around to this sooner rather than later.

Custom View Controller and Non-UI Extension Providers

Extensions have powered so many new APIs in iOS and macOS, completely obviating the plugin model with a robust out-of-process signed and sandboxed mechanism. However, developers cannot define their own extension points, and thus cannot use them to empower their own apps. A programming environment for iOS should be able to run its code in an extension – a separate process that can be securely scoped to just the task it's supposed to do, and if it crashes it won't bring down the host app with it. This is, of course, exactly how Apple's Playgrounds app works, but it isn't something third parties are allowed access to.

Similarly, you should be able to define your own UI-based extension point such that other apps can implement an extension that would show up inside your app. If you've ever used the Audio Unit extension point, this is exactly what it lets you do – presenting the custom UIs from various instrument or audio processing apps you have installed, in a region inside your own app's UI – but only if you're an audio app using CoreAudio.

There are many ways developers could innovate by providing their own extension points; in fact, so much of the custom URL ecosystem on iOS sprang up because there's no standardized way for apps to talk to each other or use functions from each other. Apple's own Shortcuts app, née Workflow, was born out of the custom URL ecosystem, and Apple realized its potential was so great that it acquired Workflow and built it into the OS. Your own extension point could be exposed as an action to the Shortcuts app, to let it call it transparently as part of a workflow, with or without UI, instead of the dosado between apps that custom URLs involve today.

Key Up/Key Down Events

It seems crazy that in 2019 you are still unable to track raw keyboard events on iOS – there is no way, barring using a private API, for an app to allow you to hold down physical keys as input (like WASD keys for a game), or as modifier keys (like holding Shift while resizing something in an app like Photoshop to maintain aspect ratio). A developer only knows when a key has been pressed, not released, or a shortcut has been invoked. This restriction seems so pointless today, and incredibly restrictive, affecting everything from professional creative apps to games. iPad needs robust hardware keyboard support, and shouldn’t be chained to a restriction formulated a decade ago for a very different world.

Mouse Support and API

In much the same way, it’s time for robust mouse and trackpad support on iOS. With UIKit coming to the Mac, the framework has had to add a bunch of interactions like right click, hover, and scroll bars; why not bring this to iPad too so that users can benefit from it if they choose, or if their workflows demand it?

While controlling the UI with absolute coordinates is an important function of the mouse, let’s not forget too that mouse-capturing and relative movement is essential for games, remote desktops, and emulators. UIKit thus needs to let you capture the mouse cursor for your own needs, and not just to click things onscreen. Combine this with robust keyboard support and you would be able to play games like first-person shooters on iPad just like you can on a Mac or PC. Quake 3, anyone?

Android has supported mice for nearly a decade and it hasn’t done anything to lessen the touch experience, so there’s no need to worry about it doing so on iOS. For users or workflows that truly want or need a mouse, iPad will always be a non-starter until it supports one. Time for that barrier to go away.

Larger iPads and External Touch Screen Support

There are times, however, when the computer I want on my desk is a 30” iOS drafting table. iPad is essentially a blank canvas – truer now that it has no front-facing buttons – and a bigger canvas begets entirely new experiences. I am dying to see iOS scale to desktop-sized workflows, with several apps onscreen at once. If not a desktop iOS device itself, why not an Apple-quality large external touch screen?

I would love a 15” iPad, too: I haven’t used a Mac laptop since 2013 – iPad has completely obviated that form-factor as it grows ever more powerful – and I can’t imagine ever going back, but for people like me Apple needs to offer an even bigger model iPad than 12.9”. The 12.9” iPad Pro already gets custom UIs with expansive layouts and three-column views, and UIKit on the Mac will behoove developers to create apps and layouts that can scale to 27” screens anyway. I'm deeply envious of Microsoft’s Surface Book (that is, of course, until you turn it on), and something along those lines running iOS would suit my needs incredibly well.

Expanded USB Device Support for iOS

MFi might be gone with the USB-C iPad Pros, but developers need public APIs to write user mode drivers for anything you wish to plug in to your iPad. I want to be able to plug in my various EyeTV tuners and have the EyeTV app happily init them like it does on the Mac. I want my Game Capture HD60 to work on iOS, so I can record footage from my gaming PC and actually be able to edit and render it in the fastest computer in my house (the 2018 iPad Pro). I want to plug in my Raspberry Pi’s FTDI cable and view its serial output on my iPad without buying crazy MFi-based serial adapters. If I, for whatever workflow I might have, need to burn a CD or DVD, I should be able to plug in a disc drive and do so using any app designed for the task. This should "just work" in a way iPads simply can’t do today.

Read/Write External Drives through the Files App

By now, this is on everybody’s wishlist. Need I say more? It is so long overdue that I can’t imagine Apple holding off much longer. But I’d like to go further...

Format/Partition External Volumes and Read/Write Disk Images

I don’t just want to read my drives: I need to be able to manage them, too. Erase, partition volumes. Understand multiple file systems. Image them to a file, or apply a disk image to them.

The Mac’s pervasive disk image support is genuinely one of its crowning achievements. As a result, I have a ton of disk images from two decades on macOS. I use disk images every day; I create CD and floppy images to pass files to/from VMWare and Qemu. If user file system access becomes a core part of using iPad, we need the rest too. If I choose to never use a Mac again, Apple is telling me that all my old data is lost.

Scripting

"AppleScript for iOS" was one of the items on my 2016 list, but the Apple automation landscape has shifted dramatically since then. Apple acquired Workflow, now Shortcuts, and the perception is that AppleScript may not be long for this world, slowly pushed out in favor of a sandboxed, secure, and modern extensions mechanism employed by Shortcuts. Scripting is still incredibly important, as evidenced by the hundreds of Shortcuts workflows used by anybody who takes iOS seriously for work these days. So if not AppleScript, then what?

Scriptable for iOS.

What I really want to see is a textual interface to Shortcuts that lets you do all the same things without having to navigate and fiddle with a UI filled with actions, so that the class of advanced user who prefers writing scripts can do the things they need. Scripts need a way to run "silently" without presenting a UI onscreen or jumping between apps. And scripting should be extended to the UI layer, letting developers build richly scriptable apps like they can today on the Mac with AppleScript. It is easy to envision Shortcuts scripting using JavaScript or Swift, and Shortcuts already has a scripting action that lets you run JavaScript on a webpage.

Virtual Machines

It's hard to see Apple supporting virtualization on iOS, as virtual machines require things like Just-In-Time compilation to execute arbitrary code in memory which violates one of the core foundations of the iOS security model, and burns through battery life like few other tasks. However, Apple provides WebKit on iOS, which also requires those special security exceptions to execute code in memory, and on macOS provides Hypervisor.framework, a lightweight virtualization system that lets developers build virtual machines easily.

Running mini vMac on iPad Pro.

I'm the kind of user who would love to see Hypervisor for iOS; let companies like VMWare and Parallels bring their expertise to the iPad, and offer approved ways to run ARM-based Linux or Windows (or, in a couple years, macOS perhaps). X86 emulation may be out of the question for now, but perhaps that won't always be the case; I'm sure I'm not the only developer with a library of VMs for everything, from DOS to NEXTSTEP to older versions of macOS and the iPhone SDKs. I can, of course, use these VMs on iOS today, with open-source apps like Bochs or mini vMac sideloaded onto my device, but because they don't have the ability to JIT, they have to run entirely using CPU emulation which is significantly slower and burns more battery.

Entitlements

Finally, that brings me to entitlements. The entitlement system in iOS is what allows Apple to have fine-grained control over which developers can access which features; if you wish to use iCloud in your app, your app must be signed with an iCloud entitlement, with similar requirements for Health, Home, Apple Pay, and many other parts of iOS. CarPlay, for example, needs a special entitlement that isn't given freely to developers – in fact, you have to apply to Apple and get your app idea approved before they'll even let you test the feature in your app. If an app or developer is ever found to be abusing their privilege, their entitlements can be revoked by Apple, and the app remotely disabled. Thus, entitlements are a great way for Apple to entrust developer partners with special access to features that other developers can never use or misuse.

With that in mind, Apple could entrust e.g. Google, Microsoft, or Mozilla with the entitlements they need to use their real browser engines on iOS instead of WebKit – real Chrome, real Firefox. VMWare and Parallels could be entrusted to build virtual machines or emulators, without leaving this open as an attack vector for malicious third-party apps. Disk utilities could be permitted to partition disks, IDEs could be permitted to run background processes, install apps, or attach a debugger to running apps. So many of these things, given freely to developers, would arguably make iOS a much less safe place (read: just as powerful as a desktop computer), but with the entitlement mechanism in place Apple could still keep the control they want and not let it get out of hand. Seeing past the inter-company politics, iOS is going to need methods to do all of these things eventually, especially if the iOS app ecosystem is to supplant the Mac app ecosystem in due course. A Mac without the ability to build and install apps, or attach a debugger, would be unimaginably crippled.


We've come a long way from the fear that enabling third-party apps on iPhone will bring down the cell networks; trying to actively build the future on iOS today is like having your hands tied behind your back. iOS has for too long relied on the fact that the Mac exists as a fallback to perform all the tasks that Apple isn't ready to rethink for its modern platforms, but that doesn't mean these problems aren't relevant or worth solving. This has left us in a situation where iOS moves forward with new ideas, but the Mac stands still, needing to keep compatibility with the iOS ecosystem whilst tiptoeing the line between keeping things as they are, or losing the freedom and power of old systems for the active development and enthusiasm of the new. The correct path forward is not to simply revert to the mechanisms available on the desktop, with all of the baggage that comes with that, but to rethink all of these things to fit in a modern, secure world.

iOS is exponentially better with a working Files app, with drag and drop, with automation, background tasks, and split-screen multitasking. iPad too, with a stylus and hardware keyboard.

What iPad does so well is completely hide its complexity from the user who doesn't need to know about the mechanics of the system beyond tapping an app to open it, and swiping the Home indicator to close it. All of these things, added to iOS, haven't made the OS harder to use. They're so transparent that I'm sure most users don't even know they exist, but for the users who do need them they have become essential tentpoles of the iOS experience.

With UIKit on the desktop, it's time to revisit just what an iOS app can and can't do; after all, they're no longer "iOS apps": they're just "apps" now.


(Don't Fear) The Reaper

May 22 2019

With the rapidly-upcoming introduction of UIKit apps to the Mac, I've been reminiscing on Twitter quite a bit about Apple's transition to Mac OS X, and just how NEXTSTEP and Mac OS found common ground despite being two very different OSes both architecturally and from a design standpoint. There are a couple of different vectors I will follow here, if you permit me, that will converge at the same place:

One of the artifacts of the final days of NeXT was a prerelease version 4.0 of NEXTSTEP. Despite appearing before OPENSTEP (the 4.0 operating system that succeeded NEXTSTEP), this NEXTSTEP 4.0 build, which I will abbreviate to NS4 herein, has a vastly different UI and set of apps. It had new colored window chrome, background images, and a tabbed shelf at the bottom of the screen — quite the departure from the grayscale NEXTSTEP we knew.

However, it was the built-in apps that piqued my interest. NS4 has redesigned and rebuilt variants of many of the apps that shipped with NEXTSTEP, but none of these changes later made it to OPENSTEP. In fact, they don't resurface until years later, after Apple acquires NeXT and revamps OPENSTEP with the Mac look and feel, in a project, named Rhapsody, that later became Mac OS X Server 1.0.

It's fun to track the progression of the apps through this period. Let's take a look at the NS4 version of Mail, from 1995, and follow it through all the way to OS X.

Mail on NS4 Mail on prerelease NEXTSTEP 4.0 (1995)

NeXT had recently, with Sun, finalized 'OpenStep' (not to be confused with OPENSTEP the OS), a cross-platform refactor and sweeping revamp of all the old NEXTSTEP frameworks. NS4, then, seemed to be the across-the-board revamp of the system apps to take advantage of OpenStep. These apps, thanks to OpenStep's portability, would run on Solaris and Windows, as well as NeXT's own OS. However, when NeXT actually shipped the next version of the OS, now renamed OPENSTEP (from NEXTSTEP), it didn't include them. Instead, OPENSTEP had an older, ironically pre-OpenStep set of apps, despite shipping a year after this NS4 beta. You can tell immediately by looking at Mail that this is a step backward.

Mail on OPENSTEP Mail on OPENSTEP (1996)

It seems obvious with hindsight that all of the fancy upgrades NeXT was working on were put on the shelf in 1995 when they realized the company had no future without an acquisition. Perhaps this NS4 build was intended to woo potential buyers like Apple? Sure enough, Rhapsody includes these revamped apps from the get-go.

Mail on Rhapsody Mail on Rhapsody (1997)


Rhapsody was Apple's first attempt of merging the worlds of Macintosh and NeXT, and architecturally it's very much a NEXTSTEP 5.0. The first build of Rhapsody was released to developers in August of 1997, about seven months after the acquisition closed. The design goals for this developer beta were relatively simple: port OPENSTEP to PowerPC, adopt the Mac HIG, design language and iconography, and provide a 'Finder' and desktop. Here, you can see the progression from left (OPENSTEP), to Rhapsody Developer Release 1, to right (Mac OS X Server/Rhapsody 1.0).

NeXT's UI wasn't Mac-like at all; its machines were designed as UNIX workstations for the professional and academic markets, and you can almost feel that from its UI design language. Just because OPENSTEP and Mac OS were both mouse & keyboard OSes didn't mean transitioning was straightforward. Multitasking was very different. Menuing was very different. Famously, NeXT's OS never had a desktop nor menu bar across the top of the screen. Even something simple like scrollbars, controls we take for granted today, were on the left on NEXTSTEP. Apple had to teach NeXT's AppKit what being 'Mac-like' meant, and it required a lot of work over many years.

Mac OS X Server Mac OS X Server (1999)


All of the Mac-like-ness of Mac OS X Server was still controlled by a preferences key in AppKit, and in fact you can switch it off completely and revert the system back to looking like OPENSTEP, which is a wild experience by itself.


Mac OS X Server in 1999 looked and acted very much like a 'Mac OS Pro'; it had the same look and feel as the Mac, but retained the power of UNIX and all of NeXT's frameworks. It even built upon some NeXT features like menus you could tear off and float onscreen. However, Mac OS X Server had a fatal flaw that had become apparent to Apple in the years since the NeXT acquisition: Mac OS X Server didn't run legacy Mac apps, and Mac developers were not on board with rewriting everything completely in Objective-C and AppKit.

For Mac OS X to actually work as a successor to Mac OS for Apple's customers, they were going to have to gut it and start over. The solution? Apple was going to port the legacy Mac OS toolbox into a new compatibility library, called Carbon, and make it the linchpin of the consumer Mac OS X experience.


Even in 1997, Apple knew they wanted consumer Rhapsody to be something special.

However, once it became clear that Apple needed to rethink its strategy completely, Mac OS X became a completely different thing to the Mac OS X Server that preceded it. It had a few tentpole changes: they would replace the kernel and driver model, replace Display PostScript rendering with the PDF-based Quartz, deeply integrate Carbon and make legacy Mac apps first-class citizens, and adopt a brand new UI called Aqua.

To do this, a lot of NeXT's pure foundation and frameworks were rearchitected atop a common foundation with Carbon, and the classic Mac OS' Appearance Manager became the basis for Aqua's UI. All of AppKit's own menus and widget/control rendering were replaced with this Mac OS foundation, which of course meant that Carbon apps and Cocoa (AppKit) apps would look the same despite their vastly different architectures.

Apple needed to show developers that Carbon was going to be a real and valid way forward, not just a temporary stopgap, so they committed to using Carbon for the Mac OS X Finder. The Carbon version of Finder was introduced in Mac OS X Developer Preview 2, before Aqua was revealed; it acted a bit more like NeXT's, in that it had a single root window (File Viewer) that had a toolbar and the column view, but secondary windows did not. At this stage, Apple didn't quite know what to do with the systemwide toolbars it had inherited from NEXTSTEP.

Mac OS X Developer Preview 3 was the first time we saw Aqua in a build, and with it came transformational change to the Mac OS X UI. We were into year three of Apple's transition to NeXT technology, but Apple was now on a path to finding what 'Mac-like' should be, rather than trying to match what it was in classic Mac OS. Apple had a better understanding of what it wanted to do with navigation in Finder, but elements of the UI were still comically oversized.

Finder in DP3


While Finder evidences the transition from the Carbon side, we can revisit Mail to see how AppKit/Cocoa was faring in this same OS release.

Mail on DP3

Mail preferences window

Mail, the same Mail as we saw earlier in this piece, now looks very different indeed, but still betrays its NeXT roots with the iconography in its preferences window. But clearly, it looks very similar to the Carbon-based Finder; as you can see, both environments were starting to come together into something new, with a shared design language.


Mac OS X shipped to consumers in 2001, and was refined massively from this first implementation of Aqua.

Mail on Mac OS X Mail on Mac OS X (AppKit, 2001)

Finder on Mac OS X Finder on Mac OS X (Carbon, 2001)

It had taken Apple four years to find the new 'Mac-like', and this is the template Mac OS X has followed ever since. Here we are, eighteen years later, and all of the elements of the Mac OS X UI are still recognizable today. So much of what we think of the Mac experience today came from NEXTSTEP, not Mac OS at all. AppKit, toolbars, Services, tooltips, multi-column table views, font & color pickers, the idea of the Dock, application bundles, installer packages, a Home folder, multiple users; you might even be hard-pressed to find a Carbon app in your Applications folder today (and Apple has announced that they won't even run in the next version of macOS).


In 2007, Apple got the chance for a complete do-over of Mac OS X with a modern architecture optimized for touchscreen devices with powerful GPUs, but with thermal, resource, and battery constraints. That do-over was, of course, iPhone OS. Instead of AppKit, which had come all the way from NEXTSTEP, was refactored and rebuilt into OpenStep, and powered the first decade of Mac OS X at Apple, Apple decided to build something fresh with everything they had learned: UIKit. As technology has progressed, UIKit and the devices it powers have become more and more complex, and powerful.

With macOS 10.15, UIKit is finally coming back to the Mac to serve as a top-tier native application development framework alongside AppKit. This is the start of Apple's next transition, and just like last time, it's almost unfathomably difficult to see how these two completely different architectures will cooperate and find common ground.

Just like last time, we'll start with baby steps: UIKit adopting the metrics and styles of macOS, AppKit adopting some of the interactions of iOS, and the two sides growing closer over time by sharing more and more DNA.

But just like last time, the road ahead is much longer than we can see from this perspective; we're building towards what comes next, even if we don't yet know what that is.

A really interesting question we probably won't have an answer to for years to come is whether UIKit is the 'Carbon' or the 'Cocoa' of this transition. I think the only appropriate answer is 'yes'. UIKit is the present, and the developer ecosystem it will bring with it is incredibly important. AppKit is also the present, and it provides and powers the Mac as we know it.

I'm sure we will have great, genre-defining apps from both UIKit and AppKit on the Mac. With Carbon, we had iTunes, Photoshop, Microsoft Office and Final Cut Pro. Eighteen years on, Carbon is finally reaching its end date, and the transition of all these apps to Cocoa/AppKit is complete. If AppKit still has eighteen years left ahead of it, I think the Mac will be just fine.

Both classic Mac OS and NEXTSTEP came to an end; the Mac did not. I think everybody can agree the unified whole was much greater than the sum of its parts, but this was not clear at all in 1997. The future is still being written, and we each, 'Mac developers' and 'iOS developers' alike, will get to be there to help shape it.

Needs more cowbell.


Translating an ARM iOS App to Intel macOS Using Bitcode

May 18 2019

When Apple introduced Bitcode and made it mandatory on watchOS and tvOS, they kinda hand-waved away questions about why it existed, with nebulous claims about it being useful to tune-up binaries by utilizing the latest compiler improvements.

Since then, Bitcode has proven instrumental in the seamless overnight transition of watchOS to 64-bit, where developers didn't even need to recompile their apps on the store as Apple did it transparently for them so they could run on the Apple Watch Series 4. You likely didn't even notice a transition had taken place.

What is Bitcode? Well, bitcode with a small b- is an architecture-specific intermediate representation used by LLVM, and capital-B Bitcode pertains to a set of features allowing you to embed this representation in your Mach-O binary and the mechanisms by which you can provide it to Apple in your App Store submissions. It's not as flexible as source code, but it's far more flexible than a built binary, with metadata and annotations for the compiler. In practice, you (or Apple) can easily take the Bitcode blobs from your app and recompile them into a fully-functioning copy of your app. Going from armv7 to armv7s, or arm64 to arm64e is a piece of cake, and saves developers having to recompile an ever-fatter binary of their own every time Apple tweaks their ARM chips. Bitcode has long-since been used by Apple in its OpenGL drivers such that the driver can optimize on the fly for the various GPU architectures Apple supports.

We have seen Microsoft use static recompilation to great effect on Xbox One, giving it access to a whole library of originally-PowerPC Xbox 360 games, all without developer intervention or access to the source code. And that's without an intermediary like Bitcode to trivialize the process.

Of course, the specter of macOS on ARM has been in the public psyche for many years now, and many have pondered whether Bitcode will make this transition more straightforward. The commonly held belief is that Bitcode is not suited to massive architectural changes like moving between Intel and ARM.

I was unconvinced, so I decided to test the theory!


Firstly, we need an Objective-C Hello World app with Bitcode; Bitcode is usually only included when building an archive for the App Store, so we need to force its inclusion in a regular build. You can use the -fembed-bitcode flag or a custom build setting:

BITCODE_GENERATION_MODE = bitcode

Build your binary for Generic iOS Device, or an attached device, like normal. Bitcode doesn't seem to be embedded in arm64e builds (i.e. if you have an A12-based device), so you might want to turn off Xcode’s ‘compile active architectures only’ setting and build for arm64 directly.

Using a tool called ebcutil, you can very easily extract all the Bitcode objects from your compiled binary.

ebcutil -a arm64 -e path/to/MyApp.app/MyApp

Then, for each Bitcode object, recompile it for Intel.

for f in *;
    do clang -arch x86_64 -c -Xclang -disable-llvm-passes -emit-llvm -x ir -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator.sdk $f -o $f.o;
done

Now, you want to link your compiled blobs back into a binary.

clang -arch x86_64 -mios-version-min=12.0 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator.sdk *.o -o path/to/MyApp.app/MyApp

If this succeeds, you now have an Intel version of your originally-arm64 app! You should be able to drop it directly into an iOS Simulator window to install and verify it runs.

This is a very important proof: you can statically translate binaries between Intel and ARM if they include Bitcode. It really works!


⚠️ Gotchas for more-complex projects

ARC appears to use some inline assembly, which means you'll need to disable ARC for a project for arm64-to-x86 translation to succeed right now.

Certain kinds of Blocks, like completion handlers, also seem to trip up the compiler with instructions it refuses to accept; if you see an X87 error this is likely your issue.

Why Objective-C? Well Swift was designed with ARC in mind and thus I dont believe there is a way to avoid the aforementioned inline assembly, so recompilation will fail currently.


Let’s take it a step further: let’s use marzipanify to convert this Intel iOS app into a Mac app that can run using Marzipan.

Translated app on macOS

That was easy!

This means, in theory, that if Apple wanted every iOS app on the App Store to run on the Mac, today or in the future, they have a mechanism to do so transparently and without needing developers to update or recompile their apps.


So, what if the Mac switched to using ARM chips instead of Intel? Well, as you can see, Apple could use Bitcode to translate every Bitcode-enabled app on the Mac App Store, without consulting developers, so it would be ready to go on day one. This kind of power means Apple needn’t preannounce an ARM switch a year ahead of time, and also means a technology like Rosetta may be completely unnecessary this time round.

Obviously, we’re not there yet: Apple doesn't enable Bitcode for submissions to the Mac App Store today, and today’s Bitcode may not be ideal for such an architectural translation. If I were Apple, I would make sure those two things change soon, and surely mandate Bitcode for all Marzipan apps in macOS 10.15.


Turntable

March 29 2019

I picked up a simple motorized turntable and photography light tent, and it's amazing how much of a difference it makes to photographing miniatures & paint jobs. I should have done this way sooner. If you're curious, I'm recording on iPhone, using Halide for the still shots, and FiLMiC Pro for the video — manual focus and exposure controls are a must.

I am no great painter, but my most recent stuff — from 2015 — is passable. As proud of my rudimentary paint jobs as I may be, seeing them in person, I have never had a good way of photographing them and sharing them online — everything always looked pretty mediocre through the lens of a camera. I think having it spin 360° shows it off in a way that photos or videos never could. Everything looks so much more professional and I love it.

With the light tent, the stills I'm getting out of this are better than I could have imagined. These look almost like product shots; it is hard to believe that these were taken on a phone.

Praetorian Guard behind 3D-printed barricade

Tau Stealth Suits on building ruin

It's kinda like being able to really see these miniatures for the first time; it makes me super enthusiastic for the next time I take up the painting brush, for now, I finally have a way to share.

I figure the turntable will also be useful to show off the larger things I build from plaster casts too, as it really accentuates the detail on the pieces. My light tent is a little on the small side for large plaster rooms, but I'll figure something out.


Behind the scenes

A glimpse behind the scenes reveals just how basic all of this stuff is — the cheapest things I could find on Amazon to do what I wanted.

It definitely highlights the importance of investing in the right tools for the job; we already have amazing 4K60 cameras in our phones, but a good mic for recording or a decent lighting setup for making photos or videos make all the difference.


Older posts…