Avanceé.Agency

Musings on designing experiences & (re)engineering complexity

Jul 2023

Don’t Need A Controller with That

Apple Vision Pro with person at a desk using Magic Keyboard and Touchpad, via Apple Newsroom website

Contentious opinion? Apple and Humane aren’t wrong… you do not want a controller or other device as a necessary input tool when using AR/VR/MR devices.

Towards the end of June, jumped again into the AR/VR thing with adding the XReal Air to the Avancee device lab. This is really an interesting device, and not just because “Apple has entered the AR fray with the VIsion Pro.” The XReal Air actually does a pretty decent job mirroring an iOS/iPadOS display, or extending a macOS display (up to three virtual desktops). The clarity is ok (am waiting for prescription inserts). And aside from a 3ft/1m cable, it does make a lot of sense towards making the best use of limited space if one needs several screens around them in order to do some level of work.

Where the Air fails (for me), is in the area of control. As has been spoken about a few times, am a fan of the Tap Strap 2 wireless keyboard. Such a novel concept, and really should be better utilized personally (waiting for a firmware update to a technical issue discovered recently). The Tap is often used alongside the iPad Pro, but on the non-dominant hand. In the dominant hand is the Apple Pencil. Between the both of those, and dictation, much about interacting with the iPad can be done quickly and easily - at least when I am not rusty in typing with the Tap. The Tap also has a neat “air mouse mode” where it can emulate a mouse by simply pointing at the screen and then doing a few gestures. Suffice to say, one can actually do a fairly decnet job of breaking free of the confines of the size of a screen with this… until you cannot.

The XReal Air does make for a decent motion to break free from structured and un-movable windows. And (am assuming) the Tap air mouse feature will enable it to go a few places further - at least on macOS. Am kind of uncertain about the extended screen play with iPadOS, though there is some experience using it in this fashion with the Apple Magic touchpad and a portable monitor. That said, when confronted with this perspective, you quickly find there many more senses we should be using on these canvases. This is where Apple and Humane have made the right call in letting our physical selves be the controllers, and the “windows of the tech” bending to us, versus the other way around.

Take a look again at the Humane TED Talk demo. Notice what is being used as the canvas? The hand, the ear, and speech. Take another look at the intro video for Apple’s Vision Pro. Notice how the hands and eyes are the controller. Notice how soundscapes and vision-scapes are postured as the canvas, and controllable. There’s something just right about being able to touch, control, smell, taste, etc. within your world. It might very well be that we are finally seeing a breaking away from an infantile slice of computing we’ve been doing since the introduction of the GUI, mouse, and even the keyboard.

XReal used to (?) offer a second device, called the Light which does do hand-tracking, and a few other nifty things. Am thinking this might be a correct route as a baseline for these “glasses-led devices.” And for a simple reason, we know how to use our hands. We know how to fix our gaze. And - barring any kind of bias towards normalizing those, allowing people of all abilities and modalities to interact to computational canvases within the world they are already familiar.

It really is a shame that Apple was not more forthcoming with the various (hidden gestures) on iPadOS. There have been (and are) so many ways to use your iPad without needing to get to context menus, with many application even extending that to do some pretty neat stuff. Kinds of things which makes you wonder if Apple has been setting up the removal of controllers or “UI chrome” for a longer time than we might suspect.

You don’t need to learn how to control your hands except for once in your life. You refine how and what you pick up, control, put down, etc. You learn tender touch, versus forceful. You learn how to grab big objects, and [how to embrace your other senses to make vision more than just what you see with your eyes](Take Off Your Glasses And See: How to Heal Your Eyesight and Expand Your Insight https://a.co/d/5epTUTn).

Perhaps, you shouldn’t need a controller to leverage what a computer might offer (note how this doesn’t say “use a computer”). And it is very possible that we are finally at the place where hardware and software is ready to agree with the reality of who we have always been. At least, it seems Apple and Humane are in agreement with some of that.