It is more than likely that a Tuesday from 2022 let’s see the arrival of invention that will end up killing the iPhone: Apple’s mixed reality glasses. That day we will attend a radical change in user experience like the one we experimented with the iPhone in 2007. A new patent from the Cupertino company gives us a clue as to what this new technological era can look like.
Apple glasses seek to eliminate the limited screen of smartphones to turn all your immediate reality into a screen. Not a giant flat screen floating in space in front of your eyes, but an infinite virtual layer, superimposed in high fidelity about the reality around you.
With these glasses you can have an unlimited work desk, a holographic design station like the ones Tony Stark uses in Iron Man, a map with directions that clearly mark the way forward on the terrain itself or the ability to instantly materialize at home. from a friend 1,000 kilometers away.
We will see virtual objects and people as if they were real, in front of our own eyes and placed on the floor or real furniture because this device will be able to know your surroundings thanks to LiDAR sensors like the ones on the iPhone 12 Pro.
The productive, social and entertainment possibilities they will go beyond anything we have experienced so far And although the first versions will not be perfect, change is inevitable.
But the key to all of this being successful will be in the user experience. Just as the touch screen of the iPhone introduced a new way to manipulate digital information without having to use physical controls, this new medium will need another new language.
Where now we pinch our fingers on a screen to enlarge an image or press to open a menu, this new non-tangible dimension will need another method of interaction, different but as much or more intuitive than the telephone.
The hand as an interface
We cannot predict what the conventions of this new medium will be, but we can assume certain parameters based on what we know.
The first is that Apple will not force us to use external controls just like today’s Oculus Rift-style mixed and virtual reality headsets. Doing so would be a step back from the outright manipulation that the iPhone set the standard. It is logical to think that it is the same or better.
The problem is in how to follow hands and make them interact with a reality that does not exist. There are several methods for tracking objects in real space. One, the most primitive, is to use external cameras located on the perimeter of a room to track your movements, usually carrying markers. This works very well and is commonly used for creating special effects, but it will not be the solution because it is neither practical nor useful.
Another option is to put the cameras on the glasses. This option – which the startup Leap Motion uses in its Orion project, which Apple tried to buy back in the day – would be a possibility. It is the most intuitive way: to act with synthetic reality you would only have to extend your hands and make gestures, grab, move, or do anything you would do with any physical object.
Leap Motion system can track hands and fingers in the field of view
In addition, you would not need more than your hands: the three-dimensional sensors of your glasses and the artificial intelligence engine would recognize your hands and fingers with great precision. However, also there are limitations in this solution. The first is that your hands could only interact with the environment when they were in the field of view of the cameras. The second is that you would not have the ability to use more precise physical objects, such as a real pencil to write or draw on a virtual sheet of paper.
The Lord of the rings
The third solution is the one presented by this patent with the arcane title of “Gesture input system using accessories or manual devices with auto-mixed interferometry”.
This patent says that Apple would use two rings, one on the thumb and one on the index finger, equipped with short-range laser emitters. The rings would pulse continuously and would have sensors that receive the reflections of these rays, between them or between them and the surfaces around you.
According to Apple, the processor would be able to measure the speed differences between the emission and reception of laser pulses. Combined with the 3D analysis of your environment – carried out by the cameras in the glasses – the CPU could know in what exact position are your hands and fingers.
Not only that, but they could also convert physical objects into virtual surfaces on which to be able to do things with your hands, from typing to drawing. You could grab and move objects, even when your hands were out of the field of view of the cameras in your glasses. According to Apple, these two rings will be able to “follow the movements of the user’s fingers with reference to any surface, including, in some cases, the surface of another finger and the palm of the hand.”
Minority Report’s “thimbles” are an idea in the same vein as Apple’s rings, but they’re optical
In fact, Apple’s patent also contemplates the possibility of using other types of hardware in addition to rings, such as a new type of Apple Pencil that would use a similar mechanism that would be so precise as to be able to draw on digital paper. as naturally as if you were doing it with a pencil and real paper.
The latter will be the key: that everything is natural. In theory, using your hands or three-dimensional physical objects is always more natural and intuitive than using a trackpad, mouse, joystick, or flat touch screen. It only remains to be seen the path that Apple chooses to do so.
Obviously, although rings seems like the most logical path, we have no idea if this patent will become a product or not. All I know is that the year we are going to witness the birth of a new era of computing, whether with the glasses of Apple, Facebook or Microsoft. For the first time in more than a decade, the future seems to be around the corner again.