in

‘Apple Glass’ UI will let wearers smoothly select new AR scenes, settings


Apple AR devices, such as “Apple Glass,” will work to avoid disorientating users by giving them control over where they move, plus previews of what they’ll see.

Maybe you remember how slow it was moving around the island in “Myst,” or perhaps you’re used to “walking” in a first-person shooter game. “Apple Glass” wants speed, and more smoothness — and specifically when a user moves to a new scene.

This isn’t just for games, either. After more than a century of movies, we are trained to understand when film cuts to a new location, or a scene switches to a new angle. Early moviegoers struggled with this impossibly alien idea, and Apple wants AR users to skip that struggle.

Apple wants us to unthinkingly know when large-scale movement is about to happen, just as we all so unconsciously know when an movie scene is over.

“As CSR [computer simulated reality] applications become more ubiquitous, there is need for techniques for quickly and efficiently moving about CSR settings,” says Apple. “For example, a user immersed in a virtual reality setting (e.g., a house) may wish to move to a different portion of the setting or to a different virtual setting altogether (e.g., an underwater setting).”

“To enhance movement experience, the present disclosure presents techniques allowing for efficient, natural, seamless, and/or comfort-preserving movement between locations in CSR settings,” continues the patent application. “In this way, an improved CSR experience is provided to users.”

Unlike a movie, where a cut can happen at any time, and the next scene could be anywhere, in Apple AR, the user has the controls — literally. As they look at where they are, the current view, they get controls to move.

“The current view depicts a current location of the CSR setting from a first perspective corresponding to a first determined direction,” says Apple. “A user interface element is displayed.”

Apple’s patent application is not especially concerned with either the details of the user interface, nor of when it is displayed. It could appear when a user looks at, or walks to, a particular spot in the AR experience.

Alternatively, it could be summoned by the user at any point they wish, using Siri, or controls on “Apple Glass.”

Wherever and however it appears, the idea is that the controls will always include a view of the next place a user can go to. That can be chosen by the AR creator, but it will typically be somewhere in the environment that is not immediately obvious, such as a different room, or a different world.

“The user interface element depicts a destination location not visible from the current location,” says Apple. “In response to receiving input representing selection of the user interface element, the display of the current view is modified to display a destination view depicting the destination location.”

So if you’re in an AR experience showing you the ice planet of Hoth, you might find a control that shows you a preview image of Tatooine instead. Then when you use the control, you move to that next location, and the key thing is that you know you will.

No matter how different the new environment is, you won’t be disorientated because you chose exactly when to go, and you saw exactly what you were getting yourself into.

Depending on your motion, Apple AR may show you a magnified view of your next virtual destination

Depending on your motion, Apple AR may show you a magnified view of your next virtual destination

The patent application includes proposals to do with making the destination clearer though the size of the preview. If you’re running up to the user interface element, it might show you a much bigger preview because you’re in a hurry, and you’ve started looking from further away.

“[Modifying] the display of the current view to display the destination view [includes] determining whether the received input represents movement of an object towards the electronic device,” continue Apple. “[And] in response to determining that the received input represents movement of the object towards the electronic device, proportionately enlarging the user interface element in accordance with a magnitude of the movement.”

This patent application is credited to two inventors, Luis R. Deliz Centeno, and Avi Bar-Zeev. As well as separately having worked on related projects, they have both previously collaborated on a system for Apple AR to change resolution to suit where a user is looking.



Source link

What do you think?

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Livestream shopping on social media shows potential in the US — Quartz

Sharks' Intestines Spiral Like a Valve Invented by Nikola Tesla |
Smart News

Sharks’ Intestines Spiral Like a Valve Invented by Nikola Tesla | Smart News