Sei sulla pagina 1di 14

ARKit

iOS Foundation UniParthenope


ARKit
iOS 11 introduce ARKit a new framework that allows you to easily create
unparalleled augmented reality experiences for iPhone and iPad.

The basic requirements for any AR experience — and the defining feature of
ARKit — is the ability to create and track a correspondence between the real-
world space the user inhabits and a virtual space where you can model visual
content.

When your app displays that content together with a live camera image, the
user experiences augmented reality: the illusion that your virtual content is part
of the real world.
What is augmented reality?
Augmented reality (AR) is a variation of virtual environments (VE), or virtual
reality.

VE technologies completely immerse a user inside a synthetic environment. While


immersed, the user cannot see the real world around him.

In contrast, AR allows the user to see the real world, with virtual objects superimposed
upon or composited with the real world.

Therefore, AR supplements reality, rather than completely replacing it. In this way,
ideally it would appear to the user that the virtual and real objects coexisted in the
same space.

iOS Foundation UniParthenope


What is augmented reality?
There are two forms of AR currently available to designers:

1. location-based: leverages GPS-enabled smartphones and tablets to present


digital media to users as they move through a physical area. The media (i.e.,
text, graphics, audio, video, 3D models) are triggered and oriented via GPS
and compass technologies to augment the physical environment with
narrative, navigation, and/or academic information relevant to the location.

2. vision-based (or target-based AR) presents digital media to the users after
they point the camera in their mobile device at an object or target (e.g., QR
code, 2D target).
The current state of AR apps

PeakFinder Earth

Sky View
Pokémon Go
iOS Foundation UniParthenope
The current state of AR apps

Ink Hunter Gymaholic PhotoPills


What ARKit does
With ARKit, the creation of augmented reality apps no longer requires a custom
engine or finding the perfect library. Apple built the engine to work on all existing
devices with at least an A9 processor that run iOS 11. That covers the iPhone 6s
or later, including the iPhone SE, all iPad Pros, and the 2017 iPad.

ARKit use a process called Visual Inertial Odometry (VIO) to build the
correspondence between the real and virtual in your app. It combines the motion
sensors in the device with analysis of the scene gathered through the device’s
camera and produces a high-precision model of the device’s position and motion
within the world for you.

iOS Foundation UniParthenope


What ARKit does
The result of the VIO process, it recognizes notable features in the scene image,
tracks differences in the positions of those features, and compares that
information with motion sensing data during movement.

ARKit doesn’t only determine that you’re pointing your phone to the east. It also
analyzes and tries to understand the scene to the east.

ARKit can find real-world surfaces that correspond to points in the camera
image. It can detect flat surfaces (though not vertical ones) and provide
information on the position and size of these surfaces. Your app can place virtual
objects on and interact with these points and surfaces.
Visual Inertial Odometry
VIO analyzes camera data ("visual") to identify landmarks it can use to measure
("odometry") how the device is moving in space relative to the landmarks it sees.
Motion sensor ("inertial") data is used to fill in the blanks in providing
complementary information that the device can compare with what it's seeing to
better understand how it's moving in space.

Essentially, VIO allows the system to


create animated 3D graphics that it can
visualize live in "6 degrees of freedom,"
following the device's complex
movements along 6 axes: up/down,
back/forth, in/out and its pitch, yaw and
roll.

iOS Foundation UniParthenope


Detecting planes
In order to get an idea of how well features are being recognized on different
scenarios, there are two debugging options available when we use SceneKit in
our SceneView object: “Show feature points” and “Show world origin”.

The first one will ask ARKit to render the “feature points” as they are being
identified. They will appear as yellow dots.

The second one will show a triple X-Y-Z axis at the world origin.
What can we do with ARKit
The ability to translate distances on a screen to distances in the real world
provides a foundation for apps that need to place objects accurately onto a view.

For example IKEA developed IKEA Place, an app allowing customers to preview
furniture in a room before heading to the store.
Design and home decor companies likely will
add similar abilities to their apps.

This will make online shopping for these


items feel a little less risky.

iOS Foundation UniParthenope


What can we do with ARKit
Games seem a natural fit for this functionality. The ability to move around and
interact with a game as though it’s truly happening in the real world opens up a
whole new category of gaming.

For example is very interesting the idea of games that bring people into a shared
virtual environment.
Instead of looking at a representation of a board
on the screen, an augmented reality games can
show the board on the desk in front of you.

The concept could also would work for card


games, role playing games, and other shared
game experience
What can we do with ARKit
Augmented reality has a lot of potential to revolutionize education at many levels.
Apps that take an abstract or large scale concept and place it in front of
someone’s eyes can be a powerful tool to help students understand new topics
and concepts.
Beyond the classroom, historical locations and places can
now augment their experience with AR apps.

Instead of giving visitors a simple pamphlet at a historic


site, each visitor’s phone could become their key to
understanding the past.

Points of interest could now provide relevant visual and


auditory information when you visit a particular important
location.
iOS Foundation UniParthenope
What can we do with ARKit
Current navigational apps simply tell you to turn left at the next intersection. An
augmented reality app could draw the path for you, showing the turn around the
corner. A hiking application can show you the fork to take for the desired trail, or
lead you back to the trail when lost.
A businesses could help customers better navigate their
stores with an app.

Hospitals and universities in particular often have


decades worth of remodels and additions that make
finding certain rooms or floors rather difficult.

Imagine an app that would lay out the path to a doctor’s


office or waiting room or area right on your phone
screen.

Potrebbero piacerti anche