- Motion tracking
- Environmental understanding
- Light estimation
Motion tracking
Users use the phone's camera to see the real world and the device must be rotated and it changes orientation, angles, position, etc to get better clicks. Since this device's motion is taken into consideration to track motion, ARCore uses a process called concurrent odometry and mapping, or COM, to understand where the phone is relative to the world around it. Some visually distinct features detected by ARCore in the image captured by the camera are called feature points and it uses these points to compute its change in location.
The main objective is to capture device orientation and position in the real world with respect to time and this information is used and combined with inertial measurements from the device's IMU to estimate the pose (position and orientation).
If the device's camera orientation and position are aligned to the virtual camera provided by ARCore then users are able to render the 3D image over a virtual camera. That rendered virtual image must be overlayed over the top of the image clicked by the device camera.
Environmental understanding
ARCore is constantly improving its understanding of the real world through feature point detection and surface detection as well. ARCore looks for the group or cluster of points that show on a vertical or horizontal surface like a table, floor, and walls, etc. After detection and manipulating ARCore enables these surfaces as a plane to render the objects in the real world over the camera. Objects must be shown over these planes.
Note that white walls and floors without textures are difficult to detect surfaces, and it will not enable them as planes.
Light Estimation
ARCore is so advanced that it can detect lighting in the surroundings, so accordingly, it gives a sense of realism while rendering objects. Based on the light data, it provides the controls to enhance or alter color combinations of the rendered scene.
User Interaction
Since the device screen can be regarded as an (x,y) plane where ever a user interacts or touches, it will give the coordinates of that screen. A ray is formed that intersects the planes and these feature points are returned that are passed by ray along with the pose of that intersection in world space.
Oriented Points
Since we know that ARCore uses clusters of points to detect the angled surface, oriented points let us place objects on angled surfaces. When a user tests or moves the device around them, it looks for nearby points and uses that data to sense the angle of the surface.
Anchors and trackables
Poses get continuously changed because ARCore is continuously estimating the surface and angles in the real world. When you want to place a virtual object, than the anchor must be defined first and relative to that anchor object should be placed. Planes and points are a special type of object called a trackable. Be sure that trackable and object relation must be stable even if the device rotates or changes pose over time. For example, if you have placed an object on a desk, even if the device position changes, then the object must be in a stable state or appear on the table.
Augumented Image
An augmented image is a feature used to enhance reality. For example, take a 2D poster. Then, if the camera is placed over the poster, its characters would be popped out and enact a scene that seems to be a 3D experience.
AR principles
- Designing environments
- Creating and manipulating virtual objects
- Real-world movement
- Making user interfaces
Runtime Consideration
You can provide the best possible user experience by ensuring that your app:
- Provides clear feedback to users
- Encourages them to move their device
- Shows them how to interact with their device to experience AR
Encourage users to move the camera slowly
Since ARCore requires visual information and motion information as well, rapid device movement can cause the camera image to become blurry, reducing ARCore's ability to track and detect features. Mostly, data relies on IMU(Inertial Measurement Unit) information to distinguish a pose from the real world surroundings. So, avoid extended periods of rapid movement, which can cause ARCore to lose tracking and prevent the detection of features.
Enable ARCore on Android Studio
First, we need to update our Android Studio IDE to 3.1 or higher. Currently, I have updated to 3.6.1, so you guys must update to the latest version of Android Studio.
Secondly, update the SDK platform to 7.0 or API level 24 or higher.
We have to follow some steps to enable ARCore in our project:
- Add AR Required or AR Optional entries to the manifest
- Add build dependencies to your project
- Perform runtime checks to ensure the device is ARCore-supported, that Google Play Services for AR is installed on it, and that camera permission has been granted
- Make sure your app complies with ARCore's User Privacy Requirements
AR Optional Apps
- AR optional apps are those apps that can run on devices that don't support ARCore.
- Google play store doesn't install automatically Google play services for AR
AndroidManifest.xml