- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-25-2020 09:49 PM (Last edited 06-25-2020 09:52 PM ) in
OthersAR applications on Android have historically always had problems with proper depth sensing and distinguishing between foreground and background in the physical world. Whenever you'd add an AR object, it would just sit on top of the whole scenery in front of your viewfinder, regardless of whether or not something should realistically block the view. After an extensive preview phase introduced last year, Google is now launching its new Depth API to ARCore to all developers using Android and Unity.
The new interface helps distinguish between real-world foreground and background so digital objects will be properly occluded while improving their path finding capabilities and physics.
Developers can integrate the technology into their projects starting today, and you should see the change in some of Google's own products. The API uses a depth-from-motion algorithm similar to Google Camera's bokeh Portrait Mode to create a depth map. This is achieved by taking multiple images from different angles while you move the phone, which lets the system estimate the distance to every pixel you see on the screen. At the moment, the API only relies on one single camera for that.
Thanks to this depth information, digital objects can be hidden or partially hidden behind real-world materials. The first Google product to be equipped with the API is the Scene Viewer, which is part of Google Search. It lets you view all kinds of animals and more right in front of your camera — just search for "cat" on your ARCore-enabled device, for example.
The depth information can also be used for improved path finding (so digital characters stop running through your furniture), proper surface interactions (so you can paint on more complex objects than walls), and better physics (when you throw a digital ball, it will be obstructed by real-world objects). With more and more cameras sprouting on phones' backs, Google also teases that the API will rely on additional depth sensors and time-of-flight lenses in the future to improve and speed up the mapping process: "We’ve only begun to scratch the surface of what’s possible."
Apart from Google Search and an ARCore Depth Lab (APK Mirror) specifically meant to highlight the new API, the first product to receive an update that takes advantage of occlusion is Houzz, an application that lets you outfit your home with furniture in AR. There's also the TeamViewer Pilot app, helping you draw in AR to remotely assist those who aren't computer-savvy. Five Nights at Freddy's is the first game to take advantage of the API, allowing some characters to hide behind real-world objects for extra jump scares. Additionally, Snapchat has updated its Dancing Hotdog and Undersea World lenses to take advantage of occlusion.
Samsung will also release a new version of its Quick Measure app to take advantage of the new depth capabilities, making it faster and more accurate.
Starting today, the API will be available through ARCore 1.18 on all devices currently supporting it, which includes most of the recent flagships and some mid-range phones. You can get the update from the Play Store or APK Mirror. Interested developers can head to the ARCore website for more information, where they'll also find updated SDKs. The changes to these websites should go live over the next hours.
https://play.google.com/store/apps/details?id=com.google.ar.unity.arcore_depth_lab
https://play.google.com/store/apps/details?id=com.google.ar.core
- Mark as New
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-26-2020 09:59 AM in
Others