Jump to content

Google gives Android depth sensing and object occlusion with ARCore 1.18


Karlston

Recommended Posts

Google gives Android depth sensing and object occlusion with ARCore 1.18

Virtual objects can appear behind real objects and collide with them.

The latest version of ARCore, Google's augmented reality developer platform for Android phones, now includes a depth API. The API was released as a preview back in December, but now it's live for everyone in ARCore 1.18.

 

Previously, ARCore would map out walls and floors and scale AR objects accordingly, but the Depth API enables things like occlusion—letting AR actors appear to be behind objects in the real world. The other big feature enabled by depth sensing is the ability to simulate physics, like the ability to toss a virtual object down the real-life stairs and have it bounce around realistically.

3D sensing

While Apple is building more advanced hardware into its devices for augmented reality, namely the lidar sensor in the iPad Pro, ARCore has typically been designed to work on the lowest common denominator in camera hardware. In the past that has meant ARCore only uses a single camera, even when most Android phones, even cheap ~$100 Android phones, come with multiple cameras that could help with 3D sensing. (Qualcomm's deserves some of the blame here, since its SoCs have often only supported running a single camera at a time.)

 

In version 1.18, for the first time ever, ARCore can use some of this extra camera hardware to help with 3D sensing. While the Depth API can run in a single-camera mode that uses motion to determine depth values, it can also pull in data from a phone's time-of-flight sensor to improve the depth quality. Samsung was one of the companies that was called out as specifically supporting this in the Note10+ and Galaxy S20 Ultra. Note that both of these are the highest-end skus for these devices. Tons of phones have multiple cameras like wide-angle and telephoto, but many phones have ToF cameras.

 

For a guess at the future of ARCore, a good idea would be a look across the aisle to ARKit, Apple's augmented reality platform. A big depth feature in ARKit that doesn't seem to be mentioned in Google's blog post is "people occlusion," or the ability for moving objects to hide virtual objects. Google's demos only show stationary objects hiding virtual objects.

 

The Depth API is available in Android and Unity SDKs. For users, you'll need an ARCore-compatible phone. Google maintains a big list here.

 

 

Google gives Android depth sensing and object occlusion with ARCore 1.18

 

(To view the article's image gallery, please visit the above link)

 

ThanksForReading200x49.jpg

Link to comment
Share on other sites


  • Views 522
  • Created
  • Last Reply

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...