Google’s new AR equipment work without lidar, yet with similar depth-mapping results.
At Google’s virtual Google I/O developer conference this season, information on augmented reality has not necessarily taken the front. But the company’s updates to its ARCore tools on Android look such as they’re adding some associated with the same world-scanning issues that Apple’s lidar will — but without lidar. And, in ways that will could layer that AR into prerecorded videos.
According to Google’s ARCore one.24 announcement, the fresh tools can be utilized to share depth routes captured with a phone along with ARCore: In other phrases, the rough locations associated with objects in the room within a “3D mesh.” Based on partner applications Google announced within the blog site post, applications could become utilized for 3D scanning, or even AR that places items, like furniture, realistically within a room full associated with things.
Power the Android
Get the latest news, how-to and reviews on Google-powered devices in ‘s Google Report newsletter.
The more fascinating (and mind-bending) part includes Google’s ability to document and share depth chart data, which the organization envisions in an effort to place AR into things such as prerecorded video clips. If someone already acquired depth data for the certain room, then a video clip of that room can end up using that will data to place items into the video because effects, perhaps for public video (TikTok was shown as an example) or even anywhere else.
layering AR things on a depth-scanned chair using ARCore, without having lidar.
Partner apps that will are already trying out going into these features, based on Google, are AR Doodads (an app that levels complex Rube Goldberg things right into a room), LifeAR (a movie calling app that utilizes AR to project avatars), and VoxPlop (which levels 3D effects into video clips after the fact).
Google first announced the depth-sensing AR features the year ago, but is actually the opened-up sharing associated with depth data between apps that’s new now. “These updates leverage depth from motion, making them on hundreds of millions of Android devices without relying on specialized sensors,” Google’s post explains. “Although depth sensors like time-of-flight (ToF) sensors aren’t required, having them will further improve the quality of your experiences.”
While Apple currently leans on physical sensors (lidar) for 3D meshes of rooms in AR, some companies like Niantic are already doing similar things without lidar. Google now looks like it’s flexing more of those tools for Android, too.
- Google I/O 2021 live updates
- After a year of chaos, Google aims for normal at I/O 2021
- Google and Samsung unite to reboot Android watches, with a dose of Fitbit too
- Android 12 gets biggest design change in years: What’s new