At the opening keynote speech of WWDC 2022, Apple said nothing about augmented reality and virtual reality technology, but in fact, when it comes to ARKit and related technologies, iOS 16 brings a lot of improvements.
One of these technologies is “RoomPlan”, a new API that can be used to quickly create 3D floor plans using LiDAR scans. Apple introduced LiDAR scanners in the 2020 iPad Pro and iPhone 12 Pro, making it easier and more accurate to detect the size of objects.
LiDAR is a technology that fires a laser and calculates the time it takes for the light to return to a sensor to precisely measure the distance between a device and an object. As a result, LiDAR scanners are able to detect the size and even shape of objects or indoor environments.
In iOS 16, Apple is combining LiDAR with iPhone and iPad cameras in a new RoomPlan API that lets users create 3D floor plans in seconds. According to Apple, the API is useful for real estate, construction, and interior design applications because it is precise and easy to use.
Since RoomPlan is an API, developers are required to create and update their applications to support this new feature. The source editors used Apple’s sample code to try the RoomPlan API on their iPhone 13 Pro Max.
The editor says it works pretty well, creating a 3D floor plan of your living room by pointing your iPhone at a wall in seconds, all you have to do is open a compatible app and scan your surroundings. The API can even recognize some objects, although only the blocks that represent them are captured.
In addition to accuracy, the editor said that what impressed him the most was the speed of scanning. The API provides a nice animation and live preview showing the progress of the scan. 3D floor plans can be exported as USDZ files, compatible with popular tools such as Cinema 4D, Shapr3D, and AutoCAD.
According to the source, Apple has introduced this API for iPhone and iPad devices, and there may be more purposes. Apple may be studying LiDAR technology-based devices that need to detect the environment around the user in real time.
In iOS 16, Apple also added 4K HDR video support for ARKit apps for the first time, while also updating the Nearby Interaction API to combine the U1 chip with AR. Since the company has reportedly been working on its own mixed reality headsets, perhaps these new APIs were created with the headsets in mind.