The· Depth scan LIDAR sensor It looks like you are ready to open up many possibilities of 3D scanning apps on your mobile phone. A new one designed for home scanning, Called canvasUse lidar to add precision and detail. However, this app also works on non-professional iPhones returning to iPhone 8.
The approach adopted by Canvas shows how lidar works in the iPhone 12 Pro app. You can add more accurate and detailed information to processes that are already possible using other methods on mobile phones and tablets that do not have lidar.
Canvas, created by the Boulder-based company Occipital, Originally released for iPad Pro We will use lidar scans earlier this year. At the time, when I saw a demo of that possibility, I thought it would show how Apple’s depth detection technology could be applied to home refurbishment and measurement apps. The updated app will perform a clearer and clearer scan.
Since the debut of lidar-powered iPhones, it has been optimized to provide 3D scanning of objects, large-scale space-scanning photos (called photogrammetry), and augmented reality that allows you to blend spatial mesh-out maps with virtual objects. Several apps have appeared. However, a sample scan of the Occipital Canvas app on the iPhone 12 Pro embedded below looks clearer than the 3D scanning apps I’ve played so far.
According to Alex Schiff and Anton Yakubenko, Vice Presidents of Products at Occipital, developers will have live access to lidar data on the iPhone. This allows Occipital to build its own algorithms to get the most out of Apple’s lidar depth maps. Occipital can also apply depth mapping data to help improve future apps for phones that don’t have lidar.
It is possible to scan 3D space without a specific depth mapping rider or flight time sensor, 6d.ai (Acquired by Niantic) I’m already using it. However, Schiff and Yakubenko say that LIDAR offers faster and more accurate upgrades to that technology. According to Occipital, the iPhone 12 version of Canvas will perform a more detailed scan than the first version of the iPad Pro earlier this year, primarily for deeper access to lidar information on iOS 14. The latest lidar-enabled version is accurate within 1%, while non-lidar scans are accurate within 5% (literally making the iPhone 12 Pro a pro upgrade for those who need a boost).
According to Yakubenko, Apple’s iPad Pro LIDAR provides 574 depth points per frame on scans, according to previous measurements by Occipital, but iOS 14 for developers could cause depth maps to jump up to 256×192 points. there is. This builds more detailed information through AI and camera data.
Canvas room scans can be converted into actionable CAD models in a process that takes about 48 hours, while Occipital converts scans faster and uses AI for semantic data (doors, windows, and other room details). We are also working on adding (such as recognition of).
As more 3D scans and 3D data become available on iPhone and iPad, it also makes sense to share and edit files in common formats. iOS14 USDZ file format For 3D files, Occipital has its own format for more detailed scanning, which can be output to .rvt, .ifc, .dwg, .skp, and .plan formats when converted to CAD models. At some point, 3D scanning can be as standardized as PDF. We aren’t there yet, but we may need to get there soon.
Introvert. Beer guru. Communicator. Travel fanatic. Web advocate. Certified alcohol geek. Tv buff. Subtly charming internet aficionado.