The technique would allow cameras photography to measure the distance to objects and make three-dimensional imaging widely available in smartphones. “Existing 3D smart cameras need specialized pixels, which are difficult to realize in large formats and have smaller fill factors due to the complex electronics required to capture 3D in the pixels,” Okan Atalar, a doctoral candidate in electrical engineering at Stanford University and the first author on a new paper that describes the new system, told Digital Trends in an interview
Measuring distance between objects with light is currently possible only with specialized and expensive lidar — short for “light detection and ranging” – systems. Lidar uses a laser that shoots at objects and measures the light that bounces back. It can tell how far away the thing is, how fast it’s traveling, whether it’s moving closer or farther away and whether the paths of two moving objects will intersect.
The Stanford researchers’ new approach could enable megapixel-resolution lidar – a level that’s not possible today. Higher resolution would allow lidar to identify targets at a greater range.
One way to add 3D imaging to standard sensors is by adding a light source and a modulator that turns the light on and off millions of times every second. By measuring the variations in the light, engineers can calculate distance. Existing modulators require impractically large amounts of power.
The Stanford team solved the modulator problem by using a phenomenon known as acoustic resonance. The researchers built an acoustic modulator using a wafer of lithium niobate – a transparent crystal that is highly desirable for its electrical, auditory, and optical properties – coated with two transparent electrodes.
It could become the basis for a new type of compact, low-cost, energy-efficient lidar … that could find its way into drones, extraterrestrial rovers, and other applications.
The new modulator’s design is simple and integrates into a proposed system that uses off-the-shelf cameras, like those in everyday cell phones and digital SLRs. Atalar and his advisor Amin Arbabian, an associate professor of electrical engineering and the project’s senior author, said it could become the basis for a new type of compact, low-cost, energy-efficient lidar, “standard CMOS lidar” as they call it, that could find its way into drones, extraterrestrial rovers, and other applications.
“Our approach could also work in the infrared regime,” Atalar said. “No IR image pixels sensor can detect depth without requiring significant modifications.”
Apple includes lidar on its current iPhone 13 Pro and iPhone 13 Pro Max models. The company says the system offers better low-light focus and improves night portrait mode effects. The Stanford researchers said their lidar solution is less expensive to implement than the one used by Apple and could be installed on a wider range of phones.
Lidar scanning devices are used to determine the depth of a photo, Hans Hansen, the CEO of Brand 3D, a 3D photography company, told Digital Trends. By moving the camera around the object, the distances from multiple angles can be used to create a full 3D model. There’s also stereo photography where multiple cameras are placed apart (for example the three-camera pixels lens module on Apple iPhone Pro phones) and then use the information to create a spatial image of a scene or an object.
“We’ve seen use cases from measuring the wall distance in your home and other spatial measurements,” Sukemasa Kabayama, the CEO of Uplift Labs, which uses 3D analysis to analyze human performance using smartphones, said to Digital Trends. “Although those are not 3D cameras specifically, these smartphone cameras have the power to capture pixels valuable data and produce 3D visualizations using video and other applications.”
Recently, researchers from MIT developed ultra low power radars that use regular radar technology to detect distances to moving objects. This technology could eventually be suitable to make a new type of camera that would not be sensitive to light issues, e.g. when scanning transparent objects.
“With 3D cameras, you would be able to capture scenes and objects that people remotely would be able to experience as if they were physically in the room,” Hansen said. “This would be groundbreaking for remote working, learning and for safe distances during pandemics, as well as for diagnosing, treating and repairing functions in healthcare, technology, and manufacturing sectors.”
Kabayama said the 3D imaging collected smart could provide detailed analytics and enhancements across various industries. One area where 3D technology could have an impact is sports, fitness, and wellness.
“Whether you’re a CrossFit junkie, weekend golfer, or avid Peloton enthusiast, the risk of physical injury is present and for many, a constant battle,” Kabayama added. “Professional athletes have access to 3D technology that serves as a way to minimize performance-related injuries, but most of us everyday athletes do not.”
Kabayama predicted that by making 3D cameras and analysis accessible through smartphones, athletes of all skill sets could track and analyze their movement to gain detailed biomechanical analytics.
“With most injuries due to overexertion, improper form, or other poor body mechanics, 3D imaging can make pinpointing areas of improvement — whether that be form or parts of the body to strengthen — a seamless task,” he added.
Having a 3D camera on your mobile device could also make it more secure. After all, the face-based security on phones is only as good as the camera behind it, Richard Carriere, a senior vice president, and general manager at CyberLink, which makes facial recognition technology, said in an interview.
Face ID for example, which many people use to gain access to sensitive accounts like mobile bank accounts, work emails, and to pay via smartphone, runs on Apple’s TrueDepth camera system. The more advanced 3D cameras coming to market are able to capture even more detailed depth readings, Carriere said.
“Improved accuracy not only reduces the number of times the technology fails to scan your face correctly, perhaps because you’re at an unusual angle, but it also critically safeguards against spoofing attempts,” Carriere added.
Source: This news is originally published by digital