Let Us Know How Apple Brought  $75,000Lidar To iPhone

Let Us Know How Apple Brought $75,000Lidar To iPhone

During the launch of the iPhone 12 on Tuesday, Apple promoted the capabilities of its new lidar sensor. Apple says lidar will improve the iPhone’s camera by ensuring more rapid focus, during low-light situations in particular. It can allow creating a new generation of trail-blazing and amplified reality apps.

Apple first incorporated the technology with the new iPad in March. No one has analyzed the iPhone 12 yet. We can learn about it from the recent iPad teardowns.

Lidar works by releasing laser light and measuring how long it takes to bounce back. As light travels at a constant speed, the round-trip time can be interpreted as a precise distance estimate. Redo the same process across a two-dimensional grid and the result is a three-dimensional point cloud revealing the location of objects around a room, street, or other location.

Apple’s declaration is interesting because there are companies who employ the same combination of technologies—VCSEL lasers and SPAD detectors—to create a stronger lidar for the automotive market. The USP of VCSELs and SPADs is that they can be built by applying the regular semiconductor fabrication techniques. Due to this, they’re benefitted by the huge economies of scale in the semiconductor industry. As VCSEL-based sensors become more common, they expected to become more inexpensive and better.

Two leading organizations working on high-end VCSEL-based lidar—Ouster and Ibeo—have already done more purchases than most companies in the crowded lidar business. Apple’s decision to embrace the technology will provide them with an edge in the coming years.

The premier three-dimensional lidar sensor was launched by Velodyne more than a decade ago. The spinning unit priced approximately $75,000 was notably larger than a smartphone. Apple had to make lidar sensors remarkably cheaper and smaller in order to put one in each iPhone, and VCSELs facilitated the company in its endeavour.

What’s a VCSEL? If you’re creating a laser using standard semiconductor fabrication techniques, you have two basic options. You can create a laser that circulates light out from the side of the wafer (known as an edge-emitting laser) or from the top (a vertical-cavity surface-emitting laser, or VCSEL).

Conventionally, edge-emitting lasers have been stronger. VCSELs have been applied for decades in everything from optical mice to optical networking gear. They were initially dismissed as unsuitable for high-end applications where a lot of light was required, but VCSELs have become stronger as the technology has matured.

Apple, Ouster, and Ibeo are building lidar sensors without any moving parts. With uncountable lasers on a chip, VCSEL-based lidars can have a committed laser for every point in the lidar’s field of view. Due to the availability of all these lasers in pre-packaged condition one chip, assembly is easier than Velodyne’s classic spinning design.

Recent iPhones possessed another 3-D sensor called the TrueDepth camera that was equipped with Apple’s FaceID feature. It also used a wide range of VCSELs expected to have been provided by Lumentum. TrueDepth functions by projecting a grid of more than 30,000 dots onto a subject’s face and then evaluating the three-dimensional shape of the user’s face based on the way the grid pattern was disfigured.

The iPad’s lidar sensor releases many fewer laser dots than the TrueDepth camera. An iFixit video made with an infrared camera demonstrated the lidar projecting a grid of only a few hundred pixels. The TrueDepth pattern tries to estimate depths on the shape of the light that falls on a subject’s face. On the contrary, the iPad’s lidar sensor calculates distance directly by computing how long it takes for the light to bounce off an object and return to the camera. This process likely produces both better precision in-depth measurements and also long-range.

Tags: ,

Related posts

Subscribe to our Newsletter