Stanford Engineering team develops new approach to enabling standard image sensors to see in 3D

Engineers develop a way to make simple cameras see in 3D

The billions of standard image sensors that are already in almost every smartphone today capture color and light intensity. These cameras rely on common sensor technology, known as CMOS. They have become smaller and more powerful each year. Now they offer tens of megapixels. They’ve only ever seen in two dimensions and captured images that were flat like a drawing.

Stanford University researchers have developed a new method that allows image sensors to view light in three dimensions. These common cameras will soon be able to measure distances to objects.

The engineering possibilities for this technology are vast. Lidar systems, which are expensive and specialized, can only measure distances between objects using light. You can easily identify a self driving car by its hunchback-like technology on the roof. The lidar system is the most important piece of gear, as it uses lasers to measure distances.