Connect with us

Technology

All the data self-driving cars take in from cameras looks like this

Published

on

Self-driving cars are almost too observant, taking in information from light-emitting LiDAR sensors, radar equipment, microphones, and cameras. But all the information a car gleans from the outside world still has to be wrangled to be useful. 

Cruise’s fleet of self-driving cars testing in San Francisco take in petabytes of data each month from its sensor suite on the road and in simulation, similar to other configurations other self-driving car companies have on autonomous vehicles. A petabyte is a million gigabytes, by the way.

So to corral all this information, Cruise — through a hackathon event — created an open-source data visualization platform called Webviz. Other autonomous vehicle companies offer different aspects of the self-driving process, like Baidu’s Apollo open-source autonomous driving platform. Now Cruise is opening up its application for anyone who works with robotics.

This is what it looks like for an autonomous car picking up camera shots and turning that into useful data points.

This is what it looks like for an autonomous car picking up camera shots and turning that into useful data points.

With Webviz, engineers can understand the autonomous vehicle data and analyze what the cars are doing out in the streets and help decide how the cars should drive or approach different situations. Even though there are robo-car specific aspects, Cruise says anyone in the robotics community can use the program.

So someone who works with a delivery bot or a humanoid mimicking human movement can plug in data inputs from their cameras and sensors and lay it out and visualize it for further analysis and interpretation, just like autonomous vehicle teams do. 

Cruise says it uses the platform to watch simulations live or to examine past rides from an older data set. Here’s a live demo to see how the data is displayed.

Interpreting all that data that comes in.

Interpreting all that data that comes in.

Cruise previously opened up its 2D and 3D scene rendering library, Worldview, and Uber made its  tool Autonomous Visualization System publicly available around the same time back in February to turn self-driving data into 3D scenes. 

Anyone who wants to start looking through their robotics data can now go to and use Webviz.

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f91256%252fba5b0a7d 69f7 4cd9 92b7 b3dbdc2984c7.jpg%252foriginal.jpg?signature=9 pzooigxbywhatptvdsisdaloa=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Continue Reading
Advertisement Find your dream job

Trending