Outsight’s Augmented LiDAR Box is a lidar-agnostic plug & play edge computing device.
As lidar sensors for perception become an essential technology for developers of autonomous vehicles, there are dozens of companies working to improve upon the technology, including developers of more reliable solid-state lidar sensors,including those that can identify objects at longer range.
However, one of the challenges for developers of autonomous vehicles that use lidar sensors for perception is pre-processing massive amounts of raw lidar sensor data so its can be used along with other perception software. There are also many different lidar sensors on the market from dozens of companies, all with different capabilities and specifications.
But French photonics company Outsight has come up with a solution. The company announced today the launch of what it calls a game-changing product named the Augmented LiDAR Box (ALB), which is a lidar-agnostic plug & play edge computing device.
The ALB overcomes the complexity of using raw 3D lidar data, so application developers or integrators don’t need to become a lidar experts, according to Outsight. It’s pre-processing software makes 3D lidar easier to use. With the ALB, customers can receive both the raw, high-resolution point cloud data and the pre-processed data.
Making sense of raw 3D spatial lidar data could be quite challenging, which can hinder the development of autonomous vehicles and other systems that rely on lidar data.
The ALB provides a comprehensive set of fundamental features that are commonly required in almost every application for autonomous vehicles, including localization and mapping, 3D Simultaneous Localization and Mapping (SLAM), object identification & tracking, and segmentation & classification.
The ALB uses edge processing and only requires an ARM CPU. In addition, its AI doesn’t rely on machine learning, therefore it’s power-efficient and doesn’t require any training or annotation efforts for machine learning algorithms.
“The hardware aspect of LiDAR is becoming a commodity with prices decreasing very quickly together with impressive performance improvements. However, this new animal in the Computer Vision landscape remains a complex technology for most customers to use,” said Raul Bravo, President and co-founder of Outsight. “We’re convinced that the key condition required for LiDAR to become mainstream is the emergence of enabling software.”
Lidar works by emitting pulses of invisible laser light which are reflected back off objects. The reflected light is used to render a 3D image of the environment known as a lidar point cloud, By measuring time-of-flight it can also determine the velocity and direction of moving objects.
Lidar is an essential sensor for developers of self-driving vehicles, acting as an extra set of eyes on the road. Lidar is often combined with camera and radar data to improve perception for autonomous vehicles. When data from multiple lidar sensors is combined using sensor fusion, it can offer complete, 360 degree perception coverage around the vehicle.
Outsight’s Augmented LiDAR Box is Agnostic
The ALB is the first real-time lidar software engine that allows developers to seamlessly use lidar data from any hardware supplier, including Velodyne, Ouster, Hesai and Robosense, all of which have entered into strategic partnership agreements and are collaborating with Outsight.
The ALB avoids the added complexity of processing 3D data in real-time, according to Outsight. Since it’s an agnostic solution that works with lidar sensors from different manufacturers, it saves the customer the hassle of assessing each individual model and choosing the most appropriate lidar sensor for each application.
Outsight’s ALB is not just for self-driving vehicles however, other applications include robotics, security & surveillance equipment such as smart cameras that use lidar.
Outsight said that many companies want to leverage real-time 3D spatial intelligence that lidar technology provides for perception and safe navigation for autonomous vehicles, but don’t want to deal with the added complexity of processing the raw lidar data.
In addition, assessing, selecting and using the optimal lidar sensor from dozens of hardware suppliers is time-consuming and is not a valuable use of engineering resources, especially for startups and universities developing autonomous machines.
The launch of ALB also follows the successful deployment of Outsight’s technology at Paris Charles de Gaulle airport to provide accurate real-time monitoring of people flow while preserving private data.
Outsight was recognized with a Best of CES Innovation Award in Las Vegas in 2020 for its 3D Semantic Camera. The camera combines hardware with edge processing software for object detection, tracking and classification.
The 3D semantic camera uses lidar from Silicon Valley-based Velodyne. It merges this lidar data with RGB color data using an embedded AI processor.
Outsight is also the youngest company to win a Prism Award that honors the best new optics and photonics products on the market.
Outsight aims to accelerate the adoption of technology that uses lidar with its easy-to-use and scalable pre-processing hardware, which can help companies build transformative products.
The company is based in Paris with additional offices in Helsinki and San Francisco.