Overview
This project was developed as a proof of concept to provide an immersive visualization of vehicle sensor data in augmented reality. By placing a virtual vehicle model in the real world and displaying sensor data around it, engineers and developers can better understand how their sensors interact with the environment and debug issues more effectively.
Demo
Details
Technologies
ARKit, Unity, ROS (Robot Operating System), C#
Role
Lead Developer
Year
2019
The Problem
Vehicle sensor data is traditionally visualized using 2D interfaces or complex 3D visualization tools that run on a desktop. These methods make it difficult to intuitively understand how sensors interact with the physical world, especially for non-technical stakeholders.
The Solution
By creating an AR application that visualizes sensor data in the real world, we can provide a more intuitive understanding of how sensors work. The app shows a 3D model of the vehicle with real-time sensor data overlaid, making it easier to:
- Visualize sensor detection ranges
- Observe environmental interactions
- Detect blind spots or sensor coverage issues
- Communicate sensor concepts to non-technical stakeholders
Features
The application includes:
- Real-time visualization of camera, LIDAR, radar, and ultrasonic sensor data
- Dynamic scaling of the virtual vehicle
- Ability to toggle different sensors on/off
- Integration with ROS (Robot Operating System) for real-time data streaming
- Replay mode for analyzing recorded sensor data