Research Area

Sensor fusion is the art of extracting accurate state information from noisy data of heterogeneous set of sensors. I am developing advanced sensor fusion methods on real-time embedded systems. Sensor fusion is commonly tackled using advanced signal processing, estimation, machine learning, and artificial intelligence. The challenge becomes more complex when sensor fusion methods are to be implemented on limited resources embedded computing platforms such as FPGAs or microcontrollers. Some sensors require extremely high-rate processing such as GPS raw signals, inertial measurement units (IMU), Vision, and Radar/LiDAR sensors. Therefore, the simultaneous processing of heterogeneous high-rate streams of noisy sensor measurements on limited resources embedded computing systems under hard real-time constraints is the core profile of my research. For more information, please visit my research lab website here.

Applications

Intelligent vehicles, automated driving, assisted driving and car safety systems, environment perception for unmanned systems, indoor navigation, mobile robot’s navigation, remote sensing, machine control, mapping/surveying, intelligent transportation systems, home automation, smart buildings, health, wellness, and medical devices. Integrated navigation, guidance, and control systems using Global Navigation Satellite Systems (GNSS), Inertial Measurement Units (IMU), Radar/LiDAR, and Vision sensors. Simultaneous localization and mapping (SLAM). Attitude and heading reference systems (AHRS). Vision/Radar/LiDAR-aided navigation, indoor localization and mapping.

Useful links

 Youtube Channel     Research Lab  Software & Tools