Sensor Fusion
1. List
2. Paper
Performance Limits of Track-to-Track Fusion vs. Centralized Estimation: Theory and Application
Track-to-track Fusion for Multi-target Tracking Using Asynchronous and Delayed Data: 2017, 69p
A Survey of ADAS Technologies for the Future Perspective of Sensor Fusion
VANETs Meet Autonomous Vehicles: A Multimodal 3D Environment Learning Approach
Multi-Modal Obstacle Detection in Unstructured Environments with Conditional Random Fields: 2017.06
- Unstructured Environments: 농장
- lidar + camera sensing using a conditional random field
- Fusing LIDAR and images for pedestrian detection using convolutional neural networks: 2016.04
Multiview random forest of local experts combining rgb and lidar data for pedestrian detection [Gonzalez, IV '15]
Pedestrian Detection Combining RGB and Dense LIDAR Data [Premebida, IROS '14] [Project] [Code]
- Radar/Lidar sensor fusion for car-following on highways: 2011
- Shruti Gangadhar, SENSOR FUSION FRAMEWORK AND SIMULATION ON A TURTLEBOT ROBOTIC VEHICLE, 2017, 석사학위
- A comparative study of data fusion for RGB-D based visual recognition : A study of data fusion methods for RGB-D visual recognition can be found in Sanchez-Riera et al. [2016].
C. Lundquist, “Sensor fusion for automotive applications,” Ph.D. dissertation, Linkoping University, Link ¨ oping, 2011. ¨
N.-E. E. Faouzi, H. Leung, and A. Kurian, “Data fusion in intelligent transportation systems: Progress and challenges : A survey,” Information Fusion, vol. 12, no. 1, pp. 4 – 10, 2011.
Object Detection and Classification by Decision-Level Fusion for Intelligent Vehicle Systems: 2017, KITTI, 21pFUSION OF LIDAR 3D POINTS CLOUD WITH 2D DIGITAL CAMERA IMAGE: 2015, 석사학위, 90p, 라이다+카메라 -> RGB-D 정보 획득하기, transformatio 논문
A Survey of ADAS Technologies for the Future Perspective of Sensor Fusion: 2016, 일반적 설명FusionNet: A deep fully residual convolutional neural network for image segmentation in connectomics: 2016
FuseNet: Incorporating Depth into Semantic Segmentation via Fusion-Based CNN Architecture: 2016, Depth Information, Semantic Segmentation, Depth Image, CNN, RGB
Multi-Sensor Fusion of Occupancy Grids based on Integer Arithmetic: Fusion 프레임워크 제안, 기본 익힌후 살펴 보기Fusing LIDAR and Images for Pedestrian Detection using Convolutional Neural Networks, Schlosser
Motion-based Detection and Tracking in 3D LiDAR Scans, Dewan
Multiple Sensor Fusion and Classification for Moving Object Detection and Tracking: 2015, 관련연구의 퓨전 레벨 부분만 참고 함Object Perception for Intelligent Vehicle Applications:A Multi-Sensor Fusion Approach: 2014
Sensor Modality Fusion with CNNs for UGV Autonomous Driving in Indoor Environments: 2017, 모형 자동차, 카메라+라이다 연동, DNN 제안Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios
3. Article (Post, blog, etc.)
Expert Advice: Sensor Fusion for Highly Automated Driving: GNSS관련, 구체적 내용 없음Object tracking with LIDAR, Radar, sensor fusion and Extended Kalman Filter: post
Sensor Fusion Algorithms For Autonomous Driving
Udacity carnd2 Sensor Fusion, Extended Karman Filter (English)
3. Tutorial (Series, )
4. Youtube
6. Material (Pdf, ppt)
Sensor Fusion and Calibration of Velodyne LiDAR and RGB Camera: pptSensor Fusion for Automotive Applications: 2011, 331p
7. Implementation (Project)
- An extended Kalman Filter implementation in Python for fusing lidar and radar sensor measurements: mithi, python
- Cpp버젼, CPP버젼v2
Object-Tracking-and-State-Prediction-with-Unscented-and-Extended-Kalman-Filters : 변형
8. Research Group / Conference
Data fusion development with ROS | BASELABS: 필독, 샘플파일 제공