1000/1000
Hot
Most Recent
| Version | Summary | Created by | Modification | Content Size | Created at | Operation |
|---|---|---|---|---|---|---|
| 1 | Guangtao Shang | -- | 4999 | 2022-07-01 07:56:40 | | | |
| 2 | Guangtao Shang | -3916 word(s) | 1083 | 2022-07-01 09:45:39 | | | | |
| 3 | Camila Xu | -26 word(s) | 1057 | 2022-07-04 10:24:09 | | |
Simultaneous Localization and Mapping (SLAM) was first applied in the field of robotics. Its goal is to build a real-time map of the surrounding environment based on sensor data without any prior knowledge, and at the same time predict its own location based on the map. SLAM has attracted extensive attention from many researchers since it was first proposed in 1986 and is now a necessary capability for autonomous mobile robots.
Figure 1. Comparison between different cameras. An event camera is not a specific type of camera, but a camera that can obtain “event information”. “Traditional cameras” work at a constant frequency and have natural drawbacks, such as lag, blurring, and overexposure when shooting high-speed objects. However, the event camera, a neuro-based method of processing information similar to the human eye, has none of these problems.
Figure 2. The typical visual SLAM system framework.
Figure 3. Schematic diagram of the direct method.