Ship Tracking Based on Satellite Videos: History
Please note this is an old version of this entry, which may differ significantly from the current revision.
Contributor:

Ship detection and tracking have attracted a lot of attention in remote sensing because of the great potential in military application and port activities analysis. Compared with the vehicle targets, the size of the ship targets varies in a wide range, and the background of the track is commonly water, which may limit the performance of tracking methods. The feature of the water background is very similar to adjacent frames, which leads to ineffective motion information from the background analysis. Tracking algorithms such as optical flow-based tracker and offline tracking methods are thus not proper for ship tracking. Therefore, several novel models have been proposed to track ships from satellite videos.

  • satellite video
  • traffic target tracking
  • ship tracking

1. Introduction

Object tracking is a hot topic in computer vision and remote sensing, and it typically employs a bounding box that locks onto the region of interest (ROI) when only an initial state of the target (in a video frame) is available [1][2]. Thanks to the development of satellite imaging technology, various satellites with advanced onboard cameras have been launched to obtain very high resolution (VHR) satellite videos for military and civilian applications. Compared to traditional target tracking methods, satellite video target tracking is more efficient in motion analysis and object surveillance, and has shown great potential applications in spying on enemies [3], monitoring and protecting sea ice [4], fighting wildfires [5], and monitoring city trafficking [6], which traditional target tracking cannot even approach.

Recent research has shown an increasing interest in traditional video-based target tracking, with numerous algorithms proposed for accurate tracking in computer vision. Methods that utilize generative models [7][8][9][10] or discriminant models [11][12][13][14][15][16][17] can be divided into two categories. The generative model-based target tracking can be thought of as a search problem, in which the object area in the current frame is modeled and the most similar region is chosen as the predicted location in the next frame. In contrast, discriminant models regard object tracking as a binary classification problem and have attracted much attention due to their efficiency and robustness [18].

The ship tracking approaches are categorized into two classes: image-based tracking methods and multi-modality-based tracking approaches. The summary of reviewed ship tracking publications is given in Table 1. In addition, Figure 1 shows a comparison of algorithm structure between two categories.
Figure 1. Comparison diagram of algorithm structure for ship tracking. (a) the framework of Ref. [19] (An Example of image-based tracking method); (b) the procedure of track-level fusion reproduced from Ref. [20] (An example of a multi-modality-based tracking method).
Table 1. Summary of the ship tracking methods.
Target Method Ref. Year Description
Ship Image-based [19] 2019 Automatic detection and tracking for moving ships
[21] 2021 Framework consists of ANGS, MDDCM, JPDA
[22] 2022 Mutual convolution SN with hierarchical double regression
Multi-modality [23] 2010 Ship detection and tracking using AIS and SAR data
[20] 2018 Track-level fusion for noncooperative ship tracking
[24] 2018 Integrate sequential imagery with AIS data
[25] 2021 Integrate satellite sequential imagery with ship location information

2. Image-Based Tracking Methods

Ref. [19] developed an automatic detection and tracking model for moving ships in different sizes from satellite videos, as illustrated in Figure 1a. The dynamic multiscale saliency map was generated using motion compensation and multiscale differential saliency maps. Remote sensing images from the GO3S satellite were used to study the performance of the proposed method, indicating the effectiveness on ship tracking, especially on small ships. Furthermore, Ref. [21] proposed a new framework, including adaptive nonlinear gray stretch(ANGS), multiscale dual-neighbor difference contrast measure (MDDCM), and joint probability data association (JPDA) methods, to detect moving ships from GF-4 satellite images [26]. In Ref. [21], the ANGS enhanced the image and highlighted small and dim ship targets. The MDDCM detected the position of the candidate ship target, and the JPDA was applied for multi-frame data association and tracking. It was analyzed that general influencing factors on ship detection in optical remote sensing images include bright clouds and islands. In addition, high-resolution images are encouraged for better detection scores. By designing the mutual convolution Siamese network, Ref. [22] calculated the similarity between the object template and the search area to enhance the significance of the ship in the feature map. It was also proposed that a hierarchical double regression module to reduce the influence of the non-rigid motion of the water surface in the tracking phase.

3. Multi-Modality Based Tracking Methods

The automatic identification system (AIS) is an automatic tracking system that utilizes transceivers on ships and is applied by vessel traffic services. AIS information supplements marine radar, which continues to be the primary method of collision avoidance for water transport. AIS has been proven to be instrumental in accident investigation and search-and-rescue operations.
Earlier in 2010, Ref. [23] studied a fused ship detection and tracking system using the AIS data and satellite-borne SAR data. A 3D extension of a standard ordered-statistics constant false alarm rate (OSCFAR) algorithm was implemented on the radar data to realize target detection. For ship tracking, an alpha-beta filter combined with a nearest neighborhood assignment strategy was proposed and performed in polar coordinates to reduce false alarm errors. A time series of 512 samples and two onboard SAR sensors were used to verify their method, showing competitive results with previous works.
Recently, there has been renewed interest in fusing optical images with AIS data. Ref. [20] provided a track-level fusion architecture for GF-4 and AIS data to ship tracking tasks, as shown in Figure 1b. The constant false alarm rate (CFAR) detector first detected ships in GF-4 images, and then the multiple hypotheses tracking (MHT) Tracker with projected AIS data was aimed to achieve ship tracking. Then, a new track-to-track association algorithm was designed based on iterative closest point (ICP) and global nearest neighbor (GNN) with multiple features to improve the validity of association. The core data fusion architecture was the track-to-track association based on a combined algorithm with multiple features to correct positioning errors. As reported, their effective data fusion method showed that the AIS aided satellite image offered a great perspective for tracking non-cooperative targets. Similar to Ref. [20], Ref. [24] investigated the AIS aided ship-tracking method with GF-4 satellite sequential imagery. The algorithm consisted of three steps: ship detection, position correction, and ship tracking, which were realized by the peak signal-to-noise ratio (PSNR)-based local visual saliency map, the rational polynomial coefficient (RPC) model with AIS data, and amplitude assisted MHT framework, respectively. The proposed method achieved the accuracy evaluation, precision, recall, and F1-score indices with 98.5%, 87.4%, and 92.6% on GF-4 satellite sequences, indicating the accurate estimation of moving ships. In 2021, Ref. [25] combined GOES-17 satellite imagery with ship location information to track the trajectories of ship-emitted aerosols based on its physical processes and optical flow model.

This entry is adapted from the peer-reviewed paper 10.3390/rs14153674

References

  1. Yilmaz, A.; Javed, O.; Shah, M. Object tracking: A survey. ACM Comput. Surv. (CSUR) 2006, 38, 13.
  2. Jiao, L.; Zhang, R.; Liu, F.; Yang, S.; Hou, B.; Li, L.; Tang, X. New Generation Deep Learning for Video Object Detection: A Survey. IEEE Trans. Neural Netw. Learn. Syst. 2021, 1–21.
  3. Melillos, G.; Themistocleous, K.; Papadavid, G.; Agapiou, A.; Prodromou, M.; Michaelides, S.; Hadjimitsis, D.G. Integrated use of field spectroscopy and satellite remote sensing for defence and security applications in Cyprus. In Proceedings of the Fourth International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2016), Paphos, Cyprus, 4–8 April 2016; Volume 9688, pp. 127–135.
  4. Xian, Y.; Petrou, Z.I.; Tian, Y.; Meier, W.N. Super-resolved fine-scale sea ice motion tracking. IEEE Trans. Geosci. Remote Sens. 2017, 55, 5427–5439.
  5. Bailon-Ruiz, R.; Lacroix, S. Wildfire remote sensing with UAVs: A review from the autonomy point of view. In Proceedings of the 2020 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 1–4 September 2020; pp. 412–420.
  6. Du, B.; Sun, Y.; Cai, S.; Wu, C.; Du, Q. Object tracking in satellite videos by fusing the kernel correlation filter and the three-frame-difference algorithm. IEEE Geosci. Remote Sens. Lett. 2017, 15, 168–172.
  7. Xing, X.; Yongjie, Y.; Huang, X. Real-time object tracking based on optical flow. In Proceedings of the 2021 International Conference on Computer, Control and Robotics (ICCCR), Shanghai, China, 8–10 January 2021; pp. 315–318.
  8. Panetta, K.; Kezebou, L.; Oludare, V.; Agaian, S. Comprehensive underwater object tracking benchmark dataset and underwater image enhancement with GAN. IEEE J. Ocean. Eng. 2021, 47, 59–75.
  9. Yu, H.; Li, G.; Su, L.; Zhong, B.; Yao, H.; Huang, Q. Conditional GAN based individual and global motion fusion for multiple object tracking in UAV videos. Pattern Recognit. Lett. 2020, 131, 219–226.
  10. Acharya, D.; Ramezani, M.; Khoshelham, K.; Winter, S. BIM-Tracker: A model-based visual tracking approach for indoor localisation using a 3D building model. ISPRS J. Photogramm. Remote Sens. 2019, 150, 157–171.
  11. Zhao, C.; Liu, H.; Su, N.; Wang, L.; Yan, Y. RANet: A Reliability-Guided Aggregation Network for Hyperspectral and RGB Fusion Tracking. Remote Sens. 2022, 14, 2765.
  12. Wilson, D.; Alshaabi, T.; Van Oort, C.; Zhang, X.; Nelson, J.; Wshah, S. Object Tracking and Geo-Localization from Street Images. Remote Sens. 2022, 14, 2575.
  13. Klinger, T.; Rottensteiner, F.; Heipke, C. Probabilistic multi-person localisation and tracking in image sequences. ISPRS J. Photogramm. Remote Sens. 2017, 127, 73–88.
  14. Zhang, X.; Xia, G.S.; Lu, Q.; Shen, W.; Zhang, L. Visual object tracking by correlation filters and online learning. ISPRS J. Photogramm. Remote Sens. 2018, 140, 77–89.
  15. Liu, S.; Liu, D.; Srivastava, G.; Połap, D.; Woźniak, M. Overview and methods of correlation filter algorithms in object tracking. Complex Intell. Syst. 2021, 7, 1895–1917.
  16. Du, S.; Wang, S. An overview of correlation-filter-based object tracking. IEEE Trans. Comput. Soc. Syst. 2021, 9, 18–31.
  17. Xu, T.; Feng, Z.; Wu, X.J.; Kittler, J. Adaptive channel selection for robust visual object tracking with discriminative correlation filters. Int. J. Comput. Vis. 2021, 129, 1359–1375.
  18. Lyu, Y.; Yang, M.Y.; Vosselman, G.; Xia, G.S. Video object detection with a convolutional regression tracker. ISPRS J. Photogramm. Remote Sens. 2021, 176, 139–150.
  19. Li, H.; Chen, L.; Li, F.; Huang, M. Ship detection and tracking method for satellite video based on multiscale saliency and surrounding contrast analysis. J. Appl. Remote Sens. 2019, 13, 026511.
  20. Liu, Y.; Yao, L.; Xiong, W.; Zhou, Z. GF-4 Satellite and automatic identification system data fusion for ship tracking. IEEE Geosci. Remote Sens. Lett. 2018, 16, 281–285.
  21. Yu, W.; You, H.; Lv, P.; Hu, Y.; Han, B. A Moving Ship Detection and Tracking Method Based on Optical Remote Sensing Images from the Geostationary Satellite. Sensors 2021, 21, 7547.
  22. Bai, Y.; Lv, J.; Wang, C.; Geng, Y. Ship tracking method for resisting similar shape information under satellite videos. J. Appl. Remote Sens. 2022, 16, 026517.
  23. Gurgel, K.W.; Schlick, T.; Horstmann, J.; Maresca, S. Evaluation of an HF-radar ship detection and tracking algorithm by comparison to AIS and SAR data. In Proceedings of the 2010 International WaterSide Security Conference, Carrara, Italy, 3–5 November 2010; pp. 1–6.
  24. Yao, L.; Liu, Y.; He, Y. A Novel ship-tracking method for GF-4 satellite sequential images. Sensors 2018, 18, 2007.
  25. Shand, L.; Larson, K.M.; Staid, A.; Gray, S.; Roesler, E.L.; Lyons, D. An efficient approach for tracking the aerosol-cloud interactions formed by ship emissions using GOES-R satellite imagery and AIS ship tracking information. arXiv 2021, arXiv:2108.05882.
  26. Wang, D.; He, H. Observation capability and application prospect of GF-4 satellite. In Proceedings of the 3rd International Symposium of Space Optical Instruments and Applications, Beijing, China, 26–29 June 2016; pp. 393–401.
More
This entry is offline, you can click here to edit this entry!