Robot Landing Method on Overhead P[ower Transmission Lines: Comparison
Please note this is a comparison between Version 2 by Lindsay Dong and Version 1 by LEI JIN.

Hybrid inspection robots have been attracting increasing interest in recent years, and are suitable for inspecting long-distance overhead power transmission lines (OPTLs), combining the advantages of flying robots (e.g., UAVs) and climbing robots (e.g., multiple-arm robots). Due to the complex work conditions (e.g., power line slopes, complex backgrounds, wind interference), landing on OPTL is one of the most difficult challenges faced by hybrid inspection robots.

  • FPLIR
  • autonomous landing
  • Hybrid inspection robots

1. Introduction

Overhead power transmission lines (OPTLs), as a key component of the state grid infrastructure, is a primary means for the long-distance transmission of electric power, contributing significantly to the economic development of a stable nation. Due to their passage through harsh environments (e.g., deserts, mountains, forests, and rivers), OPTLs are easily affected by material deterioration, electrical flashover, and constant mechanical tension [1,2,3][1][2][3]. To efficiently and reliably transmit high-voltage electric power, OPTLs need to be routinely inspected for early fault detection [4]. In the US, the average cost of a half-hour blackout for medium and large industrial customers is USD 15,707, while it is nearly USD 94,000 for an 8 h interruption. Additionally, the growing global population and the over-reliance on electricity supply have created great demand for more efficient transmission line inspection strategies [5,6][5][6].
The original inspection method for OPTLs was human inspection, which requires inspectors to climb along the power line to detect faults. This is laborious, inefficient, and dangerous for inspectors [7]; therefore, robots have become important tools for OPTL inspection over the past three decades [8]. Currently, many studies focus mainly on climbing robots (e.g., multi-arm robots) and flying robots (e.g., UAVs). Climbing robots are suitable for short-distance inspections with heavier payloads, providing detailed and reliable inspection data due to being closer to the power lines. Nevertheless, bypassing large obstacles and landing on overhead power lines present great difficulties. Flying robots are flexible, low cost, and capable of collecting high-quality images. However, they are limited in terms of flight endurance and cannot accurately inspect OPTLs from close distances [9,10,11][9][10][11].
Hybrid robots have been a focus of attention in recent decades, combining the advantages of climbing robots with those of flying robots. They are suitable for long-distance inspections with more flexibility. The flight mechanism can land on power lines and fly over obstacles, while the walking mechanism can walk along the OPTLs [12,13][12][13]. The existing landing methods of hybrid robots only allow the robots to approach power lines from the top [12,14,15,16][12][14][15][16] or the bottom [17,18,19][17][18][19]. However, these hybrid robots are unstable when walking on power lines due to their mechanical structure. In addition, power lines are flexible cable structures with slopes; when hybrid robots land on power lines, they may slip or lose control. As a result, autonomous landing methods for the developed FPLIR should be investigated to ensure safe landing on power lines. This challenge can be broken down into four main issues: (1) identify power lines in the observable space; (2) estimate the status of the robot using the onboard sensors; (3) plan a trajectory that satisfies the dynamic constraints of the robot; (4) track the trajectory under the work conditions [14].

2. Power Line Detection

The existing image-based methods for power line detection can be divided into traditional and deep-learning-based methods, as listed in Table 1. Traditional methods have focused on low-level local features, such as gradient, luminance, texture, and other prior information. Power lines are assumed to be straight lines or polynomial curves with the lowest intensity in the image and parallel to each other. Yan et al. [20] adopted Radon transform to extract line segments, and then connected the segments into the whole line using the grouping method and the Kalman filter. Li et al. [21] proposed a knowledge-based power line detection method using the Pulse Coupled Neural Network (PCNN) to remove background noise from the images. Yang et al. [22] proposed an adaptive thresholding approach, Hough transforms and the Fuzzy C-Means (FCM) clustering algorithm for power line detection, removing spurious lines using the properties of power lines. Cerón et al. [23] proposed a method called Circle-Based Search (CBS) for detecting power lines by searching for lines between two opposite points. Song et al. [24] proposed a sequential local-to-global power line detection method based on a graph-cut model. However, the limitations of these methods are still obvious when applied to a real environment. For instance, manually tuning dozens of parameters makes it difficult to achieve the optimal result for each image during the inspection. Thus, when the parameters are fixed, the methods tend to produce more false positives and negatives on a dataset.
Table 1.
Summary of the literature related to power line detection.
Method Category Author/Method Advantages Limitations
Traditional method Yan et al. [20], Li et al. [21], Yang et al. [22], Cerón et al. [23], Song et al. [24] Simple model, fast and automatic, low data requirements Low noise resistance, low extraction accuracy
Deep learning-based method Holistically Nested Edge Detection [25], DeepContour [26], DeepEdge [27]

Zhang et al. [28], Madaan et al. [29]
Diverse use of information, high scene applicability, high extraction accuracy Complex model, high data requirements, low extraction efficiency

Deep learning-based methods have a strong ability to learn multiscale features and perceive global information, and they can produce high-level representations of objects in natural images. State-of-the-art CNN-based edge detectors, such as Holistically Nested Edge Detection [25], DeepContour [26] and DeepEdge [27], can be applied to produce very-high-quality edge maps. Then, the edge maps can be used by traditional straight-line detection methods (e.g., Hough transform). Zhang et al. [28] developed an accurate power line detection method using convolutional and structured features, improving the detection accuracy. Madaan et al. [29] treated power line detection as a semantic segmentation task, adopting an expansive convolutional network to develop a power line detection framework. Semantic segmentation using CNN is a highly accurate method. CNNs are robust to change in illumination and scenarios, reducing the chances of false positives and negatives. However, it is well known that CNN, and particularly segmentation networks (e.g., SegNet [30] or DeepLab [31]), usually have a high computational cost. This fact is crucial for FPLIR due to payload limitations and the need for real-time detection. The STDC-Seg [32,33,34][32][33][34] is used to address this problem, as it is able to provide real-time semantic segmentation with low-computing cost and high accuracy.

3. Robot Landing Method

With regard to methods of landing on a wall, Erginer et al. [35] proposed a method combining a PD attitude controller and vision-based tracking to allow UAVs to land autonomously on a stationary platform. Mellinger et al. [36] defined a planned trajectory as a series of segments, each of which was executed by a linear controller for the landing trajectory. Thomas et al. [37] developed a method for planning trajectories considering actuator and sensor constraints, enabling a quadrotor with a gripper to land on inclined surfaces. Mao et al. [38] proposed a vision-based wall landing method that used a combination of Apriltags and Visual Inertial Odometry (VIO) to land on the wall without using a motion capture system. The typical method for landing on walls is by colliding at terminal velocities, which is not safe for the landing of an FPLIR on a power line. With regard to methods of landing on a power line or cylinder, Popek et al. [39] presented an autonomous perching concept for UAVs, integrating vision-based perception, path planning and motion control on an aerial robot with limited processing capability. They realized the landing of a UAV with a manipulator on a cylinder. This required the establishment of constraints with respect to the drones and the cylinders, but was difficult for power lines. Mirallès et al. [14] used a cascaded P/PI controller to align the UAV with the power line and to assist the pilot in controlling the UAV landing on the power line from above. It was semi-automatic. Ramon-Soria et al. [15] used position-based visual servoing (PBVS) to land a UAV with soft jaws on a pipe. Hang et al. [16] proposed a heterogeneous landing platform that allowed multi-rotor robots to land on common structures, such as streetlights and the edges of buildings. The terminal speed on landing was not taken into account. Thomas et al. [17] proposed image-based visual servoing (IBVS), enabling a UAV to land on a cylindrical structure from below, which is difficult to construct geometric models of FPLIR and power lines. With regard to trajectory tracking methods, Ahmed et al. [40] proposed an extended backstepping nonlinear controller, which permitted multi-rotor UAVs to land on a moving platform. Wang et al. [41] used a hybrid of the 𝐻2/𝐻 technique to ensure that UAVs could track the desired trajectory under the influence of uncertainties and disturbances. Escareño et al. [42] used a hierarchical control strategy that was based on a combination of sliding mode and adaptive strategies. Meanwhile, they considered adaptive trajectory tracking in the presence of parameter uncertainties and constant wind disturbances. These methods add constraints to the FPLIR state and cannot guarantee the stability and safety of the landing. In recent years, model predictive control (MPC) has been used extensively for multi-rotor UAV control, and advances have been made in hardware and algorithmic efficiency [43,44,45][43][44][45]. System uncertainty can arise from various factors, including: (1) the effects of wind during flight; (2) uncertainty in the air drag coefficient; and (3) neglecting the deformation and vibration of the robot’s body in dynamic modeling. With respect to MPC, the main advantages include two aspects: (1) it is predictive, i.e., the control inputs at any moment are calculated to optimize the system performance in the future time horizon; and (2) it can satisfy the constraints on input and state variables, which are essential for guaranteeing landing safety.

References

  1. Jalil, B.; Leone, G.R.; Martinelli, M.; Moroni, D.; Pascali, M.A.; Berton, A. Fault detection in power equipment via an unmanned aerial system using multi modal data. Sensors 2019, 19, 3014.
  2. Menendez, O.; Cheein, F.A.A.; Perez, M.; Kouro, S. Robotics in power systems: Enabling a more reliable and safe grid. IEEE Ind. Electron. Mag. 2017, 11, 22–34.
  3. Disyadej, T.; Promjan, J.; Muneesawang, P.; Poochinapan, K.; Grzybowski, S. Application in O&M Practices of Overhead Power Line Robotics. In Proceedings of the 2019 IEEE PES GTD Grand International Conference and Exposition Asia (GTD Asia), Bangkok, Thailand, 19–23 March 2019; pp. 347–351.
  4. Wu, G.; Cao, H.; Xu, X.; Xiao, H.; Li, S.; Xu, Q.; Liu, B.; Wang, Q.; Wang, Z.; Ma, Y. Design and application of inspection system in a self-governing mobile robot system for high voltage transmission line inspection. In Proceedings of the 2009 Asia-Pacific Power and Energy Engineering Conference, Wuhan, China, 27–31 March 2009; pp. 1–4.
  5. Wale, P.B. Maintenance of transmission line by using robot. In Proceedings of the 2016 International Conference on Automatic Control and Dynamic Optimization Techniques (ICACDOT), Pune, India, 9–10 September 2016; pp. 538–542.
  6. Xie, X.; Liu, Z.; Xu, C.; Zhang, Y. A multiple sensors platform method for power line inspection based on a large unmanned helicopter. Sensors 2017, 17, 1222.
  7. Debenest, P.; Guarnieri, M.; Takita, K.; Fukushima, E.F.; Hirose, S.; Tamura, K.; Kimura, A.; Kubokawa, H.; Iwama, N.; Shiga, F. Expliner-Robot for inspection of transmission lines. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008; pp. 3978–3984.
  8. Elizondo, D.; Gentile, T.; Candia, H.; Bell, G. Overview of robotic applications for energized transmission line work—Technologies, field projects and future developments. In Proceedings of the 2010 1st International Conference on Applied Robotics for the Power Industry, Montreal, QC, Canada, 5–7 October 2010; pp. 1–7.
  9. Luque-Vega, L.F.; Castillo-Toledo, B.; Loukianov, A.; Gonzalez-Jimenez, L.E. Power line inspection via an unmanned aerial system based on the quadrotor helicopter. In Proceedings of the MELECON 2014–2014 17th IEEE Mediterranean Electrotechnical Conference, Beirut, Lebanon, 13–16 April 2014; pp. 393–397.
  10. Hui, X.; Bian, J.; Yu, Y.; Zhao, X.; Tan, M. A novel autonomous navigation approach for UAV power line inspection. In Proceedings of the 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), Macau, Macao, 5–8 December 2017; pp. 634–639.
  11. Ahmed, M.F.; Mohanta, J.; Sanyal, A.; Yadav, P.S. Path Planning of Unmanned Aerial Systems for Visual Inspection of Power Transmission Lines and Towers. IETE J. Res. 2023, 1–21.
  12. Hamelin, P.; Miralles, F.; Lambert, G.; Lavoie, S.; Pouliot, N.; Montfrond, M.; Montambault, S. Discrete-time control of LineDrone: An assisted tracking and landing UAV for live power line inspection and maintenance. In Proceedings of the 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 11–14 June 2019; pp. 292–298.
  13. Alhassan, A.B.; Zhang, X.; Shen, H.; Xu, H. Power transmission line inspection robots: A review, trends and challenges for future research. Int. J. Electr. Power Energy Syst. 2020, 118, 105862.
  14. Mirallès, F.; Hamelin, P.; Lambert, G.; Lavoie, S.; Pouliot, N.; Montfrond, M.; Montambault, S. LineDrone Technology: Landing an unmanned aerial vehicle on a power line. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 6545–6552.
  15. Ramon-Soria, P.; Gomez-Tamm, A.E.; Garcia-Rubiales, F.J.; Arrue, B.C.; Ollero, A. Autonomous landing on pipes using soft gripper for inspection and maintenance in outdoor environments. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 5832–5839.
  16. Hang, K.; Lyu, X.; Song, H.; Stork, J.A.; Dollar, A.M.; Kragic, D.; Zhang, F. Perching and resting—A paradigm for UAV maneuvering with modularized landing gears. Sci. Robot. 2019, 4, eaau6637.
  17. Thomas, J.; Loianno, G.; Daniilidis, K.; Kumar, V. Visual servoing of quadrotors for perching by hanging from cylindrical objects. IEEE Robot. Autom. Lett. 2015, 1, 57–64.
  18. Paneque, J.L.; Martinez-de-Dios, J.R.; Ollero, A.; Hanover, D.; Sun, S.; Romero, A.; Scaramuzza, D. Perception-Aware Perching on Powerlines with Multirotors. IEEE Robot. Autom. Lett. 2022, 7, 3077–3084.
  19. Schofield, O.B.; Iversen, N.; Ebeid, E. Autonomous power line detection and tracking system using UAVs. Microprocess. Microsyst. 2022, 94, 104609.
  20. Yan, G.; Li, C.; Zhou, G.; Zhang, W.; Li, X. Automatic extraction of power lines from aerial images. IEEE Geosci. Remote Sens. Lett. 2007, 4, 387–391.
  21. Li, Z.; Liu, Y.; Hayward, R.; Zhang, J.; Cai, J. Knowledge-based power line detection for UAV surveillance and inspection systems. In Proceedings of the 2008 23rd International Conference Image and Vision Computing New Zealand, Christchurch, New Zealand, 26–28 November 2008; pp. 1–6.
  22. Yang, T.W.; Yin, H.; Ruan, Q.Q.; Da Han, J.; Qi, J.T.; Yong, Q.; Wang, Z.T.; Sun, Z.Q. Overhead power line detection from UAV video images. In Proceedings of the 2012 19th International Conference on Mechatronics and Machine Vision in Practice (M2VIP), Auckland, New Zealand, 28–30 November 2012; pp. 74–79.
  23. Cerón, A.; Prieto, F. Power line detection using a circle based search with UAV images. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 632–639.
  24. Song, B.; Li, X. Power line detection from optical images. Neurocomputing 2014, 129, 350–361.
  25. Xie, S.; Tu, Z. Holistically-nested edge detection. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7-13 December 2015; pp. 1395–1403.
  26. Shen, W.; Wang, X.; Wang, Y.; Bai, X.; Zhang, Z. Deepcontour: A deep convolutional feature learned by positive-sharing loss for contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3982–3991.
  27. Bertasius, G.; Shi, J.; Torresani, L. Deepedge: A multi-scale bifurcated deep network for top-down contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 4380–4389.
  28. Zhang, H.; Yang, W.; Yu, H.; Zhang, H.; Xia, G.-S. Detecting power lines in UAV images with convolutional features and structured constraints. Remote Sens. 2019, 11, 1342.
  29. Madaan, R.; Maturana, D.; Scherer, S. Wire detection using synthetic data and dilated convolutional networks for unmanned aerial vehicles. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 3487–3494.
  30. Badrinarayanan, V.; Kendall, A.; Cipolla, R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495.
  31. Chen, L.-C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 834–848.
  32. Yu, C.; Wang, J.; Peng, C.; Gao, C.; Yu, G.; Sang, N. Bisenet: Bilateral segmentation network for real-time semantic segmentation. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 325–341.
  33. Yu, C.; Gao, C.; Wang, J.; Yu, G.; Shen, C.; Sang, N. Bisenet v2: Bilateral network with guided aggregation for real-time semantic segmentation. Int. J. Comput. Vis. 2021, 129, 3051–3068.
  34. Fan, M.; Lai, S.; Huang, J.; Wei, X.; Chai, Z.; Luo, J.; Wei, X. Rethinking BiSeNet for real-time semantic segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 9716–9725.
  35. Erginer, B.; Altug, E. Modeling and PD control of a quadrotor VTOL vehicle. In Proceedings of the 2007 IEEE Intelligent Vehicles Symposium, Istanbul, Turkey, 13–15 June 2007; pp. 894–899.
  36. Mellinger, D.; Michael, N.; Kumar, V. Trajectory generation and control for precise aggressive maneuvers with quadrotors. Int. J. Robot. Res. 2012, 31, 664–674.
  37. Thomas, J.; Pope, M.; Loianno, G.; Hawkes, E.W.; Estrada, M.A.; Jiang, H.; Cutkosky, M.R.; Kumar, V. Aggressive flight with quadrotors for perching on inclined surfaces. J. Mech. Robot. 2016, 8, 051007.
  38. Mao, J.; Li, G.; Nogar, S.; Kroninger, C.; Loianno, G. Aggressive visual perching with quadrotors on inclined surfaces. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 5242–5248.
  39. Popek, K.M.; Johannes, M.S.; Wolfe, K.C.; Hegeman, R.A.; Hatch, J.M.; Moore, J.L.; Katyal, K.D.; Yeh, B.Y.; Bamberger, R.J. Autonomous grasping robotic aerial system for perching (agrasp). In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1–9.
  40. Ahmed, B.; Pota, H.R. Backstepping-based landing control of a RUAV using tether incorporating flapping correction dynamics. In Proceedings of the 2008 American Control Conference, Seattle, WA, USA, 11–13 June 2008; pp. 2728–2733.
  41. Wang, R.; Zhou, Z.; Shen, Y. Flying-wing UAV landing control and simulation based on mixed H 2/H∞. In Proceedings of the 2007 International Conference on Mechatronics and Automation, Harbin, China, 5–8 August 2007; pp. 1523–1528.
  42. Escareño, J.; Salazar, S.; Romero, H.; Lozano, R. Trajectory control of a quadrotor subject to 2D wind disturbances. J. Intell. Robot. Syst. 2013, 70, 51–63.
  43. Kamel, M.; Stastny, T.; Alexis, K.; Siegwart, R. Model predictive control for trajectory tracking of unmanned aerial vehicles using robot operating system. In Robot Operating System (ROS); Springer: Berlin/Heidelberg, Germany, 2017; pp. 3–39.
  44. Small, E.; Sopasakis, P.; Fresk, E.; Patrinos, P.; Nikolakopoulos, G. Aerial navigation in obstructed environments with embedded nonlinear model predictive control. In Proceedings of the 2019 18th European Control Conference (ECC), Naples, Italy, 25–28 June 2019; pp. 3556–3563.
  45. Nguyen, H.; Kamel, M.; Alexis, K.; Siegwart, R. Model predictive control for micro aerial vehicles: A survey. In Proceedings of the 2021 European Control Conference (ECC), Delft, The Netherlands, 29 June–2 July 2021; pp. 1556–1563.
More
ScholarVision Creations