1000/1000
Hot
Most Recent
Being an emerging technology, robotic manipulation has encountered tremendous advancements due to technological developments starting from using sensors to artificial intelligence. Over the decades, robotic manipulation has advanced in terms of the versatility and flexibility of mobile robot platforms. Thus, robots are now capable of interacting with the world around them. To interact with the real world, robots require various sensory inputs from their surroundings, and the use of vision is rapidly increasing, as vision is unquestionably a rich source of information for a robotic system.
Year | Addressed Problems | Contributions | Outcomes |
---|---|---|---|
2016 [64][65][66] | Manipulation [64] and grasping control strategies [66] using eye-tracking and sensory-motor fusion [65]. | Object detection, path planning, and navigation [64]; Control of an endoscopic manipulator [65]; Sensory-motor fusion-based manipulation and grasping control strategy for a robotic hand–eye system [66]. | The proposed approach has improved performance in calibration, task completion, and navigation [64]; Shows better a performance than endoscope manipulation by an assistant [64]; Demonstrates responsiveness and flexibility [66]. |
2017 [67][68][69][70][71][72][73] | Following human user with robotic blimp [70]; Deformable object manipulation [67]; Tracking and navigation for aerial vehicles [68][73]; Object detection without GPU support [69]; Automated object recognition for assistive robots [71]; Path-finding for humanoid robot [72]. | Robotic rope manipulation using vision-based learning model [67]; Robust vision-based tracking system for a UAV [68]; Real-time robotic object detection and recognition model [69]; Behavioral stability in humanoid robots and path-finding algorithms [72]; Robust real-time navigation [70] and long-range object tracking system [70][73]. | Robot successfully manipulates a rope [67]; System achieves robust tracking in real-time [68] and proved to be efficient in object detection [69]; Robotic blimp can follow humans [70]; System was able to detect and recognize objects [71]; Algorithm successfully able to find a path to guide the robot [72]; System arrived at an operational stage for lighting and weather conditions [73]. |
2018 [74][75][76][77][78][79][80][81] | Real-time mobile robot controller [74]; Target detection for safe UAV landing [75]; Vision-based grasping [76], object sorting [79], and dynamic manipulation [77]; Multi-task learning [78]; Learn complex skills from raw sensory inputs [80]; Autonomous landing of a quadrotor on moving targets [81]. | Sensor-independent controller for real-time mobile robots [74]; Detection and landing system for drones [75]; GDRL-based grasping benchmark [76]; Effective robotic framework for extensible RL [77]; Complete controller for generating robot arm trajectories [78]; Successfully inaugurate a camera-robot system [79]; Successful framework to learn a deep dynamics model on images [80]; Autonomous NN-based landing controller of UAVs on moving targets in search and secure applications [81]. | The mobile robot reaches its goal [74]; The system finds targets and lands safely [75]; System grasps better than other algorithms [76]; Real-world reinforcement learning can handle large datasets and models [77]; Method is a versatile manipulator that can accurately correct errors [78]; Placement of objects by the robot gripper [79]; Generalization to a wide range of tasks [80]; Successful autonomous quadrotor landing on fixed and moving platforms [81]. |
2019 [82][83][84][85][86][87][88][89] | Nonlinear approximation for mobile robots [82]; Control of cable-driven robots [83]; Leader–follower formation control [84]; Motion control for a free-floating robot [85]; Control of soft robots [86]; Approach an object when obstacles are present [87]; Needle-based percutaneous using robotic technologies [88]; Natural interaction control of surgical robots [89]. | Effective recurrent neural network-based controller for robots [82]; Robust method for analyzing the stability of the cable-driven robots [83]; Effective formation control for a multi-agent system [84]; Efficient vision-based system for a free-floating robot [85]; Stable framework for soft robots [86]; Useful system to increase the autonomy of people with upper-body disabilities [87]; Accurate system to identify the needle position and orientation [88]; Smooth model to use eye movements to control a robot [89]. | System outperforms existing ones [82]; Vision-based control is a good alternative to model-based control [83]; Control protocol completes formation tasks with visibility constraints [84]; Method eliminates false targets and improves positioning precision [85]; System maintained an acceptable accuracy and stability [86]; A person successfully controlled the robotic arm using the system [87]; Framework shows the proposed robotic hardware’s efficiency [88]; movement was feasible and convenient [89]. |
2020 [90][91][92][93][94][95] | Grasping under occlusion [90]; Recognition and manipulation of objects [91]; Controllers for decentralized robot swarms [92]; Robot manipulation via human demonstration [93]; Robot manipulator using Iris tracking [94]; Object tracking of visual servoing [95]. | Robust grasping method for a robotic system [90]; Effective stereo algorithm for manipulation of objects [91]; Successful framework to control decentralized robot swarms [92]; Generalized framework for activity recognition from human demonstrations [93]; Real-time iris tracking method for the ophthalmic robotic system [94]; Successful method for conventional template matching [95]. | Method’s effectiveness validated through experiments [90]; R-CNN method is very stable [91]; Architecture shows promising performance for large-sized swarms [92]; Proposed approach achieves good generalized performance [93]; Tracker is suitable for the ophthalmic robotic system [94]; Control system demonstrates significant improvement to feature tracking and robot motion [95]. |
2021 [96][97][98][99][100][101][102] | Human–robot handover applications [96]; Imitation learning for robotic manipulation [97]; Reaching and grasping objects using a robotic arm [98]; Integration of libraries for real-time computer vision [99]; Mobility and key challenges for various construction applications [100]; Obtaining the spatial information of operated target [101]; Training actor–critic methods is RL [102]. | Efficient human–robot hand-over control strategy [96]; Intelligent vision-guided imitation learning framework for robotic exactitude manipulation [97]; Robotic hand–eye coordination system to achieve robust reaching ability [98]; Upgraded vision of a real-time computer vision system [99]; Mobile robotic system for object manipulation using autonomous navigation and object grasping [100]; Calibration-free monocular vision-based robot manipulation [101]; Attention-driven robot manipulation for discretization of the translation space [102] | Control shows promising and effective results [96]; Object can reach the goal positions smoothly and intelligently using the framework [97]; Dual neural-network-based controller leads to higher success rate and better control performance [98]; Successfully implemented and tested on the latest technologies [99]; UGV autonomously navigates toward a selected location [100]; Performance of the method has been successfully evaluated [101]; Algorithm achieves state-of-the-art performance on several difficult robotics tasks [102]. |
2022 [103][104][105][106][107][108][109] | Micro-manipulation on cells [103]; Collision-free navigation [104]; Highly nonlinear continuum manipulation [105]; Complexity of RL in broad range of robotic manipulation task [106]; Uncertainty in DNN-based prediction for robotic grasping [107]; Path planning for a robotic arm in a 3D workspace [108]; Object tracking and control of a robotic arm in real-time [109]. | Path planning for magnetic micro-robots [103]; Neural radiance fields (NeRFs) for navigation in 3D environment [104]; Aerial continuum manipulation systems (ACMSs) [105]; Attention-driven robotic manipulation [106]; Robotic grabbing in distorted RGB-D data [107]; Real-time path generation with lower computational cost [108]; Real-time object tracking with reduced stress load and a high rate of success. [109]. | Magnetic micro-robots performed accurately in complex environment [103]; NeRFs outperforms the dynamically informed INeRF baseline [104]; simulation demonstrates good results [105]; ARM was successful on a range of RLBench tasks [106]; System performs better than end-to-end networks in difficult conditions [107]; System significantly eased the limitations of prior research [108]; System effectively locates the robotic arm in the desired location with very high accuracy [109]. |
Study | Manipulation of Object | Vision-Based Tracking | Object Detection | Path Finding/ Navigation | Real-Time Remote Control | Robotic Arm/ Grasping |
---|---|---|---|---|---|---|
[67] | ✓ | |||||
[68] | ✓ | |||||
[69] | ✓ | |||||
[70] | ✓ | ✓ | ||||
[71] | ✓ | |||||
[72] | ✓ | ✓ | ||||
[73] | ✓ | ✓ | ||||
[74] | ✓ | ✓ | ||||
[75] | ✓ | ✓ | ||||
[76] | ✓ | |||||
[77] | ✓ | ✓ | ✓ | |||
[78] | ✓ | ✓ | ✓ | ✓ | ||
[79] | ✓ | ✓ | ✓ | ✓ | ||
[80] | ✓ | ✓ | ||||
[81] | ✓ | ✓ | ||||
[82] | ✓ | ✓ | ||||
[83] | ✓ | ✓ | ||||
[84] | ✓ | ✓ | ||||
[85] | ✓ | ✓ | ||||
[86] | ✓ | ✓ | ✓ | |||
[90] | ✓ | ✓ | ||||
[91] | ✓ | ✓ | ||||
[92] | ✓ | ✓ | ✓ | |||
[93] | ✓ | ✓ | ||||
[96] | ✓ | ✓ | ||||
[97] | ✓ | ✓ | ||||
[64] | ✓ | ✓ | ||||
[94] | ✓ | ✓ | ||||
[65] | ✓ | ✓ | ||||
[87] | ✓ | ✓ | ✓ | ✓ | ||
[95] | ✓ | ✓ | ||||
[88] | ✓ | ✓ | ||||
[89] | ✓ | ✓ | ✓ | ✓ | ||
[66] | ✓ | ✓ | ||||
[98] | ✓ | ✓ | ✓ | |||
[99] | ✓ | |||||
[100] | ✓ | ✓ | ||||
[101] | ✓ | ✓ | ||||
[102] | ✓ | |||||
[103] | ✓ | ✓ | ||||
[104] | ✓ | |||||
[105] | ✓ | |||||
[106] | ✓ | |||||
[107] | ✓ | |||||
[108] | ✓ | |||||
[109] | ✓ | ✓ |