Sensors Used on Fabric-Handling Robots: History
Please note this is an old version of this entry, which may differ significantly from the current revision.

Fabric-handling robots could be used by individuals as home-assistive robots. While in most industries, most processes are automated and human workers have either been replaced by robots or work alongside them, fewer changes have occurred in industries that use limp materials, like fabrics, clothes, and garments, than might be expected with today’s technological evolution. Integration of robots in these industries is a relatively demanding and challenging task, mostly because of the natural and mechanical properties of limp materials.

  • sensors
  • limp materials
  • garments
  • fabrics
  • automation
  • grippers
  • robot

1. Introduction

Textile industries present a relatively low percentage of automation in their production lines, which are constantly transporting, handling, and processing limp materials. In 1990, the challenges of developing a fully automated garment manufacturing process were stated in [1]. In line with the article, one of the major mistakes usually made is facing the problem without considering engineering, by characterizing materials as soft, harsh, slippery, etc., without using quantitative values. It is suggested that fabrics must be carefully studied and, if needed, redesigned to fulfil the desired specifications. When it comes to the machinery used for manipulation of fabrics, the soft and sensitive surface of materials of this kind, combined with their lack of rigidity, demands the development of sophisticated and specially designed grippers and robots that can effectively handle limp materials like fabrics in an accurate, dexterous, and quick manner. Integration of robots in production aims to improve the quality of the final product so technology that might harm the material surface, like bulk machines used in other industries, is not implemented.

Picking up fabric either from a surface or from a pile of cloth is probably the most important and challenging task in fabric manipulation. Various methods to pick up fabric from a solid surface have been developed, such as intrusive, surface attraction, pinching, etc. [2]. Picking up fabric from a pile presents different challenges, such as detecting the fabric and dexterously grasping it.
Fabric-handling robots could—and probably will—also be used by individuals as home-assistive robots. The modern way of working and living, combined with the increase in life expectancy, has resulted in a high percentage of people with limited physical abilities who cannot easily perform many daily tasks, such as laundry and bed making. Advanced technologies could offer a solution to these people by making home-assistive robots capable of dealing with garments in order to help with these kinds of tasks. In addition to the difficulties caused by the mechanical properties of garments, tasks involving human–robot interaction should be flawlessly safe, and the robots used should be able to be precisely controlled. For this reason, the use of sensors in such applications is necessary to ensure the safety of humans involved. Such home-assistive applications are already being investigated, and many research projects have developed robots that could help in housework activities [3][4][5][6][7][8][9][10][11][12] or by assisting humans with mobility problems [13][14][15][16][17][18][19].
A recent approach to manipulation of clothes is CloPeMa (clothes perception and manipulation), which aims to advance the state of the art in autonomous perception and manipulation of all kinds of fabrics, textiles, and garments [20]. The project consisted of experiments with several types of fabrics and different solutions for their manipulation. Various sensors were integrated for haptic and visual sensing and recognition using a pair of robotic arms.

2. Sensor Analysis

2.1. Working Principle

2.1.1. Visual Sensors

Most developed applications of fabric-handling robots include visual sensors. Visual sensors can be separated into various categories based on their working principle and their goal. The most fundamental classification is binary vs. non-binary sensors. Binary sensors can produce only two values as output: logic “0” and logic “1”. These sensors are only used to detect the presence of an object. A popular application of binary visual sensors is infrared sensors [21][22][23] (Figure 1), although other types of are available, such as fiber-optic [24][25][26] or optoelectronic sensors [27]. In [28] two sensors working as 1D cameras were used as edge sensors, measuring the light of every pixel.
Machines 10 00101 g001
Figure 1. Depiction of a 1D infrared sensor.
On the other hand, non-binary sensors provide more information than just the existence of an object. These sensors are usually cameras, either 2D or 3D (Figure 2). According to Kelley [29], 2D vision can usually answer questions such as the location of a part, its orientation, its appropriacy, etc. These are the most common issues in fabric-manipulation applications, and 2D vision can be used to effectively solve them, avoiding the significant data processing required by 3D vision. The use of 2D cameras is very popular in applications in which object or edge detection is required, as image processing can offer a solution that does not require interaction between the manipulator and the object. In [23], a camera was used in different schemes to detect the location of a fabric, its centroid, or an edge, while in [9], a camera mounted above the manipulation surface inspected the folding process of clothes by detecting their edges and comparing images to an ideal model. 
Machines 10 00101 g002
One major drawback of 2D visual sensors is that the quality of the information they can provide depends heavily on experimental conditions, especially lighting.
The advantage of 3D vision over 2D is the sense of depth that 3D can offer. Measured depth differences in applications of fabric manipulation are usually owing to wrinkles or edges. An RGB-D sensor was used in [30], wherein a fabric or some clothes were left lying folded on a surface, and the sensor identified the folded part and determined the point from which to grasp the fabric. Using the same principle, the contour and the formation of the fabric—or even its presence, as in [10]—can be detected. A humanoid robot used to assist with the bottom dressing of a human [15] implemented another 3D camera (Xtion PRO LIVE) to locate the position of the legs and the state of the pants, as well as possible failure throughout the process. Another human-assisting robot is presented in [3], where an RGB-D camera located the edges of the bed sheets in a bed-making process. A sketch depicting a typical RGB-D sensor can be seen below in Figure 3.
Machines 10 00101 g003
Besides detecting edges and formations, 3D sensors are also widely used for classification purposes. A depth sensor was used in [31] to pick up clothes from a pile. Next, it classified them into one of four main clothing categories, and an algorithm was implemented to identify the correct point from which the clothing should be picked up for optimal unfolding.

2.1.2. Tactile Sensors

In robots, similarly to humans, vision is the most useful sense, as it provides a lot of information quickly. One disadvantage of vision sensing is that it is prone to miscalculations and mistakes when environmental conditions vary. Furthermore, there are many mechanical properties of objects that cannot be identified by vision alone, such as hardness. A popular alternative to vision, especially in applications with fabrics, is tactile sense.
The basic working principle is similar for most tactile sensors. Conductive components, the electrical properties of which may change when they deform, are used such that when the sensor comes in contact with an object, they deform, and as a result, a difference in the component’s resistivity is measured (Figure 4). If the mechanical properties of the deformed part are known, the electrical differences can be translated into developed forces.
Machines 10 00101 g004

2.1.3. Body Sensors

Tactile and visual sensing add exteroception to a robotic system by enabling it to gather information from its environment. Proprioception, on the other hand, is the ability of a robotic system to sense its own movement and position in space. In order to make a robot proprioceptive, sensors are usually mounted on its body, mostly on its joints or at critical points (Figure 5). In [32], positions of the finger joints of a humanoid robot were measured using Hall sensors on the output of each joint, and arm-joint positions were measured using optical sensors. The robot explored the surface of clothes, and when it encountered features, such as buttons or snaps, the accurate position of the finger that came in contact with the feature was already known and could be marked as the location of the feature.
Machines 10 00101 g006

2.1.4. Wearable Sensors

A different kind of sensor implementation in fabric applications that could be included in the body-sensor category is stretch sensors in clothes or even on the human body. These sensors are called wearable sensors, as they can be worn together with clothes or like stickers. Such sensors might belong to a different category since they do not involve fabric-manipulating robots, but this kind of sensor could play a determinative role in the future and be used both on humans and robots.

2.2. Data Processing and a Reference to the Type of Manipulated Object

Sensors are used in robotic applications to instantiate perception in robots. Depending on the sensor used, different types of perception can be achieved, and depending on the type of perception required, different types of sensors can be used. Below, the most popular tasks for sensors are presented, with a reference to corresponding applications that have been developed.

2.2.1. Object Detection

The first step in almost every fabric-handling application is to locate the fabric and detect the points from which it will be grasped. In production lines, where conveyor belts are usually used and, in general, processes are more specific, detection of the object is easier, as its possible location is known most of the time; only its presence needs to be confirmed. In such cases, simple sensor setups can be used, such as binary sensors. Binary sensors can be visual [21][22][23][24][25][26][27][28], tactile [6][27][33], or even placed on the body of the robot, as in [34], where sensors were placed on a robot that picks up flexible materials from a stack, and when there is contact between the gripper and the stack, a signal is sent to a microprocessor. 

2.2.2. Control

The purpose of using sensors in robots is to enable them to operate more precisely in a closed loop. Force/torque sensors empower robots to obtain feedback on the exerting force and make necessary adjustments, as in [26]. In [35], force/torque sensors were incorporated in order to constantly measure the applied forces from the gripper to the fabric and control the processes, including laying, folding, etc. In the same system, a camera was also used to track the manipulated object. 

2.2.3. Object Classification

Fabric recognition and classification is not a simple task, and in many cases, various sensors and complex data processing are necessary. Tactile and vision sensors are mostly used applications of this kind, similarly to humans that recognize a fabric either by looking at it or rubbing it—sometimes both. In [7], a 2D camera on a humanoid robot captured images of a fabric, identified its contour, and classified the fabric into a clothing category; then, depending on the category, an automated folding algorithm was implemented. In [36], wrinkles and other features were derived from images of fabrics, and a set of Gabor filters were applied to classify fabrics into clothing categories, with almost perfect results. In [37], fabrics were not classified into categories, but their material properties, like stiffness and density, were estimated.

3. Conclusions

A systematic analysis of different types of sensors was carried out, focusing on those that have been used in fabric-handling applications. The goal was to group information about sensing systems in a way that will facilitate future selection of appropriate hardware for use in fabric-handling applications. The use of automated systems is constantly increasing, and implementation of appropriate sensing systems is crucial when developing such applications. Fabric handling is a task that has not been automated to the same extent as other fields, such as the automotive industry. This can be explained by the challenges encountered in applications. It is hoped that researchers will maintain an active interest in this field and continue to develop future advances. In terms of future work, researchers could focus on studies regarding grippers, in-hand manipulation strategies, or even the textile types that pose greater/lesser challenges in automated applications.

This entry is adapted from the peer-reviewed paper 10.3390/machines10020101

References

  1. Seesselberg, H.A. A Challenge to Develop Fully Automated Garment Manufacturing. In Sensory Robotics for the Handling of Limp Materials; Springer: Berlin/Heidelberg, Germany, 1990; pp. 53–67.
  2. Koustoumpardis, P.N.; Aspragathos, N.A. A Review of Gripping Devices for Fabric Handling. In Proceedings of the International Conference on Intelligent Manipulation and Grasping IMG04, Genova, Italy, 1–2 July 2004; pp. 229–234.
  3. Seita, D.; Jamali, N.; Laskey, M.; Kumar Tanwani, A.; Berenstein, R.; Baskaran, P.; Iba, S.; Canny, J.; Goldberg, K. Deep Transfer Learning of Pick Points on Fabric for Robot Bed-Making. arXiv 2018, arXiv:1809.09810.
  4. Osawa, F.; Seki, H.; Kamiya, Y. Unfolding of Massive Laundry and Classification Types by Dual Manipulator. J. Adv. Comput. Intell. Intell. Inform. 2007, 11, 457–463.
  5. Kita, Y.; Kanehiro, F.; Ueshiba, T.; Kita, N. Clothes handling based on recognition by strategic observation. In Proceedings of the 2011 11th IEEE-RAS International Conference on Humanoid Robots, Bled, Slovenia, 26–28 October 2011; pp. 53–58.
  6. Bersch, C.; Pitzer, B.; Kammel, S. Bimanual Robotic Cloth Manipulation for Laundry Folding. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 1413–1419.
  7. Miller, S.; Berg, J.V.D.; Fritz, M.; Darrell, T.; Goldberg, K.; Abbeel, P. A geometric approach to robotic laundry folding. Int. J. Robot. Res. 2011, 31, 249–267.
  8. Maitin-Shepard, J.; Cusumano-Towner, M.; Lei, J.; Abbeel, P. Cloth grasp point detection based on multiple-view geometric cues with application to robotic towel folding. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 2308–2315.
  9. Osawa, F.; Seki, H.; Kamiya, Y. Clothes Folding Task by Tool-Using Robot. J. Robot. Mechatron. 2006, 18, 618–625.
  10. Yamazaki, K.; Inaba, M. A Cloth Detection Method Based on Image Wrinkle Feature for Daily Assistive Robots. In Proceedings of the MVA2009 IAPR International Conference on Machine Vision Applications, Yokohama, Japan, 20–22 May 2009; pp. 366–369. Available online: http://www.mva-org.jp/Proceedings/2009CD/papers/11-03.pdf (accessed on 19 April 2021).
  11. Twardon, L.; Ritter, H. Interaction skills for a coat-check robot: Identifying and handling the boundary components of clothes. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 3682–3688.
  12. Li, Y.; Hu, X.; Xu, D.; Yue, Y.; Grinspun, E.; Allen, P.K. Multi-Sensor Surface Analysis for Robotic Ironing. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 5670–5676.
  13. Yu, W.; Kapusta, A.; Tan, J.; Kemp, C.C.; Turk, G.; Liu, C.K. Haptic Simulation for Robot-Assisted Dressing. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017; pp. 6044–6051.
  14. Kruse, D.; Radke, R.J.; Wen, J.T. Collaborative Human-Robot Manipulation of Highly Deformable Materials. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 3782–3787.
  15. Yamazaki, K.; Oya, R.; Nagahama, K.; Okada, K.; Inaba, M. Bottom Dressing by a Life-Sized Humanoid Robot Provided Failure Detection and Recovery Functions. In Proceedings of the 2014 IEEE/SICE International Symposium on System Integration, Tokyo, Japan, 13–15 December 2014; pp. 564–570.
  16. Joshi, R.P.; Koganti, N.; Shibata, T. A framework for robotic clothing assistance by imitation learning. Adv. Robot. 2019, 33, 1156–1174.
  17. Tamei, T.; Matsubara, T.; Rai, A.; Shibata, T. Reinforcement learning of clothing assistance with a dual-arm robot. In Proceedings of the 2011 11th IEEE-RAS International Conference on Humanoid Robots, Bled, Slovenia, 26–28 October 2011; pp. 733–738.
  18. Erickson, Z.; Clever, H.M.; Turk, G.; Liu, C.K.; Kemp, C.C. Deep Haptic Model Predictive Control for Robot-Assisted Dressing. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 1–8.
  19. King, C.H.; Chen, T.L.; Jain, A.; Kemp, C.C. Towards an Assistive Robot That Autonomously Performs Bed Baths for Patient Hygiene. In Proceedings of the IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 319–324.
  20. Clothes Perception and Manipulation (CloPeMa)—HOME—CloPeMa—Clothes Perception and Manipulation. Available online: http://clopemaweb.felk.cvut.cz/clothes-perception-and-manipulation-clopema-home/ (accessed on 17 May 2021).
  21. Balaguer, B.; Carpin, S. Combining Imitation and Reinforcement Learning to Fold Deformable Planar Objects. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 1405–1412.
  22. Sahari, K.S.M.; Seki, H.; Kamiya, Y.; Hikizu, M. Clothes Manipulation by Robot Grippers with Roller Fingertips. Adv. Robot. 2010, 24, 139–158.
  23. Taylor, P.M.; Taylor, G.E. Sensory robotic assembly of apparel at Hull University. J. Intell. Robot. Syst. 1992, 6, 81–94.
  24. Kolluru, R.; Valavanis, K.P.; Steward, A.; Sonnier, M.J. A flat surface robotic gripper for handling limp material. IEEE Robot. Autom. Mag. 1995, 2, 19–26.
  25. Kolluru, R.; Valavanis, K.P.; Hebert, T.M. A robotic gripper system for limp material manipulation: Modeling, analysis and performance evaluation. In Proceedings of the 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C), Albuquerque, NM, USA, 25 April 1997; Volume 1, pp. 310–316.
  26. Hebert, T.; Valavanis, K.; Kolluru, R. A robotic gripper system for limp material manipulation: Hardware and software development and integration. In Proceedings of the 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C), Albuquerque, NM, USA, 25 April 1997; Volume 1, pp. 15–21.
  27. Doulgeri, Z.; Fahantidis, N. Picking up flexible pieces out of a bundle. IEEE Robot. Autom. Mag. 2002, 9, 9–19.
  28. Schrimpf, J.; Wetterwald, L.E. Experiments towards automated sewing with a multi-robot system. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 5258–5263.
  29. Kelley, R.B. 2D Vision Techniques for the Handling of Limp Materials. In Sensory Robotics for the Handling of Limp Materials; Springer Science and Business Media LLC: Cham, Switzerland, 1990; pp. 141–157.
  30. Triantafyllou, D.; Koustoumpardis, P.; Aspragathos, N. Type independent hierarchical analysis for the recognition of folded garments’ configuration. Intell. Serv. Robot. 2021, 14, 427–444.
  31. Doumanoglou, A.; Kargakos, A.; Kim, T.-K.; Malassiotis, S. Autonomous active recognition and unfolding of clothes using random decision forests and probabilistic planning. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 987–993.
  32. Platt, R.; Permenter, F.; Pfeiffer, J. Using Bayesian Filtering to Localize Flexible Materials during Manipulation. IEEE Trans. Robot. 2011, 27, 586–598.
  33. Koustoumpardis, P.N.; Nastos, K.X.; Aspragathos, N.A. Underactuated 3-Finger Robotic Gripper for Grasping Fabrics. In Proceedings of the 2014 23rd International Conference on Robotics in Alpe-Adria-Danube Region (RAAD), Smolenice, Slovakia, 3–5 September 2014.
  34. Kondratas, A. Robotic Gripping Device for Garment Handling Operations and Its Adaptive Control. Fibres Text. East. Eur. 2005, 13, 84–89.
  35. Paraschidis, K.; Fahantidis, N.; Vassiliadis, V.; Petridis, V.; Doulgeri, Z.; Petrou, L.; Hasapis, G. A robotic system for handling textile materials. In Proceedings of the 1995 IEEE International Conference on Robotics and Automation, Nagoya, Japan, 21–27 May 1995; Volume 2, pp. 1769–1774.
  36. Yamazaki, K.; Inaba, M. Clothing Classification Using Image Features Derived from Clothing Fabrics, Wrinkles and Cloth Overlaps. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 2710–2717.
  37. Bouman, K.L.; Xiao, B.; Battaglia, P.; Freeman, W.T. Estimating the Material Properties of Fabric from Video. In Proceedings of the IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013.
More
This entry is offline, you can click here to edit this entry!
ScholarVision Creations