Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 + 3130 word(s) 3130 2021-01-06 07:13:59 |
2 format correct Meta information modification 3130 2021-01-07 10:43:17 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Zona, A. Vision-Based Vibration Monitoring. Encyclopedia. Available online: https://encyclopedia.pub/entry/6166 (accessed on 29 March 2024).
Zona A. Vision-Based Vibration Monitoring. Encyclopedia. Available at: https://encyclopedia.pub/entry/6166. Accessed March 29, 2024.
Zona, Alessandro. "Vision-Based Vibration Monitoring" Encyclopedia, https://encyclopedia.pub/entry/6166 (accessed March 29, 2024).
Zona, A. (2021, January 07). Vision-Based Vibration Monitoring. In Encyclopedia. https://encyclopedia.pub/entry/6166
Zona, Alessandro. "Vision-Based Vibration Monitoring." Encyclopedia. Web. 07 January, 2021.
Vision-Based Vibration Monitoring
Edit

Contactless structural monitoring has in recent years seen a growing number of applications in civil engineering. Indeed, the elimination of physical installations of sensors is very attractive, especially for structures that might not be easily or safely accessible, yet requiring the experimental evaluation of their conditions, for example following extreme events such as strong earthquakes, explosions, and floods. Among contactless technologies, vision-based monitoring is possibly the solution that has attracted most of the interest of civil engineers, given that the advantages of contactless monitoring can be potentially obtained thorough simple and low-cost consumer-grade instrumentations.

experimental modal analysis operational modal analysis structural monitoring structural vibrations vision-based monitoring

1. Brief Overview of Vision-Based Monitoring Systems

1.1. Monitoring Process

A vision-based system could consist of a set of video cameras connected to a computer installed with software having real-time processing capacity of the acquired images, or could be made by a set of video cameras whose recordings are only acquired during monitoring and later processed. Depending on the distance between the cameras and the structure to be monitored, appropriate lenses must be selected to obtain images with adequate resolution, indispensable to track the motion of the selected targets with sufficient accuracy, e.g., [1][2][3][4]. Lighting lamps could be added for conducting measurements in positions with scarce illumination or even at night.

The monitoring process roughly consists of the following phases: (1) installation, i.e., the video cameras equipped with the selected lenses are placed on tripods in the most convenient locations, connected to the computer and synchronized; for each video camera the targets to be tracked are set (depending on post-processing procedures, they could be, for example, applied markers or existing textures in the structure surface); (2) calibration, i.e., the relationship between the pixel coordinates and the physical coordinates is obtained, usually based on known physical dimension on the object surface and its corresponding image dimension in pixels; and (3) video acquisition and processing, i.e., the videos are recorded and the motion of each target is tracked in the image sequences; as a result, the displacement time history is given as output. A schematic representation of this simple flowchart is depicted in Figure 1, with the sources of errors and uncertainties discussed in the following paragraph.

Figure 1. Diagram of the vision-based monitoring and relations with the sources of errors and uncertainties.

1.2. Errors and Uncertainties

Differently from other measurement approaches where the accuracy of the employed sensors/systems is provided by their manufacturers and generally remains stable within assigned operational conditions during a given calibration time span, the accuracy of vision-based systems cannot be related solely to the technical specifications of the video cameras. The accuracy determination in vision-based monitoring is a rather complex problem as it depends on a multifaceted combination and interaction of different parameters. The sources of errors and uncertainties in vision-based monitoring can be subdivided in three groups: (1) intrinsic to the monitoring hardware, e.g., optical distortions and aberrations in the lenses, limitations in the resolution, and performance of the sensor of the video camera; (2) relevant to the software and calibration/synchronization process, e.g., limitations in the motion tracking algorithm, synchronization lags among cameras, and round-offs in camera calibration; and (3) environmental, e.g., influence of the location where the camera is installed, vibrations induced in the camera-tripod system, variable ambient light, and non-uniform air refraction due to variable temperatures between installed cameras and the structure being monitored. These sources inevitably influence each other, for example, the resolution of the hardware influences the precision that can be achieved in the calibration, which is in turn influenced by the environmental conditions. The scheme depicted in Figure 1 summarizes the possible interactions between the three phases of the vision-based monitoring process and the sources of errors and uncertainties.

Investment can be made in the hardware (high quality cameras and lenses), in up-to-date software, in efforts to access the most favorable locations for camera installation, and in accurate controls of the calibration and synchronization. Nevertheless, the variability of the environmental parameters might still jeopardize the quality of the results; this is a concern especially for long-term field monitoring as required in structural health monitoring, which faces large variations in ambient light, temperature, humidity, wind, and other possible interferences inducing vibrations in the cameras. As a consequence, these sources of errors and uncertainties have a larger impact on vision-based monitoring as compared to the case of conventional monitoring procedures when sensors are in direct contact with the object being monitored.

2. Recent Field Applications of Vision-Based Vibration Monitoring in Civil Engineering

2.1. General Overview

Many published works presenting applications of vision-based monitoring in civil engineering can be found in the technical literature. Contributions (only refereed journal articles are here considered) can be organized in three areas of monitoring applications: (1) measurements of displacements and strains under static and quasi-static loadings [5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21][22][23][24][25][26][27]; (2) measurements of displacement time histories in prototypes or small-scale structures in controlled environmental conditions, typically in a laboratory, [28][29][30][31][32][33][34][35][36][37][38][39][40][41][42][43][44][45][46][47][48][49][50][51][52][53][54][55][56][57][58][59]; (3) field measurements of displacement time histories in full scale structures [60][61][62][63][64][65][66][67][68][69][70][71][72][73][74][75][76][77][78][79][80][81][82][83][84][85][86][87][88][89][90][91][92][93][94][95][96][97]; (4) development of sensors using vision-based techniques [98][99][100][101][102][103][104]; and (5) field measurements of moving components, as in the case of wind turbines, e.g., [105][106][107][108]. Such a subdivision is made regardless of the adopted vision-based techniques and image processing algorithms. It should be remarked that overlaps exist between these monitoring applications, as in some cases, there are publications that, prior to field testing, illustrate preliminary laboratory validations. Hence, the proposed subdivision should be considered on the basis of the main contribution provided.

Attention in this review article is given to the analysis of recent results obtained in vibration (displacement time histories) monitoring of civil engineering structures and infrastructures in the field, as documented in refereed journal articles published in the last four years [80][81][82][83][84][85][86][87][88][89][90][91][92][93][94][95][96][97]. The results presented are subdivided into six structural groups: steel bridges, steel footbridges, steel structures for sport stadiums, reinforced concrete structures, masonry structures, and timber footbridge. For each field study, a short description of the monitored structure is provided, with a summary of the main information and conclusions provided in the publication. A list of the considered applications is reported in Table 1; it is observed that half of them are in the U.S.A. and that bridges/footbridges are the most recurring structures.

Table 1. Recent field applications of vision-based vibration monitoring in civil engineering.

Group Structure Country Authors and Reference
Steel bridges Suspension bridge U.S.A. Feng and Feng [84]
Truss with vertical lift U.S.A. Chen et al. [86]
Skew girder U.K. Xu et al. [91]
Steel footbridges Cable-stayed bridge U.K. Xu et al. [90]
Suspension bridge North Ireland Lydon et al. [93]
Suspension bridge U.S.A. Hoskere et al. [94]
Vertical truss frames U.S.A. Dong et al. [96]
Steel structures for sport stadiums Grandstands U.S.A. Khuc and Catbas [81][82][95]
Superstructure cables U.S.A. Feng et al. [83]
Reinforced concrete structures Deck on arch footbridge U.S.A. Shariati and Schumacher [80]
Five-story building U.S.A. Harvey and Elisha [87]
Beam-slab bridge North Ireland Lydon et al. [93]
Masonry structures Heritage ruins and arch bridge Italy Fioriti et al. [88]
Arch bridge U.K. Acikgoz et al. [89]
Arch bridge Australia Dhanasekar et al. [92]
Timber footbridge Deck-stiffened arch Greece Fradelos et al. [97]

For each reference, some essential information on the adopted hardware is provided in Table 2, alongside video processing (optical flow, template matching, feature matching, motion magnification, and proprietary commercial software), loading condition during monitoring, as well as comparisons with monitoring using other technologies. In this way, Table 1 and Table 2 are supposed to serve as a guide to the following paragraphs, each dedicated to one of the six structural groups, presented in the same order used in the tables.

Table 2. Essential information on adopted hardware, video processing, loading conditions during monitoring, and comparison with other monitoring technologies.

Reference Camera, Pixel Resolution, and Frame Rate (FPS) Video Processing Algorithm Loading Condition during Monitoring Comparisons with Other Monitoring Technologies
[84] Point Grey, 1280 × 1024, 10 Template mat. Passage of subway trains No direct, GPS, and radar
[86] Point Grey, 800 × 600, 30 Optical flow Lift impact, normal traffic Accelerom., strain gauges
[91] Go Pro, 1920 × 1080, 25
Imetrum, 2048 × 1088, 30
Template mat. Imetrum [109] Passage of trains Low cost and high-end vision-based, accelerometers
[90] Go Pro, 1920 × 1080, 30 Template mat. Crowd of pedestrians Wireless accelerometers
[93] Go Pro, 1920 × 1080, 25 Template mat. Crowd of pedestrians Accelerometers
[94] DJI 3840 × 2160, 30 Optical flow Walk, running, jumping Accelerometers
[96] Low cost, 1920 × 1080, 60 Feature mat. Walk, running, jumping Accelerometers
[81][82][95] Canon, N/A, 30 and 60 Feature mat. Crowd during game Accelerom., displ. transd.
[83] Point Grey, 1280 × 1024, 50 Template mat. Operational, shaken Load cell
[80] Canon, N/A, 60 Motion magn. Pedestrian jumping No direct, vision-based
[87] N/A, 1056 × 720, 25 Feature mat. Outdoor shake table Accelerometers
[93] Go Pro, 1920 × 1080, 25 Template mat. Normal vehicular traffic No direct, integr. fiber optics
[88] N/A Motion magn. Tram vibrations, wind Velocimeters
[89] Imetrum, N/A, 50 Imetrum [109] Passage of trains Fiber optics
[92] Sony, 1936 × 1216, 50 Dantec [110] Passage of trains No direct, numerical
[97] Low cost, 1920 × 1080, 30 Optical flow Group of pedestrians Accelerom., GPS, theodolite

It is anticipated that comparisons in all cases provided good correlations between vision-based monitoring and the other considered technologies, with one exception being the steel footbridge (vertical truss frames) tested by Dong et al. [96], where differences between accelerometers and vision-based measurements were not negligible. It should be remarked that, in four cases, no direct comparisons were made: Shariati and Schumacher [80], as well as Feng and Feng [84], compared the magnitudes of the measurements to those obtained in previous tests, concluding that such comparisons were favorable; in Dhanasekar et al. [92], the outcomes of the experimental monitoring were satisfactory compared with numerical simulations in terms of magnitude of the monitored structural parameters; and in Lydon et al. [93], vision-based monitoring was part of an integrated monitoring system that included fiber optics with the objectives to complement the two systems.

2.2. Steel Bridges

Feng and Feng [84] presented the outcomes of vision-based field monitoring of the Manhattan Bridge (New York, NY, USA) using a single camera for remote real-time displacement measurements at one single point and simultaneously at multiple points. The Manhattan Bridge, opened to traffic in 1909, is a suspension bridge spanning the East River in New York City, connecting Manhattan and Brooklyn; the main span is 448 m long; the deck is 36.5 m wide, including seven lanes in total and four subway lanes. The camera was placed on stable stone steps around 300 m away from the bridge mid-span and the video recording was made using a frame rate of 10 FPS. The known dimensions (7.2 m) of the vertical trusses were used for camera calibration. Displacement responses at one single point at the mid-span region were measured during the passage of subway trains, having estimated the scale factor as 20.5 mm/pixel. The authors commented that the dynamic displacement response was similar to that measured by GPS and interferometric radar systems in previous studies. Then, by zooming out the lens to obtain a large field of view (FOV), i.e., the area that is visible in the image, three points at the mid-span region were selected and a scaling factor of about 36 mm/pixel was estimated. The authors commented that such measures displayed more fluctuations, especially for small displacement amplitudes, as a consequence of the larger FOV, determining a decreased measurement resolution compared with the single point case. In addition, the authors studied the influence of the camera vibration during the field measurements. Such a test was conducted by looking at a building in the background and tracking its apparent motion; the camera motion was estimated with the assumption that the building was not moving. The authors concluded that, compared with the bridge displacement, the camera motion was insignificant.

2.3. Steel Footbridges

Xu et al. [90] illustrated the activities for field vision-based monitoring of the Baker Bridge, a cable-stayed footbridge spanning 109 m over the A379 dual-carriageway in Exeter (UK). The bridge provides cyclist and pedestrian access to the Sandy Park Stadium and experiences heavy pedestrian traffic on match days. The bridge comprises a single A-shaped tower that supports the continuous steel deck over a simple support at the pylon cross-beam and via seven pairs of stay cables. Because of the range of frequencies of its first vibration modes, the bridge is prone to noticeable vibration response owing to pedestrian traffic. A consumer-grade camera was mounted on the top of a tripod at the central reservation of the A379 carriageway below and approximately 55.30 m from the bridge tower. Video recording was done at 30 FPS. Camera calibration was set using the known structural dimensions from the as-built drawings, using a narrow FOV setting. Four triaxial wireless accelerometers were installed in the bridge deck to validate the results obtained from processing the images acquired by the video camera. The monitoring of the bridge included periods when large crowds of spectators crossed the deck. The results in terms of identified modal frequencies of the bridge deck as obtained from vision-based monitoring accurately matched those obtained for the contact accelerometers. In addition, measurements of cable vibration using the vision-based system were performed and compared to the results from two triaxial wireless accelerometers installed on the cables. The authors concluded that the vision-based system works better to capture the lower modal frequencies of cables, whereas the accelerometers provide reliable estimations of higher frequency modes. However, the multipoint deformation data obtained using the vision system proved to be effective for tracking cable dynamic properties at the same time as bridge deformation, allowing for the effect of varying load on cable tensions to be observed. In this way, a powerful diagnostic capability for larger cable-supported structures was achieved.

2.4. Steel Structures for Sport Stadiums

Khuc and Catbas [81][82][95] illustrated a campaign of field vision-based monitoring of the steel superstructures of a football stadium in the USA with approximately 45,000 seating capacity that exhibited considerable vibration levels, especially at the sections of the highly active local team supporters. The vision-based method and framework as implemented by the authors was verified under different experimental conditions including altering light conditions, different camera locations (distances and angles), and camera frame rates (30 and 60 FPS). Specifically, a beam under the grandstand was selected for monitoring predetermined measurement points. A displacement potentiometer and an accelerometer were installed for comparative purposes. The contact sensors and camera recorded the structural vibrations synchronously during periods of intense crowd motion throughout football games. The authors concluded that the results from vision-based measurements were consistent with those from contact measurements and the first three operational modal frequencies under a human jumping load were almost the same. In addition, the authors commented that, although quite accurate results for defined measurement ranges and conditions could be achieved through a completely non-contact vision-based implementation with low-cost hardware, some issues such as data storage requirement for clips and images, processing time for image data, and limitation for horizontal displacement measurement needed to be addressed in future developments.

2.5. Reinforced Concrete Structures

Shariati and Schumacher [80] documented the field vision-based monitoring of the Streicker Bridge, a footbridge in the Princeton University campus (New Jersey, USA) with a straight main deck section supported by a steel truss system underneath and four curved ramps leading up to the straight sections. Structurally, the main span is a deck-stiffened arch and the legs are curved continuous girders supported by steel columns. The legs are horizontally curved and the shape of the main span follows this curvature. The arch and columns are weathering steel, while the main deck and legs are made of reinforced post-tensioned concrete. A consumer-grade camera with a zoom lens was used to acquire a 60 FPS video of one of the ramps while a number of volunteers jumped up and down on it. A target mounted on the edge of the bridge slab was used to track displacement time histories. Such a target was set up by a research team from Columbia University that also investigated the same footbridge with their own video-based monitoring system [99] a few years earlier. In addition, the Streicker Bridge was equipped with two fiber-optic sensing technologies, i.e., discrete long-gauge sensing, based on fiber Bragg-Gratings, and truly-distributed sensing, based on Brillouin optical time domain analysis; both sensors were embedded in concrete during the construction. The natural frequencies obtained by the authors in their tests were found to be the same as those measured by the fiber-optic measurement system and by the other vision-based method in [99]. In addition to the frequency contents, the two vision-based measurements gave comparable amplitude of displacements, showing the replicability of the obtained results.

2.6. Masonry Structures

Fioriti et al. [88] presented monitoring of two cultural heritage constructions in Italy, i.e., the temple of Minerva Medica, a ruined nymphaeum of the ancient Imperial Rome, and Ponte delle Torri in Spoleto, an aqueduct and pedestrian bridge with multiple arches having a total length of 230 m and piers of height up to 80 m, completed in the Middle Ages and possibly built over Roman ruins. The Minerva Medica ruins are very close to a tramway producing strong vibrations whose effects were clearly evident in the video taken using a low-cost consumer grade camera at a distance of 9 m. Modal analysis by motion magnification of the field video recordings was performed and compared to the results obtained through conventional contact velocimeters; the differences were limited to just a few percentage points. Satisfactory results were also achieved for the Ponte delle Torri, despite the small level of structural excitation due to the wind action and the low resolution of the adopted video cameras. The authors commented that such results constituted a remarkable starting point for future experimentations and improvements. Indeed, monitoring the ambient vibration of a massive multiple-arch masonry structure under normal conditions through vision-based monitoring appears to be a major successful case study, considering the oppositions often found in installing contact sensors in cultural heritage.

2.7. Timber Footbridge

Fradelos et al. [97] illustrated the field vision-based monitoring of the Kanellopoulos timber arch footbridge (Patras, Greece), 30 m long and 2.9 m wide, made of glulam wood and metallic elements. The omission of X-bracing below the deck and poor construction of the metal X-bracing at its roof made the footbridge prone to lateral oscillations. The bridge was monitored using satellite systems, robotic theodolites, and accelerometers. Videos were made during testing using common low-cost cameras without the initial intention for vision-based monitoring. Such video recordings were later examined and used to try to estimate the dynamic horizontal deflections of specific points of the footbridge. It was shown that the analysis of low-cost video images using a simple approximate technique permitted the reconstruction of the movements of the bridge and the computation of some of its structural characteristics. This result was possible under ideal conditions: the movement was two-dimensional, displacements of the selected target points were characterized by a signal exceeding the pixel resolution, the camera was in a fixed position and the video image covered stable points defining a reference system, and structural elements near the selected target points allowed to scale the photo in the two examined axes. As a result, the first lateral natural frequency of the footbridge obtained from video processing differed by less than 2% from that estimated using accelerometers and geodetic sensors.

References

  1. Sutton, M.A.; Orteu, J.J.; Schreier, H.W. Image Correlation for Shape, Motion and Deformation Measurements: Basic Concepts, Theory and Applications, 1st ed.; Springer: New York, NY, USA, 2009; pp. 1–316.
  2. Kohut, P.; Holak, K. Vision-Based Monitoring System. In Advanced Structural Damage Detection, 1st ed.; Stepinski, T., Uhl, T., Staszewski, W., Eds.; Wiley: Chichester, UK, 2013; pp. 279–320.
  3. Feng, D.; Feng, M.Q. Computer vision for SHM of civil infrastructure: From dynamic response measurement to damage detection—A review. Eng. Struct. 2018, 156, 105–117.
  4. Dong, C.Z.; Catbas, F.N. A review of computer vision–based structural health monitoring at local and global levels. Struct. Health Monit. 2020. in print.
  5. Sutton, M.A.; Wolters, W.J.; Peters, W.H.; Ranson, W.F.; McNeill, S.R. Determination of displacements using an improved digital correlation method. Image Vision Comput. 1983, 1, 133–139.
  6. Sutton, M.A.; Cheng, M.; Peters, W.H.; Chao, Y.J.; McNeill, S.R. Application of an optimized digital correlation method to planar deformation analysis. Image Vision Comput. 1986, 4, 143–150.
  7. Lee, J.J.; Shinozuka, M. Real-time displacement measurement of a flexible bridge using digital image processing techniques. Exp. Mech. 2006, 46, 105–114.
  8. Yoneyama, S.; Kitagawa, A.; Iwata, S.; Tani, K.; Kikuta, H. Bridge deflection measurement using digital image correlation. Exp. Tech. 2007, 31, 34–40.
  9. Park, J.W.; Lee, J.J.; Jung, H.J.; Myung, H. Vision-based displacement measurement method for high-rise building structures using partitioning approach. NDT E Int. 2010, 43, 642–647.
  10. Peddle, J.; Goudreau, A.; Carlson, E.; Santini-Bell, E. Bridge displacement measurement through digital image correlation. Bridge Struct. 2011, 7, 165–173.
  11. Sładek, J.; Ostrowska, K.; Kohut, P.; Holak, K.; Gaska, A.; Uhl, T. Development of a vision based deflection measurement system and its accuracy assessment. Measurement 2013, 46, 1237–1249.
  12. Park, S.W.; Park, H.S.; Kim, J.H.; Adeli, H. 3D displacement measurement model for health monitoring of structures using a motion capture system. Measurement 2015, 59, 352–362.
  13. Quan, C.; Tay, C.J.; Sun, W.; He, X. Determination of three-dimensional displacement using two-dimensional digital image correlation. Appl. Opt. 2008, 47, 583–593.
  14. Yoneyama, S.; Ueda, H. Bridge deflection measurement using digital image correlation with camera movement correction. Mater. Trans. 2012, 53, 285–290.
  15. Hoult, N.A.; Take, W.A.; Lee, C.; Dutton, M. Experimental accuracy of two dimensional strain measurements using digital image correlation. Eng. Struct. 2013, 46, 718–726.
  16. Gencturk, B.; Hossain, K.; Kapadia, A.; Labib, E.; Mo, Y.L. Use of digital image correlation technique in full-scale testing of prestressed concrete structures. Measurement 2014, 7, 505–515.
  17. Ghorbani, R.; Matta, F.; Sutton, M.A. Full-field deformation measurement and crack mapping on confined masonry walls using digital image correlation. Exp. Mech. 2015, 55, 227–243.
  18. Almeida Santos, C.; Oliveira Costa, C.; Batista, J. A vision-based system for measuring the displacements of large structures: Simultaneous adaptive calibration and full motion estimation. Mech. Syst. Signal Process. 2016, 72–73, 678–694.
  19. Shan, B.; Wang, L.; Huo, X.; Yuan, W.; Xue, Z. A bridge deflection monitoring system based on CCD. Adv. Mater. Sci. Eng. 2016, 4857373.
  20. Pan, B.; Tian, L.; Song, X. Real-time, non-contact and targetless measurement of vertical deflection of bridges using off-axis digital image correlation. NDT E Int. 2016, 79, 73–80.
  21. Lee, J.; Lee, K.C.; Cho, S.; Sim, S.H. Computer vision-based structural displacement measurement robust to light-induced image degradation for in-service bridges. Sensors 2017, 17, 2317.
  22. Park, J.W.; Moon, D.S.; Yoon, H.; Gomez, F.; Spencer, B.F.; Kim, J.R. Visual-inertial displacement sensing using data fusion of vision-based displacement with acceleration. Struct. Control Health Monit. 2018, 25, e2122.
  23. Alipour, M.; Washlesky, S.J.; Harris, D.K. Field deployment and laboratory evaluation of 2D digital image correlation for deflection sensing in complex environments. J. Bridge Eng. 2019, 24, 04019010.
  24. Carmo, R.N.F.; Valença, J.; Bencardino, F.; Cristofaro, S.; Chiera, D. Assessment of plastic rotation and applied load in reinforced concrete, steel and timber beams using image-based analysis. Eng. Struct. 2019, 198, 109519.
  25. Halding, P.S.; Christensen, C.O.; Schmidt, J.W. Surface rotation correction and strain precision of wide-angle 2D DIC for field use. J. Bridge Eng. 2019, 24, 04019008.
  26. Lee, J.; Lee, K.C.; Jeong, S.; Lee, Y.J.; Sim, S.H. Long-term displacement measurement of full-scale bridges using camera ego-motion compensation. Mech. Syst. Signal Process. 2020, 140, 106651.
  27. Dong, C.Z.; Celik, O.; Catbas, F.N.; O’Brien, E.J.; Taylor, S. Structural displacement monitoring using deep learning-based full field optical flow methods. Struct. Infrastruct. Eng. 2020, 16, 51–71.
  28. Schmidt, T.; Tyson, J.; Galanulis, K. Full-field dynamic displacement and strain measurement using advanced 3d image correlation photogrammetry: Part 1. Exp. Tech. 2003, 27, 47–50.
  29. Chang, C.C.; Ji, Y.F. Flexible videogrammetric technique for three-dimensional structural vibration measurement. J. Eng. Mech. 2007, 133, 656–664.
  30. Jurjo, D.L.B.R.; Magluta, C.; Roitman, N.; Gonçalves, P.B. Experimental methodology for the dynamic analysis of slender structures based on digital image processing techniques. Mech. Syst. Signal Process. 2010, 24, 1369–1382.
  31. Choi, H.S.; Cheung, J.H.; Kim, S.H.; Ahn, J.H. Structural dynamic displacement vision system using digital image processing. NDT E Int. 2011, 44, 597–608.
  32. Yang, Y.S.; Huang, C.W.; Wu, C.L. A simple image-based strain measurement method for measuring the strain fields in an RC-wall experiment. Earthq. Eng. Struct. Dyn. 2012, 41, 1–17.
  33. Wang, W.; Mottershead, J.E.; Siebert, T.; Pipino, A. Frequency response functions of shape features from full-field vibration measurements using digital image correlation. Mech. Syst. Signal Process. 2012, 28, 333–347.
  34. Mas, D.; Espinosa, J.; Roig, A.B.; Ferrer, B.; Perez, J.; Illueca, C. Measurement of wide frequency range structural microvibrations with a pocket digital camera and sub-pixel techniques. Appl. Opt. 2012, 51, 2664–2671.
  35. Wu, L.J.; Casciati, F.; Casciati, S. Dynamic testing of a laboratory model via vision-based sensing. Eng. Struct. 2014, 60, 113–125.
  36. Feng, D.M.; Feng, M.Q. Vision-based multi-point displacement measurement for structural health monitoring. Struct. Control Health Monit. 2015, 23, 876–890.
  37. Chen, J.G.; Wadhwa, N.; Cha, Y.J.; Durand, F.; Freeman, W.T.; Buyukozturk, O. Modal identification of simple structures with high-speed video using motion magnification. J. Sound Vib. 2015, 345, 58–71.
  38. Oh, B.K.; Hwang, J.W.; Kim, Y.; Cho, T.; Park, H.S. Vision-based system identification technique for building structures using a motion capture system. J. Sound Vib. 2015, 356, 72–85.
  39. Lei, X.; Jin, Y.; Guo, J.; Zhu, C.A. Vibration extraction based on fast NCC algorithm and high-speed camera. Appl. Opt. 2015, 54, 8198–8206.
  40. Zheng, F.; Shao, L.; Racic, V.; Brownjohn, J. Measuring human-induced vibrations of civil engineering structures via vision-based motion tracking. Measurement 2016, 83, 44–56.
  41. McCarthy, D.M.J.; Chandler, J.H.; Palmieri, A. Monitoring 3D vibrations in structures using high-resolution blurred imagery. Photogramm. Rec. 2016, 31, 304–324.
  42. Yoon, H.; Elanwar, H.; Choi, H.; Golparvar-Fard, M.; Spencer, B.F. Target-free approach for vision-based structural system identification using consumer-grade cameras. Struct. Control Health Monit. 2016, 23, 1405–1416.
  43. Mas, D.; Ferrer, B.; Acevedo, P.; Espinosa, J. Methods and algorithms for video-based multi-point frequency measuring and mapping. Measurement 2016, 85, 164–174.
  44. Poozesh, P.; Sarrafi, A.; Mao, Z.; Avitabile, P.; Niezrecki, C. Feasibility of extracting operating shapes using phase-based motion magnification technique and stereo-photogrammetry. J. Sound Vib. 2017, 407, 350–366.
  45. Khuc, T.; Catbas, F.N. Structural identification using computer vision–based bridge health monitoring. J. Struct. Eng. 2018, 144, 04017202.
  46. Yang, Y.; Dorn, C.; Mancini, T.; Talken, Z.; Nagarajaiah, S.; Kenyon, G.; Farrar, C.; Mascareñas, D. Blind identification of full-field vibration modes of output-only structures from uniformly-sampled, possibly temporally-aliased (sub-Nyquist), video measurements. J. Sound Vib. 2017, 390, 232–256.
  47. Feng, D.; Feng, M.Q. Identification of structural stiffness and excitation forces in time domain using noncontact vision-based displacement measurement. J. Sound Vib. 2017, 406, 15–28.
  48. Cha, Y.J.; Chen, J.G.; Büyüköztürk, O. Output-only computer vision based damage detection using phase-based optical flow and unscented Kalman filters. Eng. Struct. 2017, 132, 300–313.
  49. Javh, J.; Slavič, J.; Boltežar, M. The subpixel resolution of optical-flow-based modal analysis. Mech. Syst. Signal Process. 2017, 88, 89–99.
  50. Xu, F. Accurate measurement of structural vibration based on digital image processing technology. Concurr. Comput. Pract. Exp. 2019, 31, e4767.
  51. Dong, C.Z.; Ye, X.W.; Jin, T. Identification of structural dynamic characteristics based on machine vision technology. Measurement 2018, 126, 405–416.
  52. Guo, J.; Jiao, J.; Fujita, K.; Takewaki, I. Damage identification for frame structures using vision-based measurement. Eng. Struct. 2019, 199, 109634.
  53. Hosseinzadeh, A.Z.; Harvey, P.S. Pixel-based operating modes from surveillance videos for structural vibration monitoring: A preliminary experimental study. Measurement 2019, 148, 106911.
  54. Kuddusa, M.A.; Lia, J.; Hao, H.; Lia, C.; Bi, K. Target-free vision-based technique for vibration measurements of structures subjected to out-of-plane movements. Eng. Struct. 2019, 190, 210–222.
  55. Durand-Texte, T.; Simonetto, E.; Durand, S.; Melon, M.; Moulet, M.H. Vibration measurement using a pseudo-stereo system, target tracking and vision methods. Mech. Syst. Signal Process. 2019, 118, 30–40.
  56. Civera, M.; Zanotti, F.L.; Surace, C. An experimental study of the feasibility of phase-based video magnification for damage detection and localisation in operational deflection shapes. Strain 2020, 56, e12336.
  57. Eick, B.A.; Narazaki, Y.; Smith, M.D.; Spencer, B.F. Vision-based monitoring of post-tensioned diagonals on miter lock gate. J. Struct. Eng. 2020, 146, 04020209.
  58. Lai, Z.; Alzugaray, I.; Chli, M.; Chatzi, E. Full-field structural monitoring using event cameras and physics-informed sparse identification. Mech. Syst. Signal Process. 2020, 145, 106905.
  59. Ngeljaratan, L.; Moustafa, M.A. Structural health monitoring and seismic response assessment of bridge structures using target-tracking digital image correlation. Eng. Struct. 2020, 213, 110551.
  60. Stephen, G.A.; Brownjohn, J.M.W.; Taylor, C.A. Measurements of static and dynamic displacement from visual monitoring of the Humber Bridge. Eng. Struct. 1993, 154, 197–208.
  61. Olaszek, P. Investigation of the dynamic characteristic of bridge structures using a computer vision method. Measurement 1999, 25, 227–236.
  62. Wahbeh, A.M.; Caffrey, J.P.; Masri, S.F. A vision-based approach for the direct measurement of displacements in vibrating systems. Smart Mater Struct 2003, 12, 785–794.
  63. Lee, J.J.; Shinozuka, M. A vision-based system for remote sensing of bridge displacement. NDT E Int. 2006, 39, 425–431.
  64. Ji, Y.F.; Chang, C.C. Nontarget image-based technique for small cable vibration measurement. J. Bridge Eng. 2008, 13, 34–42.
  65. Chang, C.C.; Xiao, X.H. An integrated visual-inertial technique for structural displacement and velocity measurement. Smart Struct. Syst. 2010, 6, 1025–1039.
  66. Fukuda, Y.; Feng, M.Q.; Shinozuka, M. Cost-effective vision-based system for monitoring dynamic response of civil engineering structures. Struct. Control Health Monit. 2010, 17, 918–936.
  67. Caetano, E.; Silva, S.; Bateira, J. A vision system for vibration monitoring of civil engineering structures. Exp. Tech. 2011, 4, 74–82.
  68. Mazzoleni, P.; Zappa, E. Vision-based estimation of vertical dynamic loading induced by jumping and bobbing crowds on civil structures. Mech. Syst. Signal Process. 2012, 33, 1–12.
  69. Ye, X.W.; Ni, Y.Q.; Wai, T.T.; Wong, K.Y.; Zhang, X.M.; Xu, F. A vision-based system for dynamic displacement measurement of long-span bridges: Algorithm and verification. Smart Struct. Syst. 2013, 12, 363–379.
  70. Kim, S.W.; Kim, N.S. Dynamic characteristics of suspension bridge hanger cables using digital image processing. NDT E Int. 2013, 59, 25–33.
  71. Kohut, P.; Holak, K.; Uhl, T.; Ortyl, Ł.; Owerko, T.; Kuras, P.; Kocierz, R. Monitoring of a civil structure’s state based on noncontact measurements. Struct. Health Monit. 2013, 12, 411–429.
  72. Ribeiro, D.; Calcada, R.; Ferreira, J.; Martins, T. Non-contact measurement of the dynamic displacement of railway bridges using an advanced video-based system. Eng. Struct. 2014, 75, 164–180.
  73. Busca, G.; Cigada, A.; Mazzoleni, P.; Zappa, E. Vibration monitoring of multiple bridge points by means of a unique vision-based measuring system. Exp. Mech. 2014, 54, 255–271.
  74. Feng, M.Q.; Fukuda, Y.; Feng, D.; Mizuta, M. Nontarget vision sensor for remote measurement of bridge dynamic response. J. Bridge Eng. 2015, 20, 04015023.
  75. Feng, D.; Feng, M. Model updating of railway bridge using in situ dynamic displacement measurement under trainloads. J. Bridge Eng. 2015, 20, 04015019.
  76. Bartilson, D.T.; Wieghaus, K.T.; Hurlebaus, S. Target-less computer vision for traffic signal structure vibration studies. Mech Syst Signal Process 2015, 60–61, 571–582.
  77. Ferrer, B.; Mas, D.; García-Santos, J.I.; Luzi, G. Parametric study of the errors obtained from the measurement of the oscillating movement of a bridge using image processing. J. Nondestruct. Eval. 2016, 35, 53.
  78. Guo, J.; Zhu, C. Dynamic displacement measurement of large-scale structures based on the Lucas–Kanade template tracking algorithm. Mech. Syst. Signal Process 2016, 66-67, 425–436.
  79. Ye, X.W.; Dong, C.Z.; Liu, T. Image-based structural dynamic displacement measurement using different multi-object tracking algorithms. Smart Struct. Syst. 2016, 17, 935–956.
  80. Shariati, A.; Schumacher, T. Eulerian-based virtual visual sensors to measure dynamic displacements of structures. Struct. Control. Health Monit. 2017, 24, e1977.
  81. Khuc, T.; Catbas, F.N. Completely contactless structural health monitoring of real-life structures using cameras and computer vision. Struct. Control Health Monit. 2017, 24, e1852.
  82. Khuc, T.; Catbas, F.N. Computer vision-based displacement and vibration monitoring without using physical target on structures. Struct. Infrastruct. Eng. 2017, 13, 505–516.
  83. Feng, D.; Scarangello, T.; Feng, M.Q.; Ye, Q. Cable tension force estimate using novel noncontact vision-based sensor. Measurement 2017, 99, 44–52.
  84. Feng, D.; Feng, M.Q. Experimental validation of cost-effective vision-based structural health monitoring. Mech. Syst. Signal Process. 2017, 88, 199–211.
  85. Chen, J.G.; Davis, A.; Wadhwa, N.; Durand, F.; Freeman, W.T.; Büyüköztürk, O. Video camera–based vibration measurement for civil infrastructure applications. J. Infrastruct. Syst. 2017, 23, B4016013-1.
  86. Chen, J.G.; Adams, T.M.; Sun, H.; Bell, E.S.; Büyüköztürk, O. Camera-based vibration measurement of the World War I Memorial Bridge in Portsmouth, New Hampshire. J. Struct. Eng. 2018, 144, 04018207.
  87. Harvey, P.S.; Elisha, G. Vision-based vibration monitoring using existing cameras installed within a building. Struct. Control Health Monit. 2018, 25, e2235.
  88. Fioriti, V.; Roselli, I.; Tatì, A.; Romano, R.; De Canio, G. Motion magnification analysis for structural monitoring of ancient constructions. Measurement 2018, 129, 375–380.
  89. Acikgoz, S.; DeJong, M.J.; Kechavarzi, C.; Soga, K. Dynamic response of a damaged masonry rail viaduct: Measurement and interpretation. Eng. Struct. 2018, 168, 544–558.
  90. Xu, Y.; Brownjohn, J.; Kong, D. A non-contact vision-based system for multipoint displacement monitoring in a cable-stayed footbridge. Struct Control Health Monit. 2018, 25, e2155.
  91. Xu, Y.; Brownjohn, J.M.W.; Huseynov, F. Accurate deformation monitoring on bridge structures using a cost-effective sensing system combined with a camera and accelerometers: Case study. J. Bridge Eng. 2019, 24, 05018014.
  92. Dhanasekar, M.; Prasad, P.; Dorji, J.; Zahra, T. Serviceability assessment of masonry arch bridges using digital image correlation. J. Bridge Eng. 2019, 24, 04018120.
  93. Lydon, D.; Lydon, M.; Taylor, S.; Martinez Del Rincon, J.; Hester, D.; Brownjohn, J. Development and field testing of a vision-based displacement system using a low cost wireless action camera. Mech. Syst. Signal Process. 2019, 121, 343–358.
  94. Hoskere, V.; Park, J.W.; Yoon, H.; Spencer, B.F. Vision-based modal survey of civil infrastructure using unmanned aerial vehicles. J. Struct. Eng. 2019, 145, 04019062.
  95. Dong, C.Z.; Celik, O.; Catbas, F.N. Marker-free monitoring of the grandstand structures and modal identification using computer vision methods. Struct. Health Monit. 2019, 18, 1491–1509.
  96. Dong, C.Z.; Bas, S.; Catbas, F.N. Investigation of vibration serviceability of a footbridge using computer vision-based methods. Eng. Struct. 2020, 224, 111224.
  97. Fradelos, Y.; Thalla, O.; Biliani, I.; Stiros, S. Study of lateral displacements and the natural frequency of a pedestrian bridge using low-cost cameras. Sensors 2020, 20, 3217.
  98. Fukuda, Y.; Feng, M.Q.; Narita, Y.; Kaneko, S.; Tanaka, T. Vision-based displacement sensor for monitoring dynamic response using robust object search algorithm. IEEE Sens. J. 2013, 13, 4725–4732.
  99. Feng, D.; Feng, M.Q.; Ozer, E.; Fukuda, Y. A vision-based sensor for noncontact structural displacement measurement. Sensors 2015, 15, 16557–16575.
  100. Zhang, D.; Guo, J.; Lei, X.; Zhu, C. A high-speed vision-based sensor for dynamic vibration analysis using fast motion extraction algorithms. Sensors 2016, 16, 572.
  101. Choi, I.; Kim, J.H.; Kim, D. A target-less vision-based displacement sensor based on image convex hull optimization for measuring the dynamic response of building structures. Sensors 2016, 16, 2085.
  102. Hu, Q.; He, S.; Wang, S.; Liu, Y.; Zhang, Z.; He, L.; Wang, F.; Cai, Q.; Shi, R.; Yang, Y. A high-speed target-free vision-based sensor for bus rapid transit viaduct vibration measurements using CMT and ORB algorithms. Sensors 2017, 17, 1305.
  103. Luo, L.; Feng, M.Q.; Wu, Z.Y. Robust vision sensor for multi-point displacement monitoring of bridges in the field. Eng. Struct. 2018, 163, 255–266.
  104. Erdogan, Y.S.; Ada, M. A computer-vision based vibration transducer scheme for structural health monitoring applications. Smart Mater. Struct. 2020, 29, 085007.
  105. Park, J.H.; Huynh, T.C.; Choi, S.H.; Kim, J.T. Vision-based technique for bolt-loosening detection in wind turbine tower. Wind. Struct. 2015, 21, 709–726.
  106. Poozesh, P.; Baqersad, J.; Niezrecki, C.; Avitabile, P.; Harvey, E.; Yarala, R. Large-area photogrammetry based testing of wind turbine blades. Mech. Syst. Signal Process. 2017, 86, 98–115.
  107. Sarrafi, A.; Mao, Z.; Niezrecki, C.; Poozesh, P. Vibration-based damage detection in wind turbine blades using phase-based motion estimation and motion magnification. J. Sound Vib. 2018, 421, 300–318.
  108. Poozesh, P.; Sabato, A.; Sarrafi, A.; Niezrecki, C.; Avitabile, P.; Yarala, R. Multicamera measurement system to evaluate the dynamic response of utility-scale wind turbine blades. Wind Energy 2020, 23, 1619–1639.
  109. IMETRUM Non-Contact Precision Measurement. Available online: https://www.imetrum.com/ (accessed on 29 October 2020).
  110. Dantec Dynamics, Laser Optical Measurements Systems and Sensors. Available online: https://www.dantecdynamics.com/ (accessed on 29 October 2020).
More
Information
Contributor MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register :
View Times: 846
Revisions: 2 times (View History)
Update Date: 07 Jan 2021
1000/1000