Virtual Environments for Visualizing Structural Health Monitoring Sensor Networks, Data, and Metadata
Abstract
:1. Introduction
2. Current Methodologies for Documentation and Communication
3. Case Study: Streicker Bridge
4. Tools and Software Used for VT/IM
- Metadata (structure)
- Technical images showing sensor location in cross-sectional, aerial, and side views;
- Diagram showing the post-tensioning profile of South-East Leg.
- Metadata (SHM system)
- Information box detailing the resolution, repeatability, typical gauge length, dynamic range, and maximum measured frequency of the strain sensors;
- Legend showing various types of sensors;
- Color-coding scheme that identifies function, malfunction, or disconnection of the sensors.
- Data (raw)
- Databases connected to the strain sensors showing the raw strain data over time;
- Databases connected to the sensing sheet showing the raw strain data for each strain sensor over time;
- Databases connected to the displacement sensors showing the raw displacement data over time.
- Data (analyzed)
- Graphs connected to the temperature sensors showing the relationship between temperature data and the time of day;
- Diagrams showing curvature and displacement graphs for South-East Leg.
- Diagram showing the pre-stressing force in South-East Leg.
5. Results and Discussion
5.1. Navigating the Interactive Interface
- A user can interact with a built-in map, driven by Google maps; here a user can see the different viewpoints available, select one, and be transported virtually to this location on the bridge (see Figure 5);
- A user can use built in “scene-connectors” to virtually “walk” from one view of the bridge to another; if a user is on one part of the bridge deck, they can move to an adjacent position along the deck by clicking on the appropriate “hotspot” in the virtual environment (see Figure 5);
- Last, a user can select where to navigate to through a drop-down menu. This allows a user to navigate to a specific location without having to know where it is on a map. Figure 5 illustrates these means of navigating the VT/IM environment.
5.2. Access to and Visualization of SHM Data and Metadata
5.3. Evaluation of VT/IM Performance
- The first scene (see Figure 8A) familiarizes the user with the interface and the virtual tour environment. There is a video of a user panning around the top of the bridge, viewing the different hotspots they can interact with (the black circles). Additionally, a user can see how the geographic map updates as the user navigates the space. The blue markers on the left-hand side illustrate to a user where navigation hotspots are and the white fan around that pin shows the current field of view to orient a user. In the first scene, a user interacts with two of the hotspots to navigate underneath the bridge.
- The second scene (see Figure 8B) is set under the bridge and shows a user where sensors are on the bridge and what types of sensors are there. Here three strain sensors can be seen with three accompanying temperature sensors. For each of the sensors, there is an information hotspot detailing the resolution, repeatability, typical gauge length, dynamic range, and maximum measurement frequency. Additionally, since this information was available, there is a picture of the cross section included, showing where in the cross section the sensor is located. The colors of the sensors in the scene correlate to their current function as indicated by the gauge legend in the lower right-hand corner: functioning, malfunctioning, disconnected. The legend describes to a user what the different sensors are in the tour (temperature sensor, Fiber Bragg-Grating discrete long-gauge, displacement, and a sensing sheet), what their icons look like, and what their colors indicate. If a user hovers over the sensor, they can get its name. By clicking on it, a user can interact with the data as an image or data file.
- The third scene is under the South-East leg of the bridge and displays a sensing sheet and displacement sensor (see Figure 8C). Again, a user can access the information from this sensor in either an embedded image or linked data file.
- The fourth scene is on the side of the bridge and provides a side view. Global, analyzed data such as curvature and displacement diagrams of the South-East leg, the post-tensioning tendon profile, and the prestressing force in the South-East leg are integrated into this scene (see Figure 8D).
- How easy was it to understand what the video shows?
- Does the video help to understand the SHM system and sensor network installed on the bridge?
- Does the video help assess the behavior/ functionality of the sensors on the bridge?
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Glisic, B.; Inaudi, D.; Cassanova, N. SHM process as perceived through 350 projects. SPIE Smart Struct. Mater. Nondestruct. Eval. Health Monit. 2010, 7648, 553–560. [Google Scholar] [CrossRef]
- Glisic, B.; Yarnold, M.; Moon, F.; Aktan, A. Advanced visualization and accessibility to heterogenous monitoring data. Comp.-Aided Civ. Infrastruct. Eng. 2014, 29, 382–398. [Google Scholar] [CrossRef]
- Colombo, A.B.; Bittencourt, T.N. Development of web interface for structural health monitoring data visualization and structural performance assessment. In Maintenance, Monitoring, Safetey, Risk, and Resilience of Bridge and Bridge Networks; CRC Press: Boca Raton, FL, USA, 2016; ISBN 978-1-138-02851-7. [Google Scholar]
- Canary Systems. MLWeb Internet Web Client. 2015. Available online: http://canarysystems.com/software/mlweb-internet-web-client-2/ (accessed on 15 January 2018).
- Datum Monitoring Systems Limited. End to End Data Management and Web Presentation. 2015. Available online: https://datum-group.com/products/platform-interactive/ (accessed on 15 January 2018).
- Smartec. DiView Software. 2017. Available online: https://smartec.ch/en/product/diview-software/ (accessed on 15 January 2018).
- Zonta, D.; Chiappini, A.; Chiasera, A.; Ferrari, M.; Pozzi, M.; Battisti, L.; Benedetti, M. Photonic crystals for monitoring fatigue phenomena in steel structures. SPIE Smart Struct. Mater. Nondestruct. Eval. Health Monit. 2009, 7292. [Google Scholar] [CrossRef]
- Sabra, K.; Srivastava, A.; Lanza di Scalea, F.; Bartoli, I.; Rizzo, P.; Conti, S. Structural health monitoring by extraction of coherent guided waves from diffuse fields. J. Acoust. Soc. Am. 2008, 123, 8–13. [Google Scholar] [CrossRef] [PubMed]
- Yao, Y.; Glisic, B. Detection of Steel Fatigue Cracks with Strain Sensing Sheets Based on Large Area Electronics. Sensors 2015, 15, 8088–8108. [Google Scholar] [CrossRef] [PubMed]
- Loh, K.; Hou, T.; Lynch, J.; Verma, N. Carbon nanotube sensing skins for spatial strain and impact damage identification. J. Nondestruct. Eval. 2009, 28, 9–25. [Google Scholar] [CrossRef]
- Yao, J.; Tjuatja, S.; Huang, H. Real time vibratory strain sensing using passive wireless antenna sensor. IEEE Sens. J. 2015, 15, 4338–4345. [Google Scholar] [CrossRef]
- Linderman, L.; Jo, H.; Spencer, B. Low latency data acquisition hardware for real-time wireless sensor applications. IEEE Sens. J. 2015, 15, 1800–1809. [Google Scholar] [CrossRef]
- DiGiampaolo, E.; DiCarlofelice, A.; Gregori, A. An RFID-enabled wireless strain gauge sensor for static and dynamic structural monitoring. IEEE Sens. J. 2017, 17, 286–294. [Google Scholar] [CrossRef]
- Yao, J.; Hew, Y.Y.M.; Mears, A.; Huang, H. Strain gauge-enable wireless vibration sensor remotely powered by light. IEEE Sens. J. 2015, 15, 5185–5192. [Google Scholar] [CrossRef]
- Ong, J.; You, Y.Z.; Mills-Beale, J.; Tan, E.L.; Pereles, B.; Ghee, K. A wireless, passive embedded sensor for real-time monitoring of water content in civil engineering materials. IEEE Sens. J. 2008, 8, 2053–2058. [Google Scholar] [CrossRef]
- Feng, D.; Feng, M.Q. Identification of structural stiffness and excitation forces in time domain using noncontact vision-based displacement measurement. J. Sound Vib. 2017, 406, 15–28. [Google Scholar] [CrossRef]
- Cardno, C. The new reality. Civ. Eng. 2017, 87, 48–57. [Google Scholar] [CrossRef]
- Autodesk. BIM. 2017. Available online: https://www.autodesk.com/solutions/bim (accessed on 15 January 2018).
- Napolitano, R.; Glisic, B. Minimizing the adverse effects of bias and low repeatability precision in photogrammetry software through statistical analysis. J. Cult. Herit. 2017. [Google Scholar] [CrossRef]
- Boshce, F.; Haas, C.; Akinci, B. Automated recognition of 3D CAD objects in site laser scans for project 3D status visualization and performance control. J. Comput. Civ. Eng. 2009, 23, 311–318. [Google Scholar] [CrossRef]
- Cardno, C. Virtual and augmented reality resolve remote collaboration issues. Civ. Eng. 2016, 86, 40–43. [Google Scholar] [CrossRef]
- Brilakis, I.; Lourakis, M.; Sacks, R.; Savarese, S.; Christodoulou, S.; Teizer, J.; Mahmalbaf, A. Toward automated generation of parametric BIMs based on hybrid video and laser scanning data. Adv. Eng. Inform. 2010, 24, 456–465. [Google Scholar] [CrossRef]
- Embry, K.; Hengartner-Cuellar, A.; Ross, H.; Mascarenas, D. A virtual reality glovebox with dynamic safety modeling for improved criticality regulation visualization. Sens. Instrument. 2015, 5, 11–22. [Google Scholar] [CrossRef]
- Bleck, B.; Katko, B.; Trujillo, J.; Harden, T.; Farrar, C.; Wysong, A.; Mascarenas, D. Augmented Reality Tools for the Development of Smart Nuclear Facilities; Technical Report; Los Alamos National Laboratory (LANL): Los Alamos, NM, USA, 2017.
- Valinjadshoubi, M.; Bagchi, A.; Moselhi, O. Managing structural health monitoring data using building information modeling. In Proceedings of the 2nd World Congress and Exhibition on Construction and Steel Structure, Las Vegas, NV, USA, 22–24 September 2016. [Google Scholar] [CrossRef]
- Patias, P.; Santana, M. Introduction to heritage documentation. In CIPA Heritage Documentation, Best Practices, and Applications; Ziti Publications: Thessaloniki, Greece, 2011; pp. 9–13. [Google Scholar]
- Colonial Williamsburg. Tour the Town. 2017. Available online: http://www.history.org/almanack/tourthetown/index.cfm (accessed on 15 November 2018).
- Mount Vernon Ladies Association. Mount Vernon. 2017. Available online: http://www.mountvernon.org/site/virtual-tour/ (accessed on 15 January 2018).
- Napolitano, R.; Scherer, G.; Glisic, B. Virtual tours and informational modeling for conservation of cultural heritage sites. J. Cult. Herit. 2017. [Google Scholar] [CrossRef]
- Hou, L.; Wang, Y.; Wang, X.; Maynard, N.; Cameron, I.; Zhang, S.; Maynard, Y. Combining photogrammetry and augmented reality towards an integrated facility management system for the oil industry. Proc. IEEE 2014, 102, 204–220. [Google Scholar] [CrossRef]
- Napolitano, R.; Douglas, I.; Garlock, M.E.; Glisic, B. Virtual Tour Environment of Cuba’s National School of Art. ISPRS 2017, XLII-2/W5. [Google Scholar] [CrossRef]
- Koehl, M.; Brigand, N. Combination of virtual tours 3D model and digital data in a 3D archaeological knowledge and information system. ISPRS 2012, 39, 439–444. [Google Scholar] [CrossRef]
- Napolitano, R.; Blyth, A.; Glisic, B. Virtual environments for structural health monitoring. In Proceedings of the 11th IWSHM, Stanford, CA, USA, 12–14 September 2017. [Google Scholar]
- Sigurdardottir, D.; Glisic, B. On-Site Validation of Fiber-Optic Methods for Structural Health Monitoring: Streicker Bridge. J. Civ. Struct. Health Monit. 2015, 5, 529–549. [Google Scholar] [CrossRef]
- Sigurdardottir, D.; Glisic, B. Evaluating the coefficient of thermal expansion using time periods of minimal thermal gradient for a temperature driven structural health monitoring. Proc. SPIE Nondestruct. Charact. Monit. Adv. Mater. Aeros. Civ. Infrastruct. 2017, 1016929. [Google Scholar] [CrossRef]
- Abdel-Jabar, H. Comprehensive Strain-Based Methods for Monitoring Prestressed Concrete Beam-Like Elements. Doctoral Dissertation, Princeton University, Princeton, NJ, USA, 2017. [Google Scholar]
- Glisic, B. Civil and Environmental Engineering Course 439/539; Princeton University: Princeton, NJ, USA, 2017. [Google Scholar]
- Theta. Ricoh Theta S. 2017. Available online: https://theta360.com/en/about/theta/s.html (accessed on 15 January 2018).
- Hugin. Panorama Photo Stitcher. 2017. Available online: http://hugin.sourceforge.net/ (accessed on 15 January 2018).
- PTGui. Spherical Panoramas. 2017. Available online: https://www.ptgui.com/ (accessed on 15 January 2018).
- Kolor. Panotour. 2017. Available online: http://www.kolor.com/panotour/ (accessed on 15 January 2018).
Method | Time Spent (h) | Data File Size (MB) |
---|---|---|
VT/IM | 1 | 87.9 |
3D modeling | 12 | 4000 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Napolitano, R.; Blyth, A.; Glisic, B. Virtual Environments for Visualizing Structural Health Monitoring Sensor Networks, Data, and Metadata. Sensors 2018, 18, 243. https://doi.org/10.3390/s18010243
Napolitano R, Blyth A, Glisic B. Virtual Environments for Visualizing Structural Health Monitoring Sensor Networks, Data, and Metadata. Sensors. 2018; 18(1):243. https://doi.org/10.3390/s18010243
Chicago/Turabian StyleNapolitano, Rebecca, Anna Blyth, and Branko Glisic. 2018. "Virtual Environments for Visualizing Structural Health Monitoring Sensor Networks, Data, and Metadata" Sensors 18, no. 1: 243. https://doi.org/10.3390/s18010243
APA StyleNapolitano, R., Blyth, A., & Glisic, B. (2018). Virtual Environments for Visualizing Structural Health Monitoring Sensor Networks, Data, and Metadata. Sensors, 18(1), 243. https://doi.org/10.3390/s18010243