Language:
    • Available Formats
    • Options
    • Availability
    • Priced From ( in USD )
 

About This Item

 

Full Description

Most existing approaches to characterizing thermal properties of buildings and heat emissions from their elements rely on manual inspection and as such are slow, and labor intensive. This is often a daunting task, which requires several days of on-site inspection. In this paper, we propose a fully automatic approach to construct a 3D thermal point cloud of the building interior reflecting the geometry including walls, floors, and ceilings, as well as structures such as furniture, lights, windows, and plug loads. Our approach is based on a wearable ambulatory backpack comprising multiple sensors such as Light Detection And Ranging (LiDAR) scanners, and Infrared and optical cameras. As the operator wearing the backpack walks through the building, the LiDAR scans are collected and processed in order to compute the 3D geometry of the building. Furthermore, the Infrared cameras are calibrated intrinsically and extrinsically such that the captured images are registered to the captured geometry. Thus, the temperature data in the Infrared images is associated with the geometry resulting in a "thermal 3D point cloud". The same process can be repeated using optical imagery resulting in a "visible 3D point cloud". By visualizing the two point clouds simultaneously in interactive rendering tools, we can virtually walk through the thermal and optical 3D point clouds, toggle between them, identify and annotate, "hot" regions, objects, plug loads, thermal and moisture leaks, and document their location with fine spatial granularity in the 3D point clouds.