Reality Capture
∙ Tomislav Zigo
∙
SP17

Reality capture of subjects both natural and constructed, via Laser/Lidar imaging and Photogrammetry, yield two-dimensional images and three-dimensional space. While the accuracy and efficiency of these methods of measurement far exceed analog measurement and representation, there is still room for improvement. The proposed testing site located adjacent to the Cotton Belt building along the near North Riverfront, will utilize a myriad of natural and synthetic objects and systems to fool, break, and therefore iteratively perfect computer vision via machine learning.
OVERVIEW
Laser/Lidar imaging has its basis in the simple fundamentals of sending a beam of light out and it reflecting back. A machine then computes, the distance the beam has traveled. For the course, the machines used were more complex but followed the same principle. The seminar took advantage of using 3D laser scanners (FARO) to determine within a variable set tolerance form, shape, and an associated color. The 3D model generated is formed as a point cloud, which can then have BIM objects and families mapped onto them for use in the Architectural and Engineering fields. The focus of this seminar was not on this particular aspect but rather in the ability to trick the laser scanner into creating a space that did not match reality. This was not accomplished through intentioanlly messing up the scan or settings, but rather through testing the limitations of the scanner by moving or changing material surface.
Photogrammetry is a different approach for generating the similar end condition of a 3D space. However, Photogrammetry utilizes geo-referencing images in order to construct space. This can be accomplished on a small scale by taking passing series of images at 0,45, and 90 degrees orbiting around the object. These photos are then compiled and mapped into a mesh by using Autodesk ReCap or ReMake. From here, the object meshes can be further manipulated in Meshmixer or other similar programs. These meshes are highly detailed and precise due to the amount of overlap and total coverage relative to the size of the object. Limitations occur on a larger scale when a drone is flying and taking a similar series of photos without the having the specs of a DSLR camera and is resticted in its access to variable-angle passes. Therefore overhangs, cantilevers, and areas in shadow tend to be be error-filled.
This is the premise for the testing site, pushing the envelope of existing lmitations and discovering new ones for these technologies.
Photogrammetry is a different approach for generating the similar end condition of a 3D space. However, Photogrammetry utilizes geo-referencing images in order to construct space. This can be accomplished on a small scale by taking passing series of images at 0,45, and 90 degrees orbiting around the object. These photos are then compiled and mapped into a mesh by using Autodesk ReCap or ReMake. From here, the object meshes can be further manipulated in Meshmixer or other similar programs. These meshes are highly detailed and precise due to the amount of overlap and total coverage relative to the size of the object. Limitations occur on a larger scale when a drone is flying and taking a similar series of photos without the having the specs of a DSLR camera and is resticted in its access to variable-angle passes. Therefore overhangs, cantilevers, and areas in shadow tend to be be error-filled.
This is the premise for the testing site, pushing the envelope of existing lmitations and discovering new ones for these technologies.
SPECULATIVE PROJECTS



PRODUCTIVE ORGANICS
RORY THIBAULT
RORY THIBAULT
Alterations of scanned subject matter can occur in immediacy as well as prolonged through both space and time. These alterations are resultant from both natural and calculated actions, yielding respectively nonlinear and speculative results. This system of scanning, validation, and comparison generates a feedback loop improving the machine’s ability over time. The actions and consistency of the subject matter for the scan can trick both Laser/Lidar imaging and Photogrammetry through changes to its shape and volume, material, lighting conditions and emittance, and location over time.
Physical erroneousness was present when analyzing organic material, such as trees and grass, in the class context. Scanned trees yielded an RGB value associated with the atmospheric conditions of that point in time. Since the trees are not static, in many instances, the value assigned to a voxel was of the sky behind the subject rather than of the subject itself. Interpolated and interstitial space is generated to stitch together a conceivable solid object, even though this may not depict reality. Instead, it yields a referenced visual overlay, similar to the digital overlaying of layers in Photoshop of like objects, creating an averaged depiction of reality. In comparison, the relatively static turf grass surface of the Givens lawn yielded a more physically and color accurate result. Since the proposed site lacks a large amount of trees and consists mainly of sedges and low shrubs, trees should be planted in order to iteratively test the quickness of the Laser scanner and fine detail via drone Photogrammetry. Tests should be conducted as the plant matures at various distances away from the trunk, thereby testing the limits of altitude/azimuth accuracy, and slight growth perturbations from the tree’s apical meristems. The site could also be regraded, testing a natural version of the scan flatness/levelness compliance test on a concrete floor shown in class.
Physical erroneousness was present when analyzing organic material, such as trees and grass, in the class context. Scanned trees yielded an RGB value associated with the atmospheric conditions of that point in time. Since the trees are not static, in many instances, the value assigned to a voxel was of the sky behind the subject rather than of the subject itself. Interpolated and interstitial space is generated to stitch together a conceivable solid object, even though this may not depict reality. Instead, it yields a referenced visual overlay, similar to the digital overlaying of layers in Photoshop of like objects, creating an averaged depiction of reality. In comparison, the relatively static turf grass surface of the Givens lawn yielded a more physically and color accurate result. Since the proposed site lacks a large amount of trees and consists mainly of sedges and low shrubs, trees should be planted in order to iteratively test the quickness of the Laser scanner and fine detail via drone Photogrammetry. Tests should be conducted as the plant matures at various distances away from the trunk, thereby testing the limits of altitude/azimuth accuracy, and slight growth perturbations from the tree’s apical meristems. The site could also be regraded, testing a natural version of the scan flatness/levelness compliance test on a concrete floor shown in class.
SKEWED PERSPECTIVES
ARMAAN SHAH
Reflection has strength in manipulating laser scans in an extremely confusing way. Taking a concept from pure mechanical and artificially created mirrors recreate regions that read as regions of reality to computer vision. Effectively depth and color and not skewed in anyway whatsoever making them a perfect candidate for spatial depth inhibitions for machine learning and computer vision processes.
A much more refined factor of this is when we start to itegrate regions of more than reality capture through natural reflection, but a genuined skewed perception defined by material refractive index changes. This can result in extrememly warped scenarios wether it be as regularly set, uncontrolled surface systems like windows or glass dividers, or as regularly modulated regions of polycarbonate slats that have interuptions in transpereancy and material at regular intervals. This begins to further move into the realm of even greater disruptions in LASER LIDAR and photogametry, namely the quality of refraction and reflection being integrated into systems of change.
The most volatile inhibitor of computer vision becomes water. An extremely unpredictable fluid, water becomes a means by which we see reflection, refraction, and movement simultaneously. This is the quality of disruption that becomes condusive to various reality capture systems. Both Photogamtery and LASER LIDAR will be tested to the greatest extent. The matter becomes a pure facade and region integration.
Projects such as Cloud Gate show depict a reframing and skewed perception reality through space making and material choices. Namely the warped form and shape of reflective stainless steel become a means through which to view the city and sky with a different lens. The impacts on LASER LIDAR and photogamtery especially.
With the movement people in the piece become a moment of movement change that bcome vary tough for these computer mechanisms to capture. Effectively and architectural intervention that begins to explore the warped and hidden or dark thorugh organic reflective/refractive form-making along with the dynamism of water movement will be a strong direction to take a testing facility for reality capture technology. Designing an architecture that isn’t actively dynamic but is more passively dynamic through a reframing or direct reflection of the dynamism of the external stimuli through which it interacts.
Reflection has strength in manipulating laser scans in an extremely confusing way. Taking a concept from pure mechanical and artificially created mirrors recreate regions that read as regions of reality to computer vision. Effectively depth and color and not skewed in anyway whatsoever making them a perfect candidate for spatial depth inhibitions for machine learning and computer vision processes.
A much more refined factor of this is when we start to itegrate regions of more than reality capture through natural reflection, but a genuined skewed perception defined by material refractive index changes. This can result in extrememly warped scenarios wether it be as regularly set, uncontrolled surface systems like windows or glass dividers, or as regularly modulated regions of polycarbonate slats that have interuptions in transpereancy and material at regular intervals. This begins to further move into the realm of even greater disruptions in LASER LIDAR and photogametry, namely the quality of refraction and reflection being integrated into systems of change.
The most volatile inhibitor of computer vision becomes water. An extremely unpredictable fluid, water becomes a means by which we see reflection, refraction, and movement simultaneously. This is the quality of disruption that becomes condusive to various reality capture systems. Both Photogamtery and LASER LIDAR will be tested to the greatest extent. The matter becomes a pure facade and region integration.
Projects such as Cloud Gate show depict a reframing and skewed perception reality through space making and material choices. Namely the warped form and shape of reflective stainless steel become a means through which to view the city and sky with a different lens. The impacts on LASER LIDAR and photogamtery especially.
With the movement people in the piece become a moment of movement change that bcome vary tough for these computer mechanisms to capture. Effectively and architectural intervention that begins to explore the warped and hidden or dark thorugh organic reflective/refractive form-making along with the dynamism of water movement will be a strong direction to take a testing facility for reality capture technology. Designing an architecture that isn’t actively dynamic but is more passively dynamic through a reframing or direct reflection of the dynamism of the external stimuli through which it interacts.





REFRACTIVE KINETICS
WILL JAMES Downtown Clayton consists almost entirely of uninspiring office towers, but one structure incorporates an interesting, environmentally responsive feature. Somewhat ironically, it is a parking garage, but by delicately hanging thousands of aluminum cards in an immense field across the façade, it transforms into one of the most compelling and street-activating built works in the area.
This kind of geo-activated kinetic feature is not only interesting from an architectural perspective, but has implications for the field of remote sensing and reality capture. Both photogrammetry and laser scanning rely on unmoving, unchanging, solid surfaces to provide reliable approximations of the physical reality of an object. Introducing an element of motion is confusing to these technologies, as each tries to provide a static interpretation. While something like a laser scanner can process enormous amounts of data at very high precision, it cannot interpret motion, particularly unstructured, organic motion, to anywhere near the degree that the human mind can.
There are limits though to how much a kinetic windscreen like the one in Clayton could do to influence the outcomes of remote sensing testing. Scans taken at different times would be difficult to register against each other, but discrepancies would be small in scale. To further complicate the situation from the scanner’s perspective, we are proposing the introduction of refractivity. The combination of complex organic motion caused by site effects and refractive materials creates a dynamic façade condition that is ideal for pushing the boundaries of remote sensing technology. When paired with interventions in the project landscape and building form, it could provide limitless opportunities for the improvement of these technologies.
This intervention is best suited to a building façade so as to provide the opportunity for testing both from the inside out and the outside in, which allows for greater variability in the light condition combinations possible. We imagine this façade looking like a combination of other wind-activated facades and wind chimes. Cables strung taut between floor and roof carry refractive panes (likely silica glass, but in theory could be strontium titanate, silicon carbide, or another compound with an extremely high refractive index). The cables run through these panes, which are spaced apart slightly by low friction bearings to best facilitate their rotation in response to wind.
WILL JAMES Downtown Clayton consists almost entirely of uninspiring office towers, but one structure incorporates an interesting, environmentally responsive feature. Somewhat ironically, it is a parking garage, but by delicately hanging thousands of aluminum cards in an immense field across the façade, it transforms into one of the most compelling and street-activating built works in the area.
This kind of geo-activated kinetic feature is not only interesting from an architectural perspective, but has implications for the field of remote sensing and reality capture. Both photogrammetry and laser scanning rely on unmoving, unchanging, solid surfaces to provide reliable approximations of the physical reality of an object. Introducing an element of motion is confusing to these technologies, as each tries to provide a static interpretation. While something like a laser scanner can process enormous amounts of data at very high precision, it cannot interpret motion, particularly unstructured, organic motion, to anywhere near the degree that the human mind can.
There are limits though to how much a kinetic windscreen like the one in Clayton could do to influence the outcomes of remote sensing testing. Scans taken at different times would be difficult to register against each other, but discrepancies would be small in scale. To further complicate the situation from the scanner’s perspective, we are proposing the introduction of refractivity. The combination of complex organic motion caused by site effects and refractive materials creates a dynamic façade condition that is ideal for pushing the boundaries of remote sensing technology. When paired with interventions in the project landscape and building form, it could provide limitless opportunities for the improvement of these technologies.
This intervention is best suited to a building façade so as to provide the opportunity for testing both from the inside out and the outside in, which allows for greater variability in the light condition combinations possible. We imagine this façade looking like a combination of other wind-activated facades and wind chimes. Cables strung taut between floor and roof carry refractive panes (likely silica glass, but in theory could be strontium titanate, silicon carbide, or another compound with an extremely high refractive index). The cables run through these panes, which are spaced apart slightly by low friction bearings to best facilitate their rotation in response to wind.