Subvert

This project is an exercise in understanding how the physical environment is being fundamentally reformed with the introduction of large scale applications of Light Detection and Ranging Systems (LiDAR). This is a technology that is being used at the street level (through self-driving vehicles), airspace (military drones) and at the macro scale (satellite imaging) to produce a digital copy of our world for various uses.

Using light ranging and detection systems (Lidar), this system constantly updates their readings of the city to a collective consciousness. By coupling their technology with satellites and drones, the most logical routes can be calculated and updated to suit any condition. This method understands the urban landscape by emitting infrared breams, which is then calculated by a sensor chip and mapped out into digital space.

Light Detection and Ranging Systems

The machine vision instrument chosen for this project is LiDAR, a process which emits infrared light beams to capture physical objects within a room. 

This project asks how does this new digital realm reconfigure our understanding of the physical urban network? Although the AI system is attempting to create an exact mapping of urban cities, it cannot do so in the way an individual would. Formal studies have shown that certain gestures, angles, and materials disrupt the infrared readings of the landscape, scattering points to new areas. These points create a new urban map, and changes the dynamic that pedestrians have within these systems.

The objects in the room are  documented in the system’s algorithm to determine the field of depth, color, and composition of the physical landscape. By locating where these values pass through the sensor chip, a tool can be generated to change these values. The movement of particular physical attributes that the infrared beams rely on from their original location will force the camera to reorganize it’s data based on this new information. This tweak causes a new interpretation to the color composition  of the room, and an alternate visual output when the values are reorganized.

Deliberate gestures, carved angles reveal an alternative digital understanding of our physical realm.

How does this new digital realm reconfigure our understanding of the physical urban network? Although the AI system is attempting to create an exact mapping of urban cities, it cannot do so in the way an individual would. Formal studies have shown that certain gestures, angles, and materials disrupt the infrared readings of the landscape, scattering points to new areas. These points create a new urban map, and changes the dynamic that pedestrians have within these systems.

The objects in the room are  documented in the system’s algorithm to determine the field of depth, color, and composition of the physical landscape. By locating where these values pass through the sensor chip, a tool can be generated to change these values. The movement of particular physical attributes that the infrared beams rely on from their original location will force the camera to reorganize it’s data based on this new information. This tweak causes a new interpretation to the color composition  of the room, and an alternate visual output when the values are reorganized.

Subversion Tower

The second part of this project was to design a tower based on this new imagined landscape, where the digital and physical realms overlapped in unprecedented ways. The tower, through the techniques of the previous models, was placed in an area of heavy traffic flows to exploit the weakness of the self driving vehicle.

Site Location

Plans, Sections

Visitor Lounge

Car Exhibition

Physical Model

How these formal adjustments would impact the city at the macro scale once the self-driving vehicle systems are the norm? If these scattered points do become a condition that influences how self-driving vehicles interact with the city, where would these systems re-establish themselves and what could this mean in other pockets of the city? If the physical realm is causing unexpected readings in a digital consciousness (come overarching network), how would that transferred back into the physical world?