Date: 
05.05.11 to 09.04.13

F'11 Studio: Ryan Luke Johns

Posted By: 

505a Graduate Design Studio: Alejandro Zaera Polo

Localising Networks: Physical Terminals for Web 2.0 Engines

While the concept of the studio involves creating physical manifestations of web 2.0 engines, this project seeks to turn the idea on its head:  experimenting with the potential of uploading localized architectural design into the expanding realm of mobile apps and social networking sites.  Mario Carpo illustrates in “Architecture in the Age of Printing,” that while architectural styles are clearly tied to the development of design and construction technologies (i.e. trabeation for the Ancient Greek, the arch for the Romans, stereotomy for the Gothic, reinforced concrete in modernism, etc.), they can also be influenced by technologies of representation and dissemination of media (i.e. the printing press for the Renaissance).  With the recent explosion of social media engines, we must ask the question—what would it mean for architecture to go viral?  How might we “share” space?

The first experiment in this project is a mobile phone application developed using processing for android which utilizes the phone’s accelerometer and compass data to orient a sphere mapped with an equirectangular image—creating a simple fixed-position, orientation-based panoramic viewer.  Existing applications (for most smartphone platforms) allow users to easily capture and stitch equirectangular images:  these images can then be emailed to other users which can open and explore these environments using the panoramic viewer.

Evolving from the simple implementation of a VR panoramic viewer, the final prototypical application uses the Kinect to track the user’s physical position within virtual space.  The android device serves as the basis of a low-cost augmented reality headset.  The user’s head and hand positions are tracked in 3d using the Kinect, and these coordinates are sent in real time, wirelessly over the internet to the headset, which positions the virtual camera based on the XYZ coordinate of the user’s head and orients the camera fulcrum using the smartphone’s accelerometer and geomagnetic sensors.  The user can generate gestural forms by initializing functions via spoken command:  “Loft” generates a surface that is lofted between the paths of the right and left hand, while the command “brick,” initializes a brick wall which follows the path of the right hand in plan and is built to the height of the hand in elevation.  Multiple functions can be run simultaneously, forms can be added to or erased, and multiple objects can be generated within the same scene. The user can walk around and explore the scene in first-person augmented-reality before speaking the command "Rhino" to open the exported geometry in the 3d modeling software on a nearby pc for further exploration or fabrication.

The project uses widely available and relatively cheap technologies to provide individuals with intuitive means for roughing out architectural forms to-scale, and equips them with easy techniques for exploring, editing, detailing and fabricating those forms.  By implementing this prototypical framework, we enable a more productive conversation about the potential of AR-based architectural design and how it might be influenced by social media:  reiterating concerns surrounding ideas of mass customization, design scale, on-site design and the modular.