Monday, April 18, 2011

4.18 progress update

In playing with my new IR proximity sensor, I found that I will only be able to track motion in a straight line from the sensor, not in a larger area as I had originally anticipated.  Therefore, I have decided to hang the digital venus flytraps from the a stairwell in alumni; this will ensure that people walk at the sensor in a straight line.  I was able to write a script that utilizes proximity data from the IR sensor to control red, green and blue values in the RGB LED.  This allows me to gesturally control color, just like the color sliders in photoshop by moving my hands. I ordered 9 more RGB LED's; I plan to have 9 total flytraps. I finally found a way onto the laser cutter schedule so I will need finalized my files this week.

Tuesday, April 12, 2011

Midterm Update

Although some of this is posted elsewhere on my blog, I will be redundant here for the sake of organization.

Independent Project:
The Personification of an Impartial Digital Agenda

ABSTRACT

Although energy efficiency is economically and ethically the justification for the advancement of digital (inte)reaction in architecture, I am interested in applying  an identifiable personality to a digital product.

If I can master the means and methods needed to create and program a digital work with personality, I certainly will have developed the technical skills to create something environmentally reactive as well.

In fabricating and programming a set of unique “digital” plants that react with the environment and interact with human passersby, I will become more aware of the embedded digital possibilities and environmental strategies of a project proposal.  


INTRODUCTION

In class we debated the difference between interaction and reaction.  Architecture is a decidedly reactive entity; it exists only out of necessity for human shelter.  Architecture, however does not need to be limited to the reactive realm. When augmented with digitally controlled systems, the built environment can become interactive; personifying space.

Reactive digital systems, centered upon automating modern convinces such as temperature control systems, fire and crime detection, and automatic teller machines, have been around for some time.  Their usefulness and functionality is not debatable; obviously these systems have found their niche in contemporary culture.  According to Usman Haque, these single-loop systems “provide us with a situation where a person is at the mercy of the machine and its inherent logical constructs. [We may get unexpected results (for example the machine tells us that it is out of cash), but the fact that the machine itself was selecting from a predetermined set of responses precludes any constructive interaction].”  These single-loop, narrow “minded” machines have no personality, their existence is justified strictly by their ability to serve a purpose. 

Largely beginning in the 1960’s with the Cyberneticians Gordon Pask and Cedric Price, interactive architecture has come a very long way.  While the technology available today was not at the disposal of these pioneers, the foresight of Pask and Price is still very much relevant.  Gordon Pask spoke of how “rather than an environment that strictly interprets people’s desires, an environment should allow users to take a bottom-up role in configuring their environment in a malleable way without specific goals.”  Cedric Price, attributed to the creation of the inspiration for the reconfigurable Pompidou Center, was very much an advocate for the notion his so called “anticipatory architecture.”
Price's Fun Palace. 1961

As technology continued to develop of the years, a drastic acceleration in the application of technologic advancements occurred in the nineties.  With the creation of such projects as Michael Mozer’s adaptive house, MIT’s Intelligent Room, and Bill Gate’s house, programming architecture to respond to the needs of its human inhabitants became reality. 



MIT: Intelligent Room

Adaptive House



However, today digitally interactive architecture seems no more a reality than it was when it was originally conceived in the 60’s.  We still don’t talk to our homes, nor do they speak to us.  Contemporary paradigms of interactive architecture seem to best exist in installations and expositions, and have not yet become mainstream applications to the built environment.
the temporary Blur Building: DS + R




installation: Digital Water Pavilion




Technology as of recently has showed great promise in user tangibility and interactivity, especially with the advent of tablets and open source applications.  Programming is becoming much simpler, and the learning curve of new software is much less steep.   Computer controlled circuit boards such as Arduino can be readily found online and easily programmed to automate any number of tasks; perhaps an open source library home automation is not so far off in the future?

While certainly sustainability and improved energy efficiency are the future for digitally interactive and reactive architecture, I am interested in the creation something digitally interactive through a programmed ‘personality’.

What if a digital product were freed of catering to human needs, in a single-loop, reactive sort of way? What if the digital product was allowed to develop its own personal agenda, and subsequently, its own personality?




PROJECT DESCRIPTION

In the spirit of creating a digitally interactive piece that is freed of functional assistance to its owner, I have decided that through scripting and digital fabrication, I will try to capture the movements reactions and interactions of a plant; the Venus flytrap.  The Venus flytrap is not only reactive to the environment, with its sensitivity to sunlight and soil moisture content, but it is also highly kinetically interactive with potential prey.  I wish to fabricate a number of these digital Venus flytraps, and program each one with a different “personality.”  Some will be shy, and move timidly, and startle and subsequently close at the first detection of movement.  Others will be more bold, and take a high degree of detected movement to close.  All the digital flytraps will open to the sun, and attempt to optimize the amount of light hitting their petals. I haven’t yet figured this part out, but each flytrap will have a corresponding coloration that can fluctuate based on the under lighting of LED’s.

I will be using Firefly, a plugin to Grasshopper, an Adrunio board, an IR motion sensor, photo resistor, RGB LED’s and servo motors to realize my digital Venus flytrap garden.  



WORKING BIBLIOGRAPHY

Design Museum. "Cedric Price." http://designmuseum.org/design/cedric-price

Haque, Usman. "Architecture, Interaction, Systems."  www.haque.co.uk. 2006. 

"The Architectural Relevance of Gordon Pask." 4d Social - Interactive Design Environments. Wiley & Sons, 2007.

Kulkarni, Ajay. "Design Principles of a Reactive Behavioral System for the Intelligent Room". Artificial Intelligence. 2002.

"Interactive Architecture." Fox, Michael and Miles Kemp. Princeton Architectural Press, New York.













Red, green and blue LED's all illuminated with full light
hitting sensor.


As the light to the sensor begins to be blocked, the green 
light is the first to turn off.










With all light blocked to the sensor, the red light 
is the only continuing to shine.
With the red, green, and blue light diffused into a semi-
transparent object, I found that I could create any color 
in the RGB spectrum by incrementally turning on or 
off any combination of the LED's.




Monday, April 11, 2011

progress updates. week of 4.10

4.10

I've been searching for some "academic" articles on interactive architecture other than the one's we assigned in class.  I must admit that I am more interested in the applications of interactive technology rather than rhetorical argument's regarding its definition, so I haven't yet found much more than case studies.

4.11

I started the digital files for my pieces and parts to control with Arduino; I've decided laser cut plexi is the way to go for the body of my 'creature.'  I will upload some images of my Arduino set-up tomorrow.  My search for articles continues.  

Wednesday, April 6, 2011

progress update

 4.06.
http://www.acroname.com/robotics/parts/R48-IR12.html

I look forward to adding this to my Arduino collection: I want my project to be able to react to proximity of passers-by.


4.07
If I am going to make something interactive that can respond to more than one form of stimulus, the venus flytrap is an excellent case study.  It responds to not only environmental conditions- humidity, sunlight, rainfall, and temperature- but also movement and touch. 




4.08

RGB LED changing (rainbow) colors, Arduino from Meinaart van Straalen on Vimeo.

While waiting for my proximity sensor to arrive, I'm trying to come up with a few variables to be controlled by distance data.  My arduino board came with a few different colors of LED's; color coding with light someone's distance from the sensor could be fun.


4.09

I really appreciate the look of the digital fabrication group's laser cut plexi... I purchased some clear acrylic today to create my creature.  Under-lit plexi from Arduino's LED's should look vibrant. 

Mistry Pravnav Talk



While it seems this technology is still years away, it certainly seems promising.  There isn't anyone who would rather enjoy working on their "computer" wherever they please, as they go throughout the day.  No one wants to be a slave to their computer monitor.  This would also be fantastic in eliminating the waste in the production of computer screens, which inevitably become outdated every five years or so.

Monday, April 4, 2011

individual project. progress update


Now that I have proven to myself that I can control Arduino, I need to find an application for it.  While originally I was looking to create a digital flower that responds to light, I want to find something that is less predictable and responsive to more criteria.  The script above allows a servo motor to respond to the fluctuating light levels in a room, but I want to branch out and try out some sensors that respond to proximity, touch, noise and temperature. In class we discussed the possibilities of interacting with something digital, and I now want to create something that unless uninterrupted, will do its own thing- as if it weren't simply created to cater to the user.

Here are some great examples of work with Arduino and proximity sensors:
http://natebu.wordpress.com/

Sunday, April 3, 2011

case studies

CNC:




Laser cutter:






3d Printer: RepRap. the self replicating machine




Water Jet Cutter:


Robotic Bricklayer: