Betascape 2012: Art, technology, and magic in Baltimore

27 Sep

This past weekend, I helped to facilitate a relational art workshop in Betascape’s physical computing lab with Kawandeep Virdee.  Betascape is an annual weekend long Art & Technology Exploratorium in Baltimore, MD.  This year betascape was held at the Maryland Institute College of Art (MICA) which was a nice deviation from the technology oriented venues that often host collaborative hands-on explorations in Boston.

Betascape’s keynote speakers included Nathalie MiebachHod LipsonMarco Perry, and Nervous System; talks covered Nathalie’s incredibly complex data-based sculptures, a fascinating look into the future of 3d printing, the DIWire bender for rapid prototyping, and Nervous System’s beautiful nature-inspired computer-generated digitally-fabricated designs.  The labs hosted live 3D printing and scanning demonstrations, workshops in data visualization, and many opportunities for participants to build, program, and make art.

Check out my betascape flickr set for photos.

What is relational art?

Relational Art is an emerging movement in art identified by Nicolas Bourriaud, a French philosopher, who recognized a growing number of contemporary artists used performative and interactive techniques that rely on the responses of others: pedestrians, shoppers, browsers—the casual observer-turned-participant.”

Kawan and I discussed physical computing, which explores how humans and environments engage with computers via sound, heat, motion, light, and other inputs, as a playground to build highly interactive and collectively experienced relational art.  In collaborating with Kawan, I realized that the goals of relational art align closely with the goals of exhibit design in museums; both relational art pieces and museums exhibits seek to provoke dialogue, questions, and ways to collectively interact with ideas.  At a science museum, cultivating skepticism and discussion about research is an important component of the scientific perspective.  It is easy to fall into a one-visitor-per kiosk model when you want to deliver a clear content message in a science museum, but a more social experience can better deliver content with curiosity.

Read more at relational art and complex systems.

ofxTessellationBuilder, a relational art sketch for betascape

For the workshop, I shared an open source relational art piece called ofxTessellationBuilder that allows up to 7 people to interact through touch.  The physical interface was a MaKey MaKey attached to 7 rings which are each grabbed by one person.  One person holds the special earth ring to literally serve as ground in the circuit and the other (up to 6) participants can touch this grounded person to add a specifically colored ring to a dynamic tessellation called the flower of life.

ofxTessellationBuilder screenshot showing colored rings forming a dynamic tessellation

ofxTessellationBuilder screenshot showing colored rings forming a dynamic tessellation

The MaKey MaKey detects touch, but works like a keyboard encoder under the hood, so you can test the software by using the keys ‘a’, ‘s’, ‘d’, left arrow, down arrow, and right arrow (each mapped to a different color).

Things get more interesting when you start to count how many participants are grounded at once; what happens if two or three people connect to ground at the same time (either directly or through one another as a proxy)?  I programmed special behaviors for these situations from 2-6 people grounding at once.  Two people causes colors to merge into one ring, three causes the tessellation to break apart and reform, 4 is a more aggressive explosion, 5 shows participants through a webcam view, etc.  The additional behaviors help encourage participants to physically connect in different ways. ofxTessellation builder came together rather quickly due to some simple magic that helped to make it more interactive.

Simple magic for relational art

  1. Try doubling your sensors:  Detecting heartrate?  How does the experience change when two people can compare heart rate directly?
  2. Leverage emergence through simulation: ofxTessellationBuilder uses a 2D physics engine to simulate forces, collisions, and complex behavior.  I was delighted to see one ring orbiting a larger mass of rings as I had not planned for anything like this.
  3. Use multiple inputs in combination: Having the software respond to different combinations of the 6 keystrokes allows for very complex interactions
  4. Use bright colors: this came up throughout the betascape conference; bright colors are magical and people enjoy interacting with colorful experiences
  5. Simplify technology: the MaKey MaKey’s keystroke behavior made it very fast to test software and handle input (I used keyPressed events rather than parsing arduino serial data).  Museums routinely use keyboard encoders in kiosks and interactives because they are easy to maintain, troubleshoot, and simulate w/ a regular computer keyboard for software testing.

The arduino orchestra

The first day of our workshop was exploratory with demos, discussions, and brainstorming around relational art ideas.  We tinkered with sensors, discussed inputs and outputs, and visioned ways to make experiences more collective.

For the second day, we wanted to focus and build something collaboratively based on our explorations.  The resulting vision was to create an arduino-theramin orchestra using range finders and piezo speakers.  Here’s a time-lapse video of day two showing the physical computing lab jamming on soldering speakers, programming arduinos, and playing custom microcontroller-based musical instruments.

Kawan posted the rangefinder code for the arduino theramin which was tweaked for our workshop by Amy Lee (thanks!). Check out our collaborative notes for the workshop here: http://typewith.me/p/betascape-2012

Arduino based theramin with range finder by @not_rollergirl

Arduino based theramin with range finder by @not_rollergirl

Thanks a million to everyone who made Betascape possible and totally magical (especially Matt Forr and Heather Sarkissian)!

Creating Museum Media for Everyone; developing DIY Toolkits for UD and UDL

23 Jun

In May 2012, the Museum of Science in Boston hosted the 2012 Creating Museum Media for Everyone workshop to develop tools and strategies for universal design (UD) and universal design for learning (UDL).  The first two days featured talks and demonstrations from  experts on creating inclusive museum environments and building engaging interactive experiences for visitors with a wide range of disabilities.  For the rest of the week, researchers, accessibility experts, designers, and creative programmers worked together in four teams to prototype a dynamic haptic interface for exploring graphs, gesture-based descriptive audio layers for multitouch surfaces, a physical interface for exploring graphs via data sonification, and strategies for personalizing exhibit and museum experiences to increase accessibility.

Here are some of the outcomes from the workshop which show the power of collaboration in rapid prototyping.  The following demo shows how a user might explore a graph via touch and sound through a physical sonification interface.  Two axes move above a tactile graph which produces sound to represent data:

A similar haptic interface was created with vibration instead of sound to represent data points on a two dimensional graph:

haptic graph exploration tool

haptic graph exploration tool

Touch tables offer interesting accessibility challenges in a shared interactive space like a museum.  This prototype uses an audio puck to allow users to explore a multitouch surface with touch, sound, and vibration:

 touch table with haptic fiduciary markers

touch table with haptic fiduciary markers

The workshop has created some powerful momentum within the accessibility and museum communities.  I had the pleasure of working with Sina Bahram, a doctoral student from North Carolina State University, on the sonification prototype.  Sina has since developed a mobile device platform for exploring astronomy data through touch and sound as seen in this demo created for the 2012 transit of venus across the sun:

Some of the ideas represented in the CMME workshop have already made their way into new exhibit designs at the Museum of Science.  More links as tools/resources are shared publicly!

Etching Glass Using Vinyl Stencils

29 Apr

I’ve been using my vinyl cutter recently to create custom glass etchings.  Vinyl resists the acids in glass etching cream making it perfect for high resolution stenciling.  The cream I use is can be found here (I picked some up at a Michael’s craft store).  You’ll want to be careful and read the full Material Safety Data Sheet for armour etch.  You can etch most glass but not pyrex (Borosilicate).   The results are pretty neat:

Coaster with an etched crane

Coaster with an etched crane

To start the process, you’ll need to create a vinyl cut and apply it to your glass:

Applying the vinyl negative

Applying the vinyl negative

Now, put on your gloves (seriously) and paint on some etching cream.  I also use an exhaust fan in a nearby window for added safety:

Paint on your etching cream and wait

Paint on your etching cream and wait

After about 5 minutes, I rinse off the etching cream.  The MSDS mentions that baking soda can help neutralize some of the acids; you’ll want to flush everything with lots of water.  Here’s the final result:

Final result of vinyl resisted glass etch

Final result of vinyl resisted glass etch

More photos via flickr: http://flic.kr/s/aHsjz5247h

Zurich Bog’s Carnivorous Plants

8 Jun

Zurich Bog, an advanced sphagnum bog or peatland in upstate New York, was given to the Bergen Swamp Preservation Society for preservation on December 10, 1957 by Lyman Stuart and the Newark School District.  Over 50 years later, the bog supports  a cornucopia of native plant species including carnivorous sundews (Drosera rotundiflora) and purple pitcher plants (Sarracenia purpurea). I had the pleasure of hiking the bog on April 30th, 2011 with environmental scientist Valerie George as a guide and my girlfriend Meaghan Boyle.  Non-carnivorous plants you might find in the bog include: “water willow, highbush blueberry, mountain holly, black huckleberry, small cranberry, and several species of orchids” (Johnson, 234).  I captured the following two images of the bog’s carnivores soon after the last frost in the area and before anything had begun flowering:

Drosera rotundifolia in the Zurich Bog

Drosera rotundifolia in the Zurich Bog

The species of sundew above was “made famous by Charles Darwin’s tireless and hideous experiments upon it” (D’Amato, 136).  It captures insects with its sticky glue-like dew and tentacles that slowly wrap around the prey as it struggles.

Sarracenia purpurea in the Zurich Bog

Sarracenia purpurea in the Zurich Bog

The purple pitcher plant above is one of the American Pitcher Plants which is a different genus than the Tropical Pitcher Plants (Nepenthes).  The collar of the purple pitchers are “covered in bristly, downward-pointing haris.  Insects often cling to and slip from these hairs, which are wet with nectar” (D’Amato, 75).

According to Valerie, early June is the best time to observe the purple pitcher plants in flower.  Definitely check it out if you are near upstate, NY and hike with respect in these fragile peatlands.

Works Cited:

D’Amato, Peter.  The Savage Garden: Cultivating Carnivorous Plants.  Ten Speed Press, 1998.  Print.

Johnson, Charles.  Bogs of the Northeast.  University Press of New England, 1985.  Print.

Pygmy Sundew Propagation by Gemmae

28 Apr

I recently ordered a number of carnivorous plants from Cook’s Carnivorous Plants as well as CP Jungle.  One of those plants was a pygmy sundew, Drosera Scorpioides.  I wasn’t sure what was going on with the top of the plant at first; was it flowering?  It was actually producing gemmae or brood bodies that may be propagated asexually.  These are modified leaves that break free in the wild when struck by rain and they each contain an exact clone of the parent plant.

Here is a 2.5 minute video showing my process for propagating my new Drosera scorpioides from gemmae:

 
This is the plant the same day it was received from Cook’s Carnivorous Plants:

Drosera scorpioides, a pygmy sundew

Drosera scorpioides, a pygmy sundew

I fed my plant a moth 2 days before removing about 5 of the gemmae, 2 of which you see in the video above.  For the morbidly curious, the video of the moth being enveloped by the sundew tentacles is below.  After about 30 seconds the footage is sped up to 2000% although you’ll still probably want to skip forward as there is a lot of footage.  Notice there are a few more gemmae on the plant in this video:

Kinect + Blender 3D Scene Reconstruction

7 Apr

One of my current projects is to find a cheap and accurate way to 3D scan faces for the creation of custom coins and memorabilia;  mostly, I want my face on a 3D printable coin which can then be cast more cheaply in metal.  I had the opportunity to borrow a Microsoft Kinect which has 2 cameras and a structured light infrared laser projector.  One camera captures the infrared laser grid as projected into the room and constructs a depth map in realtime of the entire view.  The other camera captures visible light e.g. normal images and video.  I used the kinect to capture images and depth maps and reconstructed the scene in 3D using blender.  To dump the data, I used libfreenect‘s ‘record’ program, part of the OpenKinect project.

Here’s is a camera panning animation of the result created in blender using a displacement modifier on a heavily subdivided plane:

This is the unedited depth map that I took from the ‘record’ program output:

Kinect depth map produced by libfreenect's record utility

Kinect depth map produced by libfreenect's record utility

I had to scale and move the corresponding image texture to fit the geometry properly.  This is partly due to the slight distance between the cameras.  Here is the slightly altered texture image captured by the kinect:

Kinect image captured using libfreenect's record utility and slightly edited in gimp to align

Kinect image captured using libfreenect's record utility and slightly edited in gimp to align

This is the depth data as determined by blender’s ambient occlusion rendering:

Blender render showing depth via ambient occlusion

Blender render showing depth via ambient occlusion

I will soon compare these results to the free version of DAVID-laserscanner.  I’m currently waiting on the arrival of a very cheap laser line module ($2.50 to be exact) that will be used in conjunction with a high-def camera as input to the DAVID laserscanning software.  Stay tuned.

UPDATE: I’ve attached the .blend file for exploring in blender.  Textures are embedded.  Blender 2.56 Beta or later is recommended.

The Open Dream Journal Clip

1 Apr

I have wanted to design a simple notebook clip with an LED light for use on a dream journal for quite some time now.  Writing down dreams at night is difficult without a very convenient light source.  Now with access to a MakerBot 3D printer at sprout studios, I’ve released an open dream clip design on thingiverse, an online database of freely available and printable 3D models.  My design uses an LED and CR2032 battery, components commonly available at hackerspaces that would have 3D printers (and radioshack, although the parts are cheaper online through sites like digikey and mouser).

Here is the dream journal clip in action:

Dream Journal Clip with a white LED

Dream Journal Clip with a white LED

The first revision is pretty rough and we had to use tape on it to make it work:

Dream Journal Clip with a red LED

Dream Journal Clip with a red LED

Here is the MakerBot that Jimmie Rodgers kindly tuned and used to print the first test:

MakerBot at Sprout

MakerBot at Sprout

Planned improvements include:

  • using less plastic
  • making a larger hole for seating the LED
  • making a larger and deeper battery slot
  • adding a plastic tab to hold the battery in place (instead of tape)

Download the source files for the open dream clip on thingiverse and keep an eye out for  improvements shortly.

Follow

Get every new post delivered to your Inbox.