(winter show documentation coming)
It’s very exciting to see our installation comes to life since the mini model that we made weeks earlier. To recap, the project is a interactive storytelling installation that consists of several door-sized frames. Each frame “contains” music, visualization, and a small paragraph from a poem written by my partner about death and rebirth. As a participant go through each frame, the sound and visualization assigned to the frame will be triggered.

After my partner’s final presentation last Tuesday and user testing on Wednesday, we decided to change our trigger system, photosensors and laser diodes, to ultrasonic sensors. The problems with our previous triggering system was that it was very difficult to aim laser to the tiny photosensors, and any slight change or movement of the structure will effect the position of the laser, thus resulting in flickering reading of data from the photosensors.

Here’s the code for setting up the ultrasonic triggering system:

Also after the user testing, we have been thinking about how to stop the participant at each frame. We thought about sending a signal for stop by adding a speaker sound every time the participant passes the frame. We also programmed the LED light to blink every time the participant passes one frame.

From this project, we learned a lot about setting up serial communications between two applications, voltage relay using transistors, testing different sensors for desire result, and so on. What we’ve also learned from this project, besides the physical computing part, is that often space is a limitation and factors to consider when it comes to project like this. Something more detachable or easier to transform would be nice to consider next time when doing project like this. It has been a lot of trials and errors, but in return, I think we got a lot out of this project.

PCOMP Final Progress

Physical Computing Part:

In our installation, we will need to use some kind of sensor to trigger projection as user passes each door frame. We have thought about different types of sensor: pressure sensor which the user steps on and triggers projection, or motion sensor which detects the user’s movement and trigger projection. Then someone suggested to us about using laser beam to detect whether or not the user has passed the door and trigger projection. So we ran a test and the testing result was not too bad; the laser beam can accurately detect the passing of an object and trigger event:

Component used for making a laser beam sensor in the video: 1 photo-resistor, 1 laser pointer, two 10 ohm resistors, 1 LED pin, and here’s the code:

int ledPin=12;
int sensor=0;
void setup() {
 pinMode(ledPin, OUTPUT);
 pinMode(sensor, INPUT);
void loop() {
  if (analogRead(sensor)<750){

After the initial test with LED pin, we then ran a test with the same circuit and used it to trigger video playing in P5.js. and we got it to work!

Here’s the code in P5:

let serial;
let latestData = “waiting for data”;
let vid1;

function setup() {
createCanvas(windowWidth, windowHeight);
vid1 = createVideo(“sample.mp4”);
serial = new p5.SerialPort();
serial.on(‘data’, gotData);

function draw() {
if (latestData < 800) {
image(vid1, 0, 0);
} else {

function gotData() {
var currentString = serial.readLine(); // read the incoming string
trim(currentString); // remove any trailing whitespace
if (!currentString) return; // if the string is empty, do no more
console.log(currentString); // println the string
latestData = currentString;

Fabrication/Construction Part:

PCOMP Final Concept

For the pcomp final, my partner, Daisy Lu Wang from other pcomp session, and I have been discussing about making an interactive installation involving video and music and focusing on storytelling. Lu is a musician who had recently composed a series of music pieces about the “circle of life” in Buddhism and we sort of having the idea of building an interactive installation revolving around this theme using video and sound to tell the story of “six states of existence”  (http://www.onmarkproductions.com/html/six-states.shtml) and let participant to experience the story.

We have the idea to create a series of frames (ideally six but varies depending on our abilities). As participants walk through one frame, a sensor will detect that the participants have entered the frame and thus trigger playing of video and music. Each frame will trigger different videos and music.

How can the videos projected without a physical surface? that’s a question we’ve been struggling with.

Inline image 1
We were thinking may be we can create some kind of fog screen on which videos can be projected and participant can go through the screen. As the participant passes through one frame, the video projection on the next fog screen is triggered. Each video depicts one state of existence, so as participants completed the journey through all six frames, they experience all “six states of existence”.



PCOMP Mid-Term Project

Collaboration with Martin Calvino

For our mid-term project, we brainstormed many ideas involving motion sensing and eventually we decided on using Kinect sensor to detect motion and incorporating a sensor to control the visual output. I had no knowledge about Kinect and my partner Martin, who had previous experience in using Kinect, explained to me that the way Kinect works is that it has an infrared projector that sends out infrared light into the room and an infrared camera that reads the infrared lights that’s reflected back to see what things are further and what things are closer (so basically it senses the depth of the object(s) that’s being captured).

Here’s my sketch of how all the parts are connected:

Arduino(light sensor input)—>Processing (load Kinect input and map it to input value of light sensor)—>Projector (visual output controlled by light intensity)

Here’s a quick shot of the result taken by Martin:

In the code, the changes of input as a result of changing light intensity is mapped to the size of each pixel (the rectangles shown in the video). Thus as the the light intensity becomes high, the size of the rectangles increases, and as the light intensity gets low, the size of the rectangles decreases.

if (port.avilable()>0) {

PImage img=kinect.getDepthImage();
image(img, 0,0);

int skip=1+int(val);
for (int x=0; x<img.width; x+=skip){
for (int y=0; y<img.height; y+=skip){
int index = x+y*img.width;
float b=brightness(img.pixels[index]);


The change of input from Arduino is also mapped to the degree of tilting of Kinect:

deg=constrain(deg, 0, 30);

Material: 20×30 Black Mat-Board, drafting film, two straws, 8×5.5 Muji Box

We do need to think further about the interaction aspect of the project and why would people want to interact with our installation. I think giving a more elaborated context and focusing on user experience would be the next step for polishing this project.

Arduino p5.js Serial Control

This week, we learned how to do serial output from Arduino to p5.js. 
I connected a potentiometer to Arduino analog output and tried to read the serial output in Ascii characters:

However, when I tried to connect my Arduino to one of p5.js’s sketch, it doesn’t print out any data, and the console only shows this:

I thought it was because I have multiple programs open at the same time so I closed Arduino editor, but I still couldn’t get any data in my p5.js…
[trying to find the problem causing this hopefully will find out]


Servo Control Using Arduino

This week we have learned about servo motor control using Arduino and for this week’s lab, I am trying to prototype a vending machine using servo motor.

1. Overview of programming servo motor via Arduino online editor:

Insert 3-pin head into servo motor pin. The yellow cord will be connected to digital pin on arduino board.

Once the circuit is completed, I connected the arduino board to my laptop and upload the following code:

2. Prototyping a simple vending machine mechanism:

Electronics: Arduino Uno, servo motor, wires

Material: misc wire, a recycled cat-food can, a recycled rectangular box

However, the result was not as successful as I thought because I overlooked the fact that the servo motor only rotates 180 degree instead of full 360 degree. So the result was just like what is shown in the gif—the coiled wire rotates back and forth instead of rotating consistently in one direction and thus the item is stuck in the coiled wire.

Then I found a hobby motor inside my Arduino kit and tried to use it to replace the servo motor. But turns out that the hobby motor spins way too fast (almost resemble the speed of a fan spinning) and I wasn’t able to turn down the speed. Therefore the attempt to use a hobby motor was unsuccessful either.

PCOMP Week 3: Playing with Programming Arduino & Observation

Part 1: Play with programming Arduino

To review this week’s lab, I came up with a simple application—I called it “lyrics highlighter”. Basically I used 3 LED lights—each highlights a line of lyrics—and connected them to the Arduino board via digital input/output ports. Each LED light would light up for a few seconds, which equal to the amount of time each lyrics is sung, and then it would be turned off. After a short interval, the next LED light lights up as the next line of lyrics is being sung. Here’s the demo:

Part 2: Observation

for this week’s assignment, I got to observe the use of an interactive technology in public. I chose to observe the interactive train info display at Broadway-Lafayette subway station platform.

There are three different options at the bottom of the screen—“Maps & Directions”, “Arrivals”, and “Service Status”. I realized that the most-used option is “Arrivals” and most people are able to recognize it right away and click on it without any difficulties and hesitations. I think this specific function was successfully designed and has met its purpose since people who are waiting for trains do want to know when the train will arrive and they can do it without asking anyone how to do it. I also noticed that most people don’t return to the home/menu page after checking arrival times, but the screen itself will automatically return to the menu page after being idle for a certain amount of time. I thought this is a very helpful feature because most people don’t bother to return to menu page after getting the information they want to know, so having the screen set to return to menu page itself would help people find the information they need quickly.

However, to me what seems like taking the longest time was when people try to proceed to the next page of “Arrivals”. Some people tempted to scroll up and down with their fingers, and some other people tempted to swipe. The correct way to go to the next page is actually press the “<” or “>” icon at the bottom of the screen, but a lot of people don’t seem to notice them.

Overall I think this piece of interactive technology was successfully designed and well-placed at an appropriate location. Although there are still some small details like the navigation icons in the “Arrivals” page being unnoticeable, the overall interface is very clear and straight-forward and most people are able to get the information they need without any difficulties.



Simple lit-up-the-LED circuit

Step 1: connect one end of the black wire to GND and one end of the red wire to 5V.

Step 2: connect the other end of the black wire to “-” column on the breadboard and connect the other end of the red wire to “+”column. By doing so, I added power source to the breadboard.

Step 3: connect one end of a resistor to one of the holes in the “+” column and the other end in “a” column. Doing so will reduce the risk of damaging the circuit by controlling the amount of current flow.

Step 4: connect the longer lead of a LED to one of the holes in the same row of the resistor and the shorter lead to one of the holes in the adjacent column.

Step 5: place a switch adjacent to the LED light and bridge the two components with a wire.

Step 6: Finally, complete the circuit by connecting the “-” end of the switch next to the black wire in the “-“column.

And that completes a simple circuit for turning on a LED with a switch!