Pcomp final – week 2

Project sketch

diffusing light masking lightGood news, I am now working with a partner!
Alina joined me and we are working together on the hourglass.

We’re making a virtual geometric hourglass that will be like an abstract interactive art piece. The user can hang on the wall and count down. When turned upside down, the count will restart.

Project sketch

 

The hourglass will consist of 16 rows of LEDs in a 16″ canvas frame. We made a  timeline, a bill of materials and already ordered the Neo Pixel LEDs. Right now we want to get the prototype working right before we get the lights. Also, we’ve been testing the plexi and canvas to see what visualeffect we get.

 

Bill of materials

max budget – $200

2 tilt sensors / accelerometer

180 LEDs
Arduino
~$20 Plexi – 24 – frosted

~ $ 30 Canvas
~ $10 Wood


Timeline

Nov 12
ordering parts, meeting with Benedetta

Nov 13
Laser cutting and prototype making

Nov 14
Class, regroup and discussion
user testing?

Nov 15

Friday morning, buying wood and plastic

coding

Nov 19
Tuesday ~3 or 4 PM
start arranging

Nov 20
continue to work

Nov 21
class, show progress

Nov 22 – 28
finalizing

Nov 28th
fully working, time left for debugging and testing, revisions etc.

Dec 5
final presentation

Final project proposal

Even though I was excited about the pulse sensor and initially wanted it to be my final project too, I feel I should experiment with other projects and maybe go back to that in the future.

For My final project this semester, I would like to make an hourglass.
Time is a subject that occupies my mind a lot and I have been wanting to make some form of clock for a while.

There will be three hourglasses and they will count a day, an hour and a minute. When turning the screen/frame, the counting will reset. This hourglass is an abstraction of a physical one.
It will include flat geometric shapes only.

The technical aspect is not yet resolved. For now, I made a processing sketch that you can see below and I am thinking of making it projected, although I am open to any ideas of a physical substance that would physically emulate the movement of an sand in an hourglass.

The back side of the frame/screen will have a circuit with either a switch or a potentiometer type sensor and when turning it, the sketch/projection/movement will start over.

The cost of this project has a wide possible range, depending on the materials. As a decorative artwork, I don’t think that the materials should be necessarily the cheapest available, but I also don’t want it to be over priced.

room_with_glass

hourglass_sketches-01

hourglass_sketches-02

Digital input and LCD print

Going over some of the first labs, partially as a setup for my final project.
I tried the “LCD crystal ball” from the arduino book, since my project is going to include a screen and perhaps a tilt sensor.

There is something wrong with the screen because it’s not printing anything but blank characters, but the contrast with the potentiometer works and the tilt sensor is also working, as you can see in the video. The sensor is giving a digital 0/1, that I think will be enough for my hourglass to tell if it has been rotated 180 degrees.

digital in and lcd print from ziv schneider on Vimeo.

Halloween Photobooth

Outside of class, I worked with a group on this year’s Halloween party photo booth. This was a big learning experience for me, with a very tight deadline. Most of the technical work was done by Alexandra from 2nd year, and I learned a lot just by watching her work, planning the project, troubleshooting and improvising.
It was the first time that I was part of a physical project that uses technology and is used by a large group of people and it was an interesting lesson, and that is why I decided to blog about it.

The initial idea was to make a “PhotoBoo!” . The user would walk into a very dark space and an extremely bright light would hit them in the face. The photo would be taken with a slight delay, when the person is frightened/shocked/angry.

A few questions we had to answer along the way:
– What will the user hear or see while he is being frightened?
– If there is a delay, how will the camera see anything?
-Should we use a regular webcam or a 5D?
– How do we trigger a flash?
– Or maybe  we should trigger clip lights using a power switch tail?
– How do we avoid getting too many photos with the sensor triggered by human presence?
– How do we build the dark booth?
– How do we keep drunk people from tripping over when they walk in the dark?

The whole project was composed by Alexandra using Max. All the different parts were connected by the program – Arduino, the camera, the dropbox folder with the photos.
We ended up using a 5D camera with This  part (strangely called female hot shoe) to trigger the flash.
The flash didn’t work perfectly and until the last minute we weren’t really sure if it was going to work.
The Booth was built in room 50, we used dark curtains and created a fairly big space that can contain a group of people.  The way to the booth was paved by whiteboards and the entrance to the booth was facing the sensor so that we won’t get side shots of people. We used a projector and a disco ball to get this effect in the space.

We were troubleshooting until after the last minute when people were already there and didn’t even document properly but still I think we can take pride in what was achieved within ~3 days.

The photos (frames were a last minute improvisation that I deeply regret):

PhotoBoo from ACoym on Vimeo.

The code for the flash (courtesy of Laura)

#define CAMERA_FLASH_PIN 8
#define BUTTOM_PIN 2
 
void setup()
{
  pinMode(BUTTOM_PIN, INPUT);
  pinMode(CAMERA_FLASH_PIN, OUTPUT);
  digitalWrite(CAMERA_FLASH_PIN, LOW);
  Serial.begin(9600); // open serial
  Serial.println(“Press the spacebar to trigger the flash”);
}
 
void loop()
{
  if (digitalRead(BUTTOM_PIN) == HIGH) {
    digitalWrite(CAMERA_FLASH_PIN, HIGH);
    delay(100);
    digitalWrite(CAMERA_FLASH_PIN, LOW);
  }
  else {
    digitalWrite(CAMERA_FLASH_PIN, LOW);

 

Heartalarm – pcomp midterm

Heartalarm

The initial idea for this project came from living in a bad part of Bushwick that I was scared to walk through at night. I had the idea of making an self defence device/alarm that responds to your level of stress and anxiety. Looking for the right data to trigger this alarm, I arrived at the pulse sensor. I decided to make a basic alarm that will be triggered when the pulse goes above a certain level. The long term plan was to have the small and wearable device connect to wi-fi and report any irregularities online to your spouse/parent/person of your choice. Feedback I got from the class while presenting my idea:

  • There should be another form of input to trigger the alarm so that it doesn’t go off for the wrong reason. I should look into combining more sensors like distance and light so that: if it is dark + someone is getting closer to you + your heart rate is very high > then the alarm will go off.
  • There should be some form of approval from the user that there is indeed a need for help.

Failures

  • Failure to work with a piezzo – The code that came with the sensor was using interrupt and there was some overlapping with the tone library that couldn’t allow the piezzo and the sensor to work at once. Trying to troubleshoot this until the last minute, I ended up using processing for sound.
  • The reading was not entirely precise – I should’ve worked further to improve the code and normalise the values of the output.
  • Not putting enough work into the interface and user experience – being too absorbed in getting the basic piezzo and sensor to work, I didn’t pay enough attention to the bigger picture, which was the main challenge of this exersize.

Lessons (Hopefully) Learned

  • Plan ahead
  • Get a plan B, and C
  • Think big and try more solutions, experiment.
  • More focus on user interface.

Some Pride

  • Getting the project significantly smaller using a shield which made it pocket sized.
  • Learning how to solder.
  • Connecting Arduino and Processing.

Heartalarm

Heartalarm

Pulse Sensor ampd from ziv schneider on Vimeo.

Sketching for mid term

For this class we had to read “Sketching user experience: the workbook”
I recently participated in Sketch Camp NYC that took place at ITP recently and I suddenly realised how detached my mind and hand have become when it comes to sketching. I used to sketch constantly from high school through undergrad but it seems this skill is somewhat rusty now, ever since I started working with the computer full time.

So, I opted for a digital sketch of the project and made a note to myself to befriend the pen and pencil once again. I think sketching with your hands on paper can really help not only gettin gyour idea across but also figuring out very quickly what is it that you are going to make, which sometimes only becomes clear to you while visualising. And maybe an idea has many problems to it that are only seen by rapidly sketching. The computer sometimes adds boundaries that make it difficult to convey our initial, raw ideas.

This sketch doesn’t show the the product itself. It shows the situation in which it would be used and the sensor options. The idea is to create a self defines device. I want to create something something wearable that would detect stress and sense when your’e in trouble. It would then set an alarm and send a signal with your location. The first option I thought of was the pressure sensor, but my thought was: ” what if you are in such a position that you cant even press it? Looking for something more discrete that would take no effort to operate, I found the heart rate an interesting data to use.

When showing the sketch in class and getting feedback, an interesting point made by Kate was: maybe there should be another step of approval in order for the alarm to be triggered.
A button pressed to say “This is not false, I am really in some deep shit”.

Quahog Main Street-2

Public space interaction observation

I hope that choosing the ‘Tisch’ building elevator isn’t the biggest Cliché after the ATM.

Analysing its level of the interactivity (as Crawford said, interactivity of things can be measured by a scale rather than a yes/no test) I would put the elevator somewhere around medium on that scale. It receives an input from humans and acts upon it, but not in a very sophisticated way. The human chooses to go up or down, and then chooses a floor. The elevator receives the choice info, processes and goes to the specified floor. It gets directions from the human but does not conduct a dialogue with it. The “thought” is not very complex, you tell it what to do and it does.

What interests me the most in elevators was the feedback, a crucial aspect that didn’t receive proper treatment in this specific elevator.

EXHIBIT A

How fast can you tell which of these floors is selected? What if your’e in  a rush? The lack of contrast between the on/off  lights and the rest of the panel leads many people to press the buttons they need twice or more, just in case. I heard that somewhere there is an elevator with an undo function. Pressing the button twice cancels the selection. Hoping this is not an urban legend, if the function was used in this elevator it would be a disaster because the second press would lead to cancellation and also confusion.

EXHIBIT B

Whoever designed this panel decided to give the up/down arrows and the floor numbers the same type of graphic treatment. Instead of placing the arrows in a height that is immediate, eye level height, these arrows require head lifting. Once your head is up, there is also the time it takes to tell what you are looking at.

EXHIBIT C

If you are not on the ground flood, telling if the elevator goes up or down actually involves peeking into it. I think placing the arrows in a spot that is more visible and noticeable from the outside would be very helpful.

elevator buttonstriangle_floorarrows

Our Fantasy Machine

the Pet's Side

Do you miss your pet? Because we sure do miss ours!
For the first physical computing class assignment, we had to imagine a machine that we would like to exist, and to build a very initial prototype of that machine. Together with Alon and Roy, We made a machine that allows you to pet your cat (or dog) from afar. The fury box would be very welcoming and inviting for the pet. The owner would be able to send out signals for the pet and would also be notified when the pet was in the box.

We were thinking that heat and vibration sensors would be used in order to transfer the feeling to both sides. The pet would receive heat and vibrations and the owner would also sense that on his side and would be able to feel when the pet is purrring.

owner_side fantasy_machine_2

The real 8-ball dillema

pessimistic 8ball

pessimistic 8ball

In Chris Crawford’s “The art of interactive design”, he writes about things that are mistakenly perceived as interactive and it made me re-think about different products.

For example, the magic 8ball, one of my all time favourite objects and a life companion – is it interactive? When I ask the 8ball a question, It responds to a energy produced by my hand, and replies and with an answer. In a way, it is speaking to me, we are having a conversation. but can a random text be considered as speaking?

The 8 ball does not process my question. The answers are chosen by chance and therefore it only creates the illusion that it’s listening to me. Maybe it’s a bit like the un-listening jerk that Crawford mentions, not a good conversation partner.
On the other hand, it does process some of my input – the physical aspect of it, translating the energy of my hand gesture in order to shuffle some possibilities and I cannot predict the answer for sure.

Although it does not think and listen in the traditional way in order to process the question we ask, I would say the 8 ball is very good in making us believe that it does.