Smart Cane Final Presentation


My PComp final 1o3a9682is a smart cane prototype.  With an integrated accelerometer, Arduino Micro, Bluetooth module, LCD screen, and battery, the PComp enabled prototype is to be used in an industrial design process as a high-resolution play-testing tool to live out potential use-cases for a final design with the hopes of uncovering unforeseen scenarios.  For the PComp final, the prototype is designed to realize one use-case: a fall – of the user while using the cane and the cane alone separate from the user.  The prototype recognizes if it isn’t upright for three seconds, in which case it tells a p5 sketch to listen for the keyword “help” being sent over via the bluetooth headset in a separate communication and recognized by the p5.Speech library included in the code.  The p5 sketch has ten seconds to listen for “help” before the Arduino code idles the “fallen” status, disengaging the p5 from waiting for the keyword.  This logic exists to accommodate for the most likely use-case of a false alarm – the cane falling over, but not its user.   However, if the p5 sketch does receive the word “help” from the Bluetooth headset, it shoots off an AJAX call to open a local host URL.  A node server listening in the background waits for that URL to be hit before it initiates a Twilio library enabled function that sends a text-to-speech message over a call to my phone.  The message tells another hypothetical user that the user has fallen and requested for help, and that they’ll be connected to that user over the phone right away.





Final assembly:








Smart Cane Print from Kenzo Nakamura on Vimeo.



p5 Code
libraries used: p5 speech recognition

[code language="css"]
var serial;
var inData;
var voiceData;

var listeningActive = false;
var twilioSwitch = false;

var recog = new p5.SpeechRec();

recog.continuous = true;
recog.resultString = "nothing";

var serial = new p5.SerialPort(); // make a new instance of serialport library

function setup() {
 createCanvas(600, 300);
 serial.on('data', serialEvent); // callback for new data coming in
 serial.list(); // list the serial ports"/dev/cu.AdafruitEZ-Link74a7-SPP"); // open a port
 recog.onResult = function(){
 console.log('resultString: ',recog.resultString);
 };// bind callback function to trigger when speech is recognized
 recog.onEnd = function(){
 console.log('we ended');
 // console.log('json: ', recog.resultJSON);
 recog.start(); // start listening


function draw() {
 text("Arduino " + inData, 30, 30);
 text("BT Headset " + recog.resultString, 30, 150);
 text("Twilio Switch " + twilioSwitch, 30, 270);
 if (inData === "Listen"){
 listeningActive = true;
 } else {
 listeningActive = false;
 if(inData === "Flat" || inData === "Idle"){
 recog.resultString = "nothing";
 twilioSwitch = false;

function serialEvent(){
 inData = serial.readLine();

function checkListeningStatus(){
 if (listeningActive) {
 if (recog.resultString.indexOf('help') > -1 && twilioSwitch === false){
 console.log('detected keyword'); 
 twilioSwitch = true;
 method: 'GET',
 url: '',
 success: function(res) {
 // serial.write("C"); // send one character to arduino
 // //show "Calling family and emergency services" on arduino LCD;

Arduino Code
libraries used: 128 x 64 LCD Library

[code language="css"]
#include <OLED_I2C.h>

#define SDA 2
#define SCL 3


extern uint8_t SmallFont[];

int xVal = 0;
int yVal = 0;
int zVal = 0; 

bool showHelp = false;
bool started = false;
bool Up = false;

long startTime;
long timeNow;
long timer = 13000;
long helpStart = 3000;

void setup() {



void loop() {
 xVal = analogRead(A2);
 yVal = analogRead(A1);
 zVal = analogRead(A0);

 if ((xVal > yVal) && (yVal > zVal) && (xVal < 360) && (zVal < 310)){ // up
 Serial.println ("Flat");
 Serial1.println ("Flat");
 startTime = millis();
 showHelp = false;
 started = false;
 Up = true;
 } else { 
 if (started == false && Up == true){
 startTime = millis();
 started = true;
 Up = false;
 timeNow = millis();

 if (((timeNow - startTime) > helpStart) && ((timeNow - startTime) < timer)){
 showHelp = true;
 if ((timeNow - startTime) > timer){
 showHelp = false;
 started = false;

 if (showHelp == true){
 int y = 27;
 int i=13;
 myOLED.print("Do you need help?", i, y);

 if (Serial.available() > 0){
 int inByte =;
 int y = 27;
 int i=13;
 if (inByte =="C"){
 myOLED.print("Calling family and emergency services.", i, y);

node / javascript code:
libraries used: express, Twilio

[code language="css"]
var http = require('http'),
 twilio = require('twilio');

 // Download the Node helper library from
// These vars are your accountSid and authToken from
var accountSid = 'AC5980153a948df7a6b92869343aef84ef';
var authToken = "e4e0d420a5a1e7889f390928cf2adfea";
var client = require('twilio')(accountSid, authToken);

http.createServer(function (req, res) {
 //Create TwiML response
 // var twiml = new twilio.TwimlResponse();
 // twiml.say("Hello from your pals at Twilio! Have fun.");

 // res.writeHead(200, {'Content-Type': 'text/xml'});
 // res.end(twiml.toString());
 //url: "",
 //url: "",
 //url: "APd0a93e4be939d35698b8d33a3707dbae",
 url: "",
 to: "+16503150610",
 from: "+16508259667"
 }, function(err, call) {

 console.log('server hit');


}).listen(1337, '');

console.log('TwiML servin\' server running at');






Modern & Accessible fashion brands



A sign for Uniqlo is shown on its store, Tuesday, Oct. 11, 2011 in New York. The Japanese retailer is opening a Fifth Avenue flagship store, Friday, Oct. 14. (AP Photo/Mark Lennihan)



screen-shot-2016-12-14-at-1-07-22-am screen-shot-2016-12-14-at-1-07-54-am

Shape theory

Two triangles of the same size on opposite ends


make an extrusion towards each other at an angle and meet to create a triangle of a larger size


The bottom intersection of the two triangles creates a point at which four sides meet.  A cross section of these four sides is used to create another extrusion for the cane shaft.

These primary shapes are then refined with fillets, surface crown, and smooth transitions.






Finals Progress

Final components:
(Handle to be refined & protoboard to be fabricated w/ sensors)



Testing out protoboard (Round 1):




Four generations of circuit boards.


  1. Breadboard (Functional)
  2. Soldered wires (Functional)
  3. Only solder (incomplete)
  4. Copper Tape (Functional)


Now prepared to do my final circuitboard soldered to sensors & modules.

Modeling the above circuit board in order to create precise enclosure:


Prototype final enclosure to come soon.

Week 11

  • pivoted target user from elderly who need walking aids to a younger generation that needs walking aids, as it seemed
  • made first cane handle / sensor enclosure
  • collected all sensors to be used for smart cane final scenario
  • started arduino sketch

    Considering issues with seniors’ vision and a general sense that the elderly might not want to be bothered with another digital device, I have decided to focus on those younger people who still need help walking.  After some research, it seems that fashionable, or simply non-clinical options are few and far between.  To some, given our era, a digitized accessory might even be welcomed.


Week 10

Final Project Plan:
I am working to integrate physical computing in my existing workflow as an industrial designer.  I’ll be creating a final design for a smart-cane whose prototype will be my PComp final.  Though the final design will use an Apple Watch, the prototype will consist of an accelerometer, a microphone (with speech recognition, which I’ve heard is easy to code?), finger print recognition, bluetooth le, and a (dummy?) LED screen.

11/14 – 1:1 prototypes printed, finalize prototype 3D model, code roughed out
11/21 – first print prototype, code progress
11/28 – final print prototype, code completed
12/5 – polish model, refine code
12/12 – presentation prep

– Cane Head (3D printed plastic enclosure)
– Sensors: accelerometer, fingerprint, microphone,
– Bluetooth LE
– Cane Body (Metal)













PCOMP Midterm

After two an a half weeks, Tiri and my physical computing midterm project is finally finished!  It was a steady and productive back and forth between partners as we dialed down our concept and built towards our designs.  The project started off with my unoriginal idea of building a clapper light (it turns out my body is as resistant to moving as my mind).


As taught by the ITP faculty, we divided our project into components and built those components one by one, adding layers of complexity with every step.  First was getting a simple LED to respond to any sound caught by our microphone.  Then coding in Arduino for that same light to respond to different combinations of sound.  Once that worked, we moved on to integrating a different light source.  The intention was to use an incandescent light bulb in our set up, but quickly realized that converting DC to AC would have been too complicated for two beginners in a two week project.  In scrambling for an alternative light source, however, our project took on a different and better personality.  Tiri’s interest in the fine arts had her looking at really interesting light installations.  After taking in inspiration and assessing time left and our fabrication capabilities, we decided to build an infinity mirror.  This required that we use Neopixels (actually, a cheaper alternative), and the Neopixels required additional code.

l1050227 l1050229

Playing with the light, we noticed a frustrating user experience.  Given the simplicity of our microphone sensor (and perhaps the code its paired with), the microphone was picking up unexpected sounds, and often not picking up the claps that were intended for it.  It became apparent that, for a more satisfying experience of interacting with the light setup, we would need a “clap counter” that would let the user know which sounds were recognized by the sensor.  Enter serial communication to p5.js!  Building off of Synthesis-day and essentially backwards engineering the provided code, we built a GUI that would tell the user: 1) how many claps the microphone has registered 2) the strength of the light as dictated by the number of claps, and 3) the color of the light, also dictated by the number of claps.

Mirror Mirror Screenshot Video

The Arduino code changed several times over the course of the project.  When the Neopixels were introduced, we got another layer of control.  The resulting logic for our clapper product: When the first clap is registered, a timer starts.  With every loop, the code checks the current time against the start of the timer.  The code listens for claps until that timer hits three seconds.  For one clap, the code changes the state of the light from on to off and visa versa.  For two claps, the code dims the brightness of the LEDs.  Three claps brightens the LEDs.  Four changes the hue.  When the code hears four claps, it automatically changes the light’s hue and does not wait for the timer to hit three seconds.  We originally had the least-often required functionality (on/off) set to what we thought was the hardest clap combination of four claps, but working on the floor, it became apparent that this was in fact the easiest, or most likely combination for the code to hear.  The sensitivity of the mic had it catching many unintended sounds, and had our light turning on and off several times over a few seconds.  We reengineered our code so that on/off happened only with one clap, and the changing of hue happened with four claps.  As people walked and talked around our setup, instead of a dizzying strobe, we saw a more pleasing changing of colors.


Mid Term Progress

Tiri and I have decided to build a clapper light.  But because a simple on / off interaction at the event of a single clap lacks a rich enough conversation, we are discussing the different ways we can add layers to our project.  Perhaps 1, 2, 3 claps takes the light to low, med, high.  Or perhaps 1, 2, 3 claps dims the light, turns it up, and turns it on / off, respectively.  Maybe our p5 element could show our sound and light levels, even a log of past state changes, and we can make a fancy enclosure?!


[WEEK 7]

Serial Input to p5 Lab:

serial-io screen-shot-2016-10-17-at-8-54-01-pm


  • the graph was slow to display the inputs from the sensors
  • the last section of the lab resulted in a sketch that wouldn’t render
  • I’m still a little unclear as to what all the “serial.on(…” functions are doing


Intro to Serial Communications:

screen-shot-2016-10-18-at-3-21-18-pm screen-shot-2016-10-18-at-2-59-51-pm


Synthesis & HW [WEEK 6]

Synthesis, for me,  was not a success story.

It was startlingly awesome to finally connect the digital world to the physical one with the examples given.  And finally figuring out the fundamentals of serial and seeing how it works in reality cleared a lot of mental confusion.  But when it came to making our own project work, to get our sensors to speak with our sketch, my partner and I got lost in a sea of code.


It doesn’t help that we decided to use two variables instead of one.  Nor did it help to use dense (albeit simple) code.  But somewhere along the multiple file listings, the serial open, the splitting of the string into two variables, and appropriately applying the sensor readings to a code variable, we tripped up and couldn’t find a solution.

I hope to redo this assignment again, this time with a single variable.




Homework assignment:




Tone Lab

PCOMP – Observation [WEEK 4]

[photos are not my own]


When I arrived to New York City a month ago, I sat down to rest at a bench on the sidewalk.  I was taking in the marvel of the Big Apple when I noticed a tall silver monolith standing quietly next to me.  I hadn’t noticed it before I sat down.  I studied it more.  It had signs and symbols and “NYC” printed neatly on all sides.  Did it belong to the city?  I looked closer.  It’s proportions were handsome.  It’s lines were neat and clever.  The details gave it rhythm.  Great design – too good – no way it was a public work!


I got up from the bench and approached the statue.  Audio – Directions – USB Power -Internet[!] – LinkNYC.  Google search: “LinkNYC”. is a first-of-its-kind communications network that will replace over 7,500 pay phones across the five boroughs with new structures called Links.  Each Link will provide superfast, free public Wi-Fi, phone calls, device charging and a tablet for access to city services, maps, and directions.  


A free wifi router and computer operated by the city for the people and tourists of NYC?!  How progressive and cool!  Designed by Antenna.  The government contracting a legitimate and well-decorated design studio?!  How informed and tasteful!  Wait – designer is alumni of NYU’s ITP – omg are you serious, I f****** love this thing!  Will be available soon.  Can’t wait to try it!  I can see it now: happy tourists looking up hip stores with the Link and trendy New Yorkers sitting close by with top shelf lap tops working on the next big thing.  How cool.  New York City does it again!  Moving out here, going to grad school, paying all this money was totally the right decision!


A month has passed since.  Some Links are actually operating, just not at every location.  And the dream, it turns out, has been (somewhat) far from reality.  These seemingly useful and fool-proof devices have, in some cases, have been coopted by the public in ways unforeseen by the designers and managers of the project.  The biggest problem the Links faced in the real world was prolonged personal use.  (Usually) homeless people would set up camp at these powerful devices, essentially barring the rest of the public from using them.



In extreme cases, the homeless men were using the Links and their public web browsers for lewd acts.  Even with the city’s disabling of porn sites, the homeless were finding ways to evade the filters.  The Links’ web browser has since been disabled.

screen-shot-2016-09-26-at-11-17-52-pm screen-shot-2016-09-26-at-11-18-02-pm screen-shot-2016-09-27-at-7-41-26-pm


Though the Links have been used mostly in the ways it was designed, it is a shame that either: some of humanity can be publicly lewd or that the designers failed to recognize this tendency in the public.  Though the the case I’m writing about might not have a salient user experience point, I think it illuminates the part of Norman’s discussion on designing for context and the extremes of the human condition.  A better connection can be made to an excellent class I took taught by a former senior UX designer at Wikipedia.  The class was called “Designing for Evil” and it implored designers to, before releasing anything to world, take into consideration the worst possible ways your designed object could be used.  That you’d be putting it into the hands of the world’s trolls and only trolls.  An innocent lapse in the design process resulted in an unexpected setback for Links, but nothing that can’t be iterated and corrected over time.