Midterm Project – Quantify Sentiment

In ITP, Quantified Self


Social sentiment is a difficult thing to think about in quantifiable terms. Humans are able to sense the emotions of others through such input as facial expression, vocal tone, body language, etc. However, our brains are not trained to extrapolate quantifiable data from our senses. Instead, humans learn over time to react intuitively and empathetically to these influences. This project is designed to find exactly what is quantifiable about social sentiment, which will inform the development of a device that will project a visual or auditory response into the same social setting to encourage positive moods or as therapy.


While our initial experiments will be conducted at specified locations, the idea of quantifying sentiment can easily be utilized in the “Quantified Self About Town” context. Placing our sensor hub in various locations (i.e. schools, subway stations, hospitals, bars ect) in the future would allow us to harvest data that could be considered by a multitude of different sectors. These include urban planning, architecture, and interior design among others. Another approach could focus on the individual user, where people could remotely view sentiment of specific locations.


Lit Review

1. The Social Life of Small Urban Spaces, William H. Whyte

William H. Whyte made the seminal film “The Social Life of Small Urban Spaces,” which aims to identify the successes and failures of various urban architectural designs. Whyte focused on the movement of groups of people through urban spaces and the environmental influences on their behavior. His most relevant conclusion from the film is that light has a noticeable effect on the congregation of groups of people in public spaces. – https://archive.org/details/SmallUrbanSpaces


2. R. Murray Schafer

Schafer is a composer and environmentalist whose body of work is concerned mostly with the conservation of natural soundscapes. However, he has written a number of essays which explore the effect that sound — and specifically “noise” — has on human psychology. – https://en.wikipedia.org/wiki/R._Murray_Schafer


3. The Thermometer of Social Relations, Hans IJzerman and Gün R. Semin

An essay which explores the connection between temperature based metaphors in speech and the actual temperature of various types of social interaction. Social proximity is tied directly to temperature differences, creating the associations with temperature that lead to the development of phrases, such as “giving the cold-shoulder.” – http://pss.sagepub.com/content/20/10/1214.full.pdf+html



1. The Social Life of Small Urban Spaces. William H Whyte. 1980. Film.

2. Schafer, R. Murray. The Tuning of the World. New York: A.A. Knopf, 1977. Print.

3. Sem, Gün, and Hans IJzerman. “The Thermometer of Social Relations.” PSCI. Web. 13 Mar. 2015. <www.academia.edu>. (unpublished, uncorrected proof)



We will gather data from various sensors within a social setting that will hopefully reflect the sentiment of that space. This data collection will happen in conjunction with a qualitative survey, which is to be taken by randomly selected subjects and designated participants. The data from this survey will be looked at in correlation with the data from the sensors in hopes that certain parallels will be made apparent. Whatever parallels can be found will be combined with further research into color psychology and visual/audio stimulus therapy to inform the development of the responsive device.


Tech and circuit

This project will employ sound sensors, infrared motion sensors, infrared temperature sensors and piezo vibration sensors. These sensors will send their data to an Arduino hub, which will either record the data to an SD card, via an SD shield, or serially to a computer. The data will be visualized in a Processing sketch. The feedback device will receive the data through serial communication and use tri-color LED strips and speakers to generate its response.


sentiment circuit_bb




Code and initial visualization

I’ve provided the whole code below but I also wanted to highlight two particular challenges we overcame when using the microphone and the piezo. Firstly, we we’re surprised when we didn’t see the values from the mic sensor behave as expected. A silent room was display similar amplitude values as a noisy room and what we realized is that the range between the peaks and the valleys is what determines the amount of noise. The larger the range, the more noise is being recorded. Here is the function we wrote to take the minimum and maximum values and determine the difference.



Once we resolved that, we noticed that getting reliable readings from the piezo would be the next priority. The problem was that the readings were very erratic and would establish different ranges based on residual vibrations. The goal was for us to smooth out these readings by averaging them over a certain amount of iterations. Here’s the function we used for that.



Finally, here’s our entire code for the midterm version of the project. As you’ll notice, we’re only passing along the difference of the audio waves over serial. We did this as we’ve only created a visualization for the audio currently but will be passing the other variables of serial as we progress.



The processing visualizations are really to help us make sense of the reading and also to provide some visual documentation of them for later use.






The experiment might be prejudiced by our hope for a specific outcome. We will be looking for a certain type of correlation to support our hypothesis, rather than generate a hypothesis based on our observations. We also need to find the appropriate context to frame these measurements for an improved chance at meaningful results.


midterm milestones

3/17: Sensors and hub device prototype, Processing sketch receiving data
3/24: device and visualization beta, data analysis ready for presentation
TBD: feedback device

Submit a comment