Module 3 Activity Research

Weekly Activity Template

Tina Ye


Project 3


Module 3

Exploring more about the applications that we are planning to use for our final project, which are the TouchDesigner and Arduino. We did many research on the topic that we are planning to do, and did many testings on things like sensors and make visuals through TouchDesigner. The processes are important because it can help us having better understanding of the applications and tools that we are using, and make sure that we can create the final project more fluently by learning the skills.

Workshop

We created prototypes to complete the challenge, the tools includes recipe card, ingredients, and cooking tools to support the apple pie scenario. The smart kitchen can help users by giving instructions to the user. When the user opens the smart fridge, it can show users what ingredients are needed for making an aplle pie by showing the recipe, it can also show the ingredients' expiry alerts. It is helpful by communicating with users by giving suggestions and guide the next steps. This is the apple pie's recipe provided by the smart fridge. It lists simple ingredients that are needed to create apple pie, the recipe is important to show main instruction to allow the user follow and start making the apple pie. This is the smart cleaning robot that can detect whether the environment is clean or not, it will clean the kitchen with the wet spill. The robot can identify the mess automatically and respond without user controlling it, also demonstrates how the smart kitchen supports the user during cooking. This sign is important to warn the user that oven is starting to preheat. After the user agrees to make the pie, the oven will receive the data from the fridge and get the recipe, and begins warming up. It shows how different smart devices can share information.

Activity 1: My Research

Looking at the chart shown on serial monitor's live chart, it shows that the pulse sensor that we used was active when we interact with it by holding the sensor using our hands. This chart shows the real time data for pulse sensor of when we hold with the sensor. For this chart, it shows the real time data for force sensor when we put pressure on it, the data goes high up to 1023, which means no one is giving pressure to the sensor. As contrast, it shows pressure data low when someone press the sensor with their force. This is the first version of our testing on two sensors together using Arduino, it shows the data separately by having real time data for force data at first, then comes out with pulse data when someone holds the pulse sensor with their hand. However, we find this method is not good when sending data to TouchDesigner, so we changed another way to show the real time data. Connecting the Arduino and the two sensors, and checking the live data will show through Arduino's serial monitor, this is helpful for me to understand whether if the sensors are working or no. The serial monitor shows the real time data for both of the sensors at the same time. This is the final version of our Arduino testing, this shows the two datas from each sensors that we are using for this project, and this data will be sent to TouchDesigner for it to detect and trigger.

Activity 2: My Research

We watched some video tutorials through YouTube to learn how to allow TouchDesigner read data from our sensors using Arduino. I learned that we should use serial monitor to read the data, then using DAT to operator to let TouchDesigner always read the last updated data. Moreover, we learned how to create the visuals by watching tutorials through YouTube by searching the visuals that we want. This image shows how to make the visuals present in different ways. This the visual that we created using TouchDesigner, by showing the soft patterns moving around, the purpose is to make users feel calm when looking at the visual. The tutorial taught us how to use TouchDesigner, such as how to change color and how to change texture. This is one of the step that we used when change the texture. This step shows about how we can add the animation for the noise, this is useful to make the visuals move and works well if we want the visuals to be dynamic.

Additional Research or Workshops

Showing through GIF to demonstrate how the force sensor will show the real time data when we are holding it and leaving it alone. Showing through GIF to demonstrate how the pulse sensor will show the real time data when we hold the sensor with our hands. Showing through GIF to demonstrate how the two sensors will interactive with users when they are holding the sensors at the same time. The data will still change when we leave one of the sensors alone, such as holding pulse sensor while leaving the force sensor alone. Showing the TouchDesigner dynamically of how detect data through sensors.
Showing the TouchDesigner dynamically of how it read the data from Arduino when we are interacting with the sensors together with our hands. The final prototypes that we will be using for our project. A seat cushion for force sensor, and a wrist band for pulse sensor. This shows how we wear the wrist band, by putting the pulse sensor in the inner part of the wrist band by touching the user and read data through wrist, The force sensor will be placed inside the seat cushion, to detect the force data when user is sitting on the cushion. Showing the final setup of our project by lying on the cushion and having pulse sensor placed well with wrist band.

Project 2


Project 3 Final Prototype

This project continues the research directions of Projects 1 and 2, focusing on how real-time environmental data can influence visual output to help users relax.

In this stage, a force sensor and a pulse sensor replace the light sensor from Project 2 to activate an shape the media experience, maintaining our goal of designing a device that helps alleviate stress in modern life.

The prototype for Project 3 uses a force sensor and a pulse sensor connected to an Arduino to detect changes in pressure and user heart rate. When the user leans on or sits against the cushion, the sensor values change. These changes trigger TouchDesigner to display predefined visuals through a projector. Depending on the user’s heart rate, the system presents either fast and intense geometric animations or calm and soothing geometric motions. It can serve as a reminder device or assist users in calming down, meditating, or falling asleep. The prototype consists of a cushion with a built-in force-sensor circuit and a wristband with an integrated pulse sensor, requiring only that the user lean on the cushion and wear the band to activate the system.

Through this process, we explored how to connect multiple sensors and transform incoming external data into responsive media behaviors. By experimenting with particle animations, effects, and interaction mapping, we were able to convert sensory input into distinct visual experiences that support users in self-assessment and relaxation.

×

Powered by w3.css