Week 5 blog updates

Lilly Moore

2/6/25

Rob Duarte 

ART 4925

Week 5 blog updates     

    This week we went over using Arduino and touch designer. Tuesday we used our toolkit, Arduino and touch designer to use the potentiometer. Our input was turning the nozzle and out was getting different coordinates in the touch designer software. Then we were sent home to complete the Touch Designer tutorial. This was quite easy and straight forward. 

    Thursday we are starting to think more about our project. Throughout my research I discovered that it is not saying it is necessary to have the xbox connect motion detector to complete this project. It says that a webcam would work, my only concern, and Rob agreed is that there wont be enough light in the room to detect the figure, only when the projector is some-what shining on. I think it is worth a shot trying to see if it would work so I wouldn't have to go check out the piece of equipment. If not I will try that, or use a AI camera that rob recommended. My next step is to figure out everything on touch designer and code (python is recommended) my next step is to check out the projector and figure out a good room and set up for the interactive experience. 

    This coming week my focus is to complete the touch designer tutorial Rob is assigning us. I still feel unexperienced with the software and in order to complete my first project it is necessary to have a better understanding. 

    This is my conversation with chatgpt to prepare for the project;

Starting this spotlight tracking effect in TouchDesigner involves using real-time body tracking and projection mapping. Here’s how to begin:


Step 1: Set Up the Camera Input

  1. Add a Video Device In TOP

    • This brings in your webcam or depth camera feed.
    • Connect it to a Level TOP to adjust brightness/contrast if needed.
  2. Use a Background Subtraction or Blob Tracking Method

    • If using a regular webcam, use the Difference TOP (to compare against an empty background).
    • If using PoseNet (body tracking):
      • Add a Kantan Mapper or Blob Track TOP for detecting movement.
      • You can also use a CHOP Execute DAT to extract movement coordinates.

Step 2: Extract Position Data

  1. Use a Blob Track TOP (for simple tracking)

    • Detect the largest moving shape (the person).
    • This outputs X and Y coordinates in the scene.
  2. Use PoseNet (for precise tracking)

    • Install TouchDesigner PoseNet COMP from the Palette (for TensorFlow-based body tracking).
    • Extract the center body coordinates (e.g., neck, torso).
    • Convert the coordinates into a CHOP.

Step 3: Create the Spotlight Effect

  1. Add a Circle TOP

    • This will be the spotlight.
    • Apply a Blur TOP to make it soft and glowing.
  2. Control the Position with CHOPs

    • Connect the tracked X/Y position CHOP to a Transform TOP (which moves the circle based on the body).
    • Use a Math CHOP to scale the values to match the projector resolution.

Step 4: Output to the Projector

  1. Add a Window COMP

    • Set the resolution to match the projector.
    • Drag the final spotlight TOP output into this Window COMP.
  2. Calibrate the Projector Mapping

    • Use a Kantan Mapper or Corner Pin TOP to align the projection to match real-world movement.

Extra Enhancements

✔ Add Smooth Motion – Use a Lag CHOP to create a natural following effect.
✔ Adjust Spotlight Size – Scale the Circle TOP dynamically based on movement speed.
✔ Multiple People? – Use multiple Blob Track or PoseNet instances.


Would you like to track movement using PoseNet (AI-based tracking) or keep it simple with blob tracking?






Comments

Popular posts from this blog

Week 3 Blog

Week 2

First Week Blog