mySynth;

Processing

Design Brief

Create a music-making system using Arduino to connect triggers and controls and Processing to play sounds. You will need to use the Sound library for Processing. Start with the mode - will this be used for live performance or pre-recorded music? How will use make the noise - by triggering sampled audio files or by making synthetic sounds yourself? What kind of visuals will accompany your music? Consider the interface: Which sensors will you use and which parameters will they control? How will the controls be laid out? Make an enclosure out of cardboard to house your Arduino and attach your triggers and controls to it.

  • Create a physical interface to control your music machine

  • Create a software program to obey your interface

  • Create visuals to accompany your music

Software:

The Inspo

Music and visuals have always been closely tied—both stimulate the senses, evoke emotions, and create unique experiences. This how I built an interactive system that connects sound synthesis with dynamic visuals in Processing. By creating custom sound waves and corresponding visual effects, I wanted to merge auditory and visual art forms. Through this project, I explored the process of combining code, music, and real-time interaction to create a synesthetic experience.

Drawing inspiration from generative art and sound synthesis, I envisioned a system where the user could play notes on a virtual synth and see corresponding visual elements that react in real-time. I wanted the experience to be immersive, where the user’s interaction directly influenced the visuals and sound. This led to the development of my custom synth and the accompanying visual effects in Processing.

The Process

Building The Synth:

To create the musical aspect of this project, I began by using Processing’s sound library. I wanted to start simple—working with basic waveforms like sine and square waves to generate different tones. Each note was tied to a specific frequency (inspired by the notes of a piano, from C4 to C5). This simple approach allowed me to focus on sound synthesis, while still leaving room for exploration.

Here’s a basic snippet of how I defined the frequencies for each note:

// Frequency values for musical notes (C4 to C5)
float[] notes = {261.63, 277.18, 293.66, 311.13, 329.63, 349.23, 369.99, 392.00, 415.30, 440.0, 466.16, 493.88, 523.25};

To add dynamics to the sound, I incorporated an envelope that controlled how the sound evolved over time—specifically, the attack, sustain, and release phases. This created a more organic sound when each note was played, providing a richer auditory experience. Here’s how I set up the envelope:

Env env;
float attackTime = 0.001;
float sustainTime = 0.004;
float sustainLevel = 0.3;
float releaseTime = 2.4;

I also included two oscillators, a sine wave and a square wave, to create different sounds:

SinOsc sineWave;
SqrOsc sqrWave;

void setup() {
  sineWave = new SinOsc(this);
  sqrWave = new SqrOsc(this);
}

Linking Sound with Visuals:

The real challenge came when I tried to create visuals that responded to the music. I decided to use colorful, animated circles to represent each note. The visual elements would be reactive—changing in size, color, and behavior based on the frequency of the note being played.

Here’s how I set up the visualization logic:

float hue = map(noteFreq, 100, 5000, 0, TWO_PI); 
VisualPoint vp = new VisualPoint(x, y, noteFreq);

I wanted the visuals to reflect the tone and character of the music, so I mapped the frequency of each note to a color on the HSV color wheel. Lower frequencies (like C4) mapped to one part of the color wheel (e.g., red), while higher frequencies (like C5) mapped to another (e.g., blue or purple). This gave each note a distinct visual representation.

Here's the VisualPoint class that handles the display of the circles, with color based on the note's frequency:

class VisualPoint {
  PVector position;
  float hue;
  float size;
  float lifeSpan;
  
  VisualPoint(float x, float y, float freq) {
    position = new PVector(x, y);
    hue = map(freq, 100, 5000, 0, TWO_PI);  
    size = random(10, 30);  
    lifeSpan = 1.0;  
  }
  
  void update() {
    size += random(-1, 1);  
    lifeSpan -= 0.01; 
  }
  
  void display() {
    fill(hue, 0.8, 0.8, lifeSpan);
    noStroke();
    ellipse(position.x, position.y, size, size);  
  
  boolean isFinished() {
    return lifeSpan <= 0; 
  }
}

Adding Depth and Interaction:

Once the basic synthesis and visuals were in place, I began refining the experience. I added a low-pass filter that adjusted the frequency of the sound based on the user's mouse position, adding a layer of interactivity that let users shape the tone and character of the music.

This was implemented with a simple mapping of the mouseY position to adjust the filter frequency:

LowPass lowPass;
float LPfreq = 2500;

void setup() {
  lowPass = new LowPass(this);
  lowPass.process(sqrWave);  
}

void draw() {
  if (mousePressed) {
    LPfreq = map(mouseY, 0, height, 5000, 100);
  }
  lowPass.freq(LPfreq); 
}

Additionally, I made the circles fade out gradually after they were created, but kept them visible for a longer period of time to allow the user to appreciate the evolving visuals. I tweaked the lifespan of the circles, adjusting how long they remained visible on the screen, giving users more time to engage with the experience.

lifeSpan -= 0.005;  

Lastly, here is the code for playing a note and triggering the visuals:

void playNoteWithVisuals(float noteFreq) {
  playNote(noteFreq);  
  
  // Create a burst of visuals at the center
  for (int i = 0; i < 10; i++) {
    float angle = random(TWO_PI);
    float radius = random(50, 200);
    float x = width * 0.5 + cos(angle) * radius;
    float y = height * 0.5 + sin(angle) * radius;
    visualPoints.add(new VisualPoint(x, y, noteFreq));
  }
}

The Final Result = mySynth;

The final result is an interactive system where users can play musical notes on a virtual keyboard (using laptop keys) and see corresponding visuals. Each note is represented by a circle that varies in size and color depending on the frequency of the note played. The visuals are dynamic, with the circles growing, shrinking, and fading based on both the sound and user interaction. The system reacts to both the input (which note is played) and the environment (such as mouse movement), making the experience feel more dynamic.

FINAL CODE

import processing.sound.*;

// Frequency values for notes (C4 to C5)
float[] notes = {
  261.63, 277.18, 293.66, 311.13, 329.63, 349.23, 369.99, 392.00, 415.30, 440.0, 466.16, 493.88, 523.25
};

// Envelope parameters
Env env;
float attackTime = 0.001;
float sustainTime = 0.004;
float sustainLevel = 0.3;
float releaseTime = 2.4;

// Sound oscillators
SinOsc sineWave;
SqrOsc sqrWave;

// Low pass filter
LowPass lowPass;
float LPfreq = 2500;

// Visuals
ArrayList<VisualPoint> visualPoints = new ArrayList<VisualPoint>();

void setup(){
  size(800, 600);
  
  // Initialize sound components
  sineWave = new SinOsc(this);
  sqrWave = new SqrOsc(this);
  env = new Env(this);
  lowPass = new LowPass(this);
  lowPass.process(sqrWave);

  // Use HSB color mode
  colorMode(HSB, TWO_PI, 1, 1, 1);
}

void draw(){
  background(0, 0, 0);

  // Adjust filter frequency based on mouse position
  LPfreq = map(mouseY, 0, height, 5000, 100);
  lowPass.freq(LPfreq);

  // Update the size of circles based on mouse position
  float sizeAdjustment = map(mouseY, 0, height, 2, 0.5);

  // Update and draw the visual points
  for (int i = visualPoints.size() - 1; i >= 0; i--) {
    VisualPoint vp = visualPoints.get(i);
    vp.update(sizeAdjustment);
    vp.display();
    
    // Remove old visual points to prevent memory overflow
    if (vp.isFinished()) {
      visualPoints.remove(i);
    }
  }
}

void keyReleased(){
  switch(key){
    case 'a': playNoteWithVisuals(0); break;  // C
    case 'w': playNoteWithVisuals(1); break;  // C#
    case 's': playNoteWithVisuals(2); break;  // D
    case 'e': playNoteWithVisuals(3); break;  // D#
    case 'd': playNoteWithVisuals(4); break;  // E
    case 'f': playNoteWithVisuals(5); break;  // F
    case 't': playNoteWithVisuals(6); break;  // F#
    case 'g': playNoteWithVisuals(7); break;  // G
    case 'y': playNoteWithVisuals(8); break;  // G#
    case 'h': playNoteWithVisuals(9); break;  // A
    case 'u': playNoteWithVisuals(10); break; // A#
    case 'j': playNoteWithVisuals(11); break; // B
    case 'k': playNoteWithVisuals(12); break; // C5
  }
}

// Plays the note and triggers visuals
void playNoteWithVisuals(int noteIndex) {
  playNote(noteIndex);

  // Create a burst of visuals at the center
  for (int i = 0; i < 10; i++) {
    float angle = random(TWO_PI);
    float radius = random(50, 200);
    float x = width * 0.5 + cos(angle) * radius;
    float y = height * 0.5 + sin(angle) * radius;
    visualPoints.add(new VisualPoint(x, y, noteIndex));
  }
}

// Plays the same note on the sine and square wave oscillators
void playNote(int noteIndex) {
  float noteFreq = notes[noteIndex];
  sineWave.play(noteFreq, 0.75);
  env.play(sineWave, attackTime, sustainTime, sustainLevel, releaseTime);
  sqrWave.play(noteFreq, 0.5);
  env.play(sqrWave, attackTime, sustainTime, sustainLevel, releaseTime);
}

// Class to handle visual points
class VisualPoint {
  PVector position;
  float hue;
  float size;
  float lifeSpan;

  // Constructor for note visualization (uses note index)
  VisualPoint(float x, float y, int noteIndex) {
    position = new PVector(x, y);
    hue = map(noteIndex, 0, notes.length - 1, 0, TWO_PI); 
    size = random(20, 60); 
    lifeSpan = 1.0;
  }
  
  // Constructor for frequency visualization (uses frequency)
  VisualPoint(float x, float y, float freq) {
    position = new PVector(x, y);
    hue = map(freq, 100, 5000, 0, TWO_PI);  
    size = random(20, 60);  
    lifeSpan = 1.0;
  }
  
  // Adjust size based on mouse position
  void update(float sizeAdjustment) {
    size *= sizeAdjustment;  
    size = constrain(size, 10, 120);  
    lifeSpan -= 0.005;  
  }
  
  void display() {
    fill(hue, 0.8, 0.8, lifeSpan);
    noStroke();
    ellipse(position.x, position.y, size, size);
  }
  
  boolean isFinished() {
    return lifeSpan <= 0;
  }
}