
ICAM Senior Project - VIS 160A
PALM & PETAL
a 3D Tool plane where nature and space collide
Overview
Palm & Petal is an interactive 3D environment where users cultivate a living landscape using hand gestures. Through real-time hand tracking, participants draw stems in space, trigger blooms, generate wind and vines, release butterflies, and navigate the world, all without touching a device.
Built with p5.js (WEBGL), MediaPipe Hands, and Tone.js, the project blends computer vision, generative graphics, and ambient sound to create a responsive ecosystem. Palm & Petal explores how natural movement can become a creative interface, transforming the body into a tool for shaping digital environments.
Timeline
January - March 2026
Tools
JavaScript, MediaPipe Hands, p5.js (WEB GL Mode), Tone.js, HTML + CSS, WebRTC
Why did i choose this as my project?
I've always loved flowers and doodling.
Since I was young, i've always been inclined to anything flowers and cute. My room was always decorated with flowers, as well as my notebooks, as I loved to draw as well.


I also loved playing Quick, Draw! and other doodling games
QuickDraw is an AI-powered browser game by Google where players have 20 seconds to doodle a prompted object (e.g., "banana" or "aircraft carrier") while a neural network tries to guess it.
Users can doodle on the screen, and I loved the interactive nature of it.
Inspiration
I was inspired by Torin Blankensmith and his creations in TouchDesigner
Torin is a freelance creative technologist focusing on mixed reality installations and interactive experiences. I originally planned for this prototype to be in TouchDesigner, and I looked through his Genuary videos, a series where he created a new visual and audio interactive piece in TouchDesigner for everyday of January.
My Iterations
Building the visual system
MediaPipe Integration
I integrated MediaPipe Hands which gives you 21 landmark points tracking your hand in real-time. The setup was straightforward - import the library, create a Hands object, point it at the webcam video element, and it starts outputting hand data.
The tricky part was gesture detection. I wrote functions to check finger positions and distances.
Pinch gesture
I check if the thumb tip (landmark 4) and index tip (landmark 8) are very close together (distance < 0.05 in normalized space).
Open Palm
I measure the distance from the wrist (landmark 0) to the middle finger base (landmark 12). If it's above 0.23, your hand is open.
Peace Sign
I check if the index and middle fingers are extended (their tips below their middle joints in Y) while the ring and pinky are curled.
Drawing in 2D
I began with a basic 2D canvas in p5.js where you could pinch to place simple flower sprites. At first I was just experimenting with gesture interactions, drag to draw lines, pinch out again to place a flower head. However, it felt these flowers were just static images with no depth or organic feeling.
I played around with different colors and tried various drawing methods, but it felt too disconnected from the natural gesture of arranging flowers. That's when I decided to pivot to a 3D canvas.
3d features
Building the 3D Foundation, Moving to WEBGL
I switched to p5.js WEBGL mode to get proper 3D rendering. I started by creating a simple ground plane - just a flat surface at y=100 where all the flowers would be anchored. At first I was eyeballing the camera distance and position, adjusting the numbers until it felt right. I settled on a camera distance of 760 units and a height of -220 to get that nice overhead "ikebana" viewing angle.
The coordinate system took some trial and error. MediaPipe gives normalized hand coordinates (0-1), so I had to map those to my 3D world space. I started with a smaller range (-420 to 420) but later expanded it to (-900 to 900) to give more drawing freedom when rotating the camera.
Designing Five Authentic Flower Types
I also researched traditional Japanese ikebana and created five distinct flower types - Cherry Blossom, Lotus, Chrysanthemum, Peony, and Tulip. Each one got its own characteristics stored in an object:

Grounding realistic flowers
For the flower geometry, I started simple - just 3D lines for stems and ellipses for petals. I used beginShape() and vertex() to draw the stem path, connecting all the points from where your hand moved. The petals were arranged radially around the flower head using rotation - I'd rotate by TWO_PI divided by the petal count, draw an ellipse, repeat.
The ground plane was easy - just a rotated plane primitive positioned at y=100. I added a subtle grid later by drawing intersecting lines to help with depth perception.

Early example of the 3D plane and drawn flowers
Adding Visual Polish, A Starry Background
I wanted the garden to be more fantastical. Inspired by the infinite nature of space, wanted a cosmic outer space vibe as well, so I created 300 stars scattered throughout the 3D space. Each star is just a sphere with a random position, size (1-5 units), and hue (full spectrum 0-360). I gave them a twinkling animation using sin(frameCount) to modulate their alpha, making them pulse gently.
The background itself is just background(0, 0, 5) - almost pure black with a tiny bit of brightness to keep it from being pitch black.

Updated 3D Canvas in Cosmic Environment
adding audio
Using Tone.js for interactive and responsive sound
I used Tone.js because it's perfect for generative music. Here's an overview of what I included.
Pad Synth
A polyphonic synthesizer playing a C-Am-F-G chord progression that loops continuously. It has a long attack (1.2s) and release (2.5s) to create that ambient, atmospheric drone.
Blip Synth
A simple triangle wave synth with a very short envelope (0.01s attack, 0.08s decay) for gesture feedback. When you draw it plays C5, when you plant it plays E5, when you erase it plays A4.
Lowpass Filter
A filter that sweeps when you bloom flowers - it ramps from 950Hz up to 2600Hz over 0.08 seconds, creating that satisfying brightness, then slowly returns to normal. It gives tactile audio feedback for the gesture.
Making it Multiplayer
Growing your garden with others, PeerJS for Real-Time Sync
I wanted people to be able to collaborate, so I added PeerJS which handles all the WebRTC complexity. Here's how it works:
Creating a room
When you click "Create Room", I generate a random 6-letter code (like "A3F9K2"), create a Peer with the ID garden-${code}, and wait for connections.
Joining a Room
When you enter a code, I create a new Peer with a random ID and connect to the host's garden-${code} ID. PeerJS handles all the signaling and NAT traversal.
Data synchronization on a multiplayer environment
When you plant a flower, I broadcast it to all connected peers. I serialize only what's needed - the path points, colors, petal count, and type. On the receiving end, I deserialize this data and create a new Flower object with those properties.
When someone erases, I broadcast just the flower ID. Everyone deletes the matching flower from their garden array.

The final Garden, palm & Petal
A multi-player ikebana canvas
A multiplayer 3D ikebana garden where you use natural hand gestures to plant authentic Japanese flowers in a cosmic starry space. Up to 6 people can collaborate in real-time, each adding their own flowers to create a shared living artwork. The responsive drawing system makes it feel like you're physically arranging flowers, and the audio responds to every gesture, creating an immersive, meditative experience.
CL
Thanks for swinging by...
Let’s build something together!





