GILGA Test Project #1

Reimagining the
Live Visual Performance
Workflow

How my frustration with existing tools led me to build something new from the ground up. Submitted for the GILGA Intern position.

Sagi Kahane-Rapport AFI MFA Directing Fellow March 2026
Two industry-standard tools,
neither built for the job.

As a hobby, I do live visual performance (VJing). It's a lot of fun, and it lets me connect with artists across mediums I don't usually touch. When I'm performing alongside a DJ, the visuals need to feel alive — reactive to the beat, the energy, the moment. But two of the most dominant tools, Resolume and TouchDesigner, each fail in a different, critical way.

Resolume

Industry-standard VJ software. Pre-rendered clip playback with basic effects. It looks polished out of the box, but I hit a wall fast: no custom shaders, limited audio reactivity, and no way to generate visuals procedurally. I could generate new ideas on the fly, but only within the boxes someone else had already made.

Ceiling too low

TouchDesigner

Incredibly powerful node-based environment. It can do anything — in theory. In practice, building a performance-ready system took me weeks of patching. Tweaking a parameter mid-set meant navigating a spaghetti graph of nodes. Getting it to a place where you could play it live, like an instrument, took weeks, and was still not the live reactive performance experience I was looking for.

Floor too high
Me. And everyone else in the room.

It's cool to go to a large scale live concert and watch a really rehearsed, really prepared show that follows a specific setlist. But when a group of musicians are just jamming, or when a group of artists are creating on the fly, pre-rendered visuals don't cut it.

🎬

Me (and other VJs)

I was forced to choose between creative control (TouchDesigner) and live reliability (Resolume). Worst of all, I couldn't iterate on the fly and let my creativity run free.

🎧

The DJs I work with

They want visuals that feel like part of their music, not a separate show running alongside it. Right now they have to over-communicate cues or just accept generic looks.

🎉

The audience

They might not know what's wrong, but they feel it. When the visuals can't keep up with the music, the show feels like a screensaver behind a DJ — not the integrated experience they came for.

So I built it myself.

Lightbridge is a VJ performance system I designed and coded from scratch. I wanted the creative depth of TouchDesigner with the live reliability of Resolume — in a single app purpose-built for performing alongside a DJ. So that's what I made.

What started as a basic chain with a single ring source has grown into a full performance system — 40 sources, 32 effects, projection mapping, MIDI control, and beat-synced modulation.

Lightbridge v0.1.0
Lightbridge v0.1.0 — early prototype

Early prototype — one source, basic parameters, no effects chain

Lightbridge v0.4.0
Lightbridge v0.4.0 — current state

Current build — 3D helix source, stacked effects, dual preview, color palette

40
Visual Sources
32
Real-time Effects
120
FPS Target
4K
Output Resolution

🎵 6-Band Audio Analysis

Real-time FFT splits audio into sub-bass, kick, low-mid, mid, presence, and brilliance bands. Every visual parameter can be driven by any band.

⏱ Beat Detection (WASM)

A custom C-to-WebAssembly beat tracker runs in a Web Worker, detecting BPM and beat phase from live audio input — no manual tap tempo needed.

🎮 Modulation Engine

Any parameter can be modulated by audio, beat, LFOs, or envelopes. Link a ring's radius to the kick drum. Shift hue on every beat. Stack modulators.

⚙ Reaction & Decay

Adjustable reaction speed and decay curves let you dial in exactly how the visuals breathe with the music — from tight percussive hits to slow ambient swells.

🔗 Flat Accumulator Model

Sources and effects live in a single ordered chain. Sources composite onto an accumulator, effects process the composite. Order matters — and you can reorder on the fly.

🎨 40 Procedural Sources

From 3D tunnels and particle simulations to fluid dynamics and fractals — every source is generated in real-time via custom GLSL shaders. No pre-rendered clips.

✨ 32 Effects

Bloom, feedback, kaleidoscope, ASCII, pixel sort, color grading, barrel distortion — stack as many as you want, reorder freely, modulate everything.

🎨 Per-Source Effects

Effects can apply to individual sources before compositing, or to the entire accumulated output. This distinction is critical for complex layered looks.

🖱 Projection Mapping

Built-in corner-pin warping, mesh subdivision (up to 8×8 grid), Catmull-Rom smoothing, and per-edge soft blending. Multi-projector setups work out of the box.

🖥 Multi-Display

Output to any connected display. Preview stays on your laptop, clean output goes fullscreen on the projector. Hot-plug detection auto-adapts when displays change.

💾 Venue Presets

Save your mapping configuration per venue. Walk into a space you've played before, load the preset, and you're aligned in seconds.

📷 Clean Output

The output window is a dumb renderer — no UI, no cursor, no overlays. It receives fully composited frames from the main window via IPC at 60fps.

🎥 Performance Sets

Organize your show into sets — each with its own chain, modulation routing, and color palette. Switch between looks instantly during a performance.

🎹 MIDI Control (APC40)

Full APC40 MkII protocol support. Map any parameter to any fader, knob, or button. The physical controller becomes an extension of the software.

🎨 Color Palettes

7-swatch HSB palettes per set. Sources bind to palette channels. Change one swatch, every bound source updates — instant global color shifts mid-performance.

💾 Show Files

Your entire performance state serializes to a single show file. Auto-save, version migration, atomic writes. Prep at home, perform with confidence.

How Lightbridge compares.

I'm not trying to replace every use case — I built this specifically for live VJ performance alongside a DJ, with a focus on being able to iterate and engage creatively live in the moment.

Capability Resolume TouchDesigner Lightbridge
Custom shader sources ✓ 40 built-in
Real-time audio reactivity Basic ✓ (complex setup) ✓ Zero config
Beat detection ✗ (tap tempo) Via plugin ✓ Built-in WASM
Live-tweakable parameters Node graph navigation ✓ Flat sliders
MIDI controller support ✓ APC40 native
Show file / session prep Composition files ✗ (project files) ✓ Full show state
Projection mapping ✓ + venue presets
Performance set switching Layer presets ✓ Instant
Time to performance-ready Hours Weeks Hours
Because it already does.

OK, so I'm cheating a little bit. I've been working on this for the past 3 weeks. But it's a great case study for a problem in the wild that I spotted and chose to fix!

I built this because I needed it

Every design decision comes from real pain I felt mid-performance — hours spent wrestling with tools that weren't built for live work, not theoretical feature requests.

The right level of abstraction

Resolume was too rigid for me. TouchDesigner was too open-ended. I wanted something in the middle: deep enough for custom visuals and modulation routing, structured enough that I can prep a show, rehearse it, and trust it on stage.

Real code, not a mockup

I wrote this in Electron + Three.js + React + TypeScript, assisted by Claude Code. It includes custom written GLSL shaders, C-to-WASM beat detection, 40 procedural sources, 32 real-time effects, full MIDI protocol, show file serialization, and projection mapping.

The process fix is the product

My old workflow — author in TouchDesigner, wrestle with OSC controls, Syphon into Resolume, be limited in manipulation — is gone. In Lightbridge, what I see is what I perform. The creative tool and the performance tool are the same thing.

Don't take my word for it.

I know this was supposed to be a hypothetical problem solving exercise, but I only started developing this app on March 6th. So I've been thinking in and living in this exact kind of problem solving space for the last few weeks, and it just felt right to share. I found a problem that frustrated me, did some research, and built an entire application to solve it. Here's a tiny taste of what that looks like — click below and make some noise.

CLICK TO ENABLE MICROPHONE Play music, talk, clap — watch it react

Thanks for reading! Hopefully this shows you a little bit about how I think and what I do when something isn't working. Looking forward to hearing from you!