Be the first to know and get exclusive access to offers by signing up for our mailing list(s).

Subscribe

We ❤️ Open Source

A community education resource

5 min read

Creating live concert visuals in React with Shaders

A step-by-step guide to building concert visuals using React and Shaders.

I perform live regularly as Messica Arson, and I create experimental electronic music with code and modular synthesizers. I’ve been creating visuals for my music as a form of world-building. When I play live, I want to take you into a world that exists for only 20 minutes or so, full of loud sounds, fuzziness, glitches, and feedback.

For the past 7 years or so, I’ve been making visuals with Hydra, a browser-based visual synthesis tool inspired by an analog video synthesizer. Hydra is one of the most widely used tools for live coding. Live coding is the act of writing and running code in real time to create art. While working with Hydra has been an accessible entry point into creating visuals with code, I’ve been looking for a way to take my visuals to the next level and create visuals that truly represent my own voice.

I recently discovered Shaders, a declarative component-based method for creating visuals in the browser. Using Shaders in a custom workflow has enabled me to create unique, stunning visuals in a lightweight, intuitive way.

This tutorial will walk you through my process for creating visuals and show you how to create your own visuals in this way.

Building in public

Now that I’ve been creating visuals for some time, whenever I see the use of shaders at a sporting event or concert, I’m always trying to figure out what’s happening under the hood. While the core of my current visual workflow is built on Shaders, I have created extensions for audio-reactive visuals and plan to add additional customizations designed for my own live performances, such as more robust ways to work with feedback and potentially integrations with other live coding frameworks. To add transparency, I decided to make my workflow public.

I named this package fort-reno after the historic DC DIY show venue, where every band has a stunning visual backdrop. The custom tools and workflows of other live coders, such as La HabraL5Viswasm, and Murrelet, also inspire it.

Read more: Turning code into music with Sonic Pi

Getting started with visuals with fort-reno

Using fort-reno, you can quickly get started making visuals using React. However, you can use the underlying shader components in Vue, Svelte, or Solid.

Since I often play in spaces without reliable internet, this is intentionally designed to run locally. The project fort-reno includes added customizations for audio reactivity, basic feedback, and other customizations based on needs that may arise during live performance.

To get started creating visuals using fort-reno, you can follow these steps:

1) Clone the fort-reno repository.

2) Navigate to fort-reno in your terminal:

bash
cd fort-reno

3) Install the required dependencies:

bash
npm install

4) Run in development mode:

bash
npm run dev

5) Open the local URL shown in your terminal which should be http://localhost:5173.

You should see something similar to the following in your browser:

Creating custom visuals using fort-reno - aurora rays example

This example is running this code:

import { Shader, Aurora, Godrays } from "shaders/react";

export default function App() {
  return (
    <div style={{ width: "100vw", height: "100vh", background: "#05070d" }}>
      <Shader>
        <Aurora
          colorA="#FFFFFF"
          colorB="#C4C4C4"
          colorC="#00FFFF"
          colorSpace="oklch"
          balance={50}
          intensity={95}
          curtainCount={10}
          speed={25}
          rayDensity={90}
          height={20}
          center={{ x: 0.5, y: 0.5 }}
          seed={0}
        /> 
        <Godrays
        blendMode="multiply"
        intensity={45}
        contrast={1}
        balance={30}
        speed={10.3}
        />
      </Shader>
    </div>
  );
}

In this example, the React component renders a full-screen shader scene using the shaders/react library. Inside the Shader container, Aurora creates the main animated northern-lights effect with a white, gray, and cyan palette, centered in the middle of the screen and tuned for bright, dense, fast-moving curtains; Godrays are layered on top with multiply blending to darken and shape the image with animated light-beam texture. The outer div fills the viewport and provides a very dark background, allowing the shader effect to read cleanly.

Creating custom visuals using fort-reno

Most of the logic for creating live visuals can be found in src/App.jsx. You can replace that file with the following to create a pink marbled texture.

import { Shader, Plasma } from "shaders/react";

export default function App() {
 return (
   <div
     style={{
       width: "100vw",
       height: "100vh",
       margin: 0,
       overflow: "hidden",
       background: "#FF3659",
     }}
   >
     <Shader style={{ width: "100%", height: "100%", display: "block" }}>
       <Plasma
         density={2}
         speed={20}
         intensity={1.5}
         warp={0.6}
         contrast={0.8}
         balance={66}
         colorA="#FF3659"
         colorB="#c4c4c4"
         colorSpace="oklab"
       />
     </Shader>
   </div>
 );
}

You should now see something that looks similar to this in your browser at http://localhost:5173.

Creating custom visuals using fort-reno - pink texture example

To create the moving pink marbled texture you are seeing, Plasma generates a flowing, glowing texture using a pink-to-gray color blend in oklab color space, with moderate density, fast motion, a bit of warping, and slightly softened contrast to keep the result smooth rather than harsh.

Live performance

While performing live, you will typically start with a base scene and gradually transition to different elements, such as intensity and speed. From there, you can add in new elements and slowly transition, change values, and add parameters as you go.

It is helpful as a learning tool for creative code examples to start with an example, change the values, and add different components until it looks unique to you.

Conclusion

This is just the start of how you can make live-coded live concert visualizations. You can add the components listed in the Shaders documentation. Be sure to check out more examples or explore ways of playing with feedback or audio reactivity.

More from We Love Open Source

The opinions expressed on this website are those of each author, not of the author's employer or All Things Open/We Love Open Source.

Want to contribute your open source content?

Contribute to We ❤️ Open Source

Help educate our community by contributing a blog post, tutorial, or how-to.

Two World-class Events

If you didn't make it to All Things AI, check out the event summary, and make plans to join us October 19-20 for All Things Open.

Open Source Meetups

We host some of the most active open source meetups in the U.S. Get more info and RSVP to an upcoming event.