Afterglow: Live Coding for Light Shows


My name is James Elliott. I’m going to be
talking to you about something that’s been devouring my life for the last year and a
bit, which I call Afterglow. So I’m a software developer. I work at Singlewire
Software with a bunch of other guys who are here, and I also sometimes DJ. And in the
process of doing that I started gathering some entertainment lights because I thought
they’d be cool and add a neat element to the shows… and that’s a real slippery
slope! [audience laughter] You start getting a couple and then you realize
“well, yeah that’s neat, but what if I had them hooked together, and could control them
for real, and then well, I’ve got this control system, I need to get some more lights to
make it worth it,” and, it snowballs from there. But, not long after—Well, we had been doing
shows for about a year or two and I realized I really didn’t like the software that we
had. There were a couple things that I thought were particularly flawed in it. The main one
was how it worked with other controllers. Let’s see, you know, these MIDI controllers
that give you nice grids for working with all your light cues, and we had one of those
and Chris, my partner behind Ryan there, did a really nice job of building a set of cues
on that and mapping them to the software that we had, but just because of limitations in
the software, if you touched the mouse instead of the controller, there was no feedback between
the two, so it would look like on the controller it was off, but the computer thought it was
on, so you could get things out of sync. It had flaky behavior. And it didn’t have a
great user interface. There were some generators you could use to build programmed effects, but they were very limited, and you could only sort of render them ahead of time and then run them. There was no way to synchronize it to music, so you’d have to sit there tapping the tempo and trying to line it up, but just guessing. And there was no support
for some of the newer lights we had, which in addition to having RGB channels, they had
white, amber and even sometimes UV channels. So, with all those things I said “well,
hmm, this stuff isn’t as good as I want.” So, I was—well, we were starting to use Clojure
at work for a couple of our projects, it’s a really neat language with all kinds of great
properties. But then I stumbled across this project called Overtone—I think actually Ryan, you
mentioned that to me—so I went and watched a presentation on that. And what Overtone
is, is it’s a music synthesis system written in Clojure so it takes a really functional
approach to how you organize the sounds and how you string them together. And I said “well
wow, if you can do that in Clojure, and it’s responsive enough to do it to the music and have
good timing with it, I bet I could build a lighting control system in Clojure!” So that
was the start of the slippery slope. So Afterglow is a lighting control system.
It’s also an experimental platform where I try out different ideas. It’s becoming
a growing integration hub, as you will see it talks to a lot of different kinds of things, and it’s spinning off other open source projects of its own— ah, I didn’t mention that,
but it’s an open source project. So, a little bit about doing light shows.
It’s an impractical hobby. This is what our living room looked like [audience laughter]
for about two weeks before we did our last show a few months ago. And… if I can get this
thing running… Oh. It’s coming out the wrong speakers. [background music] But this
gives you a sense of what our full rig looks like since I didn’t have room or time to
set that up for this show. But you’ll see that that light there is one of the ones you
see on the rig, that one’s one of the ones down there… [Audience] What does the cat think of this?
[Laughter] You know, he’s kind of used to it. [Audience] It seems like—I mean a little
laser pointer most cats go nuts… [Audience] These are all LEDs? Yeah, these are all LEDs. All right. So, given that, a little bit about
what tonight is not, and what it is. So what this is not, this is not an actual performance
by Deep Symmetry, this is not a full-blown light show, we don’t have a DJ set to play
for you, we don’t have fog, which is really nice for adding atmospheric effects to the
beams of light moving around. We don’t have most of our lights and lasers, we don’t
have a dance floor, we don’t have PLUR—well, actually, I hope there is. Those of you who
are not familiar with the electronic music scene, PLUR stands for Peace, Love, Understanding,
and Respect, it’s one of the things that people like to foster at a lot of electronic
music events. So hopefully all of our Clojure meetups have that anyway. [Audience] I want my money back! [Laughter] All right, I’ll give you all the money back. [Audience] Yeah, full refund. [Laughter] So what is tonight? Well it’s an introduction
to many of the cool building blocks that go into our shows. I’m going to start with
some of the underlying technologies and how I use them from Clojure. A technical exploration
of what we use. And one really fun use of the Clojure environment. All right, so before
I begin, any questions? Ok! So, these lights that you see here, which I haven’t powered up yet, I need to do that… are controlled using a protocol called DMX 512. That stands for
Digital Multiplex and it’s a fairly old protocol. Maybe it’s been around since the
’80s, but what it gives you, it evolved from the big banks of theatrical dimmers that
you’d see if you went behind stage in an old theater, these giant levers that you’d use
to move huge rheostats and bring up and down the lights that all worked on different channels.
So this took that and put it into a single cable that you string in a daisy-chain fashion
between the lights. And each cable can carry 512 channels, and each of those channels can
have 256 different levels. It doesn’t carry the power, it’s just a signaling protocol
and each one of those channels can be, well, it started out as being brightness, since
that was all you used to do, but then people started building fixtures that had more effects
and features. So it might be you have three channels for your red, green, and blue brightness,
and maybe a separate channel for your dimmer that overall adjusts the the light, bright
up and down. Or it might just pick a color. That light there as I’ll show you in a bit
has a color wheel, so depending on the value of the channel it spins to a different color.
It might change the focus, where the light is aimed in depth, focus-wise. Or it might
rotate the light, or it might pick a template or some other kind of special effect. So when
you assign the light, hook up the light, you give it an address, so you see the 285 sitting
on the base of that one there, that one is listening starting at channel 285. Now, it
listens to more than one channel, it’s got a bunch of different features, so it listens to
I think about 16 or 18 channels, starting with 285. So with each of these channels, 40 times a
second, which is the typical refresh rate, you can send a new value for every channel.
So in order to make a light show happen, all you need to do is spew out an awful lot of
numbers, very quickly. All right, that’s it, the end. [Laughter] So I didn’t have to write the software to
talk DMX directly myself, there’s a really cool open source project called the Open Lighting
Architecture that I use, and it talks to lots of different hardware, but I’m using in
my case this thing called an EntTec DMX USB Pro. It takes a USB connector on one end and
feeds out two universes of DMX on this side. It can also speak over Ethernet or similar
networks using a protocol called ArtNet or a more modern one that our friends at Electronic
Theatre Controls in Middleton designed called sACN or E1.31. They also have images for Raspberry
Pi that run their software so if you don’t have— it doesn’t run in Windows very well, so if you’re stuck with Windows, they recommend you buy a Raspberry Pi to run it on. And then…
it has a daemon which responds to remote procedure calls using Google Protobuf which is a pretty
neat binary protocol so I wrote a Clojure binding for the Open Lighting Architecture,
and then at the request of the Open Lighting Architecture people, I spun that off into
its own open source project called ola-clojure. So let’s pop out of the slides for a minute and look at
the OLA daemon. It has an embedded web server which we’re looking at here.
And so I’ve got a couple different universes running right now, I’m actually running
two universes to those lights up there. Everything except for that white strip is running on
one, and that white strip is a brand new light that I just got for this demo because I wanted
to talk to you about pixel mapping, so you guys gave me an excuse to “invest.” [laughter] And so let’s take a look at universe 1,
which is what most of the lights are connected to. Let’s go to that one. And go to the
DMX console. I happen to know that that moving spot there is sitting on channel 1. So if
I pull over this fader here you can see it makes the light turn around. So that channel
controls which direction that light is aiming. Now it’s pretty fidgety, and this is a hard
one to control. There’s a second channel that is for fine positioning. If I move it all the way you can barely see it move back and forth. And then similarly this adjusts
the tilt level. Let me jump over to another light for a moment.
We can come back to this one. So I know 187 is the dimmer for that tallest light there. So I turn that up, nothing happened. Why is that? Well because in addition to having the
dimmer, it’s got individual red, green, and blue controls for each of the heads that
are on there. So this next one’s the red channel on that head. So if I adjust that,
the red gets brighter and dimmer. And if I add a little bit of green, we can make yellow.
And then the main dimmer will dim the whole thing. So then this is the red channel for
the next one and so on. So that’s how you can put together different colors on
different headsfor these lights. [Audience] Can you go through the whole
spectrum with that? You can. Yeah. I’ll be showing you guys
later. That’s a lot easier to do once you’ve got Afterglow running because it’ll do the
calculations for you, given the hue, what the red, green, and blue values are that you
want to put out. All right. Let’s jump back to channel 1,
I was going to show you a couple other things on the Torrent before we move on. Ok, so,
channel 9 is the shutter for the Torrent. Like an old light would have a physical shutter,
this is just an electronic one. So that didn’t seem to do anything, but if I turn up the
dimmer you can see that there’s some light coming out of that one now. So in addition
to being open or closed, the shutter can do things like strobing… there’s a fast strobe,
or it can do different kinds of throbbing pattern. It’s really rather finicky to try
to adjust these things by hand, which is why you end up writing lighting control software. [Ryan] Or, why YOU end up writing lighting
control software. [laughter] Ok, so I mentioned the idea of focus, channel
11 is focus for this particular light, you can kind of see that [audience agrees] makes
crisp lines there, and why do you want to focus that? Well, I’ll show you more in
a moment, let’s go back to my concept of color. So channel 5 is the color channel for
this, if I crank this up, suddenly it’s red. So you can see the colors swooping in
there. The wheel inside this is moving to present a different color. And if I go high
enough it starts spinning the wheel. Back to white… and [Audience] So is there an actual physical wheel? Yes, there an actual physical wheel in there.
In addition to spinning the color wheel, channel 6 is a gobo wheel, and that lets you put patterns
which can have multiple colors in them like this. So that’s one gobo wheel, and you
can make it spin or shake, someone had that concern about having seizures [laughter, inaudible] Channel 7… oh right, that’s the rotation of the gobo. Let’s go back to that one… And then when there is a gobo placed, channel 7 will rotate it. And then channel 8 is another
gobo. Let’s pull this one up so you can see it. And now it looks really blurry because
they’re in different focal planes so you’d go back to your focus and then refocus. So
you can combine gobos, you can focus back and forth between them, do all kinds of stuff.
And then before we go on to demonstrating the other software, I’ll show you the prism.
So in addition to having all those other things you can add this prism which divides the beam
up into three. So you can have multiple copies of the beam, you can even make it spin. So
combining all that stuff, as you can imagine, which will be easier to do with Afterglow, is
how you build up really interesting looks. So that is just using the lights raw via DMX,
via OLA, so let’s look at how you do it in Clojure. So there’s a couple of concepts we’re
going to look at tonight, and I’m going to introduce them now, then I’ll show you
them in action and then we’ll go back and talk about them at the end a bit. First of
all there’s this thing called a fixture definition, which is telling Afterglow how
the heck do I control that light, or that light. It needs to know what the channels
are, what the values of those channels mean, sometimes they have subdivisions within the
channel that do different things, and also Afterglow does more modeling of the lights
than any other lighting software that I know, because it models physically how the light
moves in response to the different control channels, so I told it where in space these
lights are, then I can tell it I want to aim at that point on the ceiling, an it will do
the trigonometry to figure out how those lights have to be pointing and then it will do the
math to figure out what DMX values to send to those lights. It’s actually an optimization
problem something like robotics, ’cause there’s several different combinations of
values you can use that will point in the same direction, and you want to get there
with the least movement, from where you are now. So those are fixture definitions, and those
are just Clojure maps in the end, but generally there’s a function that builds the map because
you can have different parameters, the lights have different modes they can be in, things
like that. Show space, in order to be able think about
space like that, it has a notion of where the origin is. The origin is on the floor,
right between the center of mass of those two lights that are sitting on the edge of
the table. And the Z axis extends out into the audience, back negative behind them, the Y
axis goes to the ceiling and down to the floor, which is zero at the floor, and then the X
axis goes this way, so positive is that way, negative that way. So that’s how you express
where the lights are and how it figures it out about where to aim things. Ok, now, we’re dealing with music, so that
involves time, and so there’s a metronome that has a really fundamental role inside
Afterglow. It has the notion of how many beats per minute are going on, and it also knows
how many beats per bar you’ve got, and then how many bars in a phrase. Electronic music
is often divided up into eight note, or eight bar phrases of four bar beats, and so if you
have your lights build to a climax at the right point in that phrase it looks really
nice to the music. In order to do things that look interesting
you’re going to want to have the lights change over time, so you have oscillators
for that, we’ll talk more about those later when we can actually see them. And then to
do thing like colors that change over in space or in time you have dynamic parameters that
can be fed by oscillators or can be spatial or a combination of both. And then you of
course want to have variables which you stick things in to have your cue based off of that. Building onto those you do effects. What’s
an effect? It’s a look on the lights. It’s “I want these lights to do something.”
We’ll talk more about them when we have a chance to see them. But basically, ah, no,
we’ll talk about that. Effects, you’ve got assigners, assigners do things to lights,
and there’s a rendering loop we’ll talk about once we’ve seen a bit more. Kinds of effects: Dimmer. Make the lights brighter or dimmer. Color. Make a color happen somewhere. A spatial effect aims in a particular
direction or at a particular point. A function effect can do something like pick the gobo
that looks like a biohazard symbol, or, you know, strobe. And then compound effects you
build other effects with. All right, so, how do we see all these things? Let’s look at
our friend the Clojure REPL to begin with. So here we are in Emacs, I think the first
meetup I saw here at Bendyworks was about how to use CIDER in Emacs for Clojure and
that’s my favorite environment as well. So that’s my project definition, and I just
fired up the REPL before things started, and to get things going I’m going to say “use
sample-show” which creates a bunch of lighting fixture definitions and whatnot. And done…
And I’ll say “show/start!”. When I do that you’re going to see immediately the
lights are going to go back to their home positions because right now Afterglow is not
interacting with OLA, but as soon as I say show/start it’s going to start blasting out its frames
of DMX information 40 times a second. And since there are no effects running,
it’s going to set everything to zero, so the lights are going to go back to their defaults.
Like that. All right! So let’s make something happen.
Well, I have a… Hmm! [laughter] Ah, oh, I know! We’re having battling, ah, lighting
programs. OLA was still running and sending it’s frame values so [Audience] At 41 frames per second or something It wouldn’t be a demo if something didn’t
work, not quite right. [laughter] Ok! So I wrote this little cheat function and this
is—I’m sort of walking through the README on the project at this point, how to get started
with it—so “fiat lux” means “make light” in latin, and that’s what it does.
So it creates a slate-blue color effect and it applies it to all the fixtures it knows
about in the show. And, so that’s at full brightness, so if we wanted to control the
brightness we could add another effect. And we’ll do this one by hand. So add-effect!
is one of our options here, that’s the one we want. When you add an effect you give it
a key, which is how you can refer to it later if you want to stop it, and it also lets you
group them so they can be mutually exclusive. And what do I want to have my dimmers at?
Let’s do a global dimmer effect and level 100 so that’s going to be a little less
than half, so things will get dimmer. And things got dimmer. Except for that one,
that new one I was talking about, with all the LEDs on it, that take up its own whole
universe, that one doesn’t have a dimmer. So a dimmer effect has no effect on it. And
this was something I hadn’t encountered myself, but there was a guy in Russia who
was exploring Afterglow to use with his band, and he said “I have these lights that don’t
have dimmers, what do I do?” So I thought about it for a while and I created something
called a virtual dimmer which… There it is, add-virtual-dimmers, thank you CIDER for
reminding me how to use [laughter] Virtual dimmer, we pass that with true, keep your
eye on the the one big LED bar and you’ll see it get a bit dimmer. If I remember the
right key. There. So what the virtual dimmer does is it creates
a color effect that takes the output of the other color effects, runs after them and acts
as though a dimmer had been applied to them, so it reduces the RGB values as though there
was a dimmer on the light. So that is important for lights like this. Ok, that’s dimmer effects.
A color effect is the next one. Global color effect takes a keyword that you
can pick a color with. [Audience] Missing a bang! Oops! Thank you. Stack trace.
Boom! That’s more orange. [Audience] That is more orange. And then let’s look at some oscillator effects. So I’m using the oscillators namespace to
build an oscillated parameter, and, I’m using a sawtooth wave, which just does a ramp,
and then we will include our add-virtual-dimmer so that it will apply to all of them.
And, there we go! So now we have an oscillator that’s running
in sync with the metronome and it is ramping up the lights once on each beat. [Audience] So when you run that add-effect with
the same keyword does it replace the old one? Exactly, it replaces whatever else is running
with the same key. So if we want to change the speed of that,
I just adjust the metronome. Set the BPM… The metronome keyword attached to the show
variable and right now it’s at 120, which is the default, so if we drop it to 70 it’ll slow
down. Like that. All right! Well, as you can see, you could run a light show like this,
but… you know, this is not going to be all that practical. So how would you do it? Let’s first look at the web interface. So
this start-web-server thing, if you run Afterglow as a jar, it starts out with this on by default.
But I’m going to tell it to listen on port 1600 and open the browser, and, there it is. So this is a Luminus web interface, those
of you who’ve built Clojure web projects before, and the main purpose of it, well,
you can pop in here and type Clojure expressions if you need to and you haven’t got a REPL,
but the main point of it is this show control interface here. So what we’re looking at there
is a cue grid. That’s an 8 by 8 grid and those cues are designed to run effects when
you click on them. The colors are all designed to be sort of mnemonic, you can assign any
color you want to a cue, and some of them as you can see are changing color and those
are sort of trying to represent the fact that the cue itself changes color. So let’s turn all the dimmers on, that took
over from the other dimmers we had been using, so that’s very similar to what we had running
before. Let’s speed up the metronome a bit, this is a lot easier than typing an expression.
Ok! Now, what did I want to show on this one? Ok, yeah, everyone likes strobes. So these
strobes are another [Audience] Well, not everyone! [laughter] But you know that gives you a sense of how
you could really run a show. You can pick different fixture groups, and strobe them,
and I’ll show you more things about that when we’re talking about physical controllers.
And like I said there’s relationships between effects, so notice how a lot of these things
get dim when I click this All Saw, that’s all the other dimmer effects, for any fixture,
dim themselves to say, hey, if you run me, it‘s going to kill that other one, because
they’re mutually exclusive. Now, if I just hit it on the Torrent, that only dims the
column of effects that are affecting the Torrent, that fixture on the right there. So I could
also do a different dimmer effect on that light without creating conflict. So that visual relationship between the cues
is a nice thing to be able to see. There’s more cues than can fit on the screen, so you
can scroll through them by using this thing here. So let’s look at a very different
kind of light that’s just kind of been sitting there. This guy is a little laser. Turn it
on. Now, one thing I haven’t shown you here is while these things are running, each
effect has a row in this table down here, showing you that it’s running, you can end
it down there, and you an also, if the cue has variables associated with it, there is
a web interface for adjusting them. So this is the speed that that little laser is rotating,
and you can adjust it there. Similarly, that light there, we can make it go slower or faster
by adjusting these things. All right. Oh! Yeah. One more thing before we move on. [multiple speakers] So these are color cues on the bottom here
and, lots of white, turn down the grand master for this, It’s basically a… all the dimmers
are routed through this grand master so if you turn it down scales down all the dimmer
effects by that amount. That was another thing our other software didn’t do, and we were
in some small rooms where these lights were just overpowering. But you can turn down the
brightness of the whole show using this grand master. So that color effect I’ve got running defaults
to white but we can pull up a little color picker here and pick a different color. And
then it also shows in the… oops… That happens sometimes with that, that’s another
open source component I’m using for the color thing, let me reload the page, it gets
flaky sometimes. There, yep, OK, so now the cue shows that color while it’s running.
If I end it, it goes away and goes back to using the default. But if I go in here and
make it green and say “save” now it remembers that. So I’ve just basically saved that
as the default color for that cue from that point on. Which is a handy thing to be able
to do running a show. All right, I’m going to show you a little
bit with the Torrent again so you can see macros and then we’ll move on to the next fun topic. All right, so we’ll go over here, and let’s
see, start by aiming at the ceiling, all right. Let’s go up here, and let’s build a cool
effect. So once we’ve got a swirl, and simultaneously sunflower, so those are two different gobos
there right now, and you can’t really see them at the moment because neither’s in focus.
But if we change the focus, we can sort of go back and forth, we can focus between seeing
that one and that one. So, neat thought, that’s kind of cool going back and forth like that,
well, let’s make it do it by itself. So I can do a focus sine, which is another cue
that I built, that goes back and forth between two focus points. That actually is not too
bad there, it’s a little extreme, so let’s not quite go up to 100, and not quite go down
to zero. So that looks pretty cool, that’s a nice lighting effect all by itself, but
let’s throw the prism in, make it spin some… So if you had some fog going and that was
what was going on on the floor, that would be a really nice lighting effect. So but it
took me a fair bit of effort to put it together. Well that’s what macros are for. So if you
go down here and say Make Macro, it adds these little check boxes, to all the effects, and
you can pick the ones you want to include, so I’ll get the two gobos, the focus sine,
prism, and aiming the Torrent, so I go down and say “Sunflower
swirl” and then all I have to do is pick somewhere in the cue grid where I want it
to be, so I’ll say there, suddenly I’ve got a new cue. So if I hit it again, it will take it over,
you don’t see anything because it’s doing the same thing it was before, but now it will
stop, and if I ever want to have all those things come back again, I just have to hit
this one thing, then boom! [Audience] So when you save that, does it
go into a config file for use later, or is it just into the live REPL session? Macros right now get written to a file, as
Clojure expressions, and you can edit them into your show definition later. I’m planning
to throw a database in here at some point, but I haven’t needed it yet. I apologize for this garish continuity error,
but the camera stopped working at this point during the presentation, and we didn’t notice it for
a few minutes, so I’m going to have to reconstruct this segue in my home rehearsal space, also
known as the edge of my kitchen. So we were looking at the Afterglow web interface,
and that was clearly a big improvement in terms of expressive power and performance
ease compared to a raw Clojure REPL, but it’s not everything you might want. For example,
even if you’re really quick with a mouse, you can only click one cue at a time using
the web interface because there’s only one mouse pointer. There are other things that
you might want to do more expressively, and for that you really want to turn to something
like a MIDI controller. In fact, I happen to have an Ableton Push which is an incredibly
well designed MIDI controller, which Ableton created to use with their Live digital audio
workstation software. It’s got this big grid of pressure sensitive pads, which you
can press multiple ones at the same time, and also it can tell how hard you’re pressing
them. It’s got a beautiful text display here which can convey lots of information,
and these touch sensitive encoders which can react as soon as you touch them to update
the interface and then give you feedback as you’re doing things. So I’d love to be able to control Afterglow
with something like this. Ableton documented some aspects of the Push, but the fact that
it worked entirely with MIDI meant that you could use a program like MIDI Monitor to watch
all the communication between Live and the Push and figure out exactly how it worked. So Afterglow can be used with any kind of
MIDI controller that you might have around including just incredibly inexpensive ones. You can
map buttons or knobs or whatever on the controller to trigger cues, you can map faders and knobs,
encoders, to adjust cue variables. But if you happen to have a really really awesome
controller like a Push around, and you configure Afterglow to watch for it, as soon as you
plug it in Afterglow will notice and bring up the custom interface that I created for
the Push on it. In fact, I actually started working on this Push interface even before
I started the web interface. Here’s what happens when you connect it.
It gives you a little gratuitous starting animation that I just couldn’t resist putting
in there, and shows you information about the cues up top there and the cue grid there. And at this point we can actually teleport back to
the demonstration that was already in progress, because the camera is working again. And when we did that, this little menu appeared
on the screen here, “Linked Controller,” so if I choose Ableton Push now everything…
this cue grid that you see on this is tied to the cue grid that you see on the screen
there, so if I scroll around on this, it scrolls that around as well. So that lets you see
what cues you’re looking at on this, and then you can actually hit them on the controller. Now one of the things that is immediately
nice about that is you can say you want to strobe the Torrents and the Weather Systems,
you can hit them both at the same time. But what’s also really cool about these controllers
is they’re pressure sensitive. So the harder I push it, the brighter and more intense
the strobe gets. Or you can do other, completely different
kinds of effects, like this one I call Bloom, where it’s going to draw a little circle, and
the harder I push it, the bigger the circle gets until it spreads all over the
whole lighting grid. So, oh! Yeah, so let me just show a couple
more things about this controller, and then I’ll tell you a tale of woe. So you can
adjust the metronome here too, you can change the BPM by turning this, say what beat you
are on with this, you can tap your tempo out, so it’s easier to do it there than it is
on the mouse. But along the top here you have the list of
effects that are running, and you can adjust their cue variables as well. So this is the
grand master here, as I turn this it adjusts the brightness of that. Or if I [dog enters, inaudible]…
so you can adjust parameters of the cues there as well. And if you have a color cue going, which I
don’t think I do at the moment, scroll to it… bring up, oh no that’s a rainbow cue—ah!
I forgot to show you those. [Audience chuckles] We’ll go with the green cue. So I’ve got
this green cue running, it’s right here, and if I touch one of the encoders,
it gives me a palette here on its screen so I can pick a color by just pressing one of those buttons.
Which in a light show can be handy. Then I can adjust the saturation by
turning this, I don’t know if you can see, this little bar graph showing what the saturation is,
and if I’d scrolled up over here you’d see that it was… oh no you wouldn’t
because it’s just showing the color there. So those are really nice features, you can
also adjust the saturation or any other variable by using this touch strip here. And just about
the time when I finished getting this interface implemented, and working really nicely on
the Push, Ableton announced the Push 2. [Laughter] And they had this big program where they would
take your old Pushes and, you know, give you a huge discount on getting the Push 2, and then were
giving to schools for working with and whatnot. And, there was no real documentation on the
MIDI interface for this thing, we figured it out ourselves, but the reason we were able
to do that is that it talks purely MIDI and there’s tools, there’s a program called
MIDI Monitor that you can run on the Mac that shows you all the MIDI messages going in and
out, so you can watch while Live is controlling this thing and see the MIDI messages that
are going back and forth, and figure out what does what. That’s how we figured out how
to make these things do this cool stuff. Even the display, that text area, those are MIDI
System Exclusive messages, it just sends a big wad of bytes that set the display. I borrowed a Push 2 from a friend who works,
who’s the local Ableton rep, and bad bad news… the display was not showing up at
all on MIDI. The display was a brand new feature on the Push 2, it’s a beautiful RGB monitor
basically, built into the thing, MIDI doesn’t have the bandwidth for that. So, I was like
“I’m never going to be able to figure out how to reverse-engineer that!” So, I
had this orphaned lighting control system, on a hardware platform that was just killed! So in response to that I decided spend a little
time working with Novation. These guys make— they aren’t nearly as fancy as the Push overall,
they don’t have displays at all, or the touch sensitive encoders, but they
have a nice pressure sensitive grid. And they have the advantage of
having documentation! [Laughter] And not only that, but they sent me some hardware
so that I could map their whole line, which was pretty cool. So, you can have multiple
controllers running at once with Afterglow, you can have both the Push and this running,
and that menu now has both of them in it. So you can have this one looking at one part
of the cue grid, doing things, that one looking at another part, you can even open multiple
web windows, and have each of them tied to a different controller, if you have a lot
of screen space. Kinda nuts. Ah, so that one tided me over. But then, right
as we were about to do our last show, one of my friends said “hey, have you seen what
Ableton posted on Github?” And, lo and behold, they had open sourced a bunch of really good
documentation for the Push 2. And, so, I of course immediately had to get one. [Laughter] [Audience] You should see the boxes
that show up at Singlewire! [Laughter] So, the the way you control the display on
the Push 2 is via USB as well, there’s a USB serial connection, and they had some sample
code using LibUSB to run it, which I looked at and ported to a Java library, which I call Wayang,
which became its own open source project too. So anyone else who wants to use the Push
2 from Java can use my Wayang library. Ah, perhaps if I hook up the USB, um, oh that’s power,
we need power, there we go! Let’s do that again. So now I have this lovely graphical interface
where you can see the metronome scrolling along and it goes to the beats, the bars,
and the phrase growing in the background. And you can see the little color wheel there
as you pick different intensities… So it’s the same sort of stuff that you
can do on the original Push, only now with this awesome graphical interface. So for example
when you do a sawtooth dimmer effect, you can see the sawtooth wave right here,
showing you the moment that’s now, as well as the moments before and after in time. Or, [inaudible] There’s a triangle wave, a sine wave, a square wave,
which just flashes on and off… OK! So that’s the Push 2, and I was really
happy to be able to make that work as well. [Audience] So are you sending bitmaps
40 times a second to that thing? That’s exactly right. What Wayang does,
is it gives you an offscreen graphics context that is the size and shape of the screen,
and you perform whatever Java drawing operations in that you want, and you call a function
Wayang.sendframe, which blasts them over the USB connection to the display. The offscreen
buffer is structured in the same bit arrangement as the screen does, so it’s a pretty efficient
operation to package it up and send it to the display. All right! Just a couple more things to show,
and then we can move on. OK Let’s do… So I didn’t really talk about this before.
This is an example of a spatial effect, and what it does is it takes
the dimensions of your whole lighting grid, and assigns them all the colors of the rainbow.
So you’d asked about this earlier. So we’re going from a hue of zero, which is red,
all the way up to a hue of 360, which is back to red. And the physical location of the light
in space is what’s assigning it a color. That’s one way of doing color stuff, we
can also do it… oops… over time. So this is shifting that over time, or just doing
a rainbow over the whole grid. So those are spatial effects, and what I want
to show you now is tempo synchronization. So that’s why I brought the speakers. OK, Traktor, why are you not being happy? Hmm, strange. I should be able to grab this
controller and pick a different track, but it’s not letting me, so we’ll do it the other way. Oh maybe because we haven’t got that set.
Hmm, whatever! So if we have this playing [music begins]
I can sit here and tap the tempo… You can get pretty close that way. And actually, to make it more clear what’s going on,
I want to do an effect that’s designed for
helping with metronome synchronization. Now you can see better. You can do it that way, but it’s not all that easy,
and let me tell you— when you’re doing a show for twelve
hours, and you’re sitting there having to tap it and trying to figure out where the
beats are, and where the down beats are, it gets tiring fast. So, one of the early things
I did was add this sync capability, and Afterglow’s noticed that Traktor is running and I’ve
got this, I’ve got a custom mapping file that I wrote for Traktor that sends, in addition
to MIDI clock, it sends messages telling it, telling Afterglow where in the beat it is
right now, as well as which of the decks is currently playing, so Afterglow can figure
out more profoundly in addition to the tempo, exactly where in the beat it is. So once I do that, the BPM locks right in
to 150, which is what it is, and as soon as I hit tap tempo once,
it’ll be right on the down beat. There we go. And, so even if I go into Traktor and start
messing around with tempo, of course it gets to a part of the track
where there isn’t much rhythm… But even if I go and change,
start messing with the tempo, Afterglow just stays right locked in to it. So that is a much nicer way
to run a light show. And, let’s kill you…
so one last step I think. So like I said before we’re not doing an
actual light show, but I’ll just show you a few of the pieces together
that would go into making one. [music begins] Nope! [laughter, Audience] Demo time What I can’t figure out is why the…
what is the problem with rainbow, oh! Demos. So! That was a very poorly executed demo
of a light show, which is why… It’s interesting—you’d think that preparing for a
presentation and preparing to DJ were similar? They’re not! You have to get into
a completely different place in your head when you’re going to talk about and explain
something than if you’re going to get up here and perform music. But. Hopefully you got
an idea of what I was talking about there. So that’s synchronization with Traktor,
which is nice because this is our much more portable gig, um, more portable gear, but
when we’re doing bigger shows we use… that gear there, which is Pioneer CDJs and a mixer,
which is what you’re going to see at a big festival, or big club, and they don’t synchronize in the same
way as Traktor does, obviously. But they do talk to each other over Ethernet,
so I said, huh, these guys sync to each other, I wonder if I can figure out how they do it?
So I fired up Wireshark, and started watching the traffic between them, and I was able to
reverse-engineer the Pioneer DJ Link protocol, which is how they synchronize with each other,
and now Afterglow is quite happy to eavesdrop on that and synchronize your light show to what
the DJs are doing on their professional club gear. And that has turned into its own open source
project last month called, well dysentery is the first of them [laughter] yes, I’m
glad people got that, that is a Clojure project which is designed to help probe the protocol
and learn more about it, it just brings up windows showing you what’s coming and what
we think we know. That led to beat-link which is a Java library that implements what we
learned from dysentery, and then beat-link led to beat-link-trigger which was used at
the Electric Elements music festival in Canada, in Ontario, four weekends ago now? to synchronize
the visuals on that big stage with what the DJ was playing on his CDJs,
which was quite fun. And yeah, we’re taking it in other directions.
Everyone wants to get the track metadata— the CDJs have, you know, know the artist and title,
so if you could see what’s loaded up on the different CDJs, it could help you prepare
your light show even better. I’ve gotten close. They do talk to each
other, it’s a TCP connection directly between the CDJs, so I had to get a managed switch
so I could span the ports and see that traffic. I’ve gotten to the point where I’m replaying
everything up to the point where they spit up the track information, and I don’t get
it back yet. So, I’m working with a guy in Portugal on that one, and hopefully we’ll have
that cracked soon, which would be awesome. [Audience] Does, do, does Traktor and the
CDJs have like metadata about the part of the song that you’re entering that would
be interesting for lighting? There are cues, and I already have figured
out how to do that, so when you’re running the CDJs you have those cue markers in the
tracks, and the DJ will be have a cue marker on the player that’s over here, and he’s
looking at this one, ready to start the thing, and he can watch the countdown. I can display
the countdown in beat-link or in dysentery, and I’m going to add that to Afterglow,
I haven’t yet. I’ve been mostly working on beat-link-trigger until I stated preparing
for this demo, but that’s going to be, yeah, getting the cue countdown in there, and getting
the track metadata, it’s going to be super sweet. One last little gratuitous demo, and we’re
reaching the end. [chuckles] When I talk about this being an integration
hub, for lots of different kinds of things, there’s a protocol called Open Sound Control
which is like MIDI, only it’s much more modern. It’s network based, so it goes over the,
over TCP, actually or UDP. In addition to having a numerical value, you can have
other types, strings, floats, integers, I forget what all, and there’s hierarchical
addresses, so like you might have in your web application, different hierarchical paths,
you can get or set hierarchically named things. So that’s a much richer interface. There’s a program called TouchOSC… let
me set this up here, which runs on an iPad so … go to the aimers… ok, so those two guys on
the edges are being aimed by this aim cue here. And I’ve got TouchOSC running on this.
Now we’ll see if this works or not, this is not my normal network that I use this for,
so it could well… yep! Looks like it’s working. So here I’ve got those two cues that are
turned on, can turn those cues on and off with this, but this is the room we’re in,
since it knows the dimensions of the room, I can say here’s the point in the room I
want you guys to aim at, and they’ll just, like I said it does the trigonometry to figure
out how for each of those two lights, to get it to aim at that spot on the ceiling, or,
this is the height, so let’s go down to the floor. That is TouchOSC talking to Afterglow cues.
And it worked! [Audience] So is TouchOSC a
product by the OSC folks? OSC is an open protocol, TouchOSC is by a
German company called Hexler, there’s a lot of Germans involved here! Ableton is a
German company, Traktor is, er, Native Instruments is a German company, Hexler is German, they’re
really into this electronic music stuff. I think TouchOSC is like five bucks, ten bucks
for your iPad, or iPhone, not bad at all. And yeah, you can put whatever cue interfaces you
want on there. I’ve got Sparkle mapped as well. All right! So we’re coming close to the
end here, and this is just something I’m not showing you tonight, I mentioned we had
another laser that I just didn’t have time to set up, and do all the safety tests for here,
but I’ll show you a look at what it can do. This is our pride and joy, it’s a 30,000
point-per-second RGB analog color mixing laser, and it has its own software package
that controls it, called Pangolin Beyond, but Pangolin Beyond can talk over a UDP protocol
called PangoTalk, so of course I built Afterglow extensions to control Beyond
from an Afterglow show. And yeah, with haze, that thing is really quite beautiful. All right, so that’s it for the gratuitous
demos. I just wanted to talk a bit more about how this relates to Clojure again. So you
saw a bunch of effects happening there, what is an effect? Well in Afterglow, an effect
is a Clojure protocol, an implementation of it, and that protocol has three methods. First, on every frame we ask the effect “are
you still active?” because it might not be. In which case, we just remove it from
the list of active effects. And we pass it, well, with any protocol you pass the instance
of the object you’re calling, and the show, because it might need to look at variables
in the show, and that’s where you store variables, and then snapshot is a metronome
snapshot. So at the start of each frame we take a snapshot of the metronome saying where,
what point in time is this frame rendered for. The reason we do that is if it takes a little
while to think about the different effects across the grid, we want this light over here
to be synced up with that light down there. So rather than using the actual “now,” we
have them all run off the same snapshot. So, if you’re still active we call generate,
which we pass those things to as well. Generate spits back a list of assigners. I’ll talk
about assigners in a moment. Then if you want the effect to end, you call the end method,
or end function in the protocol, and end can either return “yep, I’m done” or it can say
“hold on a bit, I’ve got a little cleanup to do.” So your effect might have some sort of
cool animation it runs as you’re leaving. So those are the three effect protocol parts,
what are assigners? Assigners are records implementing another protocol,
and they basically are “I want something to happen.” So, it has a kind, which can be a channel,
just set a particular DMX channel to a particular value, a function, meaning like “pick the sunflower
looking thing”… a color, a direction, meaning point that way, or aim,
which is similar but different than direction, aim means point at that particular point, so
if you assign two different lights the same direction, they’ll both point the same way,
if you assign them both the same aim, they’ll both point at the same point. And then custom, you can build your own stuff.
So that’s how I pulled in Pangolin Beyond, that I was just talking about,
or setting variables, or things like that. So how that all works is there’s a multimethod,
so you run these assigners, and then they’re merged so that everything that’s affecting
the same object gets sorted by priority, and then whatever one wins… they’re run in
order, and each assignment can take what was coming before it, and modify it if it wants to.
So that’s how the virtual dimmer works, it takes the color, and then modifies it to
dim it. And in the end there’s a multimethod that
takes those assignments and figures out how to turn it into DMX buffers to get blasted
out to the lights. To build things that are real light shows you need more
than simple effects, and so there’s some basic compounding that you can use. A scene is just a bunch
of effects, whenever you run the scene, it runs all the effects. A fade takes one effect and
smoothly moves it into another effect. That might be changing colors from red to blue
smoothly, or changing direction from this to this smoothly, or changing from one scene
to another scene. And then chases are basically fades that go
between a lot of different effects, and then a blank is just nothing happens, you might
have part of your chase where nothing is going on. You can have them be conditional, you can have
them set variables, you can build stuff up with that. So that’s Afterglow as it is today. What possibly
is there left to do? Well, I want to make more different kinds of effects, more different kinds of cues.
I’d love to collaborate with a real lighting designer. I’m not. I’m a software developer who DJs,
I have some ideas about how to make light shows pretty, but I have a bunch
of friends who work in theater, who are all super busy, but if I could work
with one of them, you know, take their knowledge of what looks great on stage with my knowledge
of how to make it happen on the lights, I think we could get some really cool stuff going on. Another obvious direction is audio analysis.
In addition to syncing up those other ways, you could actually listen to the music and figure out where the beats are, and match the tempo that way. That’s the easy part,
the fun part is when you start doing fast Fourier transforms to figure out where in
the audio spectrum the energy is and doing visualizers like you see in iTunes, but having
the visualizer across the whole lighting grid that would be kind of cool. And a different kind of visualizer: Afterglow
already knows where all the lights are, and where they’re pointing and what they do,
it would be nice to be able to get a preview so that I don’t have to sit, hook up all the lights,
to say “how does that effect look?” Just render it, so I can see it on
the screen as an approximation. And I have some work in that
direction going, using WebGL, but the approach I took was, looked beautiful, but it kills the GPU, because I was doing
ray tracing through a field of the fog. It works great with three lights, with ten, not
so much. So anybody who knows, who is good with graphics,
I’d like to work with on that. I was thinking, a smartphone app to help you
figure out where your light is, because right now you have do some, get out your angle measure
and do some trigonometry in your head to figure out, here’s the normal orientation for the
light, but it’s hanging this way, what kind of rotations did I do? That’s the most tedious
part of putting the show together. So if you could have your camera pointing at the light,
and it could recognize a feature on the light, and then use the compass and accelerometer
to figure out how the phone is oriented, it could do that work for you,
which would be pretty slick. That’s what I want to do, anything else
you guys want to see? All right! Well if you want to look a bit
deeper, you can find all this stuff on Github. My Github account is brunchboy. And Afterglow’s
the main project, and beat-link is the other one I was talking about, that’s synchronizing
with the Pioneer gear, Wayang is the one that draws the pictures on the Push 2 display.
And that’s my email address. So thanks a lot for watching!
[applause]

Leave a Reply

Your email address will not be published. Required fields are marked *