

There’d be times where your brain would pick up a chord progression or some sort of coherent harmonic structure, but it was lacking a little bit. At first it was just jumping through a bunch of different chord clips and would sound slightly aleatoric, meaning it would sound random. It’s really easy to generate notes that go well together, but music is all about rhythm. James: The hardest thing to generate is rhythm. What’s been the trickiest thing to get right with this system? Everything happens in Ableton, from the MIDI clip generation, to the samples that are loaded in, to the plugins that we may or may not be using.
Reaktor vhs software#
In terms of the software that’s running and how it works, it’s essentially just an Ableton Live session. The audio goes into OBS and Unreal Engine, and the MIDI goes to Unreal Engine.
Reaktor vhs Pc#
Ableton sends audio and MIDI from the Mac Mini out from one of the interfaces, and the other interface connected to the PC picks that up. Each has an audio interface, a Steinberg UR22 MK2. We have a Mac Mini that runs Ableton, and beside it is a gaming PC with an Nvidia graphics card. A project like this, you need other people involved, and Dan’s been the other guy planning what to do with it, because there’s a question at the end of the day, when you’ve spent hours and hours on this thing “well what are you going to do with it?” Dan’s finding creative ways to bring Dream Machine not only to people on the internet where it lives, but to things in real life.Ĭould you go through how Dream Machine is put together and how each of its constituent parts are talking to each other? James: The audio engine and the Unreal Engine, Dan married those two and brought the concept to life. What did getting Dan on board bring to the project? But it all starts in Ableton with a bunch of MIDI clips, Max for Live devices, Native Instruments plugins, all sorts of stuff. Every two to three minutes, there’s a new beat and a new visual. It turned into this huge endeavour, it was crazy.
Reaktor vhs how to#
I had to figure out how to do the visuals for the stream as well, so I had to learn Unreal Engine, make Ableton talk to Unreal Engine, and then get that streaming 24/7.
Reaktor vhs generator#
James: I was putting them on Instagram for fun, and Dan’s like, “Yo, you should combine your 3D dudes with your lofi generator thing.” Next thing you know Dream Machine is born. I was also doing 3D modelling at the time, and I was making these little blobby characters. We’re generating MIDI notes, generating the mix elements, all the production is happening in a similar fashion. From there it turned into something that doesn’t use samples at all. I was making another project a couple years ago called Twitch Plays The Synth – a custom Reaktor synthesizer you could control with Twitch chat commands that ran 24/7, and I had this idea, “what if I ran a 24/7 lofi stream?” Initially we took a bunch of samples and drum loops, put everything in the same key, and randomized them. James: It started with the concept of being able to procedurally generate beats. Where did the initial idea for Dream Machine come from? I help out with the planning, integrations, collateral materials, socials, partnerships and strategy, as well as working with James on new versions of ideas. James showed me the engine he was building in Ableton, and all the cool characters he was doing, and we basically sat down and put together this Dream Machine thing. My background is in creative direction, and I run a studio called Space Agency.

Most recently I’ve embarked on a project called Dream Machine, which is a 24/7, procedurally-generated radio station.ĭan: I’m Dan, a buddy of James. I’m a producer, songwriter, sound designer, I make Reaktor instruments, a whole bunch of stuff. This is a pre-recorded clip, but you can check out the current live stream at dreammachine.ai.īefore we get into the specifics of Dream Machine, could we hear a little about who you both are and what you do?
