Modulations

this makes this do that

Director Robert Lepage is a living treasure of the Quebec creative scene. With his mad blocking skills and sense of perspective, I think of him as the Pythagoras of stage and screen geometry, but given his non-Euclidean witchcraft maybe Lovecraft is a more apt comparison.

Let me give you an example: in his film Possible Worlds there’s a scene where the protagonist walks through one particular door in a row of doors opening into a hospital cafeteria. Most directors would consider that a rather pedestrian scene and film it with a commensurate attention to detail, maybe adding some backlighting to make it dramatic. Lepage, on the other hand, shot it from a perspective that takes one’s breath away, and the row of doors clearly becomes a metaphor for possible passages to the possible worlds of the film’s title.

Lepage’s alchemy of perspective, then, is a perfect match for the multi-perspective technology of FRAGMENTATION, the installation that kicked off the first edition of BIAN, the international biennial of digital art.

FRAGMENTATION is an adaptation of three scenes from the show LIPSYNCH directed by Robert Lepage for the ReACTOR six-screen 3D projection system designed by Sarah Kenderdine and Robert Shaw. (Capsule bios of Kenderdine and Shaw are here.)

Credit : Volker Kuchelmeister

[spoiler alert! If you're in Montreal read no further and plan a trip to the Musée des Beaux-Arts de Montréal between now and August 5th 2012 because what I'm about to explain is a lot more fun when you walk around the exhibit and have the concept come together through your own experience]

The scenes of FRAGMENTATION were filmed from multiple perspectives simultaneously and are projected from multiple perspectives on ReACTOR. As the viewer walks around the hexagon he sees the scene from six different perspectives, one of which is the “staging” perspective, framed to bring together the elements of the stage, actors and props into a coherent whole.

From the other five perspective / screens, one sees fragments of the props, such as a table, chair, and piano, in pieces scattered across the stage in such a way that they only form a coherent whole when viewed from the sixth “staging” perspective.

At the climax of the 11 minute loop, a major character staggers across the stage and leaves, in the process moving his body through tables, chairs and the piano – a violation of physical law that can only be seen from the staging perspective. From the other five screens one sees the performer moving between widely spaced props – the fragmented or deconstructed view of the action. In the end, the only “real” items on the stage are a single chair and a lonely brain on a pedestal.

The BIAN continues in the weeks and months ahead, and includes work Carsten Nicolai (Alva Noto) and Ryoji Ikeda. If you’re in Montreal, check it out – most of the exhibitions and activities are free and some overlap with the Elektra and Mutek festivals. Big up to Alain Thibault, artistic director of the BIAN. The lineup of events for the BIAN is impressive and Alain must be exhausted and proud.

It occurs to me that it’s been ages since I posted an actual, you know, Reaktor tutorial around these parts. Which is part of what this blog was originally created for! So here’s something nice and meaty: a tutorial on basic algorithmic music generation techniques in Reaktor.

What this tutorial and its associated instruments will do is give you a basic feel for how events in Reaktor can be wrangled and manipulated into musically meaningful forms. What it WON’T do is compose your next masterpiece or write a top ten hit:

[soundcloud url="http://api.soundcloud.com/tracks/43507376" iframe="true" /]

However, what you might do is take some of these techniques and create instruments that go further towards realizing your musical ambitions. I’ve always thought of Reaktor as a tool that blurs the distinction between instrument and composition. If you think along the same lines then you will find this information useful. Happy building!

Download the tutorial here (PDF and instruments)

 

I was experimenting with TouchOSC‘s new MIDI capabilities and created this layout:

There are four octaves of keys – it was a bit of a pain to go to one after another and adjust the pitches so I figured I’d share the fruits of my labor. The other controls are just X, Y and XY slider controls that send MIDI CC messages. They’re arranged in a “well, let’s use up this space and see what happens” pattern. I figure I’ll fine tune it after playing it for a while.

Download it here.

Because someone asked on a forum, I decided to write a quick note on how to set up the ports and addresses for Reaktor, Konkreet Performer and TouchOSC.

First, get Reaktor set up. This is straightforward:
This is generally as simple as activating Reaktor to receive OSC. Don’t worry about the outgoing OSC setup for now – the goal is to get Reaktor listening. Take note of the IP address and “local port”.
Continue reading »

It’s always been a sore point for me that Reaktor can only receive OSC messages when running stand-alone. Open it as a plugin in a host and the OSC support disappears. Now I can balm my angst somewhat with the latest release of TouchOSC which can  send both OSC and standard MIDI signals.

Feast your eyes on the new editor:

See that new ‘MIDI” tab on the left? Ooooooooooo!

This means I can now use the same interface to control an instrument whether it’s running standalone or in a host. Granted the resolution of MIDI is lower than OSC among other drawbacks, but I’d be using MIDI anyways in a host, and have to break out a box of knobs and get used to a different control surface to use the same instrument. This is definitely an improvement! Not to mention all the other neat tools and toys that I can now address directly with TouchOSC.

Get the 1.7 update in the app store and download the new 1.5 editor version here.

I’ve added some enhancements and fixed some bugs.

  • Most notably, there is now a key/slice copy and paste function. If you want to start with a key/slice, copy it to a new key and modify it, here’s what you do: press the key (middle C for example) that contains the settings you want to copy, click the copy button, then press the destination key (C sharp for example) and click the paste button. Then you can make the new copy as subtly or vastly different as you please.
  • I’ve nailed down a crash problem that some of you may have been encountering when playing different slices very quickly from a MIDI keyboard (or the computer keyboard for that matter).
  • All the parameter IDs have been sorted and compressed to aid VST automation in a host.

The 1.5 beta can be downloaded here. The archive password you received previously will work with this new version. 1.5 should work perfectly but I’m calling it a beta until more people than me have tested it!

For more information and to purchase, click here.

After a few too many pan galactic gargle blasters it’s time to dig out the ol’ Arcturan UltraTheremin and rock out on Squornshellous Zeta until you pass out on a living mattress.

I built this as a learning project to figure out how to map Konkreet Performer’s nodes to voices in Reaktor. If you’re interested, you can download here – you’ll need an iPad and Konkreet Performer to really make use of it. (It also works with TouchOSC – layout included in download – if you don’t mind a slightly less freakish playing experience)

[soundcloud url="http://api.soundcloud.com/tracks/13024702"]

Here’s how it works – set Konkreet up with three nodes and a ribbon. The three nodes control pitch and intensity. Angle from left to right is mapped to pitch, and proximity to the center node controls vibrato depth, rate and filter resonance. As pitch increases, the filter cutoff and pulsewidth also change.

The ribbon at the bottom controls voice fanning – in the center position, all voices are centered. At left and right, the voices fan out so that voices one and three are panned out left and right, respectively. Try it and you’ll see how it works.

There’s also a pitch correction macro so you can select a key and scale type to play in tune with a track. Enjoy, and link me up if you make some interesting noise with it.

And you probably thought the polyphonic theremin was a joke!

YouTube Preview Image

BTW, if you’re interested in using Reaktor with this kind of touch surface, please vote on my poll at Reaktor Tips.

Any sound generator in Reaktor can be modified to give MIDI note control over the output level. Here’s how to do it in Spacedrone:

Easy peasy. An ADSR envelope multiplies the signal going to the output. You can also use a selective note gate module instead of a vanilla gate module so only one specific note will trigger output. Adjust the attack, decay, sustain and release to taste.

Download the modded ensemble here.

If you’re looking for an interesting way to manipulate recorded samples of Spacedrone or other audio material, try my sampler pack.

UPDATE: added pitch control too. It doesn’t work the way a normal synth would because the pitches of individual voices have a random factor but you can control the range.

I recently released an updated version of my ParamDrum 3 instrument. I liked the new features and wanted to get it out to people but had some misgivings about the quality of the sample maps and sequences I’d created for it. To be blunt, they were boring… or worse. They say hindsight is 20/20 but sometimes it’s more like waking up with a hangover. If I woke up with my arm under those beats I’d chew it off to get away.

Today I went back to the sample maps that I used in the original ParamDrum, which were taken from the Reaktor factory library instrument Massive. Within minutes, I made a groove and some variations that I was quite pleased with:

[soundcloud url="http://api.soundcloud.com/tracks/11762955"]

Hear how that has a certain… slinkiness? That’s what I’m after. But it just won’t come unless I’m inspired… with the wrong samples it’s like trying to french kiss a cinderblock.

I learned two lessons from this. Number one, trust your instincts – if you think something sucks, chances are there’s a reason for that – modesty isn’t always false. Number two, sample based music is nothing without good samples, and working with bad samples, or samples that just aren’t scratching your itch, is a recipe for tedium.

My next move, drum-wise, is to devise a good way of creating one-shots that aspire to the fun factor of the Massive sample maps. I was working in Maschine last night, importing and slicing field recordings and applying effects, and that looks to me like a decent approach to creating one-shots with zing. What I’m doing now is creating variations on a sample with automation in Maschine, then exporting, then slicing the file into one-shots in Reaper – which has superb beat detection and slicing built right in.

But you, dear reader, probably have some techniques of your own for creating one-shots and I’d like to hear what they are. Do you have favorite effects or effects chains to apply to percussive samples? The wilder the better. Convolution, layering, bitcrushing, flanging? Share if you’re willing.