Cumbia Cosmonauts Music Video

Above : more proof that Space Is The Place…. at least when it comes to Mexi-Australian tropical bass genres.

That’s the fruits of a few quick projection and filming sessions with the Cumbia Cosmonauts, featuring custom graphics made by the CC VJ – Martin Hadley (I especially liked his spaceship control deck!). I’d like to think if there’s ever a Mexi-Australian space program, that it looks something like this… ie has that Ed Wood in space vibe about it, maybe with styling by Lee Scratch Perry & Sun Ra.

The Cumbia Cosmonauts are a Melbourne band who are celebrated around the world with their take on Mexico’s cumbia music, and so fittingly, they release their new album, Tropical Bass Station, on the Berlin label, Chusma records, on Nov 23, 2012. The track ‘Our Journey To The Moon (And Back)’ comes from that album.

(Other recent video projects)

by j p, November 21, 2012 0 comments

Dr Seuss + Elefant Traks At The Sydney Opera House

Dr Seuss meets Elefant Traks at the Sydney Opera House

4 days later, and am still buzzing from the Elefant Traks vs Dr Seuss show at the Sydney Opera House.

Developed and performed for the Graphic Festival – it was an audacious project – inside a tiny time frame, create 18 songs and animations to reinterpret or remix the books of Dr.Seuss for the stage. It never felt like enough time – and yet, the amazing zoo / crew at Elefant Traks pulled it together and nailed a dynamic audiovisual smorgasbord (that apparently had some of the Seuss publishing folk moved to tears!).

My role was to develop and live trigger the animations for the show, which was akin to developing a feature film in 6 or so weeks.. while liasing with around 20 different musicians… “hey man, I’ve got this new idea for a beat / I’ll get you those lyrics soon.. etc etc” – so I wasn’t surprised to find myself still rendering out clips on stage, right up to the last minute.

I’m going to put up some more animation info later, over at skynoise.net/projects, but for now, while still floating, I wanted to put out a huge thank you to:

– Jono ‘Dropbear‘ Chong + Darin Bendall, who did an amazing job, animating half of the tracks between them.
– Urthboy – who oversaw the crazy production, as well as performed throughout the show
– Unkle Ho, who helped tie together the visual production, and developed his own flash-based interactive visuals for the show, AV jamming on a wii-board to Green Eggs & Ham, with Jim from Sietta + Angus from Hermitude.
– Luke Snarl Dearnley, who did a stellar job as technical producer, keeping the whole show smooth as butter.
– Owen Field, who covered all the logistics with grace and calm…

And that list could go on and on – there were endless Elefants who who were such a pleasure to collaborate with…

Some Elefant clips:

X-Continental, a clip I did for the Herd back in 2001.
Urthboy, Ozi Batla, Solo, The Tongue and L-FRESH: Cipher at the Opera House
and below, Dropbear’s fantastic animation for ‘And To Think That I Saw It on Mulberry st’, which was performed as the first track of the show, by Urthboy, Jane Tyrrell + Angus from Hermitude. Ozi Batla had just given his show-intro in an aviator costume, and hooded Urthboy came on to do a quick rap about Dr Seuss, before pulling back the hood as the lights came up, the decks started up, and MCs roamed the stage with this as backdrop:

by j p, November 15, 2012 0 comments

A Tribute to Runwrake, 1965-2012, RIP

Runwrake RIP

runwrake

I cried last night, upon reading that John Wrake, aka Run Wrake, has passed away.

I’d first learned of his cancer diagnosis a few months ago, after wandering once again to his youtube page, and noticing a short and simple message underneath his most recent short film:

Run Wrake youtube
Down With The Dawn, is Run Wrake’s usual virtuosic animation, but knowing that this 8 minute short film was his response to being diagnosed with cancer, made it quite confrontational viewing. I was shocked then, but somehow presumed he was turning things around, he was on the slow path to recovery, that although tragic, everything would be okay.

Award-Winning “Rabbit” Director Run Wrake Dies, 47
The Most Talented Dude You’ve Never Heard Of
A Genius Gone Too Soon. British Animator Run Wrake Loses Battle With Cancer.

“It is with incredible sadness that I have to let you know that our darling Run passed away very suddenly at 5am on Sunday morning as an end result of his cancer. He had spent a beautiful Saturday with his two children Florence and Joe, his sister Fiona and myself. We left him at 7pm doing what he loved best- drawing and animating with peg bar and paper.
I was with him for his last moments. We love you Run.
Lisa Wrake.”

runwrake screenshot medley

Above, hard-drive snapshot of some of my favourite RunWrake animations.

I first learned of Run Wrake around 10 years ag0, through his compilation Gas DVD, “Dinnertime”. Somehow it had laid unwatched in a pile of media for a few months, until late one evening I spied it again and lazily inserted, then pressed play. What followed was dizzying and overwhelming – that mix of exhilaration and exhaustion when discovering an artist so consistently good, so relentlessly inventive, and so utterly prolific that you’re left wondering if they exist under different laws of time and space.

A few years later, I was thrilled when Run Wrake agreed to an interview (published in 3D World magazine, as well as skynoise in 2006).

A snippet below:

Where did ‘Run Wrake’ come from? 
Actually a nickname earned whilst keeping wicket particularly badly during a game of cricket aged 11. A friend was sent in for sarcastically shouting ”Run”, as the ball went thru’ my legs for four.

With so much animation under your belt, what has it taught you?
It’s taught me that I’m very lucky to have the desire and ability to scrape a living doing what I enjoy, and that you will never make a piece of work with which you are entirely satisfied.

To what extent do you storyboard your clips? Or how do you approach narrative?
”Rabbit” is the first film that I have rigorously boarded, with a view to telling a story, and I thoroughly enjoyed the discipline.

Any desire for feature films, or longer works?
Absolutely, watch this space*.

(*As of 2012: Wrake was devel­op­ing an ani­mated fea­ture, The Way to a Whole New You, with writer Neil Jaworski for BBC Films.)

One of my questions was whether Run Wrake had ever animated a skateboarder, and Run Wrake was kind enough to add a note at the end saying that he’d done an ad featuring a skater, and that he’d attached a little quicktime movie of it for me. One of those wow moments – a favourite artist sending me something they’d made?? Below, a screenshot sequence from it, which demonstrates one of his trademark ‘perpetual zoom outs’…

Runwrake skateboarding

runwrake dvd

A glimpse at his biography (have you seen a more delightful online CV?), showed some of how this was all possible. Run Wrake had gone through the Chelsea College of Art and Design, and the Royal College of Art, before achieving a breakthrough with his 1990 student film Anyway on MTV’s Liquid Television. With Anyway, several strengths were already evident – an eagerness to playfully deconstruct form, an ability to adapt and incorporate many kinds of media and animation styles, and an incredible capacity for fluid transitions – smoothly morphing into wildly different scenarios or character transformations.

The DVD documents the development of all those strengths, as well as introducing others  – a highly attuned sense of animation rhythm and pacing, and a flair for visualising sound and loops. That kinship with music was partially nurtured over time with his job as an illustrator for NME magazine,  (the DVD includes a virtual gallery of these illustrations, narrated by a flying turtle-armed boy.), but is most evident across his trajectory of music videos, most notably those with long-time collaborator, Howie B.

Runwrake

Some favourite moments?

The alarm clock sequence within ‘What is that?’

How he plays with loops, one minute into Music for Babies by Howie B. (At time of writing, vimeo had just made the clip a ‘staff-pick’, in honour of Run Wrake’s passing.)

The intro sequence to ‘Jukebox’ – no, actually, just all of it…

The ‘Buttmeat‘ clip for Howie B. (All those liquid visual transitions!)

Music video directed by Run Wrake for Spacer’s 2001 single ‘The Beamer‘. (Love the scene transitions, and the disregard for time/space conventions).

And below – a sequence transition from Lessons in Smoking (this video link showcases it better),- produced by Run Wrake  for his compilation Gas DVD, “Dinnertime”.

runwrake lessons in smoking

With all that under his belt, it’s easier to understand how he gets to describe his career highlights as including…

“my first job, commissioned by an Elvis suited Jonathan Ross to make a title sequence…making Jukebox, my first animate! commission, a two year slog…meeting and working with Howie B, initially on a short film to accompany the release of his album Music For Babies, and subsequently on a series of freeform promos…presenting storyboards to Roy Lichtenstein in his New York studio for U2′s Popmart Tour visuals…and the critical acclaim for Rabbit, a short film completed in 2005.”

runwrake rabbit

Less easy to understand is why Run Wrake wasn’t better known, even amongst animators. Even though he worked on U2 tours, and Rabbit won plenty of awards, it still felt that there was an animation giant walking amongst us, and not enough recognition of how much terrain his work covered. That was at least partially remedied, earlier this year, with a Run Wrake Retrospective at the Ottawa International Animation Festival, with the title referencing one of his favourite characters:

RUN WRAKE: MEATHEADS, RABBITS AND THE DAWN

Runwrake meatheads

Below, artwork recently donated by Run Wrake –  as part of CEL: an online fundraising project to keep the Animate Collect online, ‘On The Brink of Manhood’.

runwrake

Runwrake.com
Runwrake on youtube
Runwrake on vimeo
Runwrake reworking personal home movies for live Audio Visual sets. ( Yes, he VJ-ed occasionally! )
A video interview with Run Wrake about his animation process.

RIP Runwrake…. thanks for adding your splash of colour to the world.

by j p, October 27, 2012 1 Comment

VDMX 5 Review

VDMX 5 interface

(Above, a messy example VDMX interface of mine. Click screenshot to see full version)

Here’s a review brewed since I got my review copy back in 2005 (when VDMX first turned 5, says the Vidvox software museum*). Now that it’s 2012 and we’re at Beta version 8.0.8.1, it seems as good a time as any to declare VDMX 5 ripe and ready. Let’s do this.

What is VDMX 5?

VDMX 5 = A ‘modular, highly flexible realtime performance video application‘ developed by vidvox.net.

What does that even mean? The six word executive summary by @Protostarrr :
A hipsters version of After Effects‘ is cute, but misses a crucial difference – VDMX is software built for real-time usage – ie no waiting around for rendering, it means live adjusting, manipulating and sequencing of video clips and video parameters – during a theatre performance, while musicians play on stage, within an installation, or to create some hybrid of what might be called live cinema. Just as hiphop and electronic music producers have long been playing live with audio samples, we now have the ability to shift from a studio production mentality, towards using video samples in a live setting.  This means VDMX must be capable of letting it’s users adapt and respond to any unfolding events – and the importance of having that flexibility is reflected with how Vidvox define their software:

“VDMX5 is a program that lets you assemble custom realtime video processing applications. This is an important distinction- instead of being stuck with a fixed processing engine and a static interface, it gives you the freedom to assemble not only whatever custom processing backend you desire, but it allows you a great deal of creative control over how you wish to interact with your backend.”

VDMX interfaces

(Example search for ‘VDMX interface’ )

So what can VDMX 5 do? 

– Trigger separate clips for playback across different projectors ( a desktop with multiple outputs, or an external graphics card for laptop is also needed)
Mix several clips together to create layered collages and compositions (multi-blend mode options / compositing options / cross-fade options / customisable quartz transition modes)
Map separate video layers onto physical objects (VDMX5  has basic perspective mapping functions, or can send video layers via syphon to other mapping software)
Organise video layers into groups (which allows composition or FX parameters to be adjusted per layer or per group)
Re-route any video layers into other layers / compositions (enables easy creation of visual feedback loops, or addition of more organic complexity with FX)
– Adjust or control any video parameter or Fx parameter easily with an onscreen slider or button – and in turn, control these by various data sources (eg mouse / midi / audio analysis from built-in laptop microphone / LFO oscillators and wave values / midi + OSC controllers / wii controller / iOS or android controller etc ), and these values can be flexibly refined by using a range of in-built math behaviours ( eg invert values, smooth values, multiply values etc).
– Build Control Surface Plug-ins – which are ways to consolidate various controls into a a customised interface ( eg have 4 meta sliders, each of which may control any number of other parameters, when activated )
Capture camera inputs, apply effects to these. Can also record and playback camera samples in real-time.
Capture the visual output from a window of any other application running, and re-route this through the VDMX signal chain (eg mix in a live webcast from a browser, bring in a photoshop sketching window, bring in a skype window etc )
Record your clip-triggering and visual FX experiments to disk (Fast and reliable, records directly into a VDMX media bin for immediate re-triggering / remixing / recording and etc etc )
– Use a built in step sequencer for arranging clip-triggering or FX over time.
Save and trigger presets in extensive ways (global, per layer, per FX chain, and per slider. And more recently, we can cut and paste parameter settings between sliders. Very useful for quickly copying refined parameter and interactivity settings from one effect to another.)
– Tightly integrate customised quartz composer patches and FX, including customised interface elements – where each of these can be controlled by the various methods described above. (It’s hard to overemphasise how useful and powerful this is).
– Use flash, text and HTML files, as well as Freeframe FX.
New : send DMX (Artnet) data – to control / interact with lights / lighting desks… (I’m yet to play with this, but it’s a great addition. Requires a computer to DMX box such as the Enttec ODE. )

There’s much more, but you get the idea – it’s flexible, and can be adapted to suit your project by project needs. These open ended possibilities are both a strength and weakness of VDMX – it’s fantastic being able to make your own customised interface to suit a particular workflow or project, but first time users tend to find can be daunting to approach for first time users.

Below, an example of 3 layers being mapped to suit particular shapes. (The canvas controls can be enlarged for easier mapping / alignment, with pixel increment adjustments on corners, available by pressing arrow keys )

VDMX 5 interface

Understanding the VDMX Workflow

With the above multitude of options, getting to know the ropes is pretty important. Here’s a few learning pathways:

1. Plug N Play… aka ‘explore’ : Even within the downloadable demo software, VDMX5 comes with built-in template projects that can be accessed through the topscreen menu. These can be easily modified and used as a foundation for your own projects. Playing with each template will show some of the features and variety on offer.
2. Vidvox Wiki : Extensive, detailed listing and explanation of the progam’s various parameters. Read over, then go back to step 1 and play some more.
3. tutorials.vidvox.net : In-depth video tutorials from the pixelated horse’s mouth.
4. VDMX forums : Over time, I’ve probably learnt more about the program here than anywhere else – as with any software of depth, the possible solutions to any particular problem posed, are multiple and varied, and am regularly learning new ways to use VDMX through the discussions here. The developers also contribute frequently, debugging problems, clarifying how various aspects work, and helping point beginners in the right direction.

Some Example VDMX Projects

Aka – here’s some links to material I’ve used VDMX for.

– Compositing video for 3 different projections and walls at Cockatoo Island, Sydney. ( Pattern Machine, at Underbelly performance )
– Generating and recording audio-reactive visual textures (with VDMX and quartz) (Visual backdrops for Audego)
– Generating textures and audio-reactive elements, then mapping these to suit physical shapes I’m projecting onto  (Mat Cant music video)
Triggering live video onstage with Gotye (so the right part of each animation happena when the live musicians reach the chorus etc )

VDMX Elsewhere:

How to set up the VDMX basics.. 
Learning VDMX at the Audiovisual Academy, Videos – Part IPart II and Part III.
A 32 minute intro to VDMX (via visual-society)
Iso50 overview of how he uses VDMX …
Connect VDMX to Madmapper (via official Madmapper blog)
How to send multiple outputs from VDMX to Madmapper… (via destroythingsbeautiful)
(Actually, Destroythings is destroying things for VDMX (mostly VDMX ready-quartz patches )
Making loops live with the Wii and VDMX ( 4 video tutorials via moongold)
VJ Kung Fu: AV Sequencing with Live + VDMX + Monome
Using VDMX to create stop motion animation – by synchronising video projection playback with the sequencing of time lapse photos. (by Zealousy )
How to use a window from any other mac software, within VDMX (eg for live photoshop painting etc / via 1000errors)
How to create a 16 frequency graphic equalizer for Ipad Lemur to use sound for controlling various FX in VDMX. (via 1000errors)
Creating a Sound Visualizer with VDMX + Unity 3D (via creativeapplications)
Experiments with Quartz Composer patches in VDMX (via Goto10 at vimeo)
Telecommuting the mix: VDMX, Syphon, CamTwist, and Skype (via noisepages)

Requirements :

  • Mac computer with an Intel processor
  • Mac OS X 10.6 or later
  • NVIDIA or ATI Graphics Card
  • 4+ GB of RAM
$349US  – Refreshingly, this licences the user to run VDMX on up to three different computers for personal use. On one level it’s a very generous licence – but on the other, it’s merely acknowledging the likely practices of most digital artists (across many workplaces, home, venues, installations, multi-screen set-ups etc). At any rate, very handy.
Educational pricing = $199
There’s also a ‘Starving Artist Discount’ – ‘Put your skills to work helping out the VDMX community and you can get a license of VDMX5 for only $199 USD.’

Verdict?

While VDMX 5 is overkill for some people, and others might prefer the complexities of say of MAX/MSP or coding their own software, for me it strikes a great balance of depth and accessibility. Complex results and interfaces are possible, with relatively little mental investment. Once that initial learning has happened, it’s a very versatile tool, easily refined to suit each project (eg for this gig, let’s make the playback timeline fill the whole screen, so we can fine tune tiny little loops more easily – or let’s create 3 media bins so it’s very clear which samples to trigger for each of 3 stage characters – or let’s emphasise the FX palette here.. etc etc). VDMX 5 has evolved over many years, taking on board much user feedback, as well as introducing users to better ways of approaching video signals and introducing all manner of nuanced interface elements and processes. There is a lot of significant functionality in the program, but it’s in the nuanced details of those features, that the merits of VDMX 5 really come into play. Take it for a test drive….

 

[[*VDMX software museum visitors and yesteryear software interface fetishists might also like: VDMX 2 review (2002) or VDMX 4 review (2003) ]]

by j p, September 20, 2012 6 Comments

Ghostly Forest Projections Near Hanging Rock

TZU beautiful ghost projections in forest

For the TZU ‘Beautiful’ music video, I recently found myself out near Hanging Rock, with plastic-wrapped laptop, projector, camera, lights, and a mini-crew – filming ghost projections in the night winter rain. Despite the weather drastically mismatching the supposed forecast, slowing everything to a snail’s pace, we salvaged the situation as best we could, reworking the storyboard around some of the less exposed areas, and soldiered on until about 5am. Not the end result we’d aimed for, but am happy with what we managed in the circumstances. So it goes. Full credits/links, and a series of behind the scenes photos over at the project page.

by j p, September 17, 2012 0 comments

Crossfading Laptops with the *spark d-fuser

NOW READY FOR ORDERING….

“The *spark d-fuser lets you crossfade between laptops. Whether switching between presenters or pushing avant-garde pixels, hands-on control for mixing DVI and VGA signals is now available in a compact and affordable package.

If you want to know more or see it in action, jump straight to the demo video below. If you’ve been following the project, the message is simple: pay and yours will be produced. Orders are being taken on September 5th, the manufacturing run will then take six weeks from there. Price: £710 ex. VAT, £852 inc. VAT.”

We have no jetpacks, but soon it seems, we will have affordable mixing of digital video signals, thanks to the herculean efforts of 1 x Toby Harris aka *spark aka ‘card carrying Timelord amongst VJs’.

Rattling along in the tube, in between bankers reading 50 Shades of Kindles… Toby envisioned a better world, a world where VGA and DVI signals could be mixed without repercussions, and a world where smooth crossfading could happen with a device carried in your backpack. It was also a world that he would have to build himself, and a couple of years down the track, here we are. In between priming conveyor belts and supervising factory elves, Toby was kind enough to answer these questions:

What have you enjoyed about using your prototypes during performances?

The mixer for me is in support of the laptop, and damn have I enjoyed pushing crazy pixels with my laptop. Using it two-up in a D-Fuse show with Mike, I’m freed from the need for it always to be my mix on screen, so I can rip down, prepare and experiment with the mix. Makes me push things much further! That, and I’m freed from the fear of my bleeding-edge software taking down the whole show.

The surprise for me was the tap buttons, I love them. The original prototype didn’t have them, I envisaged a cross fade from one to the other and not much else. But in the expression of interest form, lots of people asked, so on they went… and wow, tapping in a slight variation of the main laptop’s mix is a really powerful thing.

What sorts of firmware additions would you like to see / develop? (you mentioned multiply mode as an option once?)

Mix modes are in the realm of possibility. The processor has the power to compute a soft-edged key for every pixel, so there’s some per-pixel computing power to play with. Additive is the bangs-for-your-buck upgrade here, and I think would really creatively transform what is possible with the mixer as you get the ability to truly composite the two sources together. I talk about this at the end of the demo video, and I’m really trying to make it happen.

I’d love to see the processor lose its line limit of 2048 pixels, there’s the naive observation that TripleHead 800×600 should be possible given that is actually fewer pixels to process than the 1920×1080 it definitely can handle. TV-One have in a way already answered this in the 1T-C2-750‘s sibling, the 760. It can do 2880×900, but at the price of being able to fade both sources.

You have to realise however it’s TV One’s processor, and the firmware that runs it is very much their core product, their IP. There’s no possibility of them giving it to us to do, and them doing anything for us is a decision intertwined with their wider business plans. I wish it weren’t so, but the sheer fact they designed the 750 and produced it for an affordable price is something to wonder at.

Why release the firmware as Open Source?

The frustrations above should go some ways to answer! If you need to tweak, extend or optimise, its in your own hands, and in the best case that gets shared back to all. Simply put, its what I would want if I were in the community buying one. There is more to it than that, and there certainly are risks, so let’s call it an experiment and see how it plays out.

Why has the video hardware world been so slow in releasing affordable digital mixers?

Well, one thing I can say is that this project has been one of the most ridiculous things I’ve ever done in terms of effort and reward – if I had an eye on the bottom line I’d have stuck to bespoke development and on-site fees! There’s obviously a quantity sold at which point that changes, but I’m not sure that quantity is comfortably within the VJ market, and I’m doubly not sure of that if you have the overheads of a worldwide corporation.

I’m surprised however that VJs haven’t been able to co-opt generic presentation kit, the 750 is as close as I’ve seen.

In what kinds of ways have you played (live) with the OSC / DMX and ethernet capacities?

The simple answer is I haven’t – the ability to have that is everybody’s gift back to me for doing this project, along — hopefully — with additive mixing. Come the first D-Fuse gig with the new controller, we’ll be rocking the OSC out. Finally we can cut between visual laptops and have the audio follow!

Orders for the *spark d-fuser are being taken on September 5th, with a manufacturing run from the 10th September. Price: £710 ex. VAT, £852 inc. VAT

[Another interview with Toby, about Live Cinema, in 2008]

by j p, August 30, 2012 0 comments

Making Music Videos With Portable Jungles

Aka A Music Video in 3 Steps:

1. Re-created Middle-Earth on a kitchen tabletop… (hello – every backyard plant we have, hello – fallen moss covered branches from the park across the road, hello – turntable, hello – flashing bicycle lights, hello – wonky lampshades and plastic toys.)

2. Made some custom animations (Quartz composer, After Effects), and projected these onto Middle-Earth, using software to manipulate the projections (VDMX, Madmapper, quartz composer).

3. Recorded the results (Canon 7D, various lenses), and edited together (Premiere).

Song = Make Believe (Original mix), by Mat Cant (Scattermusic)

And yes, animated, directed and edited by myself in whirlwindy short amount of time.

Next up, a floating forest of bonsai plants strapped to drones..

by j p, August 28, 2012 2 Comments

Learning With Quartz Part 5: Using Twitter Hashtags + RSS Feeds in VDMX

[[ UPDATE : A while ago – Twitter changed the way it deals with RSS, thereby breaking the technique below… ]]

(( For possible solutions, try this labnol work-around (video explanation)
or try this discussion over at the Inklen / Serato QC forums... ))

As part of doing live video for a event a few months ago, I was asked about displaying a live twitter feed for it.
“I can probably take care of that.” Which meant…

QUARTZ COMPOSER WRESTLING

Ingredient 1:

The basic RSS patch that comes with Quartz Composer…

RSS quartz composer patch

(Entering the skynoise RSS feed URL into the patch on the left, generates the output in the viewer on the right.)

Ingredient 2:

Generating an RSS feed from a hashtag.

Although Twitter doesn’t offer up RSS feeds, it turns out they can be generated by using the following URL

http://search.twitter.com/search.atom?q=%23hashtag

and replacing the word hashtag with your word of choice eg

http://search.twitter.com/search.atom?q=%23KONYtattoos

http://search.twitter.com/search.atom?q=%23cannibalsvisitingIKEA

http://search.twitter.com/search.atom?q=%23butmorriseysaysmeatismurder etc

Which means – this URL can be used to populate your Quartz patch, with any tweets published by people using that hashtag.

So I edited that quartz composer file, so each tweet was composited the way I wanted, in a 16:9 image, and details like date etc were turned off.

(Uploaded here: http://skynoise.net/qtz/RSS_twitter-test2.qtz.zip )

Problems:

The Quartz patch didn’t seem to continually update from the twitter RSS feed – but starts to cycle back through older tweets in a loop after a while… ( like 10-15 tweets ? ) + couldn’t figure out how to display author, alongside the text … Tried looking through the RSS info to find author parameters, then see where these might be adjusted within the quartz patch, no dice.

Solution: ( via @lumabeamerz / aka Mr.Coge : check his software out! )

I posted a description of the problem to pastebin and asked on twitter.. and @lumabeamerz kindly wrote back *and* adjusted the quartz patch, noting…

“If you put your mouse pointer for a moment to a structure’s output, you will see what is “flowing”, like this:

So, 0-4 are indexes, “…” are keys. Basically, we need the member of key “authors”, which will give use an other structure. The index 0 member of structure is good for us, and give us an other structure. From the last structure, we can extract the name with the key “name”. It is simple if you are a programmer, since the method is same in the Obj-C land to access structures. For the updating, I connected a Signal patch to the RSS patch’s update signal input, so it actually refreshes in a 60sec. period.”

Here’s the final quartz patch, edited by @lumabeamerz – which continuously updates any tweets from a particular hashtag, and displays author name alongside. Maybe it’s a useful template for you to modify however you wish?

http://skynoise.net/qtz/RSS_twitter-test3-lumabeamerz-edit.qtz.zip

And below, the patch inside VDMX:

vdmx twitter rss quartz patch

(See above: when triggered from a VDMX media bin, the quartz patch, gives an option for writing in your preferred hashtag. If you want to add more controls, such as changing the colour, size or position of text – you can publish the relevant parameters in quartz, so they become available for use inside VDMX. )

For more industrial options – see the *spark screenrunner

Recently Gathered Quartz Composer links: 

– cv at github shares 14 example quartz patches as tutorials, available for download and playing with.

– Superfleamedia 3 part tutorial about using iterators-in-quartz-composer

– http://quartzcomposer.nodepond.com/ – Patches and various how-to’s, by Nodepond – specialists in Mac OS-X and iOS graphic tools.

– Pixelnoizz on how to turn a quartz patch into a Pixelmator filter

Learning Quartz Composer Part 1
Learning Quartz Composer Part 2
Learning With Quartz Part 3: DIY Anchor Rotation FX for VDMX
Learning With Quartz Part 4: 3D Objects With Video Textures in VDMX

by j p, July 18, 2012 7 Comments

Resolume Avenue 4.1 + Arena 4.1 Review

A recent major upgrade at the good ship Resolume, takes both Resolume Avenue + Resolume Arena to version 4.1.

Resolume Avenue 4.1
Pitched as a HD video mixing desk, this does everything you’d expect of modern VJ software. Detailed specs listed here.

Strengths At a glance?
– Instrument like interface  – built for performance. The screen layout simplicity that avoids complex navigations. It’s built for quick and easy triggering…and gives an easy visual overview of the overall composition, selected layers, and selected clips. There are plenty of nice interface touches – eg click columns for all clips in that column to play on each layer ( makes for easy switching between combinations of clips ), or adjust transition timings for new clips in each layer to trigger as fades or cuts, and custom dashboards for quick viewing of FX parameters.

( Above : the tightly arranged mixing section )

– Solid playback of video, audio and image files. ( Includes very fast custom codec, DXV, see below for codec comparison.)
Audiovisual effects and integration – many features including VST FX for audio, easy combining of audio and visual effects, and the ability to easily crossfade just the video, audio, or both at once.
Plays back interactive compositions – eg quartz composer, and flash animations including AS2 and AS3 scripting. It also handles FFGL plugins eg the IR Mapio plug-in which can be used for mapping videos to projection surfaces.
midi / osc – use any hardware or virtual controller – and nicely, it includes preferences for starting clips an adjustible few milliseconds further in the timeline to deal with delays that midi triggering can incur.
BPM Tempo + Snapping – Everything can be linked to the global (Beats Per Minute) tempo to create a fully synchronized audiovisual performance. Automatically pitch the audio and video and synchronize parameter automation. Use the beat snap function to trigger clips in sync with the beat.
Audio analysis – Utilise environmental sound to bounce any clip parameter to the music.

New to 4.1?
Native Syphon support on MacOSX. This means you can re-route Resolume into other video apps, and vice versa. ( hello madmapperVPTquartzjitterVDMXModul8Unity Game Engine etc )
Layer Routing – ‘create a clip that takes the input from any layer below’. ( All kinds of remixing and compositing capacity )

And The Flipside?
Some complexity and versatility is lost with the design decision to streamline the interface performance. eg You can’t easily preview, compare or adjust effects on 2 or more layers at once. I also asked a group of Australian VJs who’ve used Resolume much more than me, what they’d like improved:

– some still prefer earlier versions for stability reasons (V4 crashes – with V3 decks loaded / when too many notes are fired in Ableton)
– lack of a search function in the file browser + effects list
– needs a text tool
– needs an audioreactive timeline like v2.0
– midi can sometimes be inconsistent

Generally though, people seem pretty happy with the evolution to version 4.
( Thanks to VJ Zoo / Richard De Souza / Vdmo Kstati / Simon Kong for input )

Avenue 4.1 Requirements

– € 299.00 Euro for 1 computer. (50% discount available for staff/students) ( Includes all 4.x.x updates, eg 4.1.1 that came out just as I got ready to press publish)
– Windows 7 or XP: ATI Radeon 9600 or better. NVIDIA GeForce FX 5200 or better. 2GB RAM.
– Mac OSX 10.4.9 or later. Intel processor. Quartz Extreme graphics card (Resolume is not compatible with integrated Intel graphics processors). 2GB RAM. DXV Codec.

Resolume Arena 4 Media Server

Arena has all the features of Avenue aaaaand..

– Screen Warping & Video Mapping  – In the advanced output window you can now create as many slices from your composition and position and transform them to your liking – good for multi-surface projections. New – route layers directly to slices, masking and cropping now added to the Advanced Output of Arena – and use bezier curves to map video onto curved screens.

– Soft Edge – With soft edging to seamlessly project 1 widescreen image with 2 or more projectors. Or wrap around the composition for 360 degrees seamless projection.

– SMPTE Timecode Input – “With SMPTE Timecode input you can run you clips in sync with anything you want. Lights, lasers, even fireworks!”

– DMX Input – Control Arena from a lighting desk using DMX. It works similar to MIDI so it’s very easy to configure. Input can be done via ArtNet or an Enttec DMX USB Pro.

These features will appeal to some ( esp perhaps PC users who don’t have access to the easy mapping capacities of the mac only Madmapper ), an interesting option compared to much more expensive hardware media servers.

resolume arena 4

(Above: the mapping related bits inside Arena’s advanced output. )

Arena 4.1 Requirements

(Same tech requirements as Avenue 4.1)

€ 699.00 Euro for 1 computer. (50% discount available for staff/students)

BONUS ROUNDS:

Check out the Resolume Manual / Resolume forum and the Resolume ‘Stock Footage Shoppe’ which now includes customisable quartz patches alongside stock footage.

“We’re always saying generative content is the future, so it’s about time we proved it! The original Quartz Composer patch is included to create endless variations yourself”

DXV + High End Codec Comparison

H.264 helped popularise web video with it’s intensive file shrinking while maintaining a lot of visual quality. It’s terrible for real-time VJ software though, because of the relatively painful CPU intensity required to decode it. I do use it sometimes ( the Canon DLSRs record natively as h264, and sometimes I’ll just throw a clip straight in, and it’ll play fine, but it’s generally best avoided. Photo Jpeg at 75% seems to work nicely on most platforms, though Apple Pro Res and AIC give a slightly better image quality when a gig / clip needs more detail. So where does DXV fit into this?

“Regular video codecs require the CPU to decompress the frames before they can be pushed to the videocard for display. With our DXV Codec video frames are decompressed by the videocard’s GPU which can do this much more efficiently than the CPU. The DXV codec can also store the alpha channel. This is essential for preserving translucency in complex video compositions.Hardware accelerated playback is only done when played in Resolume – the video will play in any other software but it will not be hardware accelerated.”

A quick lo-fi test for comparison’s sake then?

1 minute 1920 x 1080 / Pro Res video in Resolume Avenue 4.1 with no FX.
(On a 2010 macbook pro running the usual hundred apps / million browser tabs open)

Codec File Size CPU Playback
Pro Res 1.41gb  116%
DXV 1.37gb 54%
AIC 380mb 126%
PhotoJpeg @75%quality 405mb 96%
H264 50mb 170%

(Note: Pro Res + AIC plays/encode on mac, only play back on PC)

Other high end codecs worth a look at? AJA software codec / DNxHD / Cineform HD / Sheer codec by Bitjazz

 Resolume 3 Review, 2008

by j p, July 17, 2012 0 comments

Higgs Boson Explained in a Cake Recipe

Thank you, @fustar! (+ fustar.info).

“It’s like zoologists trying to determine if they’ve discovered a new species of butterfly by looking at a meadow through binoculars. From 16 miles away.”

From the above Guardian article, I learnt that all the matter we can see in the universe accounts for only 4% of the total, and the Higgs Boson might help us figure out the other 96%. That seems odd in a Douglas Adams kind of way, that we’re so blindly adrift. And he would’ve been thrilled today if he were still around. Though he’s probably flicking someone with a towel somewhere inside that 96% we can’t see.

The Higgs boson, the Guardian also explains – is the force-carrying particle that helps transmit the effects of the Higgs field. And the Higgs field? A kind of cosmic “treacle” spread through the universe. As particles travel through the treacle they slow down, lose energy and get heavier. Kind of like cake mixing really.

And so, a significant day for science, a landmark that’ll be looked back on in decades and centuries to come apparently.

And then the next thing in my feed full-screen floors me… I don’t mean to get all Californian on your ass – but these are some of the thickest slow-motion barrels I’ve ever seen anyone surf through – we’re talking some serious overhanging slabs of ocean here. We’re talking some serious mass. Try keeping your jaw up at 2.29, 3.33, 4.29, 5.23…..  Physics eh?

[[That was by Chris Bryan. Also check out his 12 minute show-reel shot with the same super-slo-mo Phantom cameras (in an underwater body).]]

by j p, July 5, 2012 0 comments

Amon Tobin Taxed : ISAM AV in Melbourne

amon tobin isam melbourne performance

Apologies if you’ve arrived in search of Tobin Tax discussions (a tax on market speculation proposed by Nobel Laureate James Tobin as a way of managing exchange-rate volatility..). Here we have only a murky swamp of audiovisual performance questions, all of them generated by Amon Tobin’s recent performance of ISAM in Melbourne.

ISAM?
– AKA  >> An album. And a projection mapped, audiovisual extravaganza, premiered at Mutek, 2011.
– Lengthy behind the scenes article, including video and storyboard examples .. over at the Touch Designer blog, the software used to run the show.
– Pixel creation by V Squared Labs and Leviathan, production design by Alex Lazarus, stage design and set design by Vitamotus.
– And yes, the stage set was designed to fit within a few inches of biggest possible travel container. Precision mapped, video glimpse.

Ramblings / Rants / Observations

1. First up – some congratulations are in order – ISAM’s a stunning and well fleshed out achievement, raising the technical benchmark for live audiovisual shows. And perhaps because of that, the next 22 points are an assortment of thoughts the show triggered during and after. What is this live audiovisual thing, anyway?

2. Some of my favourite bits were the more imaginative transitions – the projected video morphing between different content, while simultaneously transforming the type of perspective being overlaid on the cubes. ( eg shifting from a scene where the three dimensionality of the cubes were being emphasised – with each surface mapped as walls / textures of a spaceship… and then a zooming morph that ‘flattened’ everything to more of a cinema screen showing a spaceship shrinking away into the distance…). I remember thinking these bits appealed because they were moments where imagination seemed more dominant than visual plug-ins or render farm hours.

3. How much of this was real-time, how much was choreographed? Does this matter? Paging Toby. From my vantage point, it was clear that the visual director/ technician / booth operator was doing very little behind his console for many of the tracks, arms literally folded as he leaned against the desk.

4. “It’s like we’re watching something that happened 12 months ago”, said a friend, referencing the online saturation of ISAM over the last year, and perhaps, that the visual creativity was mostly employed long ago in pre-production, rather than the live arena.

5. If you were looking at the audience for most of the gig, it would’ve been hard to tell whether they were at a screening or a concert. Does this matter?

6. After the visual avalanche – the outside world seemed more vivid… and on several occasions while riding my bicycle home afterwards, I found myself thinking – this is way more beautiful than the show –  the crisp and vivid  silhouettes of spotlit cathedral architectural elements ( flanked by fluffy night clouds ), later – a golden second floor window streaming light out onto a bunch of autumn leaves, composited within a mostly pitch black sky, and while crossing train tracks, looking right and catching bright red train lights reflected in the curve of shiny metal train tracks, glistening lines in the dark. ISAM is visually exquisite – but something bugged me – and it’s possibly the complete emphasis on synthesised environments, which alway seek to simulate but can never quite get there. A lot of the charm of Amon Tobin’s music comes from the way he uses field recordings and everyday sounds as part of his heavy digital processing – I wonder why the visual aspects weren’t considered the same way?

7. The window reveals of Amon inside the structure worked great, the lighting perfectly compositing him within the overall picture. Where and how else could rear lighting have been used playfully with the structure?

8. The music! Oh yes. Almost forgot. Periodically closed my eyes, and alternately felt the music was less – or more – interesting without the visual. Lots of the set featured luscious sound, but after a while the compositions themselves struggled to distinguish themselves from the rest of the set. The bass heavier finale hinted at directions it could’ve gone in, and friends mentioned the after-show DJ set by Amon was musically much better.

9. Wonder how modular / flexible that structure and overall software system they have is – how easily could they re-fit / re-model the shapes differently in another space?

10. What would the combined creative crew do differently if they were starting this project from scratch again? How much could their combined system be considered a platform / foundation for a more flexible / organic approach next time?

11. If it did feel a little bit like early cinema viewings, where the audience swoons at a train coming at the camera, where will the process of video-mapped surfaces go from here?

12. Are we starting to see an increasing divide between what’s possible with a few clever friends with some imaginatively deployed DIY tools – and larger scaled spectacles with matching budgets and limitations?

13. What’s with the overdose of mechanical sci-fi imagery? Galactic Giger transmissions might be contributing to the oversaturation, but surely there’s room for a projection mapping palette that expands beyond a Quake skin, to aesthetics that might include organic shapes, lifeforms, or even characters (whether narrative driven or not)? Not everyone has to be Gangpol Und Mit, (1000 people band, yo!) but other spatial composition palettes are possible!

14. Yes, there was more than visual-machine-lust, we also got geometric deconstruction – sophisticated even, but still, there it was, almost like a logo popping up logo for a software plug-in, the virtual form outlined, crumbled and rebuilt. What other ways can we play with spatial composition?

amon tobin in Melbourne, 2012

15. I was definitely transported at times, enchanted.

16. DJ as cloud? The kinect bit was effective, transforming the stage cubes with a swarm of Amon dots, as he manipulated his DJ booth gear. Was kinda arbitrary though, equivalent of stepping on a giant FX pedal during a song, and letting it utterly dominate the composition. Which is fine… but yes, trains coming towards the camera. Great effect – how could it be integrated meaningfully?

17. The audience was often quiet, near motionless. Weird for a music gig of an esteemed producer. A constant nearby soundtrack? People laughing in an excitable disbelief ( kind of like a cough-laugh) as though they were watching a sideshow circus of things that possibly shouldn’t be happening.

18. Overheard outside: “I think Melbourne is now out of weed.”

19. No matter the cost of the algorithm or render farm rental, an object artificially shattering into pieces still looks like 90’s desktop publishing. Or Lawnmower man. If you’re contractually bound to deconstruct, stylise it to some interesting degree, or show us some actual ruins, or some hybrid of your go-pro kite in Baghdad and your abstract shape plug-in of the moment.

20. So you’ve watched / listened to ISAM in Montreal, Melbourne and Manchester. How different were those shows? How different could they be?

21. Was that a pair of stacked Christie projectors?

22. Who knew the Palce in Melbourne went 4 levels high? New to me and quite the vantage point from up there, gave the cubes a much more pronounced 3D effect, from the floor it definitely felt much more cinema screen flattened rather than 3dimensional.

23. I was impressed by ISAM rather than seduced by it. Undeniably a really accomplished show, it deserves the praise, but I’m wondering if it left other live-video-heads with such mixed feelings about what was sacrificed as part of the production upscaling.

Totally curious as to what any other folks thought.

PS. I did a phone interview with Amon a long time ago, and he seems like a lovely guy.

 

by j p, June 26, 2012 5 Comments

Animation for Attack of the Cats by Sampology

Audiovisual turntablist Sampology recently commissioned a music-video for the interlude track on his upcoming album. The catch was, it had to be 100% animated cats.

IF Felinology = the study of cats and their anatomy,
THEN: GIF-felinology = Kind of like the torture scene from A Clockwork Orange, where eyes are propped open by pins… and as a piercing cat midi loop blares, an avalanche of cat tumblr feeds scratch away at the poor researchers eyes, until nothing remains but twitching and furballs.

I survived though – and am now a semi-professional animated cat-gif expert (contact me for rates). Catch the clip below ( or over at the projects page, where I now archive + document my video activities), or see it on the big screen during Sampology’s upcoming Apocalyptic AV tour.

Next up: more apocalypse, preparing ‘post-apocalyptic visual backdrops’ for upcoming TZU tour…

Elsewhere: holy felines, batman!

by j p, April 19, 2012 0 comments

Learning With Quartz Part 4: 3D Objects With Video Textures in VDMX

quartz composer dancing with syphon recorder and alpha channels

Warning: playing around long enough with 2D billboards and sprites (to display images) inside Quartz Composer, will eventually lead you to a third dimension. Fear not, inept explorers before you have survived, and they’ve even left some string at the edge of the cave:

Playing with 3D in QC

1. Create a new clear patch. This creates a blank background, choose colour in its inspector.
Type ‘clear’ into the library window, drag and drop a ‘clear’ patch onto your editor window.

2. Drag a ‘cube’ onto your window too.
Drag a movie from your hard-drive, into your window. Attach the image output to one of the image inputs of the cube.
Play with the rotation and x / y / z values of the cube in the inspector, and you should see video on one side of the cube.
( Tip – When adjusting Quartz parameters, adjust at 10 x speed by pressing shift at the same time, helpful for quicker compositing.)
You now have video playing on a 3D object in QC. Woo!

To easily put video on all surfaces – create an input splitter – by right clicking on the cube. Select ‘Insert input splitter’ and choose the the cube image input you’ve been using. Drag from the output of the input splitter, to as many of the cube image surfaces as you wish.

3. For easier 3D space manipulation try using a trackball. Drag one in from the library. Click and drag to select your cube(s) and movie files, then press command + x to cut them. Leave the clear patch. Double-click the Trackball patch to go within it. Paste (command + v) your cube here. Now try clicking and dragging on your QC viewer window – the trackball enables this more intuitive navigation / perspective alignment. (Click edit parent in the editor window to go back to your upper layer patches at any time.)

(Similarly, the 3D Transform patch can be used to allow easier control of 3D space. For example, placing several 3d objects inside a 3D transform, allows easy perspective adjustment of the whole scene by changing the 3D transform parameters.)

4. To add complexity – drag a ‘Replicate in Space’ patch from the library into your editor window, and place beside your cube. Again select your cube(s) + movie(s) + cut these. Double click to go inside the Replicate patch, then paste these inside. Play around with the Replicate parameters and watch the results.

quartz composer and 3d

3D experiment quartz patch ( Update the movie location to match a video file of yours, or delete the included movie link, and drag a movie into the editor)

Playing with your 3D QC file inside VDMX

Option 1: Drop it into the media bin and it should playback the 3D model with pre-connected movie.

Option 2: To use the QC file as an effect – allowing any clip being played in VDMX to appear on the cube – we need to do a few things :

– So that the video input is user defined rather than pre-defined, the root macro patch needs ‘an input splitter of type ‘image’ set up receive the incoming video stream‘.

– To enable any of the QC parameters to be adjusted live inside VDMX, we need to adjust the relevant parameters in the QC patch. This is called ‘publishing an input in quartz’ ( and enables that QC parameter to be adjusted within VDMX –  see instructions). However – slight complication – these need to be published in the topmost layer of quartz ( ie the root macro patch) to be accessible in VDMX. So if you’ve published an input within a subpatch of your main patch, this won’t show up in VDMX. To solve this, publish the input at the layer of QC you are in ( eg inside a replicate in space patch), then go up one level – this published input will now appear listed in that patch ( eg replicate in space). Repeat the process of right clicking to publish again, and it will appear in the next patch up, and so on until it appears in the root macro patch.

– Save the QC within your VDMX QC FX folder. Select it from assets ( if needed, refresh via VDMX > prefs > User paths ). Whatever VDMX clip is playing will now be composited onto the cubes. Dude! OH: and click-dragging in the VDMX preview window works for the trackball navigation, the same way it does within QC.

3D experiment quartz patch for VDMX  (Drop into VDMX QC folder. Change parameters in this QC patch, save as new name in the same QC FX folder, and you’ve got as many new 3D compositing tools as you want.)

Playing With 3D Models

Download and install Vade’s v002 Model Loader – which allows you to “load 35+ 3D model formats into Quartz Composer, including animation data.”

Drag the v002 model importer into your editor. For ‘model path’, enter the address of a 3D model. (Drag a 3D model to the editor, click it, select path in inspector, copy and paste into v002.)

Connect an image or video to the v002  Model Importer ‘image’ input, to texture your model.

Read the included notes for more fun – including models with embedded 3D animation parameters.

Recording Your QC Experiments

Install Syphon Recorder.
Install the Syphon for Quartz Composer plug-in.
Put ‘Syphon server’ into your toplevel Quartz editor window. To use the OpenGL scene as the source, set this patch to be the last (top) layer rendered. ( ie – click on the right of your syphon server box, and ensure it’s number is higher than other layer.) This enables the Quartz output to be displayed within Syphon Recorder.

Open ‘Syphon Recorder’. Your quartz video should be already visible. Adjust recording preferences to suit (it handles 1080P fine on this 3 year old macbookpro), and hit record. It seems to manage 1080P HD recordings fine, and it even records alpha channels ( ie you can record audio-responsive 3D models on a transparent background, for easy compositing later into the likes of After Effects.

(See also: QTZ Rendang – Free software for rendering out quartz patches. Potentially useful for non-real-time rendering out intensive patches that give slow playback. Haven’t tested that yet though.)

Special shout-outs:

– to Vade for providing both the v002 model importer, Syphon and the Syphon recorder, which makes a lot of the above possible.

– To VJ Shakinda ( co-author of a forthcoming book on QC with @momothemonster) for his trilogy of 3D tutorials on youtube, which really helped me get to grips with 3D in QC. If you found this stringy guide helpful, wait until you’re bungee jumping with this fella:


Above, 3d objects moving to music in 4 mins. Also tasty: 3d beat reactive scenes – part 1 / part 2 / part 3.

Previously on Breaking Bad:

Learning With Quartz Part 3: DIY Anchor Rotation FX for VDMX

Learning Quartz Composer Part 2

Learning Quartz Composer Part 1

by j p, April 16, 2012 0 comments

Weirdcore, Aphex Twin + Die Antwoord in Melbourne

aphex twin future music festival weirdcore

Aphex Twin played in Melbourne recently, which also meant a chance to catch the pixel mutations of his regular tour VJ, Weirdcore. After a summer of stage screens saturated with glossy, void-of-personality motion graphics templates, it was refreshing to catch live tour visuals that were ambitious, sophisticated and raw – very obviously being generated and manipulated live, mistakes and all. Probably not many other ways to approach a set for Aphex, and apparently he does improvise a different set every single time.

Below, a diagram from Weirdcore’s artist profile at vidvox.net, explaining his video set-up**.

“With additional FX programming from Andrew BensonFlight404, and Vade this is one of the wildest tour setups we’ve seen in a while… but you wouldn’t expect anything less for the worlds most known electronic musician. Pitchfork may have said it best, “First, we can’t really talk about anything until we talk about the visuals.”

weirdcore aphex twin video set-up

[[ **”UPDATE: Weirdcore mentions that diagram is now dated (was for his 2011 set-up), and he intends to shift everything to jitter, with one computer direct to LEDs, much simpler and less likely to fuck up.” ]]

Aphex Twin, kinect, weirdcore, melbourne,

Above : 1 x Richard D. James + 1 x Kinect, flanked by L.E.D. Screens of the face-replaced crowd at the Palace, Melbourne.

Below : The Kinect in action at The Future Music Festival ( Bonus Melbourne software plug : kinectar.org ).

Aphex Twin, kinect, melbourne, future music festival, lasers, weirdcore,

Aside from general pixel mangling and the fast and fluid Weirdcore style – I was curious to see how effective the live face-replacing would be (snippets of it in live Aphex shows have been glimpsed online for a couple of years now). The software and camera set-up seemed to take a while to tune, but when it locked in the effect was mesmerising, updating fast enough to cope with panning cameras of a boisterous crowd, all while being relentlessly modified and further manipulated.

The face-tracking was also put to work in a section of the show that is customised for each location. Weirdcore had asked for feedback on the list of Aussie celebrities he’d compiled, so I threw him a few more names, including 1 x Wally. Crowd reactions varied, but were probably loudest for the photo below of Julia Gillard, and Weirdcore mentioned later that for whatever reason – unlike most other countries, the response had been loudest for politicians at Australian gigs.

Aphex Twin, Gotye, Weirdcore, Melbourne

Aphex Twin, weirdcore, julia gillard, melbourne

Aphex + Die Antwoord Live

die antwoord, aphex twin, melbourne

[[ Die Antwoord? If needing a crash course in South African cartoon-rave-gangsterism – spend 15 minutes of your life inside the short film, Umshini Wam made with Harmony Korine ( Gummo, Kids, etc). ]]

If the Aphex Palace gig wasn’t overloaded enough, 3/4 through the set  – on top of a cacophony of Aphex – Die Antwoord burst out from backstage in orange flouro suits and rapid-fire a couple of tracks, one of which has ninja rapping from above the crowd he has stage-dived into. There are almost enough camera phones in the air, for ninja to get back onstage by walking across a bridge made of gadget. Another highpoint of weirdness is reached when ninja + Yo-landi rasp in their South African accents, ‘Aussie, Aussie, Aussie’ and the crowd eats it up, barking enthusiastically back ‘Oi, Oi, Oi!’. Somehow it all makes sense, including the ideas that future Die Antwoord videos are to be made by Weirdcore and later by Chris Cunningham. One extended global mutant franchise. And yes, after Die Antwoord depart, Aphex still has plenty in reserve (as do the lighting and laser operators), so by the time  it’s done, we can only depart exhausted.

**

Thanks to Weirdcore and his tech companion Fede for taking some tour time out to meet up and chat about pixels.

Be sure to check out  his video projects, including works for MIA, Chuck D, Cassette Playa + Simian Mobile Disco etc, and this Weirdcore video interview at the Creators Project.

( Thanks also, to juanjocruz for letting me use his zoomed in photos, the best on flickr for the Melbourne show. )

 

by j p, April 10, 2012 0 comments

The Science Fiction Curves of Moebius, RIP 2012.

moebius

R.I.P. Moebius, aka Jean Giraud, ( 1938-2012 ).

Sadly, we’re now without a living Moebius. All the more reason to revisit (or explore anew) his wild and fantastic creations. Most people might recognise his graceful intergalactic palette as some of the best parts of Heavy Metal magazine (originally Metal Hurlant in France) – but even the comic-less would be familiar with his work and influence on films such as Alien, Tron + Blade Runner. Indeed, we’ve lost a gentle giant.

Some Moebius links to dive into?

official site ( French )
Moebius on wikipedia
Airtight garage, a fan site / tumblr-collage tribute (named after one of his books).
There are several feature-length Moebius documentaries available. Here’s a video interview with Moebius drawing on a Cintiq screen-tablet.
And then there’s the Moebius and Alejandro Jodorowsky film version of Dune they never got to make. It was also to feature Salvador Dali and Giger amongst others, so the documentary about this, Unseen Dune (see trailer), should at least be pretty interesting.

If looking to do an out of print Moebius comic binge on your ipad ( including say, The Incal, his series with Jodorowsky) … Stanza – is a free app that can read CBR + PDF comics.

[Update: 2 great, long-read, illustrated Moebius tributes – on sci-fi sites io9 + Tor.com]

by j p, March 16, 2012 0 comments