Skip navigation


warning: Creating default object from empty value in /var/www/vhosts/ on line 33.
Original author: 
Cory Doctorow

Here's a mesmerizing gallery of "Fabrege Fractals" created by Tom Beddard, whose site also features a 2011 video of Fabrege-inspired fractal landscapes that must be seen to be believed. They're all made with Fractal Lab, a WebGL-based renderer Beddard created.

Fabergé Fractals by Tom Beddard, using his WebGL-based fractal engine, Fractal Lab. (via Colossal)     

Your rating: None
Original author: 
Adriana de Barros

Steady As She Goes (3d Styrofoam sculpture) by Like Minded Studio

“Steady As She Goes” is the title of this 3D printed sculpture by Luca Ionescu from Like Minded Studio.

Steady As She Goes (3d Styrofoam sculpture) by Like Minded Studio

Steady As She Goes (3d Styrofoam sculpture) by Like Minded Studio

Steady As She Goes (3d Styrofoam sculpture)  by Like Minded Studio

Photos © Andy Lewis

Via Behance Network
Your rating: None
Original author: 
Casey Johnston

All the bits and pieces that go into a pair of virtual reality goggles.


iFixit posted a teardown of the Oculus Rift headset Wednesday to see what, exactly, the virtual reality headset is made of. The teardown reveals the types of screens and controllers the Oculus Rift uses, and though the score is preliminary, iFixit gave it a 9 out of 10 user repairability score—unusual in the glue, tape, and Torx screw times we now live in.

The Oculus Rift uses one 1280×800 LCD that is split down the middle to show one image each to the right and left eye to create a 3D image. The display is an Innolux HJ070IA-02D 7-inch LCD panel, provided by the same distributor rumored to be Apple’s source for replacement iPad mini screens. A custom-designed Oculus Tracker V2 board pings to track the headset's motion at a 1000Hz refresh rate.

The chips inside the device include an STMicroelectronics 32F103C8 Cortex-M3 microcontroller with a 72MHz CPU and an Invensense MPU-6000 six-axis motion tracking controller that has both a gyroscope and accelerometer. There is also a chip named A983 2206, which iFixit suspects is a “three-axis magnetometer, used in conjunction with the accelerometer to correct for gyroscope drift.”

Read 2 remaining paragraphs | Comments

Your rating: None

Though three different actors have played Bruce Banner on the big screen in the past decade, only one can really say he also played the Hulk. When Mark Ruffalo appeared as the iconic character in last summer’s The Avengers, he was the first actor to actually portray the Hulk half of the character in motion-capture. (His predecessors Edward Norton and Eric Bana let the CG artists take over that part.)

The decision paid off, as fans enthused that Ruffalo’s Hulk was the best they’d seen in recent memory. Now a new video shows just how Industrial Light & Magic pulled off the trick. Hit the jump to check it out.

First founded by George Lucas in the ’70s for Star Wars, ILM has been behind some of the most notable cutting-edge effects work of the past few decades. Last year alone, they contributed work to Red Tails, The Hunger Games, Battleship, Cloud Atlas, and Paranormal Activity 4, in addition to The Avengers. Grantland posted this video from their studios.

No matter how often I watch these behind-the-scenes videos, I’m always startled by the massive difference between what I saw in the theaters (i.e., the Hulk stomping around Manhattan) and what was actually filmed on set (i.e., Ruffalo in a goofy hat prancing around a green-walled room).

As we’ve said before, part of the “problem” with VFX is that the really good work is so seamlessly done it’s easy not to notice it at all. But videos like this one demonstrate just how much skill and labor have to go into making scenes like this one work — and, unfortunately, the recent VFX protests highlight just how badly these talented artists are treated in return.

If you’re curious to see more of Grantland‘s series on the famed effects company, here are Parts 1 (From Star Wars to Today, Inside ILM Headquarters) and 2 (The Evolution of Filmmaking Technology at Lucasfilm).

Your rating: None

AudioGL presents a new vision of visual music creation, extended into space. Images courtesy the developer.

Here in flatland, ideas for musical interfaces may have become largely well-trodden. Not so in the third dimension. And so, one of the most unusual audiovisual interfaces has now hit beta, ready for you to explore. And that does mean “explore”: think navigation through spinning, animated galaxies of musical objects in this spatial modular sound environment. With the beta available, you can determine whether that´s a bold, new final frontier, or just the wheel, reinvented.

The work of Toronto-based artist and engineer Jonathan Heppner, AudioGL is a stunning vision of music creation in 3D space, with modular synths, advanced user-editable modulation, and a freely-navigable, open-ended spatial workspace.

There is a ticket for entry. While marked “beta,” the developer has admitted he needs money. And so, a trip into the space elevator will cost you US$80 for a fully-enabled license. You can try a save-disabled version for free, however, which isn’t necessarily a deal-killer for software of this nature; I’d mark this one down practically to crowd-funding for those who like the concept. (For an open-source take on graphical, spatial music sequencing, check out Iannix – and it does seem this sort of experimentalism could benefit from open licenses.) One caveat on the beta licenses: they won’t apply to the finished version. (Seems working something out there and talking about it publicly would encourage more beta users.)

This is the first beta; upcoming betas are due every 2-3 months, says the author. There’s already a lot there:

  • Immersive 3D interface
  • Preset instruments
  • Moular synth
  • Sample-accurate automation
  • Envelopes
  • Project-wide modulation
  • MIDI support
  • Sample import
  • Audio export

AudioGL isn’t limited to compelling 3D ideas. Project–wide modulation means networks of transformations that work across a scene.

For fine-grained editing of user envelopes, AudioGL does offer a more conventional 2D view.

At the top of the to-do list: ReWire, VST instruments and effects, and enhanced tempo change and modulation. Further down the line, says the developer, are DAW-style features like arrangement and project management.

No new videos of this build, but an impressive previous video is available below.

Your rating: None

Peter Curet: Generative Origami (built w/ Modelbuilder)

Corrected: The 3D printed trophy is by Chevalvert.

I just found this by accident, and what a nice accident it was: Peter Curet took my Processing Paris Master Class back in March, and subsequently produced the above video of origami structures continuously being created and unfolding. Not only is a great piece, it was apparently built with Modelbuilder. (Soft rendering courtesy of joons-renderer, which plugs the Sunflow radiosity rendered into Processing.)

I guess I finally get to feel a fraction of the pride Karsten Schmidt must feel seeing people doing awesome things with Toxiclibs. Not that I’m anywhere near reaching the awesomeness of his Toxiclibs Community showreel, but it’s a very good start.

Not that Peter’s origami is the first time Modelbuilder is spotted in the wild. Last year Paris studio Chevalvert used it to produce this 3D printed trophy for a dance award, and Greg Borenstein’s O’Reilly book Making Things See demonstrates how to combine Modelbuilder with a Kinect.

Do you know of any other examples of Modelbuilder being part of a project that made it past the “Messy Sketch” stage and on to the next level of “Thing of Beauty”? Let me know: marius at mariuswatz com.

Better yet, post it to Flickr in the new Modelbuilder group I just created:

Peter Curet: Conway Origami City

BLoc-D: IDILL 2011

Your rating: None

It starts as just another toy to play around with in a few minutes of distraction in your Web browser – as if the Web were short on distraction. But then, something amazing can happen. Like a musical Turing Test, you start to get a feeling for what’s happening on the other side. Someone’s stream of colored dots starts to jam with your stream of colored dots. You get a little rhythm, a little interplay going. And instead of being a barrier, the fact that you’re looking at simple animations and made-up names and playing a pretty little tune with complete strangers starts to feel oddly special. The absence of normal interpersonal cues makes you focus on communicating with someone, completely anonymously, using music alone.

Dinah Moe’s “Plink” is the latest glimpse of what Web browser music might be, and why it might be different than (and a compliment to) other music creation technology. You can now create private rooms to blow off steam with a faraway friend, or find new players online. It’s all powered with the Web Audio API, the browser-native, JavaScript-based tools championed by Mozilla. That means you’ll need a recent Chrome or Firefox (Chrome only at the moment; this is a Chrome Experiment), and mobile browsers won’t be able to keep up. But still, give it a try – I think you may be pleasantly surprised. (Actually, do it right now, as you’ll probably be doing it with other CDM readers. I expect greater things!)

Thanks to Robin Hunicke, who worked with multiplayer design and play at That Game Company’s Journey on PS3 and now on the browser MMO Glitch. I think her friends were more musical than most, because the place came alive after she linked from Facebook.

The browser is becoming a laboratory, a place to quickly try out ideas for music interaction, and for the code and structure that describe music in a language all their own. As in Plink, it can also benefit from being defined by the network and collaboration.

Dinah Moe’s experiments go in other directions, as well. In Tonecraft, inspired by the 3D construction metaphor of Minecraft, three-dimensional blocks become an alternative sequencer.

There are many reasons not to use Web tools. The Web Audio API still isn’t universal, and native options (like Google’s Native Client) have their own compatibility issues, stability concerns, and – because of security – they don’t do all the things a desktop application will. Desktop music tools are still more numerous, more powerful, and easier to use, so if you’re a reader out there finishing a thesis project, you might look elsewhere. (Actually, you’re probably in trouble, anyway, by any nation’s academic calendar, given it’s the First of May, but I digress.)

But think instead of this as another canvas, and the essential building blocks of interface design, code, and networking as shared across browsers and desktop apps. Somehow, in the light of the Internet, its new connectedness, and its new, more lightweight, more portable code and design options, software is changing. That transformation could happen everywhere.

If you need something to help you meditate on that and wait for a revelation to occur to you, I highly recommend watching a soothing stream of dots and some pleasing music as you jam with your mouse.

Of course, in the end, like a digital mirror, it might inspire you to go out to the park with a couple of glockenspiels and jam the old-fashioned way. But maybe that’s another reason to make software.

(Here’s a video, in case you’re not near a browser that supports the app!)

More, plus reflections on adaptive music:


Your rating: None

While experimenting with ways to calculate organic mesh surfaces I’ve tried to avoid 3D Bezier patches, since setting up control points programmatically is a bit of a pain. 2D is bad enough. But, as so often happens, I’ve found myself in a situation where I need a structure that is best described as a Bezier patch.

Paul Bourke comes to the rescue with sample code written in C, which took all of 5 minutes to port to Processing. The code below is all Bourke’s apart from the rendering logic. If you don’t know his depository of miscellaneous geometry code and wisdom, run and have a look. It’s proven invaluable over the years.

An applet version of this sketch can be seen on

Code: bezPatch.pde

// bezPatch.pde by Marius Watz
// Direct port of sample code by Paul Bourke.
// Original code:

int ni=4, nj=5, RESI=ni*10, RESJ=nj*10;
PVector outp[][], inp[][];

void setup() {

void draw() {

  for(int i=0; i<RESI-1; i++) {
    for(int j=0; j<RESJ; j++) {

void keyPressed() {
  if(key==' ') build();

void build() {
  int i, j, ki, kj;
  double mui, muj, bi, bj;

  outp=new PVector[RESI][RESJ];
  inp=new PVector[ni+1][nj+1];

  for (i=0;i<=ni;i++) {
    for (j=0;j<=nj;j++) {
      inp[i][j]=new PVector(i,j,random(-3,3));

  for (i=0;i<RESI;i++) {
    mui = i / (double)(RESI-1);
    for (j=0;j<RESJ;j++) {
      muj = j / (double)(RESJ-1);
      outp[i][j]=new PVector();

      for (ki=0;ki<=ni;ki++) {
        bi = BezierBlend(ki, mui, ni);
        for (kj=0;kj<=nj;kj++) {
          bj = BezierBlend(kj, muj, nj);
          outp[i][j].x += (inp[ki][kj].x * bi * bj);
          outp[i][j].y += (inp[ki][kj].y * bi * bj);
          outp[i][j].z += (inp[ki][kj].z * bi * bj);
      outp[i][j].add(new PVector(-ni/2,-nj/2,0));

double BezierBlend(int k, double mu, int n) {
  int nn, kn, nkn;
  double blend=1;

  nn = n;
  kn = k;
  nkn = n - k;

  while (nn >= 1) {
    blend *= nn;
    if (kn > 1) {
      blend /= (double)kn;
    if (nkn > 1) {
      blend /= (double)nkn;
  if (k > 0)
    blend *= Math.pow(mu, (double)k);
  if (n-k > 0)
    blend *= Math.pow(1-mu, (double)(n-k));


Your rating: None

Prismatica” is a new piece by Kit Webster. He attached pyramid-shaped crystals to an LCD screen and made a programmed geometric animation precisely mapped to the vertices of the crystals, illuminating them individually and in formation. The animations are further refracted through the geometry of the crystals in accordance with the shifting perspective of the observer, which in turn alters the way the illuminations appear and interact with reflections of surrounding lights within the space.

Your rating: None

Building on the original Midi Fighter, a 4×4 array of arcade push-buttons, the Midi Fighter 3D adds interactive, light-up color feedback and gyroscope-powered motion sensing. The work of electronic music site DJ Tech Tools, it’s an impressive-looking piece of work. But if you’re not interested in the “3D” sensing, don’t overlook the clever color feedback and bank shifting, which could prove as much of a draw.

The Midi Fighter 3D, announced today, will ship in April at US$249. There are now orders yet, but there is a preorder list.

DJ Tech Tools is pushing the 3D orientation functionality. In a good way, it mirrors a bit of the branding and design we see from Nintendo (well, at least that “3D” moniker). If you don’t mind moving your controller around as you play, it looks like it can do some impressive things. Dan White of DJTT explains how it works to CDM:

The 3D uses a gyroscope and a compass to track the position of the controller in space. The gyroscope tracks relative position (meaning angling the controller towards any of its sides), and the compass tracks rotation along the same plane that the controller is on (think turning the controller like a steering wheel).

While the sensing may not appeal to everybody, the big advantage here is integrating continuous control of parameters (which buttons obviously lack), in a way that’s integrated into the design and gestural.

A wrist-strap will be available, and designed in such a way that you can access all the controls, including even those on the side.

At $249, though, fans of the original could easily justify the purchase based solely on the new light-up, assignable color indicators on the buttons. Apart from looking cool, they promise to make elaborate control setups possible, with the aid of bank controls and lots of customization in the software. You get four banks of controls via the top, but there are also six nicely-integrated triggers on the side which can be used for whatever you like. That could give you more banks, effect kill switches, or some other function you haven’t thought of yet. The fimware can send up to 68 unique Control Change messages and 70 button messages, so presumably DJTT is betting – as they have with their other product line – on lots of preset ideas for different performance rigs and styles.

All of this communication happens via MIDI, so using it with your favorite software is a cinch.


  • Included configuration software
  • Customizable RGB arcade buttons: 4 x 4 button array, with individually-addressable light-up RGB feedback on each button
  • Four banks, six side buttons
  • 3D motion tracking of five movements

It’s hard not to notice the cable in the images. DJ Tech Tools tells us that’s their own DJTT USB cable, which will be bundled with the hardware and also available separately. They say it’s a “high-quality” USB cable – I’m guessing the main test is whether it can stand up to moving the hardware around, since it isn’t wireless. Having right-angle USB cables is hugely useful in tight corners, though; Hosa was showing off something like that at NAMM and I’m happy to replace my USB collection with them.

Also worth noting: DJTT says they’re applying for a patent on the five-way motion control tracking method they’ve developed. (I find the patent process to be pricey and arcane, personally, but I’ll be interested to see how it goes for them!)

$249 seems to me a really good deal for this gear, but if you liked the brute-force simplicity of the original controller – and its greater customization options – the Classic remains available, starting at US$119.99.

More details:
Introducing the Midi Fighter 3D [DJ Tech Tools]

Images courtesy DJ Tech Tools. And yes, we’ve got high-res images, so click for big, gear-pr0n-ny closer looks.

Your rating: None