Skip navigation
Help

Turing machine

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
noreply@blogger.com (Mitchell Whitelaw)

At CODE2012 I presented a paper on "programmable matter" and the proto-computational work of Ralf Baecker and Martin Howse - part of a long-running project on digital materiality. My sources included interviews with the artists, which I will be publishing here. Ralf Baecker's 2009 The Conversation is a complex physical network, woven from solenoids - electro-mechanical "bits" or binary switches. It was one of the works that started me thinking about this notion of the proto-computational - where artists seem to be stripping digital computing down to its raw materials, only to rebuild it as something weirder. Irrational Computing (2012) - which crafts a "computer" more like a modular synth made from crystals and wires - takes this approach further. Here Baecker begins by responding to this notion of proto-computing.

MW: In your work, especially Irrational Computing, we seem to see some of the primal, material elements of digital computing. But this "proto" computing is also quite unfamiliar - it is chaotic, complex and emergent, we can't control or "program" it, and it is hard to identify familiar elements such as memory vs processor. So it seems that your work is not only deconstructing computing - revealing its components - but also reconstructing it in a strange new form. Would you agree?

RB: It took me a long time to adopt the term "proto-computing". I don't mean proto in a historical or chronological sense; it is more about its state of development. I imagine a device that refers to the raw material dimension of our everyday digital machinery. Something that suddenly appears due to the interaction of matter. What I had in mind was for instance the natural nuclear fission reactor in Oklo, Gabon that was discovered in 1972. A conglomerate of minerals in a rock formation formed the conditions for a functioning nuclear reactor, all by chance. 

Computation is a cultural and not a natural phenomenon; it includes several hundred years of knowledge and cultural technics, these days all compressed into a microscopic form (the CPU). In the 18th century the mechanical tradition of automata and symbolic/mathematical thinking merged into the first calculating and astronomical devices. Also the combinatoric/hermeneutic tradition (e.g. Athanasius Kircher and Ramon Llull) is very influential to me. These automatons/concepts were philosophical and epistemological. They were dialogic devices that let us think further, much against our current utilitarian use of technology. Generative utopia.


Schematic of Irrational Computing courtesy of the artist - click for PDF

MW: Your work stages a fusion of sound, light and material. In Irrational Computing for example we both see and hear the activity of the crystals in the SiC module. Similarly in The Conversation, the solenoids act as both mechanical / symbolic components and sound generators. So there is a strong sense of the unity of the audible and the visual - their shared material origins. (This is unlike conventional audiovisual media for example where the relation between sound and image is highly constructed). It seems that there is a sense of a kind of material continuum or spectrum here, binding electricity, light, sound, and matter together?

RB: My first contact with art or media art came through net art, software art and generative art. I was totally fascinated by it. I started programming generative systems for installations and audiovisual performances. I like a lot of the early screen based computer graphics/animation stuff. The pure reduction to wireframes, simple geometric shapes. I had the feeling that in this case concept and representation almost touch each other. But I got lost working with universial machines (Turing machines). With Rechnender Raum I started to do some kind of subjective reappropriation of the digital. So I started to build my very own non-universal devices. Rechnender Raum could also be read as a kinetic interpretation of a cellular automaton algorithm. Even if the Turing machine is a theoretical machine it feels very plastic to me. It a metaphorical machine that shows the conceptual relation of space and time. Computers are basically transposers between space and time, even without seeing the actual outcome of a simulation. I like to expose the hidden structures. They are more appealing to me than the image on the screen.

MW: There is a theme of complex but insular networks in your work. In The Conversation this is very clear - a network of internal relationships, seeking a dynamic equilibrium. Similarly in Irrational Computing, modules like the phase locked loop have this insular complexity. Can you discuss this a little bit? This tendency reminds me of notions of self-referentiality, for example in the writing of Hofstadter, where recursion and self-reference are both logical paradoxes (as in Godel's theorem) and key attributes of consciousness. Your introverted networks have a strong generative character - where complex dynamics emerge from a tightly constrained set of elements and relationships.

RB: Sure, I'm fascinated by this kind of emergent processes, and how they appear on different scales. But I find it always difficult to use the attribute consciousness. I think these kind of chaotic attractors have a beauty on their own. Regardless how closed these systems look, they are always influenced by its environment. The perfect example for me is the flame of a candle. A very dynamic complex process communicating with its environment, that generates the dynamics.

MW: You describe The Conversation as "pataphysical", and mention the "mystic" and "magic" aspects of Irrational Computing. Can you say some more about this a aspect of your work? Is there a sort of romantic or poetic idea here, about what is beyond the rational, or is this about a more systematic alternative to how we understand the world?

RB: Yes, it refers to an other kind of thinking. A thinking that is anti "cause and reaction". A thinking of hidden relations, connections and uncertainty. I like Claude Lévi-Strauss' term "The Savage Mind".

0
Your rating: None

theodp writes "Microsoft's promotion of Julie Larson-Green to lead all Windows software and hardware engineering in the wake of Steven Sinofsky's resignation is reopening the question of what is the difference between Computer Science and Software Engineering. According to their bios on Microsoft's website, Sinofsky has a master's degree in computer science from the University of Massachusetts Amherst and an undergraduate degree with honors from Cornell University, while Larson-Green has a master's degree in software engineering from Seattle University and a bachelor's degree in business administration from Western Washington University. A comparison of the curricula at Sinofsky's and Larson-Green's alma maters shows there's a huge difference between UMass's MSCS program and Seattle U's MSE program. So, is one program inherently more compatible with Microsoft's new teamwork mantra?"


Share on Google+

Read more of this story at Slashdot.

0
Your rating: None

Alan Turing slate statue at Bletchley Park museum

Flickr user Duane Wessels

June 23 marks the 100th birthday of Alan Turing. If I had to name five people whose personal efforts led to the defeat of Nazi Germany, the English mathematician would surely be on my list. Turing's genius played a key role in helping the Allies win the Battle of the Atlantic—a naval blockade against the Third Reich that depended for success on the cracking and re-cracking of Germany's Enigma cipher. That single espionage victory gave the United States control of the Atlantic shipping lanes, eventually setting the stage for the 1944 invasion of Normandy.

Alan Turing's Year

2012 is billed as the "Alan Turing Year," and a lengthy compendium of past and future Alan Turing events can be found at the Centenary site hosted by the United Kingdom's Mathematics Trust. The big gathering taking place right now is the Alan Turing Centenary Conference in Manchester.

Read more | Comments

0
Your rating: None

The original question was: I’ve heard somewhere that they’re also trying to build computers using molecules, like DNA. In general would it work to try and simulate a factoring algorithm using real world things, and then let the physics of the interactions stand-in for the computer calculation? Since the real-world does all kinds of crazy calculations in no time.

Physicist: The amount of time that goes into, say, calculating how two electrons bounce off of each other is very humbling.  It’s terribly frustrating that the universe has no hang ups about doing it so fast.

In some sense, basic physical laws are the basis of how all calculations are done.  It’s just that modern computers are “Turing machines“, a very small set of all the possible kinds of computational devices.  Basically, if your calculating machine consists of the manipulation of symbols (which in turn can always be reduced to the manipulation of 1′s and 0′s), then you’re talking about Turing machines.  In the earlier epoch of computer science there was a strong case for analog computers over digital computers.  Analog computers use the properties of circuit elements like capacitors, inductors, or even just the layout of wiring, to do mathematical operations.  In their heyday they were faster than digital computers.  However, they’re difficult to design, not nearly as versatile, and they’re no longer faster.

Nordsieck's Differential Analyzer was an analog computer used for solving differential equations.

Any physical phenomena that represents information in definite, discrete states is doing the same thing a digital computer does, it’s just a question of speed.  To see other kinds of computation it’s necessary to move into non-digital kinds of information.  One beautiful example is the gravity powered square root finder.

Newtonian physics used to find the square root of numbers. Put a marble next to a number, N, (white dots) on the slope, and the marble will land on the ground at a distance proportional to √N.

When you put a marble on a ramp the horizontal distance it will travel before hitting the ground is proportional to the square root of how far up the ramp it started.  Another mechanical calculator, the planimeter, can find the area of any shape just by tracing along the edge.  Admittedly, a computer could do both calculations a heck of a lot faster, but they’re still descent enough examples.

Despite the power of digital computers, it doesn’t take much looking around to find problems that can’t be efficiently done on them, but that can be done using more “natural” devices.  For example, solutions to “harmonic functions with Dirichlet boundary conditions” (soap films) can be fiendishly difficult to calculate in general.  The huge range of possible shapes that the solutions can take mean that often even the most reasonable computer program (capable of running in any reasonable time) will fail to find all the solutions.

Part of Richard Courant's face demonstrating a fancy math calculation using soapy water and wires.

So, rather than burning through miles of chalkboards and a swimming pools of coffee, you can bend wires to fit the boundary conditions, dip them in soapy water, and see what you get.  One of the advantages, not generally mentioned in the literature, is that playing with bubbles is fun.

Today we’re seeing the advent of a new type of computer, the quantum computer, which is kinda-digital/kinda-analog.  Using quantum mechanical properties like super-position and entanglement, quantum computers can (or would, if we can get them off the ground) solve problems that would take even very powerful normal computers a tremendously long time to solve, like integer factorization.  “Long time” here means that the heat death of the universe becomes a concern.  Long time.

Aside from actual computers, you can think of the universe itself, in a… sideways, philosophical sense, as doing simulations of itself that we can use to understand it.  For example, one of the more common questions we get are along the lines of “how do scientists calculate the probability/energy of such-and-such chemical/nuclear reaction”.  There are certainly methods to do the calculations (have Schrödinger equation, will travel), but really, if you want to get it right (and often save time), the best way to do the calculation is to let nature do it.  That is, the best way to calculate atomic spectra, or how hot fire is, or how stable an isotope is, or whatever, is to go to the lab and just measure it.

Even cooler, a lot of optimization problems can be solved by looking at the biological world.  Evolution is, ideally, a process of optimization (though not always).   During the early development of sonar and radar there were (still are) a number of questions about what kind of “ping” would return the greatest amount of information about the target.  After a hell of a lot of effort it was found that the researchers had managed to re-create the sonar ping of several bat species.  Bats are still studied as the results of what the universe has already “calculated” about optimal sonar techniques.

You can usually find a solution through direct computation, but sometimes looking around works just as well.

0
Your rating: None

alphadogg writes "Judea Pearl, a longtime UCLA professor whose work on artificial intelligence laid the foundation for such inventions as the iPhone's Siri speech recognition technology and Google's driverless cars, has been named the 2011 ACM Turing Award winner. The annual Association for Computing Machinery A.M. Turing Award, sometimes called the 'Nobel Prize in Computing,' recognizes Pearl for his advances in probabilistic and causal reasoning. His work has enabled creation of thinking machines that can cope with uncertainty, making decisions even when answers aren't black or white."


Share on Google+

Read more of this story at Slashdot.

0
Your rating: None