Skip navigation
Help

Recursion theory

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.

Alan Turing slate statue at Bletchley Park museum

Flickr user Duane Wessels

June 23 marks the 100th birthday of Alan Turing. If I had to name five people whose personal efforts led to the defeat of Nazi Germany, the English mathematician would surely be on my list. Turing's genius played a key role in helping the Allies win the Battle of the Atlantic—a naval blockade against the Third Reich that depended for success on the cracking and re-cracking of Germany's Enigma cipher. That single espionage victory gave the United States control of the Atlantic shipping lanes, eventually setting the stage for the 1944 invasion of Normandy.

Alan Turing's Year

2012 is billed as the "Alan Turing Year," and a lengthy compendium of past and future Alan Turing events can be found at the Centenary site hosted by the United Kingdom's Mathematics Trust. The big gathering taking place right now is the Alan Turing Centenary Conference in Manchester.

Read more | Comments

0
Your rating: None

Hi all,

I'm a soon to be college graduate with a math major, comp sci minor, and statistics minor. I am looking for something interesting and related to comp sci to learn this summer. I hope whatever I study to be very interesting, and also improve my programming ability and problem solving ability.

Here are my ideas so far

  1. Learn Haskell. I've never done anything functional, and I hear Haskell is interesting and makes you a better programmer.

  2. Learn C. Haven't really done any low-level stuff.

  3. Algorithms. I took an algorithms class, but it wasn't too rigorous.

  4. Machine learning 5. Natural language processing. (These seem interesting)

  5. Set theory and databases (My job next year will be working with databases)

I'd appreciate any input on what seems like the most interesting or what other suggestions you have. (Don't suggest Project Euler, I do that already).

Thanks!

Edit: Thank you everybody! I think I'm going to learn a functional language, and that functional language will be Scheme (or Racket), as I found sicp to be more awesome than the Haskell resources. In conjunction with this, I'll be continuing project euler, and picking up emacs. Thanks for the advice!

submitted by cslmdt
[link] [78 comments]

0
Your rating: None

The original question was: I’ve heard somewhere that they’re also trying to build computers using molecules, like DNA. In general would it work to try and simulate a factoring algorithm using real world things, and then let the physics of the interactions stand-in for the computer calculation? Since the real-world does all kinds of crazy calculations in no time.

Physicist: The amount of time that goes into, say, calculating how two electrons bounce off of each other is very humbling.  It’s terribly frustrating that the universe has no hang ups about doing it so fast.

In some sense, basic physical laws are the basis of how all calculations are done.  It’s just that modern computers are “Turing machines“, a very small set of all the possible kinds of computational devices.  Basically, if your calculating machine consists of the manipulation of symbols (which in turn can always be reduced to the manipulation of 1′s and 0′s), then you’re talking about Turing machines.  In the earlier epoch of computer science there was a strong case for analog computers over digital computers.  Analog computers use the properties of circuit elements like capacitors, inductors, or even just the layout of wiring, to do mathematical operations.  In their heyday they were faster than digital computers.  However, they’re difficult to design, not nearly as versatile, and they’re no longer faster.

Nordsieck's Differential Analyzer was an analog computer used for solving differential equations.

Any physical phenomena that represents information in definite, discrete states is doing the same thing a digital computer does, it’s just a question of speed.  To see other kinds of computation it’s necessary to move into non-digital kinds of information.  One beautiful example is the gravity powered square root finder.

Newtonian physics used to find the square root of numbers. Put a marble next to a number, N, (white dots) on the slope, and the marble will land on the ground at a distance proportional to √N.

When you put a marble on a ramp the horizontal distance it will travel before hitting the ground is proportional to the square root of how far up the ramp it started.  Another mechanical calculator, the planimeter, can find the area of any shape just by tracing along the edge.  Admittedly, a computer could do both calculations a heck of a lot faster, but they’re still descent enough examples.

Despite the power of digital computers, it doesn’t take much looking around to find problems that can’t be efficiently done on them, but that can be done using more “natural” devices.  For example, solutions to “harmonic functions with Dirichlet boundary conditions” (soap films) can be fiendishly difficult to calculate in general.  The huge range of possible shapes that the solutions can take mean that often even the most reasonable computer program (capable of running in any reasonable time) will fail to find all the solutions.

Part of Richard Courant's face demonstrating a fancy math calculation using soapy water and wires.

So, rather than burning through miles of chalkboards and a swimming pools of coffee, you can bend wires to fit the boundary conditions, dip them in soapy water, and see what you get.  One of the advantages, not generally mentioned in the literature, is that playing with bubbles is fun.

Today we’re seeing the advent of a new type of computer, the quantum computer, which is kinda-digital/kinda-analog.  Using quantum mechanical properties like super-position and entanglement, quantum computers can (or would, if we can get them off the ground) solve problems that would take even very powerful normal computers a tremendously long time to solve, like integer factorization.  “Long time” here means that the heat death of the universe becomes a concern.  Long time.

Aside from actual computers, you can think of the universe itself, in a… sideways, philosophical sense, as doing simulations of itself that we can use to understand it.  For example, one of the more common questions we get are along the lines of “how do scientists calculate the probability/energy of such-and-such chemical/nuclear reaction”.  There are certainly methods to do the calculations (have Schrödinger equation, will travel), but really, if you want to get it right (and often save time), the best way to do the calculation is to let nature do it.  That is, the best way to calculate atomic spectra, or how hot fire is, or how stable an isotope is, or whatever, is to go to the lab and just measure it.

Even cooler, a lot of optimization problems can be solved by looking at the biological world.  Evolution is, ideally, a process of optimization (though not always).   During the early development of sonar and radar there were (still are) a number of questions about what kind of “ping” would return the greatest amount of information about the target.  After a hell of a lot of effort it was found that the researchers had managed to re-create the sonar ping of several bat species.  Bats are still studied as the results of what the universe has already “calculated” about optimal sonar techniques.

You can usually find a solution through direct computation, but sometimes looking around works just as well.

0
Your rating: None

alphadogg writes "Judea Pearl, a longtime UCLA professor whose work on artificial intelligence laid the foundation for such inventions as the iPhone's Siri speech recognition technology and Google's driverless cars, has been named the 2011 ACM Turing Award winner. The annual Association for Computing Machinery A.M. Turing Award, sometimes called the 'Nobel Prize in Computing,' recognizes Pearl for his advances in probabilistic and causal reasoning. His work has enabled creation of thinking machines that can cope with uncertainty, making decisions even when answers aren't black or white."


Share on Google+

Read more of this story at Slashdot.

0
Your rating: None



Yesterday's keynote at the 28th Chaos Computer Congress (28C3) by Meredith Patterson on "The Science of Insecurity" was a tour-de-force explanation of the formal linguistics and computer science that explain why software becomes insecure, and an explanation of how security can be dramatically increased. What's more, Patterson's slides were outstanding Rageface-meets-Occupy memeshopping. Both the video and the slides are online already.


Hard-to-parse protocols require complex parsers. Complex, buggy parsers become weird machines for exploits to run on. Help stop weird machines today: Make your protocol context-free or regular!

Protocols and file formats that are Turing-complete input languages are the worst offenders, because for them, recognizing valid or expected inputs is UNDECIDABLE: no amount of programming or testing will get it right.

A Turing-complete input language destroys security for generations of users. Avoid Turing-complete input languages!

Patterson's co-authors on the paper were her late husband, Len Sassaman (eulogized here) and Sergey Bratus.

LANGSEC explained in a few slogans

0
Your rating: None