Apparently, I'm a Bitcoin miner now, and it looks like I'm actually pretty good at it. Ars is currently in possession of one of the elusive but very real Butterfly Labs Bitcoin Miners. It's a tiny little black box that fits in the palm of my hand, and it contains a specialized ASIC adept at chewing through SHA-256 cryptographic functions—exactly the kind of calculations necessary to bring more Bitcoins into the world. Turns out, it's very good at what it does: it computes hashes at the rate of about 5.3 billion per second.
The Butterfly Labs Bitcoin Miner. Lee Hutchinson
The Miner disassembled. That 80mm fan gets pretty darn loud. Lee Hutchinson
I've got any number of computers around the house here to try the Butterfly Labs box out with, but I took the masochistic route and chose to try it out on OS X. This took quite a bit of back-and-forth with John O'Mara, creator of the popular MacMiner Bitcoin mining application. After several hours of troubleshooting, we eventually arrived at success. Here it is, happily churning away:
The Butterfly Labs Bitcoin Miner chewing its way through calculations at more than five billion hashes per second. Lee Hutchinson
According to my trusty Kill-A-Watt, the miner is drawing a pretty constant 50 watts at a similarly constant 0.73 amps. Its 80mm fan is whirring at what can only be described as "hair dryer" levels. According to MacMiner, the ASIC is generating a fair amount of heat, too—it's reporting a temperature of more than 80C.
Ilya Grigorik discusses in detail how to construct a mobile website that loads as quickly as possible. A site that not only renders in 1 second, but one that is also visible in 1 second. With hard statistics as evidence to show why this matters, Ilya discusses techniques to deliver a 1000 millisecond experience.
Stanford Professor Andrew Ng is bringing back the idea of an artificial intelligence that can think like a person. With Google's Deep Learning project, he's creating machines that take a multi-layered approach to information, building up knowledge and figuring out concepts by passing data between various networks that can each recognize a small piece of it. The approach is designed to mimic how the human brain processes information with neural networks, and it's starting to work — last year, Google's "brain" figured out how to identify cats in YouTube videos without being told that the concept of "cat" existed. Wired has profiled Ng and his work on brain-like computers, a project that also ties into current government-funded brain...
Vint Cerf knows a thing or two about the internet. Along with Bob Kahn, Cerf is credited with developing the initial TCP internet standard over thirty years ago, and now spends his time looking into the future of computer networks in his role as Google's chief internet evangelist. As part of his research, Cerf has worked with NASA for many years developing an "interplanetary internet" that can facilitate communications in high latency environments like space — something that the agency used to control a LEGO robot on Earth from the International Space Station back in 2012. Wired's in-depth interview takes a look at where it all began, the challenges the project faces, and where it can go next.
Chris Devereaux has recently been reading up on "Literate Programming," an innovative (and mostly unused) approach to programming developed by Donald Knuth in the 1970s. It got him thinking about new ways of implementing tests. "Well-written tests, especially BDD-style specs, can do a better job of explaining what code does than prose does," he writes in his question. Also, tests "have the big advantage of verifying their own accuracy." Still, Chris has never seen tests written inline with the code they test. Why not?
See the original question here.
AMD wants to talk about HSA, Heterogeneous Systems Architecture (HSA), its vision for the future of system architectures. To that end, it held a press conference last week to discuss what it's calling "heterogeneous Uniform Memory Access" (hUMA). The company outlined what it was doing, and why, both confirming and reaffirming the things it has been saying for the last couple of years.
The central HSA concept is that systems will have multiple different kinds of processors, connected together and operating as peers. The two main kinds of processors are conventional: versatile CPUs and the more specialized GPUs.
Modern GPUs have enormous parallel arithmetic power, especially floating point arithmetic, but are poorly-suited to single-threaded code with lots of branches. Modern CPUs are well-suited to single-threaded code with lots of branches, but less well-suited to massively parallel number crunching. Splitting workloads between a CPU and a GPU, using each for the workloads it's good at, has driven the development of general purpose GPU (GPGPU) software and development.
Nerval's Lobster writes "Facebook's Graph Search is an ambitious project: give users the ability to search through the social network's vast webs of data via natural-language queries. But that's much easier said—so to speak—than done. Although human beings think nothing of speaking in 'natural' language, a machine must not only learn all the grammatical building-blocks we take for granted—it needs to compensate for the quirks and errors that inevitably pop up in the course of speech. The Facebook team tasked with building Graph Search also knew that the alternate option, keyword-based search, wasn't a viable one. 'Keywords, which usually consist of nouns or proper nouns, can be nebulous in their intent,' Facebook engineering manager Xiao Li wrote in an April 29 posting on Facebook's blog. 'For example, "friends Facebook" can mean "friends on Facebook," "friends who work at Facebook Inc," or "friends who like Facebook the page."' That left the team with building a natural-language interface. The posting digs deep into the elements of the backend, including everything from 'parse trees' to a lexical analysis system."
Read more of this story at Slashdot.
When Facebook Home was first unveiled, many observers wondered how Google would react to the overhauled Android experience. As it turns out, company executives have had nothing but kind words for the effort. First there was Eric Schmidt, who described Home as "a tremendous endorsement" of the Google Play ecosystem. Now none other than Matias Duarte (who himself led Android's dramatic visual transformation) has chimed in. "The new Facebook Home shows an incredible amount of polish and attention to design detail, and that didn’t come from a hardware manufacturer," Duarte said in an interview with ABC News.
"With the Home experience, they did a nice job expressing the Facebook experience, but so much of the Google design experience with...
hypnosec writes "Google has released the kernel source code of Google Glass publicly just a couple of days after the wearable gadget was rooted by Jay Freeman. Releasing the source code, Google has noted that the location is just temporary and it would be moving to a permanent location soon saying: 'This is unlikely to be the permanent home for the kernel source, it should be pushed into git next to all other android kernel source releases relatively soon'"
Read more of this story at Slashdot.