Skip navigation
Help

John Steele

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
John Timmer

Stockton.edu

How do ethics and the free market interact? As the authors of a new paper on the topic point out, the answer is often complicated. In the past, Western economies had vigorous markets for things we now consider entirely unethical, like slaves and Papal forgiveness for sins. Ending those practices took long and bloody struggles. But was this because the market simply reflects the ethics of the day, or does engaging in a market alter people's perception of what's ethical?

To find out, the authors of the paper set up a market for an item that is ethically controversial: the lives of lab animals. They found that, for most people, keeping a mouse alive, even at someone else's cost, is only worth a limited amount of money. But that amount goes down dramatically once market-based buying and selling is involved.

The research was done at the University of Bonn, which appears to have a biology department that includes researchers who study mouse genetics. As Mendel told us, genes are inherited independently. So as these researchers are breeding mice to get a specific combination of genes, they'll inevitably get mice that have the wrong combination. Since proper mouse care is expensive and lab mice typically live a couple of years, it's standard procedure to euthanize these unneeded mice.

Read 12 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Dan Goodin

Wikipedia

Federal authorities have accused eight men of participating in 21st-Century Bank heists that netted a whopping $45 million by hacking into payment systems and eliminating withdrawal limits placed on prepaid debit cards.

The eight men formed the New York-based cell of an international crime ring that organized and executed the hacks and then used fraudulent payment cards in dozens of countries to withdraw the loot from automated teller machines, federal prosecutors alleged in court papers unsealed Thursday. In a matter of hours on two separate occasions, the eight defendants and their confederates withdrew about $2.8 million from New York City ATMs alone. At the same times, "cashing crews" in cities in at least 26 countries withdrew more than $40 million in a similar fashion.

Prosecutors have labeled this type of heist an "unlimited operation" because it systematically removes the withdrawal limits normally placed on debit card accounts. These restrictions work as a safety mechanism that caps the amount of loss that banks normally face when something goes wrong. The operation removed the limits by hacking into two companies that process online payments for prepaid MasterCard debit card accounts issued by two banks—the National Bank of Ras Al-Khaimah PSC in the United Arab Emirates and the Bank of Muscat in Oman—according to an indictment filed in federal court in the Eastern District of New York. Prosecutors didn't identify the payment processors except to say one was in India and the other in the United States.

Read 3 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Sean Gallagher


Alpha.data.gov, an experimental data portal created under the White House's Open Data Initiative.

Data.gov

President Barack Obama issued an executive order today that aims to make "open and machine-readable" data formats a requirement for all new government IT systems. The order would also apply to existing systems that are being modernized or upgraded. If implemented, the mandate would bring new life to efforts started by the Obama administration with the launch of Data.gov four years ago. It would also expand an order issued in 2012 to open up government systems with public interfaces for commercial app developers.

"The default state of new and modernized Government information resources shall be open and machine readable," the president's order reads. "Government information shall be managed as an asset throughout its life cycle to promote interoperability and openness, and, wherever possible and legally permissible, to ensure that data are released to the public in ways that make the data easy to find, accessible, and usable." The order, however, also requires that this new "default state" protect personally identifiable information and other sensitive data on individual citizens, as well as classified information.

Broadening the “open” mandate

The president's mandate was initially pushed forward by former Chief Information Officer of the United States Vivek Kundra. In May of 2009, Data.gov launched with an order that required agencies to provide at least three "high-value data sets" through the portal.

Read 6 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Lee Hutchinson

Apparently, I'm a Bitcoin miner now, and it looks like I'm actually pretty good at it. Ars is currently in possession of one of the elusive but very real Butterfly Labs Bitcoin Miners. It's a tiny little black box that fits in the palm of my hand, and it contains a specialized ASIC adept at chewing through SHA-256 cryptographic functions—exactly the kind of calculations necessary to bring more Bitcoins into the world. Turns out, it's very good at what it does: it computes hashes at the rate of about 5.3 billion per second.


The Butterfly Labs Bitcoin Miner. Lee Hutchinson


Close-up of the ASIC's heat sink and some of the motherboard components. Lee Hutchinson


The Miner disassembled. That 80mm fan gets pretty darn loud. Lee Hutchinson

I've got any number of computers around the house here to try the Butterfly Labs box out with, but I took the masochistic route and chose to try it out on OS X. This took quite a bit of back-and-forth with John O'Mara, creator of the popular MacMiner Bitcoin mining application. After several hours of troubleshooting, we eventually arrived at success. Here it is, happily churning away:


The Butterfly Labs Bitcoin Miner chewing its way through calculations at more than five billion hashes per second. Lee Hutchinson

According to my trusty Kill-A-Watt, the miner is drawing a pretty constant 50 watts at a similarly constant 0.73 amps. Its 80mm fan is whirring at what can only be described as "hair dryer" levels. According to MacMiner, the ASIC is generating a fair amount of heat, too—it's reporting a temperature of more than 80C.

Read 1 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Cyrus Farivar

When we left off, former software impresario and all-around goofball John McAfee had fled Central America and landed in, of all places, Portland, Oregon. Now he's speaking publicly, offering up stories about his life like:

I had my right testicle shattered by a hammer in 1974 when I ran afoul of some local drug barons in Oaxaca. It's the size of a grape now and shaped like a small frisbee.

And:

I was also taking more drugs weekly than most of you will do in a lifetime, and I was a totally indiscriminate user. Whatever came across my desk went up my nose, down my throat, in my veins or up the nether region.

The stories get stranger from there.

Read 7 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Jon Brodkin


The Arduino Due.

Arduino

Raspberry Pi has received the lion's share of attention devoted to cheap, single-board computers in the past year. But long before the Pi was a gleam in its creators' eyes, there was the Arduino.

Unveiled in 2005, Arduino boards don't have the CPU horsepower of a Raspberry Pi. They don't run a full PC operating system either. Arduino isn't obsolete, though—in fact, its plethora of connectivity options makes it the better choice for many electronics projects.

While the Pi has 26 GPIO (general purpose input/output) pins that can be programmed to do various tasks, the Arduino DUE (the latest Arduino released in October 2012) has 54 digital I/O pins, 12 analog input pins, and two analog output pins. Among those 54 digital I/O pins, 12 provide pulse-width modulation (PWM) output.

Read 64 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Stack Exchange

Stack Exchange

This Q&A is part of a weekly series of posts highlighting common questions encountered by technophiles and answered by users at Stack Exchange, a free, community-powered network of 100+ Q&A sites.

Chris Devereaux has recently been reading up on "Literate Programming," an innovative (and mostly unused) approach to programming developed by Donald Knuth in the 1970s. It got him thinking about new ways of implementing tests. "Well-written tests, especially BDD-style specs, can do a better job of explaining what code does than prose does," he writes in his question. Also, tests "have the big advantage of verifying their own accuracy." Still, Chris has never seen tests written inline with the code they test. Why not?

See the original question here.

Read 21 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Peter Bright

AMD

AMD wants to talk about HSA, Heterogeneous Systems Architecture (HSA), its vision for the future of system architectures. To that end, it held a press conference last week to discuss what it's calling "heterogeneous Uniform Memory Access" (hUMA). The company outlined what it was doing, and why, both confirming and reaffirming the things it has been saying for the last couple of years.

The central HSA concept is that systems will have multiple different kinds of processors, connected together and operating as peers. The two main kinds of processors are conventional: versatile CPUs and the more specialized GPUs.

Modern GPUs have enormous parallel arithmetic power, especially floating point arithmetic, but are poorly-suited to single-threaded code with lots of branches. Modern CPUs are well-suited to single-threaded code with lots of branches, but less well-suited to massively parallel number crunching. Splitting workloads between a CPU and a GPU, using each for the workloads it's good at, has driven the development of general purpose GPU (GPGPU) software and development.

Read 21 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Casey Johnston


Why are there so many password restrictions to navigate? Characters want to be free.

Daremoshiranai

The password creation process on different websites can be a bit like visiting foreign countries with unfamiliar social customs. This one requires eight characters; that one lets you have up to 64. This one allows letters and numbers only; that one allows hyphens. This one allows underscores; that one allows @#$&%, but not ^*()[]!—and heaven forbid you try to put a period in there. Sometimes passwords must have a number and at least one capital letter, but no, don’t start the password with the number—what do you think this is, Lord of the Flies?

You can’t get very far on any site today without making a password-protected account for it. Using the same password for everything is bad practice, so new emphasis has emerged on passwords that are easy to remember. Sentences or phrases of even very simple words have surfaced as a practical approach to this problem. As Thomas Baekdal wrote back in 2007, a password that’s just a series of words can be “both highly secure and user-friendly.” But this scheme, as well as other password design tropes like using symbols for complexity, does not pass muster at many sites that specify an upper limit for password length.

Most sites seem to have their own particular password bugaboos, but it’s rarely, if ever, clear why we can’t create passwords as long or short or as varied or simple as we want. (Well, the argument against short and simple is concrete, but the others are not immediately clear). Regardless of the password generation scheme, there can be a problem with it: a multi-word passphrase is too long and has no symbols; a gibberish password is too short, and what’s the % doing in there?

Read 12 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Stack Exchange

Stack Exchange

This Q&A is part of a weekly series of posts highlighting common questions encountered by technophiles and answered by users at Stack Exchange, a free, community-powered network of 100+ Q&A sites.

It's a response often encountered during technical interviews: "OK, you solved the problem with a while loop, now do it with recursion." Or vice versa. Stack Exchange user Shivan Dragon has encountered the problem and he knows how to answer: show that you're able to code both ways. Give the interviewer what he wants. But which method is generally preferable? A few more experienced programmers respond.

See the original question here.

Read 19 remaining paragraphs | Comments

0
Your rating: None