Skip navigation
Help

Technology Lab

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Dan Goodin

Wikipedia

Federal authorities have accused eight men of participating in 21st-Century Bank heists that netted a whopping $45 million by hacking into payment systems and eliminating withdrawal limits placed on prepaid debit cards.

The eight men formed the New York-based cell of an international crime ring that organized and executed the hacks and then used fraudulent payment cards in dozens of countries to withdraw the loot from automated teller machines, federal prosecutors alleged in court papers unsealed Thursday. In a matter of hours on two separate occasions, the eight defendants and their confederates withdrew about $2.8 million from New York City ATMs alone. At the same times, "cashing crews" in cities in at least 26 countries withdrew more than $40 million in a similar fashion.

Prosecutors have labeled this type of heist an "unlimited operation" because it systematically removes the withdrawal limits normally placed on debit card accounts. These restrictions work as a safety mechanism that caps the amount of loss that banks normally face when something goes wrong. The operation removed the limits by hacking into two companies that process online payments for prepaid MasterCard debit card accounts issued by two banks—the National Bank of Ras Al-Khaimah PSC in the United Arab Emirates and the Bank of Muscat in Oman—according to an indictment filed in federal court in the Eastern District of New York. Prosecutors didn't identify the payment processors except to say one was in India and the other in the United States.

Read 3 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Sean Gallagher


Alpha.data.gov, an experimental data portal created under the White House's Open Data Initiative.

Data.gov

President Barack Obama issued an executive order today that aims to make "open and machine-readable" data formats a requirement for all new government IT systems. The order would also apply to existing systems that are being modernized or upgraded. If implemented, the mandate would bring new life to efforts started by the Obama administration with the launch of Data.gov four years ago. It would also expand an order issued in 2012 to open up government systems with public interfaces for commercial app developers.

"The default state of new and modernized Government information resources shall be open and machine readable," the president's order reads. "Government information shall be managed as an asset throughout its life cycle to promote interoperability and openness, and, wherever possible and legally permissible, to ensure that data are released to the public in ways that make the data easy to find, accessible, and usable." The order, however, also requires that this new "default state" protect personally identifiable information and other sensitive data on individual citizens, as well as classified information.

Broadening the “open” mandate

The president's mandate was initially pushed forward by former Chief Information Officer of the United States Vivek Kundra. In May of 2009, Data.gov launched with an order that required agencies to provide at least three "high-value data sets" through the portal.

Read 6 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Jon Brodkin


The Arduino Due.

Arduino

Raspberry Pi has received the lion's share of attention devoted to cheap, single-board computers in the past year. But long before the Pi was a gleam in its creators' eyes, there was the Arduino.

Unveiled in 2005, Arduino boards don't have the CPU horsepower of a Raspberry Pi. They don't run a full PC operating system either. Arduino isn't obsolete, though—in fact, its plethora of connectivity options makes it the better choice for many electronics projects.

While the Pi has 26 GPIO (general purpose input/output) pins that can be programmed to do various tasks, the Arduino DUE (the latest Arduino released in October 2012) has 54 digital I/O pins, 12 analog input pins, and two analog output pins. Among those 54 digital I/O pins, 12 provide pulse-width modulation (PWM) output.

Read 64 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Stack Exchange

Stack Exchange

This Q&A is part of a weekly series of posts highlighting common questions encountered by technophiles and answered by users at Stack Exchange, a free, community-powered network of 100+ Q&A sites.

Chris Devereaux has recently been reading up on "Literate Programming," an innovative (and mostly unused) approach to programming developed by Donald Knuth in the 1970s. It got him thinking about new ways of implementing tests. "Well-written tests, especially BDD-style specs, can do a better job of explaining what code does than prose does," he writes in his question. Also, tests "have the big advantage of verifying their own accuracy." Still, Chris has never seen tests written inline with the code they test. Why not?

See the original question here.

Read 21 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Peter Bright

AMD

AMD wants to talk about HSA, Heterogeneous Systems Architecture (HSA), its vision for the future of system architectures. To that end, it held a press conference last week to discuss what it's calling "heterogeneous Uniform Memory Access" (hUMA). The company outlined what it was doing, and why, both confirming and reaffirming the things it has been saying for the last couple of years.

The central HSA concept is that systems will have multiple different kinds of processors, connected together and operating as peers. The two main kinds of processors are conventional: versatile CPUs and the more specialized GPUs.

Modern GPUs have enormous parallel arithmetic power, especially floating point arithmetic, but are poorly-suited to single-threaded code with lots of branches. Modern CPUs are well-suited to single-threaded code with lots of branches, but less well-suited to massively parallel number crunching. Splitting workloads between a CPU and a GPU, using each for the workloads it's good at, has driven the development of general purpose GPU (GPGPU) software and development.

Read 21 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Stack Exchange

Stack Exchange

This Q&A is part of a weekly series of posts highlighting common questions encountered by technophiles and answered by users at Stack Exchange, a free, community-powered network of 100+ Q&A sites.

It's a response often encountered during technical interviews: "OK, you solved the problem with a while loop, now do it with recursion." Or vice versa. Stack Exchange user Shivan Dragon has encountered the problem and he knows how to answer: show that you're able to code both ways. Give the interviewer what he wants. But which method is generally preferable? A few more experienced programmers respond.

See the original question here.

Read 19 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Jon Brodkin

About six weeks ago, users of Facebook's Android application noticed that they were being asked to install a new version—without going to the Google Play app store.

Android is far more permissive than iOS regarding the installation of third-party applications, even allowing installation from third-party sources if the user explicitly allows it. However, it's unusual for applications delivered through the official Google store to receive updates outside of the store's updating mechanism.

Google has now changed the Google Play store polices in an apparent attempt to avoid Facebook-like end runs around store-delivered updates. Under the "Dangerous Products" section of the Google Play developer policies, Google now states that "[a]n app downloaded from Google Play may not modify, replace or update its own APK binary code using any method other than Google Play's update mechanism." A Droid-Life article says the language update occurred Thursday. APK (standing for application package file) is the file format used to install applications on Android.

Read 4 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Jon Brodkin


Ubuntu 13.04.

The stable release of Ubuntu 13.04 became available for download today, with Canonical promising performance and graphical improvements to help prepare the operating system for convergence across PCs, phones, and tablets.

"Performance on lightweight systems was a core focus for this cycle, as a prelude to Ubuntu’s release on a range of mobile form factors," Canonical said in an announcement today. "As a result 13.04 delivers significantly faster response times in casual use, and a reduced memory footprint that benefits all users."

Named "Raring Ringtail,"—the prelude to Saucy Salamander—Ubuntu 13.04 is the midway point in the OS' two-year development cycle. Ubuntu 12.04, the more stable, Long Term Support edition that is supported for five years, was released one year ago. Security updates are only promised for 9 months for interim releases like 13.04. Support windows for interim releases were recently cut from 18 months to 9 months to reduce the number of versions Ubuntu developers must support and let them focus on bigger and better things.

Read 11 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Stack Exchange

Stack Exchange

This Q&A is part of a weekly series of posts highlighting common questions encountered by technophiles and answered by users at Stack Exchange, a free, community-powered network of 100+ Q&A sites.

abel is in the early stages of developing a closed-source financial app within a niche market. He is hiring his first employees, and he wants to take steps to ensure these new hires don't steal the code and run away. "I foresee disabling USB drives and DVD writers on my development machines," he writes. But will that be enough? Maybe a better question is: will that be too much?

See the original question here.

Read 18 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Dan Goodin

Wikipedia

Coordinated attacks used to knock websites offline grew meaner and more powerful in the past three months, with an eight-fold increase in the average amount of junk traffic used to take sites down, according to a company that helps customers weather the so-called distributed denial-of-service campaigns.

The average amount of bandwidth used in DDoS attacks mushroomed to an astounding 48.25 gigabits per second in the first quarter, with peaks as high as 130 Gbps, according to Hollywood, Florida-based Prolexic. During the same period last year, bandwidth in the average attack was 6.1 Gbps and in the fourth quarter of last year it was 5.9 Gbps. The average duration of attacks also grew to 34.5 hours, compared with 28.5 hours last year and 32.2 hours during the fourth quarter of 2012. Earlier this month, Prolexic engineers saw an attack that exceeded 160 Gbps, and officials said they wouldn't be surprised if peaks break the 200 Gbps threshold by the end of June.

The spikes are brought on by new attack techniques that Ars first chronicled in October. Rather than using compromised PCs in homes and small offices to flood websites with torrents of traffic, attackers are relying on Web servers, which often have orders of magnitude more bandwidth at their disposal. As Ars reported last week, an ongoing attack on servers running the WordPress blogging application is actively seeking new recruits that can also be harnessed to form never-before-seen botnets to bring still more firepower.

Read 9 remaining paragraphs | Comments

0
Your rating: None