Skip navigation
Help

gigabit

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Soulskill

coop0030 writes "Feel like someone is snooping on you? Browse anonymously anywhere you go with the Onion Pi Tor proxy. This is fun weekend project from Adafruit that uses a Raspberry Pi, a USB WiFi adapter and Ethernet cable to create a small, low-power and portable privacy Pi."

Share on Google+

Read more of this story at Slashdot.

0
Your rating: None
Original author: 
Sean Gallagher


NSA Headquarters in Fort Meade, MD.

mjb

One organization's data centers hold the contents of much of the visible Internet—and much of it that isn't visible just by clicking your way around. It has satellite imagery of much of the world and ground-level photography of homes and businesses and government installations tied into a geospatial database that is cross-indexed to petabytes of information about individuals and organizations. And its analytics systems process the Web search requests, e-mail messages, and other electronic activities of hundreds of millions of people.

No one at this organization actually "knows" everything about what individuals are doing on the Web, though there is certainly the potential for abuse. By policy, all of the "knowing" happens in software, while the organization's analysts generally handle exceptions (like violations of the law) picked from the flotsam of the seas of data that their systems process.

I'm talking, of course, about Google. Most of us are okay with what Google does with its vast supply of "big data," because we largely benefit from it—though Google does manage to make a good deal of money off of us in the process. But if I were to backspace over Google's name and replace it with "National Security Agency," that would leave a bit of a different taste in many people's mouths.

Read 31 remaining paragraphs | Comments

0
Your rating: None

CloudFlare's CDN is based on Anycast, a standard defined in the Border Gateway Protocol—the routing protocol that's at the center of how the Internet directs traffic. Anycast is part of how BGP supports the multi-homing of IP addresses, in which multiple routers connect a network to the Internet; through the broadcasts of IP addresses available through a router, other routers determine the shortest path for network traffic to take to reach that destination.

Using Anycast means that CloudFlare makes the servers it fronts appear to be in many places, while only using one IP address. "If you do a traceroute to Metallica.com (a CloudFlare customer), depending on where you are in the world, you would hit a different data center," Prince said. "But you're getting back the same IP address."

That means that as CloudFlare adds more data centers, and those data centers advertise the IP addresses of the websites that are fronted by the service, the Internet's core routers automatically re-map the routes to the IP addresses of the sites. There's no need to do anything special with the Domain Name Service to handle load-balancing of network traffic to sites other than point the hostname for a site at CloudFlare's IP address. It also means that when a specific data center needs to be taken down for an upgrade or maintenance (or gets knocked offline for some other reason), the routes can be adjusted on the fly.

That makes it much harder for distributed denial of service attacks to go after servers behind CloudFlare's CDN network; if they're geographically widespread, the traffic they generate gets spread across all of CloudFlare's data centers—as long as the network connections at each site aren't overcome.

0
Your rating: None

Aurich Lawson / Thinkstock

The corporate data center is undergoing a major transformation the likes of which haven't been seen since Intel-based servers started replacing mainframes decades ago. It isn't just the server platform: the entire infrastructure from top to bottom is seeing major changes as applications migrate to private and public clouds, networks get faster, and virtualization becomes the norm.

All of this means tomorrow's data center is going to look very different from today's. Processors, systems, and storage are getting better integrated, more virtualized, and more capable at making use of greater networking and Internet bandwidth. At the heart of these changes are major advances in networking. We're going to examine six specific trends driving the evolution of the next-generation data center and discover what both IT insiders and end-user departments outside of IT need to do to prepare for these changes.

Beyond 10Gb networks

Network connections are getting faster to be sure. Today it's common to find 10-gigabit Ethernet (GbE) connections to some large servers. But even 10GbE isn't fast enough for data centers that are heavily virtualized or handling large-scale streaming audio/video applications. As your population of virtual servers increases, you need faster networks to handle the higher information loads required to operate. Starting up a new virtual server might save you from buying a physical server, but it doesn't lessen the data traffic over the network—in fact, depending on how your virtualization infrastructure works, a virtual server can impact the network far more than a physical one. And as more audio and video applications are used by ordinary enterprises in common business situations, the file sizes balloon too. This results in multi-gigabyte files that can quickly fill up your pipes—even the big 10Gb internal pipes that make up your data center's LAN.

Read 34 remaining paragraphs | Comments

0
Your rating: None

The inside of Equinix's co-location facility in San Jose—the home of CloudFlare's primary data center.

Photo: Peter McCollough/Wired.com

On August 22, CloudFlare, a content delivery network, turned on a brand new data center in Seoul, Korea—the last of ten new facilities started across four continents in a span of thirty days. The Seoul data center brought CloudFlare's number of data centers up to 23, nearly doubling the company's global reach—a significant feat in itself for a company of just 32 employees.

But there was something else relatively significant about the Seoul data center and the other 9 facilities set up this summer: despite the fact that the company owned every router and every server in their racks, and each had been configured with great care to handle the demands of CloudFlare's CDN and security services, no one from CloudFlare had ever set foot in them. All that came from CloudFlare directly was a six-page manual instructing facility managers and local suppliers on how to rack and plug in the boxes shipped to them.

"We have nobody stationed in Stockholm or Seoul or Sydney, or a lot of the places that we put these new data centers," CloudFlare CEO Matthew Prince told Ars. "In fact, no CloudFlare employees have stepped foot in half of the facilities where we've launched." The totally remote-controlled data center approach used by the company is one of the reasons that CloudFlare can afford to provide its services for free to most of its customers—and still make a 75 percent profit margin.

Read 24 remaining paragraphs | Comments

0
Your rating: None

Lately I've been trying to rid my life of as many physical artifacts as possible. I'm with Merlin Mann on CDs:

Can't believe how quickly CDs went from something I hate storing to something I hate buying to something I hate merely existing.

Although I'd extend that line of thinking to DVDs as well. The death of physical media has some definite downsides, but after owning certain movies once on VHS, then on DVD, and finally on Blu-Ray, I think I'm now at peace with the idea of not owning any physical media ever again, if I can help it.

My current strategy of wishing my physical media collection into a cornfield involves shipping all our DVDs to Second Spin via media mail, and paying our nephew $1 per CD to rip our CD collection using Exact Audio Copy and LAME as a summer project. The point of this exercise is absolutely not piracy; I have no interest in keeping both digital and physical copies of the media I paid for the privilege of owningtemporarily licensing. Note that I didn't bother ripping any of the DVDs because I hardly ever watched them; mostly they just collected dust. But I continue to love music and listen to my music collection on a daily basis. I'll donate all the ripped CDs to some charity or library, and if I can't pull that off, I'll just destroy them outright. Stupid atoms!

CDs, unlike DVDs or even Blu-Rays, are considered reference quality. That is, the uncompressed digital audio data contained on a CD is a nearly perfect representation of the original studio master, for most reasonable people's interpretation of "perfect", at least back in 1980. So if you paid for a CD, you might be worried that ripping it to a compressed digital audio format would result in an inferior listening experience.

I'm not exactly an audiophile, but I like to think I have pretty good ears. I've recommended buying $200+ headphones and headphone amps for quite a while now. By the way: still a good investment! Go do it! Anyhow, previous research and my own experiments led me to write Getting the Best Bang for Your Byte seven years ago. I concluded that nobody could really hear the difference between a raw CD track and an MP3 using a decent encoder at a variable bit rate averaging around 160kbps. Any bit rate higher than that was just wasting space on your device and your bandwidth for no rational reason. So-called "high resolution audio" was recently thoroughly debunked for very similar reasons.

Articles last month revealed that musician Neil Young and Apple's Steve Jobs discussed offering digital music downloads of 'uncompromised studio quality'. Much of the press and user commentary was particularly enthusiastic about the prospect of uncompressed 24 bit 192kHz downloads. 24/192 featured prominently in my own conversations with Mr. Young's group several months ago.

Unfortunately, there is no point to distributing music in 24-bit/192kHz format. Its playback fidelity is slightly inferior to 16/44.1 or 16/48, and it takes up 6 times the space.

There are a few real problems with the audio quality and 'experience' of digitally distributed music today. 24/192 solves none of them. While everyone fixates on 24/192 as a magic bullet, we're not going to see any actual improvement.

The authors of LAME must have agreed with me, because the typical, standard, recommended, default way of encoding any old audio input to MP3 …

lame --preset standard "cd-track-raw.wav" "cd-track-encoded.mp3"

… now produces variable bit rate MP3 tracks at a bitrate of around 192kbps on average.

Encspot-omigod-disc-3

(Going down one level to the "medium" preset produces nearly exactly 160kbps average, my 2005 recommendation on the nose.)

Encoders have only gotten better since the good old days of 2005. Given the many orders of magnitude improvement in performance and storage since then, I'm totally comfortable with throwing an additional 32kbps in there, going from 160kbps average to 192kbps average just to be totally safe. That's still a miniscule file size compared to the enormous amount of data required for mythical, aurally perfect raw audio. For a particular 4 minute and 56 second music track, that'd be:

Uncompressed raw CD format51 mb
Lossless FLAC compression36 mb
LAME insane encoded MP3 (320kbps)11.6 mb
LAME standard encoded MP3 (192kbps avg)7.1 mb

Ripping to uncompressed audio is a non-starter. I don't care how much of an ultra audio quality nerd you are, spending 7× or 5× the bandwidth and storage for completely inaudible "quality" improvements is a dagger directly in the heart of this efficiency-loving nerd, at least. Maybe if you're planning to do a lot of remixing and manipulation it might make sense to retain the raw source audio, but for typical listening, never.

The difference between the 320kbps track and the 192kbps track is more rational to argue about. But it's still 1.6 times the size. Yes, we have tons more bandwidth and storage and power today, but storage space on your mobile device will never be free, nor will bandwidth or storage in the cloud, where I think most of this stuff should ultimately reside. And all other things being equal, wouldn't you rather be able to fit 200 songs on your device instead of 100? Wouldn't you rather be able to download 10 tracks in the same time instead of 5? Efficiency, that's where it's at. Particularly when people with dog's ears wouldn't even be able to hear the difference.

But Wait, I Have Dog Ears

Of course you do. On the Internet, nobody knows you're a dog. Personally, I think you're a human being full of crap, but let's drop some science on this and see if you can prove it.

On-the-internet-nobody-knows-youre-a-dog

When someone tells me "Dudes, come on, let's steer clear of the worst song ever written!", I say challenge accepted. Behold The Great MP3 Bitrate Experiment!

As proposed on our very own Audio and Video Production Stack Exchange, we're going to do a blind test of the same 2 minute excerpt of a particular rock audio track at a few different bitrates, ranging from 128kbps CBR MP3 all the way up to raw uncompressed CD audio. Each sample was encoded (if necessary), then exported to WAV so they all have the same file size. Can you tell the difference between any of these audio samples using just your ears?

1. Listen to each two minute audio sample

Limburger
Cheddar
Gouda
Brie
Feta

2. Rate each sample for encoding quality

Once you've given each audio sample a listen – with only your ears please, not analysis softwarefill out this brief form and rate each audio sample from 1 to 5 on encoding quality, where one represents worst and five represents flawless.

Yes, it would be better to use a variety of different audio samples, like SoundExpert does, but I don't have time to do that. Anyway, if the difference in encoding bitrate quality is as profound as certain vocal elements of the community would have you believe it is, that difference should be audible in any music track. To those who might argue that I am trolling audiophiles into listening to one of the worst-slash-best rock songs of all time … over and over and over … to prove a point … I say, how dare you impugn my honor in this manner, sir. How dare you!

I wasn't comfortable making my generous TypePad hosts suffer through the bandwidth demands of multiple 16 megabyte audio samples, so this was a fun opportunity to exercise my long dormant Amazon S3 account, and test out Amazon's on-demand CloudFront CDN. I hope I'm not rubbing any copyright holders the wrong way with this test; I just used a song excerpt for science, man! I'll pull the files entirely after a few weeks just to be sure.

You'll get no argument from me that the old standby of 128kbps constant bit rate encoding is not adequate for most music, even today, and you should be able to hear that in this test. But I also maintain that virtually nobody can reliably tell the difference between a 160kbps variable bit rate MP3 and the raw CD audio, much less 192kbps. If you'd like to prove me wrong, this is your big chance. Like the announcer in Smash TV, I say good luck – you're gonna need it.

So which is it – are you a dog or a man? Give the samples a listen, then rate them. I'll post the results of this experiment in a few days.

[advertisement] Hiring developers? Post your open positions with Stack Overflow Careers and reach over 20MM awesome devs already on Stack Overflow. Create your satisfaction-guaranteed job listing today!

0
Your rating: None

McGruber writes "The Wall Street Journal is reporting that tech interns are in high demand in the Bay area. According to the author, 'Technology giants like Google Inc. have been expanding their summer-intern programs, while smaller tech companies are ramping up theirs in response — sometimes even luring candidates away from college.' Meanwhile in NYC, CIOs lament that they are unable to retain 20-something techies according to a report in Network World. Says one CIO, 'It puts us in a really uncomfortable position to have this kind of turnover because knowledge keeps walking out the door. We invest in training people and bringing them up to speed to where they need to be, and boom they're gone. That has been my biggest struggle and concern.' It's the pay, stupid!"

Read more of this story at Slashdot.

0
Your rating: None