Skip navigation

technology team

warning: Creating default object from empty value in /var/www/vhosts/ on line 33.

The tech unit's sign, autographed by its members.

The reelection of Barack Obama was won by people, not by software. But in a contest as close as last week's election, software may have given the Obama for America organization's people a tiny edge—making them by some measures more efficient, better connected, and more engaged than the competition.

That edge was provided by the work of a group of people unique in the history of presidential politics: Team Tech, a dedicated internal team of technology professionals who operated like an Internet startup, leveraging a combination of open source software, Web services, and cloud computing power. The result was the sort of numbers any startup would consider a success. As Scott VanDenPlas, the head of the Obama technology team's DevOps group, put it in a tweet:

4Gb/s, 10k requests per second, 2,000 nodes, 3 datacenters, 180TB and 8.5 billion requests. Design, deploy, dismantle in 583 days to elect the President. #madops

Read 53 remaining paragraphs | Comments

Your rating: None

make it bounce!

Recently BLITZ was asked about modifying an audio driven flash site we completed a few years ago. In that time, technology on the internet driving rich consumer experiences has shifted from a plugin based platform dominated by Adobe Flash to a reliance on the native capabilities of the browser. This is significant, as it’s now obvious that mobile is the current trend in browsing. The old project utilized flash’s ability to extract sound data from a playing song and displayed an on-the-fly visualization of the data, similar to a visual EQ. This ability does not currently exist in most native JavaScript engines, though it is proposed. It got us thinking whether or not we could get this same effect running on a non-flash mobile browser. Currently IOS doesn’t support the ability to get this data from a sound file. After spending a couple hours thinking through the possibilities, we settled on a solution. We would preprocess the audio file, extracting the sound spectrum data and process that data in the browser.

The jist of the data-extraction process looks like this:

An Air application loads the mp3 file and lets the song play. As it plays, every 100 milliseconds it will grab the sound spectrum and select parts of it to use. Originally we were planning on using all 256 data points from the right channel, but a single 4-minute mp3 file was spitting out an 11 meg JSON file. Obviously this was a tad excessive… So after some tweaking we settled on grabbing 50 data points (every 5 from 0 to 250), which decreased the file size from 11 megs down to 2.5 megs. And trimming the decimal values down to only 3 decimal points (Thank You Nick Vincent for the idea) got it down to 800k, a size we were happy with. At each 100 millisecond marker, we grab the data, and inject it into an object. we use the time in miliseconds as the key (we had to round it to 100 ms) for the object. Once we have the object populated, we then serialize it to JSON and save it as a text file.

Once we had the data it was just a matter of implementation. Drop 50 dots in the DOM, each corresponding do the 50 data points of the sound spectrum, and move them around as the song plays. Implementation can vary, so we didn’t spend any real time cleaning things up for reuse, the purpose was more of a proof of concept. If we were going to get serious about it, we would write a wrapper JS file that you would pass an audio element, and the path to the JSON file. Then you’d hook up your events and let it handle the rest. If anyone has interest in taking this task on please let us know. We’d actually love to build it out time permitting.

You can view the demo here:

Please note, the demo only works in safari, chrome and IOS (that was the challenge). You could easily rig this to work on firefox, just wasn’t a priority.

And you can get the source here:

The source has both the air application and a www folder containing the demo. To get the air application running you have to open in flash and build. I didn’t create an actual .air file. You will also have to run sass to compile the css (sorry its just how we do things here at BLITZ)

Coming from the flash world, I was very much sick of sound visualization. So please don’t take this and just create another sound visualizer… that’s lame. If you have any practical use please let me know. Also, one disclamer, all of this needs a good deal of polish. Again, it was just a POC spike.

If you’ve read all the way to the end of this article and are interested in doing work like this, we’re always looking for people to join the technology team. Check out our Careers Page for a list of open positions.

Your rating: None

Part of the Rockstar San Diego studio, the RAGE team is Rockstar's central technology group. The RAGE engine drives games such as Red Dead Redemption, GTA IV and Max Payne 3. RAGE is looking for an energetic Web/.NET developer to help increase the features and capabilities of our online technologies. We are looking for someone who is not afraid of new challenges and wants to work on a technology team and company that are focused on redefining what an online gaming experience can be.

Your rating: None

<b>** TO APPLY, PLEASE VISIT **</b><br><br>
Monolith Productions, a division of WB Games Inc., seeks a Staff Software Engineer - Game Systems to join (the award-winning technology team behind the F.E.A.R. (tm) and Condemned(tm) franchises) creating game systems for exciting, cutting-edge action games on PlayStation 3, XBox 360 and PC. As a member of our studio's Core Technology team you will work closely with engineers across all disciplines providing systems,

Your rating: None