Skip navigation
Help

machine learning system

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
John Timmer


The D-Wave Two.

D-Wave

D-Wave's quantum optimizer has found a new customer in the form of a partnership created by Google, NASA, and a consortium of research universities. The group is forming what it's calling the Quantum Artificial Intelligence Lab and will locate the computer at NASA's Ames Research Center. Academics will get involved via the Universities Space Research Association.

Although the D-Wave Two isn't a true quantum computer in the sense the term is typically used, D-Wave's system uses quantum effects to solve computational problems in a way that can be faster than traditional computers. How much faster? We just covered some results that indicated a certain class of problems may be sped up by as much as 10,000 times. Those algorithms are typically used in what's termed machine learning. And machine learning gets mentioned several times in Google's announcement of the new hardware.

Machine learning is typically used to allow computers to classify features, like whether or not an e-mail is spam (to use Google's example) or whether or not an image contains a specific feature, like a cat. You simply feed a machine learning system enough known images with and without cats and it will identify features that are shared among the cat set. When you feed it unknown images, it can determine whether enough of those features are present and make an accurate guess as to whether there's a cat in it. In more serious applications machine learning has been used to identify patterns of brain activity that are associated with different visual inputs, like viewing different letters.

Read 1 remaining paragraphs | Comments

0
Your rating: None

cylonlover writes "Maya Cakmak, a researcher from Georgia Tech, spent the summer at Willow Garage creating a user-friendly system that teaches the PR2 robot simple tasks. The kicker is that it doesn't require any traditional programming skills whatsoever – it works by physically guiding the robot's arms while giving it verbal commands. After inviting regular people to give it a try, she found that with few instructions they were able to teach the PR2 how to retrieve medicine from a cabinet and fold a t-shirt."


Share on Google+

Read more of this story at Slashdot.

0
Your rating: None

echonest

Music intelligence startup Echo Nest, which you might know better as the company powering Spotify Radio and Vevo’s recommendations, is announcing two new partnerships today which will give developers access to more data to build their apps with. The company is teaming up with JamBase, which provides concerts listings and tour schedules, and SongMeanings, which you’ve surely come across while Googling to find out what the eff that guy is actually singing about.

The new services will become part of The Echo Nest’s Rosetta Stone project, essentially a data resolution service aiming to offer a common language to music applications. Currently, the platform includes lyrics from LyricFind and musiXmatch, music from Spotify and Rdio, tweets from artists on Twitter, concert tickets from Seatwave, and links to relevant Facebook pages.

Specifically, JamBase will integrate its artist IDs into Echo Nest’s API, to make its concert listings available. At present, JamBase says it has access to the show listings for 80,000 artists across 50 genres who perform in 100,000 venues worldwide.

SongMeanings, a community-oriented website for discussing the meaning behind a song’s lyrics, reports having 2 million lyrics with over 1.7 comments available in its database.

While Echo Nest isn’t yet announcing any of its companies on its API will be rolling out support for the new additions, it certainly would make sense for music streaming apps to continue to make their apps more intelligent and feature-rich in the future.

In addition to Spotify and Vevo, The Echo Nest has deals with Nokia, EMI, and Clear Channel. Companies building on top of its APIs also include MOG, the BBC, MTV, and Discovr, to name a few. The Echo Nest is backed by $8.31 million in funding from Matrix Partners and Commonwealth Capital.

0
Your rating: None


GTAC 2011: How to Hit a Moving Target in the Cloud

6th Annual Google Test Automation Conference 2011 (GTAC 2011) "Cloudy With A Chance Of Tests" Computer History Museum Mountain View, CA USA October 26-27, 2011 Presented by Vishal Chowdhary, Microsoft ABSTRACT In this session, we share our experiences for testing the Microsoft Translator (MT) service. Testing the translator service can be divided into two broad areas -- (1) Testing of the machine learning system that returns the translation answer and (2) the web service that serves translate requests running on the Microsoft cloud. The MT service offers translation in 36 different languages for users across the globe through UI and API. Testing the MT cloud service without user data would be like shooting in the dark. From a product standpoint, we need to answer questions like How do we prioritize languages for improvement? How do we divide resources in our data center for effective capacity distribution for the languages? From a test perspective we need to understand how the system is being used so that we can ensure that any new bits deployed will hold-up in production. We will discuss the following three problems: Strategy for testing the machine learning system. How do you perform load testing for the translation service running in the cloud? How can one extend the role of test by completely skipping the 'opening bugs' step? With Microsoft for the past seven years, Vishal Chowdhary is currently with the MSR - Machine Translation (MT) team where he is responsible for <b>...</b>
From:
GoogleTechTalks
Views:
117

1
ratings
Time:
44:00
More in
Science & Technology

0
Your rating: None