Skip navigation
Help

Beth Callaghan

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Ben Cherian

software380

Image copyright isak55

In every emerging technology market, hype seems to wax and wane. One day a new technology is red hot, the next day it’s old hat. Sometimes the hype tends to pan out and concepts such as “e-commerce” become a normal way to shop. Other times the hype doesn’t meet expectations, and consumers don’t buy into paying for e-commerce using Beenz or Flooz. Apparently, Whoopi Goldberg and a slew of big name VCs ended up making a bad bet on the e-currency market in the late 1990s. Whoopi was paid in cash and shares of Flooz. At least, she wasn’t paid in Flooz alone! When investing, some bets are great and others are awful, but often, one only knows the awful ones in retrospect.

What Does “Software Defined” Mean?

In the infrastructure space, there is a growing trend of companies calling themselves “software defined (x).” Often, it’s a vendor that is re-positioning a decades-old product. On occasion, though, it’s smart, nimble startups and wise incumbents seeing a new way of delivering infrastructure. Either way, the term “software defined” is with us to stay, and there is real meaning and value behind it if you look past the hype.

There are three software defined terms that seem to be bandied around quite often: software defined networking, software defined storage, and the software defined data center. I suspect new terms will soon follow, like software defined security and software defined management. What all these “software-defined” concepts really boil down to is: Virtualization of the underlying component and accessibility through some documented API to provision, operate and manage the low-level component.

This trend started once Amazon Web Services came onto the scene and convinced the world that the data center could be abstracted into much smaller units and could be treated as disposable pieces of technology, which in turn could be priced as a utility. Vendors watched Amazon closely and saw how this could apply to the data center of the future.

Since compute was already virtualized by VMware and Xen, projects such as Eucalyptus were launched with the intention to be a “cloud controller” that would manage the virtualized servers and provision virtual machines (VMs). Virtualized storage (a.k.a. software defined storage) was a core part of the offering and projects like OpenStack Swift and Ceph showed the world that storage could be virtualized and accessed programmatically. Today, software defined networking is the new hotness and companies like Midokura, VMware/Nicira, Big Switch and Plexxi are changing the way networks are designed and automated.

The Software Defined Data Center

The software defined data center encompasses all the concepts of software defined networking, software defined storage, cloud computing, automation, management and security. Every low-level infrastructure component in a data center can be provisioned, operated, and managed through an API. Not only are there tenant-facing APIs, but operator-facing APIs which help the operator automate tasks which were previously manual.

An infrastructure superhero might think, “With great accessibility comes great power.” The data center of the future will be the software defined data center where every component can be accessed and manipulated through an API. The proliferation of APIs will change the way people work. Programmers who have never formatted a hard drive will now be able to provision terabytes of data. A web application developer will be able to set up complex load balancing rules without ever logging into a router. IT organizations will start automating the most mundane tasks. Eventually, beautiful applications will be created that mimic the organization’s process and workflow and will automate infrastructure management.

IT Organizations Will Respond and Adapt Accordingly

Of course, this means the IT organization will have to adapt. The new base level of knowledge in IT will eventually include some sort of programming knowledge. Scripted languages like Ruby and Python will soar even higher in popularity. The network administrators will become programmers. The system administrators will become programmers. During this time, DevOps (development + operations) will make serious inroads in the enterprise and silos will be refactored, restructured or flat-out broken down.

Configuration management tools like Chef and Puppet will be the glue for the software defined data center. If done properly, the costs around delivering IT services will be lowered. “Ghosts in the system” will watch all the components (compute, storage, networking, security, etc.) and adapt to changes in real-time to increase utilization, performance, security and quality of service. Monitoring and analytics will be key to realizing this software defined future.

Big Changes in Markets Happen With Very Simple Beginnings

All this amazing innovation comes from two very simple concepts — virtualizing the underlying components and making it accessible through an API.

The IT world might look at the software defined data center and say this is nothing new. We’ve been doing this since the 80s. I disagree. What’s changed is our universal thinking about accessibility. Ten years ago, we wouldn’t have blinked if a networking product came out without an API. Today, an API is part of what we consider a 1.0 release. This thinking is pervasive throughout the data center today with every component. It’s Web 2.0 thinking that shaped cloud computing and now cloud computing is bleeding into enterprise thinking. We’re no longer constrained by the need to have deep specialized knowledge in the low-level components to get basic access to this technology.

With well documented APIs, we have now turned the entire data center into many instruments that can be played by the IT staff (musicians). I imagine the software defined data center to be a Fantasia-like world where Mickey is the IT staff and the brooms are networking, storage, compute and security. The magic is in the coordination, cadence and rhythm of how all the pieces work together. Amazing symphonies of IT will occur in the near future and this is the reason the software defined data center is not a trend to overlook. Maybe Whoopi should take a look at this market instead.

Ben Cherian is a serial entrepreneur who loves playing in the intersection of business and technology. He’s currently the Chief Strategy Officer at Midokura, a network virtualization company. Prior to Midokura, he was the GM of Emerging Technologies at DreamHost, where he ran the cloud business unit. Prior to that, Ben ran a cloud-focused managed services company.

0
Your rating: None
Original author: 
Caleb Barlow

mobilesec380

Mobile phone image copyright Oleksiy Mark

When it comes to mobile computing, many organizations either cringe at the fear of security risks or rejoice in the business potential. On one hand, mobile is revolutionizing business operations — improving operational efficiency, enhancing productivity, empowering employees and delivering an engaging user experience. On the other hand, sensitive data that used to be housed in a controlled environment of a company desktop or even laptop is now sitting in an employee’s back pocket or purse.

In today’s ultra-connected world, it can seem like threats are all around us. High-profile breaches and attacks from hacker groups have organizations of all sizes — from multinational enterprises to mom-and-pop shops — doubling down on security and making sure there aren’t any cracks in their defenses. Mobile security doesn’t have to be the Achilles’ heel that leads to a breach. New, innovative solutions for securing mobile devices at the application level are rapidly hitting the market and the latest IBM X-Force report indicates that by 2014, mobile computing will be more secure than traditional desktops. Phones, tablets and other devices are a staple of the 21st century workplace and in order to fully embrace this technology, businesses must be certain they’re well protected and secure.

Do You Know Where Your Data Is?

Tackling mobile security can seem like a daunting task. The IBM X-Force report also indicates a 19 percent increase in the number of exploits publicly released that can be used to target mobile devices. Making the task more challenging is the fact that — especially in the case of BYOD — the line between professional and personal data is more blurred on mobile platforms than anywhere before. According to Gartner, by 2014, 90 percent of organizations will support corporate applications on personal devices. This means that devices being used to connect with enterprise networks or create sensitive company data are also being used for social networking and to download mobile apps, leaving organizations with the predicament of how to manage, secure and patrol those devices. From the point of view of a hacker, a mobile device becomes an ideal target as it has access to the enterprise data as well as personal data that can be used to mount future attacks against your friends and colleagues.

Mobile apps are a great example of why mobile security tends to raise concerns among security professionals and business leaders. Employees install personal apps onto the same devices they use to access their enterprise data, but are not always careful or discriminating about the security of those apps — whether they are the real version or a manipulated version that will attempt to steal corporate data. According to a recent report by Arxan Technologies, more than 90 percent of the top 100 mobile apps have been hacked in some capacity. Some free mobile apps even demand access to an employee’s contact list in order to function correctly. Just pause and think about that for a second. Would you give your entire contact list to a complete stranger? That’s effectively what you are doing when you install many of these popular applications. If an organization takes a step back and really considers what employees are agreeing to, willingly or not, the results can be troublesome. So the challenge remains — how to get employees to recognize and understand just how vulnerable their mobile device can be to an enterprise.

Mitigating Mobile Risks: Why it’s easier than you think

Mobile app security and device management do not have to be a company’s security downfall. By employing intelligent security solutions that adapt to the requirements of a specific context, businesses can mitigate operational risk and unleash the full potential of mobility.

The key to mitigating security risks when it comes to mobile devices accessing enterprise data is access control. This may include passcode locks, data protection and malware and virus prevention. With that said, IT security priorities should focus on practices, policies and procedures, such as:

  • Risk analysis: Organizations must understand what enterprise data is on employee devices, how it could be compromised and the potential impact of the comprise (i.e. What does it cost? What happens if the device is lost? Is the data incidental or crucial to business?).
  • Securing the application: In the pre-mobile, personal computer era, simply securing the device and the user were sufficient. When it comes to mobile devices, we also need to think about securing the application itself. As a typical application is downloaded from a store, the end user really has no idea who built the application, what it actually does with your data or how secure it is. Corporate applications with sensitive data need to be secure in their own right.
  • Secure mobile access — authentication: Since mobile devices are shared, it’s important to authenticate both the user and the device before granting access and to look at the context of the user requesting access based on factors like time, network, location, device characteristics, role, etc. If the context appears to be out of line with normal behavior, appropriate counter measures can be taken.
  • Encryption: Simply put, if the data is sensitive it needs to be encrypted both while at rest as well as while in motion on the network.

Once an enterprise has defined its security policy — establishing set policies/procedures regarding content that is allowed to be accessed on devices, how it’s accessed and how the organization will handle lost/stolen devices that may contain business data — mobile technology solutions can help ensure that no opening is left unguarded.

So if security concerns are holding you back from “going mobile,” rest assured — there are many companies that have embraced trends like “Bring Your Own Device” without sending their Chief Security Officers into a panic. As long as organizations take the right steps and continually revisit their security posture to ensure that every endpoint is secured and that the proper technology is in place, it really is possible to be confident about your mobile security strategy.

Caleb Barlow is part of the executive team in IBM’s Security division. He manages three portfolios — Application Security, Data Security and Mobile Security. In addition to his day job, Caleb also hosts a popular Internet Radio show focused on IT Security with an audience averaging over 20k listeners per show.

0
Your rating: None
Original author: 
Amir Efrati

In one of the largest exoduses from Stanford University’s computer-science programs, more than a dozen students have left to launch a startup called Clinkle Corp. that aims to let other students — and eventually anyone — use their mobile devices to pay for goods and services.

Several professors also are funding and advising the company, in what may be the epitome of a Stanford-fueled startup.

Read the rest of this post on the original site »

0
Your rating: None
Original author: 
Ben Rooney

It was hard to avoid the message at the recent Mobile World Congress in Barcelona. The GSMA, the organizing body, was keen for everyone to believe that Near Field Communication might finally be about to have its day.

NFC has been a decade in the making, and has always been about to be “The Next Big Thing.” It is a contactless radio technology that can transmit data between two devices within a few centimeters of each other. Coupled with a security chip to encrypt data, it promises to transform a wide range of consumer experiences from simple ticketing to the Holy Grail of replacing your cash and payment cards with just your smartphone. The key word there is “promise.”

Read the rest of this post on the original site »

0
Your rating: None

egypt380If I were to describe a country where the Internet contributes as much as a percentage of GDP as its health services, education and oil industries, and is growing at nearly twice the rate as in Europe — driven in large part by growth in private and corporate-backed entrepreneurship — where would you guess?

Looking forward, if such a country has the largest population of Internet and mobile users in its region with one of the largest youth populations in the world; is a large consumer market in the early days of e-commerce; is a global tourist destination where roughly only five percent of all travel revenue is booked online — might this be an intriguing investment opportunity?

Am I describing Germany? China? Brazil?

Try Egypt.

Two years after the Arab uprisings and in the midst of wrestling significant economic and political change, the Internet is quietly and increasingly growing as a central platform of economic development around the country as it is around the globe. And according to a new Google-commissioned study by The Boston Consulting Group — Egypt at a Crossroads: How the Internet is Transforming Egypt’s Economy — policy makers, executives and investors alike are poised at a central moment of opportunity to embrace this platform for economic growth, job creation and returns.

David Dean, Senior Partner and Managing Director at the Boston Consulting Group — and one of the authors of the study — told me that this is the latest of fifteen country-wide studies his company has done, and he was impressed by what he found. “I think the biggest positive surprise was that there are many entrepreneurial companies using the Internet to grow their businesses.” The report highlights a handful of among hundreds of recent Egyptian startups as diverse as the content portal Masrawy, which now reaches over eight million unique users per month; e-commerce destination Nefsak, which offers over 25,000 products; and Alexandria’s Vimov, whose paid weather app WeatherHD was the fourth-best seller in Apple’s App store after its recent release. It notes that Vodafone, among other global investors, is making serious commitments both to the infrastructure and to funding startups in the region. “The report makes clear that there is much uptapped potential for Egypt’s nascent Internet ecosystem,” Samir El Bahaie, Google’s Head of Policy in the Middle East and North Africa, said — adding that “there is also a great opportunity for investment, economic growth and job creation waiting to be seized.”

The study underscores that the opportunity is now. Egypt’s population of 31 million Internet users is the largest in the Middle East, and while mobile penetration exceeds 100 percent in many parts of the country, the big news is that smartphones — with real computing capabilities — are expected by some to reach 50 percent penetration in the next three to five years. Unmeasured in penetration and GDP figures are what the report calls “ripple effects” on the Egyptian economy and society: The ability to reach new markets, to have better informed consumers, to have greater work efficiencies in the knowledge economy, to have simplified access to government and social services for people to take more control of their lives. Egypt, with its mobile penetration, is especially poised to capture opportunities in mobile banking (as significant success has been seen in Africa) and to fully embrace all the opportunities offered for tourism. Dean notes, in fact, that travel and tourism is “possibly the largest short-term lever that the Internet can have in the country.”

If the opportunity is now, however, so is the potential for missed opportunities. While access to the Internet is growing, there is still a lack of Internet skills in the workforce, even as compared to other emerging markets. While business adoption of the Internet as an economic platform in Egypt is competitive among larger enterprises, small- and medium-sized businesses still rank lowest among emerging growth markets. More fundamentally, there remains significant question of the most appropriate, entrepreneurship-driving policies — areas such as rule of law, copyright protection, lessening bureaucracy in starting businesses. “Of course, these are clearly not just questions for Egypt,” Dean explained to me. “What would really be encouraging would be a commitment by the Government to the Internet as an economic factor — which would mean simplifying the process for opening businesses, encouraging investment, demonstrating the benefits of the Internet in the way the government operates, and using the Internet to address some of Egypt’s most pressing problems, such as youth unemployment.”

Google hopes to play a continued role in working with governments like Egypt’s. Studies like these are extremely useful as they provide factual economic data points around the value of the Internet, El Bahaie noted. “We hope to work with the government of Egypt to leverage these data points to unlock the potential of eCommerce and mCommerce and well-informedly create a more enabling business environment for Egyptian small- and medium-sized business, and to help the country reach its full economic potential.”

Christopher M. Schroeder is a leading U.S. Internet entrepreneur and venture investor, a member of the advisory boards of the American University of Cairo School of Business, the regional entrepreneurship portal Wamda.com and incubator Oasis500. He is the author of “Startup Rising: The Entrepreneurial Revolution That’s Remaking the Middle East,” to be published September 2013 by Palgrave/MacMillan. He can be followed on Twitter @cmschroed.

0
Your rating: None

datum380

Image copyright kentoh

In a series of articles last year, executives from the ad-data firms BlueKai, eXelate and Rocket Fuel debated whether the future of online advertising lies with “More Data” or “Better Algorithms.” Omar Tawakol of BlueKai argues that more data wins because you can drive more effective marketing by layering additional data onto an audience. While we agree with this, we can’t help feeling like we’re being presented with a false choice.

Maybe we should think about a solution that involves smaller amounts of higher quality data instead of more data or better algorithms.

First, it’s important to understand what data is feeding the marketing ecosystem and how it’s getting there. Most third-party profiles consist of data points inferred from the content you consume, forms you fill out and stuff you engage with online. Some companies match data from offline databases with your online identity, and others link your activity across devices. Lots of energy is spent putting trackers on every single touchpoint. And yet the result isn’t very accurate — we like to make jokes around the office about whether one of our colleagues’ profiles says they’re a man or a woman that day. Truth be told, on most days BlueKai thinks they are both.

One way to increase the quality of data would be to change where we get it from.

Instead of scraping as many touchpoints as possible, we could go straight to the source: The individual. Imagine the power of data from across an individual’s entire digital experience — from search to social to purchase, across devices. This kind of data will make all aspects of online advertising more efficient: True attribution, retargeting-type performance for audience targeting, purchase data, customized experiences.

So maybe the solution to “More Data” vs. “Better Algorithms” isn’t incremental improvements to either, but rather to invite consumers to the conversation and capture a fundamentally better data set. Getting this new type of data to the market won’t be easy. Four main hurdles need to be cleared for the market to reach scale.

Control and Comfort

When consumers say they want “privacy,” they don’t normally desire the insular nature of total anonymity. Rather, they want control over what is shared and with whom. Any solution will need to give consumers complete transparent control over their profiles. Comfort is gained when consumers become aware of the information that advertisers are interested in — in most cases, the data is extremely innocuous. A Recent PWC survey found that 80 percent of people are willing to share “information if a company asks up front and clearly states use.”

Remuneration

Control and Comfort are both necessary, but people really want to share in the value created by their data. Smart businesses will offer things like access to content, free shipping, coupons, interest rate discounts or even loyalty points to incentivize consumers to transact using data. It’s not much of a stretch to think that consumers who feel fairly compensated will upload even more data into the marketing cloud.

Trust and Transparency

True transparency around what data is gathered and what happens to it engenders trust. Individuals should have the final say about which of their data is sold. Businesses will need to adopt best practices and tools that allow the individual to see and understand what is happening with their data. A simple dashboard with delete functionality should do, for a start.

Ease of Use

This will all be moot if we make it hard for consumers to participate. Whatever system we ask them to adopt needs to be dead simple to use, and offer enough benefits for them to take the time and effort to switch. Here we can apply one of my favorite principles from Ruby on Rails — convention over configuration. There is so much value in data collected directly from individuals that we can build a system whose convention is to protect even the least sensitive of data points and still respect privacy, without requiring the complexity needed for configuration.

The companies who engage individuals around how their data is used and collected will have an unfair advantage over those who don’t. Their advertising will be more relevant, they’ll be able to customize experiences and measure impact to a level of precision impossible via third-party data. To top it off, by being open and honest with their consumers about data, they’ll have impacted that intangible quality that every brand strives for: Authenticity.

In the bigger picture, the advertising industry faces an exciting opportunity. By treating people and their data with respect and involving them in the conversation around how their data is used, we help other industries gain access to data by helping individuals feel good about transacting with it. From healthcare to education to transportation, society stands to gain if people see data as an opportunity and not a threat.

Marc is the co-founder and CEO of Enliken, a startup focused on helping businesses and consumers transact with data. Currently, it offers tools for publishers and readers to exchange data for access to content. Prior to Enliken, Marc was the founding CEO of Spongecell, an interactive advertising platform that produced one of the first ad units to run on biddable media.

0
Your rating: None

Faced with the need to generate ever-greater insight and end-user value, some of the world’s most innovative companies — Google, Facebook, Twitter, Adobe and American Express among them — have turned to graph technologies to tackle the complexity at the heart of their data.

To understand how graphs address data complexity, we need first to understand the nature of the complexity itself. In practical terms, data gets more complex as it gets bigger, more semi-structured, and more densely connected.

We all know about big data. The volume of net new data being created each year is growing exponentially — a trend that is set to continue for the foreseeable future. But increased volume isn’t the only force we have to contend with today: On top of this staggering growth in the volume of data, we are also seeing an increase in both the amount of semi-structure and the degree of connectedness present in that data.

Semi-Structure

Semi-structured data is messy data: data that doesn’t fit into a uniform, one-size-fits-all, rigid relational schema. It is characterized by the presence of sparse tables and lots of null checking logic — all of it necessary to produce a solution that is fast enough and flexible enough to deal with the vagaries of real world data.

Increased semi-structure, then, is another force with which we have to contend, besides increased data volume. As data volumes grow, we trade insight for uniformity; the more data we gather about a group of entities, the more that data is likely to be semi-structured.

Connectedness

But insight and end-user value do not simply result from ramping up volume and variation in our data. Many of the more important questions we want to ask of our data require us to understand how things are connected. Insight depends on us understanding the relationships between entities — and often, the quality of those relationships.

Here are some examples, taken from different domains, of the kinds of important questions we ask of our data:

  • Which friends and colleagues do we have in common?
  • What’s the quickest route between two stations on the metro?
  • What do you recommend I buy based on my previous purchases?
  • Which products, services and subscriptions do I have permission to access and modify? Conversely, given this particular subscription, who can modify or cancel it?
  • What’s the most efficient means of delivering a parcel from A to B?
  • Who has been fraudulently claiming benefits?
  • Who owns all the debt? Who is most at risk of poisoning the financial markets?

To answer each of these questions, we need to understand how the entities in our domain are connected. In other words, these are graph problems.

Why are these graph problems? Because graphs are the best abstraction we have for modeling and querying connectedness. Moreover, the malleability of the graph structure makes it ideal for creating high-fidelity representations of a semi-structured domain. Traditionally relegated to the more obscure applications of computer science, graph data models are today proving to be a powerful way of modeling and interrogating a wide range of common use cases. Put simply, graphs are everywhere.

Graph Databases

Today, if you’ve got a graph data problem, you can tackle it using a graph database — an online transactional system that allows you to store, manage and query your data in the form of a graph. A graph database enables you to represent any kind of data in a highly accessible, elegant way using nodes and relationships, both of which may host properties:

  • Nodes are containers for properties, which are key-value pairs that capture an entity’s attributes. In a graph model of a domain, nodes tend to be used to represent the things in the domain. The connections between these things are expressed using relationships.
  • A relationship has a name and a direction, which together lend semantic clarity and context to the nodes connected by the relationship. Like nodes, relationships can also contain properties: Attaching one or more properties to a relationship allows us to weight that relationship, or describe its quality, or otherwise qualify its applicability for a particular query.

The key thing about such a model is that it makes relations first-class citizens of the data, rather than treating them as metadata. As real data points, they can be queried and understood in their variety, weight and quality: Important capabilities in a world of increasing connectedness.

Graph Databases in Practice

Today, the most innovative organizations are leveraging graph databases as a way to solve the challenges around their connected data. These include major names such as Google, Facebook, Twitter, Adobe and American Express. Graph databases are also being used by organizations in a range of fields including finance, education, web, ISV and telecom and data communications.

The following examples offer use case scenarios of graph databases in practice.

  • Adobe Systems currently leverages a graph database to provide social capabilities to its Creative Cloud — a new array of services to media enthusiasts and professionals. A graph offers clear advantages in capturing Adobe’s rich data model fully, while still allowing for high performance queries that range from simple reads to advanced analytics. It also enables Adobe to store large amounts of connected data across three continents, all while maintaining high query performance.
  • Europe’s No. 1 professional network, Viadeo, has integrated a graph database to store all of its users and relationships. Viadeo currently has 40 million professionals in its network and requires a solution that is easy to use and capable of handling major expansion. Upon integrating a graph model, Viadeo has accelerated its system performance by more than 200 percent.
  • Telenor Group is one of the top ten wireless Telco companies in the world, and uses a graph database to manage its customer organizational structures. The ability to model and query complex data such as customer and account structures with high performance has proven to be critical to Telenor’s ongoing success.

An access control graph. Telenor uses a similar data model to manage products and subscriptions.

An access control graph. Telenor uses a similar data model to manage products and subscriptions.

  • Deutsche Telekom leverages a graph database for its highly scalable social soccer fan website attracting tens of thousands of visitors during each soccer match, where it provides painless data modeling, seamless data model extendibility, and high performance and reliability.
  • Squidoo is the popular social publishing platform where users share their passions. They recently created a product called Postcards, which are single-page, beautifully designed recommendations of books, movies, music albums, quotes and other products and media types. A graph database ensures that users have an awesome experience as it provides a primary data store for the Postcards taxonomy and the recommendation engine for what people should be doing next.

Such examples prove the pervasiveness of connections within data and the power of a graph model to optimally map relationships. A graph database allows you to further query and analyze such connections to provide greater insight and end-user value. In short, graphs are poised to deliver true competitive advantage by offering deeper perspective into data as well as a new framework to power today’s revolutionary applications.

A New Way of Thinking

Graphs are a new way of thinking for explicitly modeling the factors that make today’s big data so complex: Semi-structure and connectedness. As more and more organizations recognize the value of modeling data with a graph, they are turning to the use of graph databases to extend this powerful modeling capability to the storage and querying of complex, densely connected structures. The result is the opening up of new opportunities for generating critical insight and end-user value, which can make all the difference in keeping up with today’s competitive business environment.

Emil is the founder of the Neo4j open source graph database project, which is the most widely deployed graph database in the world. As a life-long compulsive programmer who started his first free software project in 1994, Emil has with horror witnessed his recent degradation into a VC-backed powerpoint engineer. As the CEO of Neo4j’s commercial sponsor Neo Technology, Emil is now mainly focused on spreading the word about the powers of graphs and preaching the demise of tabular solutions everywhere. Emil presents regularly at conferences such as JAOO, JavaOne, QCon and OSCON.

0
Your rating: None

If 2011 was the year social media arrived as a force in Chinese culture and politics, then 2012 was the year social media supercharged one of contemporary China’s finest forms of cultural and political expression: the Internet meme.

To be sure, the Chinese Internet has been a fertile producer of memes for quite some time. One of 2011’s great Internet moments — the Ministry of Railways spokesman’s haughty and ultimately career-ending effort to explain the burial of passenger cars after a deadly high-speed train crash in Wenzhou — is still going strong a year and a half later. And of course there’s 2009’s “grass mud horse,” which appears destined for immortality (and even a modicum of global cross-over) after being adopted by dissident artist Ai Weiwei.

Read the rest of this post on the original site »

0
Your rating: None

hal380The advent of Salesforce Marketing Cloud and Adobe Marketing Cloud demonstrates the need for enterprises to develop new ways of harnessing the vast potential of big data. Yet these marketing clouds beg the question of who will help marketers, the frontline of businesses, maximize marketing spending and ROI and help their brands win in the end. Simply moving software from onsite to hosted servers does not change the capabilities marketers require — real competitive advantage stems from intelligent use of big data.

Marc Benioff, who is famous for declaring that “Software Is Dead,” may face a similar fate with his recent bets on Buddy Media and Radian6. These applications provide data to people who must then analyze, prioritize and act — often at a pace much slower than the digital world. Data, content and platform insights are too massive for mere mortals to handle without costing a fortune. Solutions that leverage big data are poised to win — freeing up people to do the strategy and content creation that is best done by humans, not machines.

Big data is too big for humans to work with, at least in the all-important analytical construct of responding to opportunities in real time — formulating efficient and timely responses to opportunities generated from your marketing cloud, or pursuing the never-ending quest for perfecting search engine optimization (SEO) and search engine marketing (SEM). The volume, velocity and veracity of raw, unstructured data is overwhelming. Big data pioneers such as Facebook and eBay have moved to massive Hadoop clusters to process their petabytes of information.

In recent years, we’ve gone from analyzing megabytes of data to working with gigabytes, and then terabytes, and then petabytes and exabytes, and beyond. Two years ago, James Rogers, writing in The Street, wrote: “It’s estimated that 1 Petabyte is equal to 20 million four-door filing cabinets full of text.” We’ve become jaded to seeing such figures. But 20 million filing cabinets? If those filing cabinets were a standard 15 inches wide, you could line them up, side by side, all the way from Seattle to New York — and back again. One would need a lot of coffee to peruse so much information, one cabinet at a time. And, a lot of marketing staff.

Of course, we have computers that do the perusing for us, but as big data gets bigger, and as analysts, marketers and others seek to do more with the massive intelligence that can be pulled from big data, we risk running into a human bottleneck. Just how much can one person — or a cubicle farm of persons — accomplish in a timely manner from the dashboard of their marketing cloud? While marketing clouds do a fine job of gathering data, it still comes down to expecting analysts and marketers to interpret and act on it — often with data that has gone out of date by the time they work with it.

Hence, big data solutions leveraging machine learning, language models and prediction, in the form of self-learning solutions that go from using algorithms for harvesting information from big data, to using algorithms to initiate actions based on the data.

Yes, this may sound a bit frightful: Removing the human from the loop. Marketers indeed need to automate some decision-making. But the human touch will still be there, doing what only people can do — creating great content that evokes emotions from consumers — and then monitoring and fine-tuning the overall performance of a system designed to take actions on the basis of big data.

This isn’t a radical idea. Programmed trading algorithms already drive significant activity across stock markets. And, of course, Amazon, eBay and Facebook have become generators of — and consummate users of — big data. Others are jumping on the bandwagon as well. RocketFuel uses big data about consumers, sites, ads and prior ad performance to optimize display advertising. Turn.com uses big data from consumer Web behavior, on-site behaviors and publisher content to create, optimize and buy advertising across the Web for display advertisers.

The big data revolution is just beginning as it moves beyond analytics. If we were building CRM again, we wouldn’t just track sales-force productivity; we’d recommend how you’re doing versus your competitors based on data across the industry. If we were building marketing automation software, we wouldn’t just capture and nurture leads generated by our clients, we’d find and attract more leads for them from across the Web. If we were building a financial application, it wouldn’t just track the financials of your company, it would compare them to public filings in your category so you could benchmark yourself and act on best practices.

Benioff is correct that there’s an undeniable trend that most marketing budgets today are betting more on social and mobile. The ability to manage social, mobile and Web analysis for better marketing has quickly become a real focus — and a big data marketing cloud is needed to do it. However, the real value and ROI comes from the ability to turn big data analysis into action, automatically. There’s clearly big value in big data, but it’s not cost-effective for any company to interpret and act on it before the trend changes or is over. Some reports find that 70 percent of marketers are concerned with making sense of the data and more than 91 percent are concerned with extracting marketing ROI from it. Incorporating big data technologies that create action means that your organization’s marketing can get smarter even while you sleep.

Raj De Datta founded BloomReach with 10 years of enterprise and entrepreneurial experience behind him. Most recently, he was an Entrepreneur-In-Residence at Mohr-Davidow Ventures. Previously, he was a Director of Product Marketing at Cisco. Raj also worked in technology investment banking at Lazard Freres. He holds a BSE in Electrical Engineering from Princeton and an MBA from Harvard Business School.

0
Your rating: None

Image via vichie81

Recently, Omar Tawakol from BlueKai wrote a fascinating article positing that more data beats better algorithms. He argued that more data trumps a better algorithm, but better still is having an algorithm that augments your data with linkages and connections, in the end creating a more robust data asset.

At Rocket Fuel, we’re big believers in the power of algorithms. This is because data, no matter how rich or augmented, is still a mostly static representation of customer interest and intent. To use data in the traditional way for Web advertising, choosing whom to show ads on the basis of the specific data segments they may be in represents one very simple choice of algorithm. But there are many others that can be strategically applied to take advantage of specific opportunities in the market, like a sudden burst of relevant ad inventory or a sudden increase in competition for consumers in a particular data segment. The algorithms can react to the changing usefulness of data, such as data that indicates interest in a specific time-sensitive event that is now past. They can also take advantage of ephemeral data not tied to individual behavior in any long-term way, such as the time of day or the context in which the person is browsing.

So while the world of data is rich, and algorithms can extend those data assets even further, the use of that data can be even more interesting and challenging, requiring extremely clever algorithms that result in significant, measurable improvements in campaign performance. Very few of these performance improvements are attributable solely to the use of more data.

For the sake of illustration, imagine you want to marry someone who will help you produce tall, healthy children. You are sequentially presented with suitors whom you have to either marry, or reject forever. Let’s say you start with only being able to look at the suitor’s height, and your simple algorithm is to “marry the first person who is over six feet tall.” How can we improve on these results? Using the “more data” strategy, we could also look at how strong they are, and set a threshold for that. Alternatively, we could use the same data but improve the algorithm: “Measure the height of the first third of the people I see, and marry the next person who is taller than all of them.” This algorithm improvement has a good chance of delivering a better result than just using more data with a simple algorithm.

Choosing opportunities to show online advertising to consumers is very much like that example, except that we’re picking millions of “suitors” each day for each advertiser, out of tens of billions of opportunities. As with the marriage challenge, we find it is most valuable to make improvements to the algorithms to help us make real-time decisions that grow increasingly optimal with each campaign.

There’s yet another dimension not covered in Omar’s article: the speed of the algorithms and data access, and the capacity of the infrastructure on which they run. The provider you work with needs to be able to make more decisions, faster, than any other players in this space. Doing that calls for a huge investment in hardware and software improvements at all layers of the stack. These investments are in some ways orthogonal to Omar’s original question: they simultaneously help optimize the performance of the algorithms, and they ensure the ability to store and process massive amounts of data.

In short, if I were told I had to either give up all the third-party data I might use, or give up my use of algorithms, I would give up the data in a heartbeat. There is plenty of relevant data captured through the passive activity of consumers interacting with Web advertising — more than enough to drive great performance for the vast majority of clients.

Mark Torrance is CTO of Rocket Fuel, which provides artificial-intelligence advertising solutions.

0
Your rating: None