Skip navigation
Help

Enterprise

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Ben Cherian

software380

Image copyright isak55

In every emerging technology market, hype seems to wax and wane. One day a new technology is red hot, the next day it’s old hat. Sometimes the hype tends to pan out and concepts such as “e-commerce” become a normal way to shop. Other times the hype doesn’t meet expectations, and consumers don’t buy into paying for e-commerce using Beenz or Flooz. Apparently, Whoopi Goldberg and a slew of big name VCs ended up making a bad bet on the e-currency market in the late 1990s. Whoopi was paid in cash and shares of Flooz. At least, she wasn’t paid in Flooz alone! When investing, some bets are great and others are awful, but often, one only knows the awful ones in retrospect.

What Does “Software Defined” Mean?

In the infrastructure space, there is a growing trend of companies calling themselves “software defined (x).” Often, it’s a vendor that is re-positioning a decades-old product. On occasion, though, it’s smart, nimble startups and wise incumbents seeing a new way of delivering infrastructure. Either way, the term “software defined” is with us to stay, and there is real meaning and value behind it if you look past the hype.

There are three software defined terms that seem to be bandied around quite often: software defined networking, software defined storage, and the software defined data center. I suspect new terms will soon follow, like software defined security and software defined management. What all these “software-defined” concepts really boil down to is: Virtualization of the underlying component and accessibility through some documented API to provision, operate and manage the low-level component.

This trend started once Amazon Web Services came onto the scene and convinced the world that the data center could be abstracted into much smaller units and could be treated as disposable pieces of technology, which in turn could be priced as a utility. Vendors watched Amazon closely and saw how this could apply to the data center of the future.

Since compute was already virtualized by VMware and Xen, projects such as Eucalyptus were launched with the intention to be a “cloud controller” that would manage the virtualized servers and provision virtual machines (VMs). Virtualized storage (a.k.a. software defined storage) was a core part of the offering and projects like OpenStack Swift and Ceph showed the world that storage could be virtualized and accessed programmatically. Today, software defined networking is the new hotness and companies like Midokura, VMware/Nicira, Big Switch and Plexxi are changing the way networks are designed and automated.

The Software Defined Data Center

The software defined data center encompasses all the concepts of software defined networking, software defined storage, cloud computing, automation, management and security. Every low-level infrastructure component in a data center can be provisioned, operated, and managed through an API. Not only are there tenant-facing APIs, but operator-facing APIs which help the operator automate tasks which were previously manual.

An infrastructure superhero might think, “With great accessibility comes great power.” The data center of the future will be the software defined data center where every component can be accessed and manipulated through an API. The proliferation of APIs will change the way people work. Programmers who have never formatted a hard drive will now be able to provision terabytes of data. A web application developer will be able to set up complex load balancing rules without ever logging into a router. IT organizations will start automating the most mundane tasks. Eventually, beautiful applications will be created that mimic the organization’s process and workflow and will automate infrastructure management.

IT Organizations Will Respond and Adapt Accordingly

Of course, this means the IT organization will have to adapt. The new base level of knowledge in IT will eventually include some sort of programming knowledge. Scripted languages like Ruby and Python will soar even higher in popularity. The network administrators will become programmers. The system administrators will become programmers. During this time, DevOps (development + operations) will make serious inroads in the enterprise and silos will be refactored, restructured or flat-out broken down.

Configuration management tools like Chef and Puppet will be the glue for the software defined data center. If done properly, the costs around delivering IT services will be lowered. “Ghosts in the system” will watch all the components (compute, storage, networking, security, etc.) and adapt to changes in real-time to increase utilization, performance, security and quality of service. Monitoring and analytics will be key to realizing this software defined future.

Big Changes in Markets Happen With Very Simple Beginnings

All this amazing innovation comes from two very simple concepts — virtualizing the underlying components and making it accessible through an API.

The IT world might look at the software defined data center and say this is nothing new. We’ve been doing this since the 80s. I disagree. What’s changed is our universal thinking about accessibility. Ten years ago, we wouldn’t have blinked if a networking product came out without an API. Today, an API is part of what we consider a 1.0 release. This thinking is pervasive throughout the data center today with every component. It’s Web 2.0 thinking that shaped cloud computing and now cloud computing is bleeding into enterprise thinking. We’re no longer constrained by the need to have deep specialized knowledge in the low-level components to get basic access to this technology.

With well documented APIs, we have now turned the entire data center into many instruments that can be played by the IT staff (musicians). I imagine the software defined data center to be a Fantasia-like world where Mickey is the IT staff and the brooms are networking, storage, compute and security. The magic is in the coordination, cadence and rhythm of how all the pieces work together. Amazing symphonies of IT will occur in the near future and this is the reason the software defined data center is not a trend to overlook. Maybe Whoopi should take a look at this market instead.

Ben Cherian is a serial entrepreneur who loves playing in the intersection of business and technology. He’s currently the Chief Strategy Officer at Midokura, a network virtualization company. Prior to Midokura, he was the GM of Emerging Technologies at DreamHost, where he ran the cloud business unit. Prior to that, Ben ran a cloud-focused managed services company.

0
Your rating: None
Original author: 
Caleb Barlow

mobilesec380

Mobile phone image copyright Oleksiy Mark

When it comes to mobile computing, many organizations either cringe at the fear of security risks or rejoice in the business potential. On one hand, mobile is revolutionizing business operations — improving operational efficiency, enhancing productivity, empowering employees and delivering an engaging user experience. On the other hand, sensitive data that used to be housed in a controlled environment of a company desktop or even laptop is now sitting in an employee’s back pocket or purse.

In today’s ultra-connected world, it can seem like threats are all around us. High-profile breaches and attacks from hacker groups have organizations of all sizes — from multinational enterprises to mom-and-pop shops — doubling down on security and making sure there aren’t any cracks in their defenses. Mobile security doesn’t have to be the Achilles’ heel that leads to a breach. New, innovative solutions for securing mobile devices at the application level are rapidly hitting the market and the latest IBM X-Force report indicates that by 2014, mobile computing will be more secure than traditional desktops. Phones, tablets and other devices are a staple of the 21st century workplace and in order to fully embrace this technology, businesses must be certain they’re well protected and secure.

Do You Know Where Your Data Is?

Tackling mobile security can seem like a daunting task. The IBM X-Force report also indicates a 19 percent increase in the number of exploits publicly released that can be used to target mobile devices. Making the task more challenging is the fact that — especially in the case of BYOD — the line between professional and personal data is more blurred on mobile platforms than anywhere before. According to Gartner, by 2014, 90 percent of organizations will support corporate applications on personal devices. This means that devices being used to connect with enterprise networks or create sensitive company data are also being used for social networking and to download mobile apps, leaving organizations with the predicament of how to manage, secure and patrol those devices. From the point of view of a hacker, a mobile device becomes an ideal target as it has access to the enterprise data as well as personal data that can be used to mount future attacks against your friends and colleagues.

Mobile apps are a great example of why mobile security tends to raise concerns among security professionals and business leaders. Employees install personal apps onto the same devices they use to access their enterprise data, but are not always careful or discriminating about the security of those apps — whether they are the real version or a manipulated version that will attempt to steal corporate data. According to a recent report by Arxan Technologies, more than 90 percent of the top 100 mobile apps have been hacked in some capacity. Some free mobile apps even demand access to an employee’s contact list in order to function correctly. Just pause and think about that for a second. Would you give your entire contact list to a complete stranger? That’s effectively what you are doing when you install many of these popular applications. If an organization takes a step back and really considers what employees are agreeing to, willingly or not, the results can be troublesome. So the challenge remains — how to get employees to recognize and understand just how vulnerable their mobile device can be to an enterprise.

Mitigating Mobile Risks: Why it’s easier than you think

Mobile app security and device management do not have to be a company’s security downfall. By employing intelligent security solutions that adapt to the requirements of a specific context, businesses can mitigate operational risk and unleash the full potential of mobility.

The key to mitigating security risks when it comes to mobile devices accessing enterprise data is access control. This may include passcode locks, data protection and malware and virus prevention. With that said, IT security priorities should focus on practices, policies and procedures, such as:

  • Risk analysis: Organizations must understand what enterprise data is on employee devices, how it could be compromised and the potential impact of the comprise (i.e. What does it cost? What happens if the device is lost? Is the data incidental or crucial to business?).
  • Securing the application: In the pre-mobile, personal computer era, simply securing the device and the user were sufficient. When it comes to mobile devices, we also need to think about securing the application itself. As a typical application is downloaded from a store, the end user really has no idea who built the application, what it actually does with your data or how secure it is. Corporate applications with sensitive data need to be secure in their own right.
  • Secure mobile access — authentication: Since mobile devices are shared, it’s important to authenticate both the user and the device before granting access and to look at the context of the user requesting access based on factors like time, network, location, device characteristics, role, etc. If the context appears to be out of line with normal behavior, appropriate counter measures can be taken.
  • Encryption: Simply put, if the data is sensitive it needs to be encrypted both while at rest as well as while in motion on the network.

Once an enterprise has defined its security policy — establishing set policies/procedures regarding content that is allowed to be accessed on devices, how it’s accessed and how the organization will handle lost/stolen devices that may contain business data — mobile technology solutions can help ensure that no opening is left unguarded.

So if security concerns are holding you back from “going mobile,” rest assured — there are many companies that have embraced trends like “Bring Your Own Device” without sending their Chief Security Officers into a panic. As long as organizations take the right steps and continually revisit their security posture to ensure that every endpoint is secured and that the proper technology is in place, it really is possible to be confident about your mobile security strategy.

Caleb Barlow is part of the executive team in IBM’s Security division. He manages three portfolios — Application Security, Data Security and Mobile Security. In addition to his day job, Caleb also hosts a popular Internet Radio show focused on IT Security with an audience averaging over 20k listeners per show.

0
Your rating: None
Original author: 
Arik Hesseldahl

cloud1Here’s a name I haven’t heard in a while: Anso Labs.

This was the cloud computing startup that originated at NASA, where the original ideas for OpenStack, the open source cloud computing platform, was born. Anso Labs was acquired by Rackspace a little more than two years ago.

It was a small team. But now a lot of the people who ran Anso Labs are back with a new outfit, still devoted to cloud computing, and still devoted to OpenStack. It’s called Nebula. And it builds a turnkey computer that will turn an ordinary rack of servers into a cloud-ready system, running — you guessed it — OpenStack.

Based in Mountain View, Calif., Nebula claims to have an answer for any company that has ever wanted to build its own private cloud system and not rely on outside vendors like Amazon or Hewlett-Packard or Rackspace to run it for them.

It’s called the Nebula One. And the setup is pretty simple, said Nebula CEO and founder Chris Kemp said: Plug the servers into the Nebula One, then you “turn it on and it boots up cloud.” All of the provisioning and management that a service provider would normally charge you for has been created on a hardware device. There are no services to buy, no consultants to pay to set it up. “Turn on the power switch, and an hour later you have a petascale cloud running on your premise,” Kemp told me.

The Nebula One sits at the top of a rack of servers; on its back are 48 Ethernet ports. It runs an operating system called Cosmos that grabs all the memory and storage and CPU capacity from every server in the rack and makes them part of the cloud. It doesn’t matter who made them — Dell, Hewlett-Packard or IBM.

Kemp named two customers: Genentech and Xerox’s research lab, PARC. There are more customer names coming, he says, and it already boasts investments from Kleiner Perkins, Highland Capital and Comcast Ventures. Nebula is also the only startup company that is a platinum member of the OpenStack Foundation. Others include IBM, HP, Rackspace, RedHat and AT&T.

If OpenStack becomes as easy to deploy as Kemp says it can be, a lot of companies — those that can afford to have their own data centers, anyway — are going to have their own clouds. And that is sort of the point.

0
Your rating: None

 AlbumIn the business of selling stuff, there’s a lot of managing. Sales reps usually have a boss they check in with on the status of deals in the pipeline, maybe to get some advice on how to close a deal when there’s stiff competition from another company, or to go over how an important customer was reeled in, so that others can learn from it.

These check-ins are sometimes referred to as coaching, and there is data to show that coaching can boost sales performance. A study by the Sales Executive Council suggests that reps who received three or more hours of coaching per month outsold those who received two hours or less of coaching per month, by as much as 17 percent.

Getting that coaching done can be kind of a hassle. But it’s the sort of hassle that Salesforce.com has often sought to understand intimately, and then create products within its suite of cloud software tools.

Today is one of those days. The company is announcing a trial of a new feature that closely ties its traditional Sales Cloud with its Work.com product. The point is to do a few things: Speed up the review portion that has always tended to be a big consumer of time and attention in pretty much any organization, and also to make it easier for sales managers to find ways to motivate their teams to, you know, sell more stuff, which is basically the point of sales in the first place.

Through a combination of Salesforce services including the Sales Cloud, its social enterprise platform Chatter and Work.com, an HR software outfit that includes the Rypple acquisition it made last year, sales teams will see each other’s goals, will learn about big deals coming in, and know about each other’s expertise.

The new tools will also give managers a way to provide instant feedback and public recognition to those sales people who are doing well. Remember “gamification”? It’s not my favorite word, but apparently it works to some extent, especially with sales people who have monthly, quarterly and annual targets to make.

There is research to back up the assertion that when people leave sales jobs they do so in part because they don’t think they’re getting enough recognition from above. Now, on those occasions when a rep lands a big customer in a competitive deal, the manager can publicly pat them on the back with a “thanks in Chatter” feature, and give them a “sales Ninja” badge, or something like it, that everyone can see in their Chatter feeds.

Think it all sounds hokey? Maybe it is, but there’s a lot of evidence that these things have a way to making sales people happier on the job. And happy sales reps are sales reps who close deals, or least that’s the theory. We’ve come a long way since Alec Baldwin’s memorable (and profanity-laced) monologue in “Glengarry Glen Ross.”

The new features are coming in early 2013, and are available for certain Salesforce customers on a pilot basis starting today.

0
Your rating: None

hal380The advent of Salesforce Marketing Cloud and Adobe Marketing Cloud demonstrates the need for enterprises to develop new ways of harnessing the vast potential of big data. Yet these marketing clouds beg the question of who will help marketers, the frontline of businesses, maximize marketing spending and ROI and help their brands win in the end. Simply moving software from onsite to hosted servers does not change the capabilities marketers require — real competitive advantage stems from intelligent use of big data.

Marc Benioff, who is famous for declaring that “Software Is Dead,” may face a similar fate with his recent bets on Buddy Media and Radian6. These applications provide data to people who must then analyze, prioritize and act — often at a pace much slower than the digital world. Data, content and platform insights are too massive for mere mortals to handle without costing a fortune. Solutions that leverage big data are poised to win — freeing up people to do the strategy and content creation that is best done by humans, not machines.

Big data is too big for humans to work with, at least in the all-important analytical construct of responding to opportunities in real time — formulating efficient and timely responses to opportunities generated from your marketing cloud, or pursuing the never-ending quest for perfecting search engine optimization (SEO) and search engine marketing (SEM). The volume, velocity and veracity of raw, unstructured data is overwhelming. Big data pioneers such as Facebook and eBay have moved to massive Hadoop clusters to process their petabytes of information.

In recent years, we’ve gone from analyzing megabytes of data to working with gigabytes, and then terabytes, and then petabytes and exabytes, and beyond. Two years ago, James Rogers, writing in The Street, wrote: “It’s estimated that 1 Petabyte is equal to 20 million four-door filing cabinets full of text.” We’ve become jaded to seeing such figures. But 20 million filing cabinets? If those filing cabinets were a standard 15 inches wide, you could line them up, side by side, all the way from Seattle to New York — and back again. One would need a lot of coffee to peruse so much information, one cabinet at a time. And, a lot of marketing staff.

Of course, we have computers that do the perusing for us, but as big data gets bigger, and as analysts, marketers and others seek to do more with the massive intelligence that can be pulled from big data, we risk running into a human bottleneck. Just how much can one person — or a cubicle farm of persons — accomplish in a timely manner from the dashboard of their marketing cloud? While marketing clouds do a fine job of gathering data, it still comes down to expecting analysts and marketers to interpret and act on it — often with data that has gone out of date by the time they work with it.

Hence, big data solutions leveraging machine learning, language models and prediction, in the form of self-learning solutions that go from using algorithms for harvesting information from big data, to using algorithms to initiate actions based on the data.

Yes, this may sound a bit frightful: Removing the human from the loop. Marketers indeed need to automate some decision-making. But the human touch will still be there, doing what only people can do — creating great content that evokes emotions from consumers — and then monitoring and fine-tuning the overall performance of a system designed to take actions on the basis of big data.

This isn’t a radical idea. Programmed trading algorithms already drive significant activity across stock markets. And, of course, Amazon, eBay and Facebook have become generators of — and consummate users of — big data. Others are jumping on the bandwagon as well. RocketFuel uses big data about consumers, sites, ads and prior ad performance to optimize display advertising. Turn.com uses big data from consumer Web behavior, on-site behaviors and publisher content to create, optimize and buy advertising across the Web for display advertisers.

The big data revolution is just beginning as it moves beyond analytics. If we were building CRM again, we wouldn’t just track sales-force productivity; we’d recommend how you’re doing versus your competitors based on data across the industry. If we were building marketing automation software, we wouldn’t just capture and nurture leads generated by our clients, we’d find and attract more leads for them from across the Web. If we were building a financial application, it wouldn’t just track the financials of your company, it would compare them to public filings in your category so you could benchmark yourself and act on best practices.

Benioff is correct that there’s an undeniable trend that most marketing budgets today are betting more on social and mobile. The ability to manage social, mobile and Web analysis for better marketing has quickly become a real focus — and a big data marketing cloud is needed to do it. However, the real value and ROI comes from the ability to turn big data analysis into action, automatically. There’s clearly big value in big data, but it’s not cost-effective for any company to interpret and act on it before the trend changes or is over. Some reports find that 70 percent of marketers are concerned with making sense of the data and more than 91 percent are concerned with extracting marketing ROI from it. Incorporating big data technologies that create action means that your organization’s marketing can get smarter even while you sleep.

Raj De Datta founded BloomReach with 10 years of enterprise and entrepreneurial experience behind him. Most recently, he was an Entrepreneur-In-Residence at Mohr-Davidow Ventures. Previously, he was a Director of Product Marketing at Cisco. Raj also worked in technology investment banking at Lazard Freres. He holds a BSE in Electrical Engineering from Princeton and an MBA from Harvard Business School.

0
Your rating: None

Syria’s Internet infrastructure remains almost entirely dark today. Almost.

The folks at Renesys, who were the first to notice that something was amiss with the telecom infrastructure of the war-torn Middle Eastern nation, have been hard at work sifting through their data — and they’ve found something interesting.

At least five networks operating outside Syria, but still operating within Syrian-registered IP address spaces, are still working, and are apparently controlled by India’s Tata Communications.

These same networks, Renesys says, have some servers running on them that were implicated in an attempt to deliver Trojans and other malware to Syrian activists. The payload was a fake “Skype Encryption Tool” — which is, on its face, kind of silly, because Skype itself is already encrypted to some degree — that was actually a spying tool. The Electronic Frontier Foundation covered the attempted cyber attack at the time.

Cloudflare has also been monitoring the situation in Syria and has made a few interesting observations.

First, pretty much all Internet access in the country is funneled through one point: The state-run, state-controlled Syrian Telecommunications Establishment. The companies that provide this capacity running into the country are PCCW and Turk Telekom as the primary providers, with Telecom Italia and Tata providing additional capacity.

There are, Cloudflare notes, four physical cables that bring Internet connectivity into Syria. Three of them are undersea cables that land in the coastal city of Tartus. A fourth comes in from Turkey to the north. Cloudflare’s Matt Prince says it’s unlikely that the cables were physically cut.

Cloudflare put together a video of what it looked like watching the changes in the routing tables happen live. It’s less than two minutes long.

For what it’s worth, Syria’s information minister is being quoted in various reports as blaming the opposition for the shutdown.

So the question is: Why now? Clearly, the Syrian regime is under more pressure than ever before. Previously, it tended to view the country’s Internet as a tool to not only get its own word out to the wider world, but also to try and spy on and monitor the activities of the rebels and activists.

With fighting intensifying in and around the capital and the commercial city of Aleppo, the decision to throw the kill switch might indicate a decision to try to disrupt enemy communications. Or it might mask a seriously aggressive military action that it wants to keep as secret as possible. We don’t know yet.

0
Your rating: None

When it acquired Yammer over the summer for $1.2 billion, Microsoft essentially admitted that it had lost any edge it might have once had in the social enterprise and collaboration software space. SharePoint has long been the hated, entrenched collaboration platform that, along with Microsoft’s Exchange and Office, so many upstart enterprise cloud companies like Jive have sought to beat up on, mainly because it was so big: Microsoft today disclosed that SharePoint is a $2 billion business.

Now Yammer is not only part of SharePoint, but a part of all the company’s mainstream business apps. At a SharePoint-oriented conference in Las Vegas, Microsoft announced today that the kind of social features that Yammer provides — and which SharePoint was widely criticized for not having, or at least for not having executed well — are now just part of every business application. For openers, Yammer has been integrated into Office 365 Enterprise and with SharePoint Online.

Also gone from Yammer is the four-tiered pricing model that at once made it so successful and yet ultimately was said to have doomed its long-term viability as a business. Yammer had picked up a lot of its momentum by being free for companies to use indefinitely, but it was supposedly a lot more powerful if you got one of the paid versions. The problem was that the free version was usually good enough, and few cared enough to try the paid version. The result: Converting free customers to paid customers was pretty tough.

Microsoft has sought to fix that by cutting the number of versions to two — one free, called Yammer Basic, that will be the simple, standalone version. On the other, Yammer Enterprise, Microsoft has slashed the price from $15 a user to $3 a user.

It’s all part of a broader “social everywhere” strategy that will in time see social features crop up everywhere you see a Microsoft logo: Office, Outlook, Skype. Everything that happens at the office that involves another person becomes an event that shows up in the social feed.

As Microsoft corporate VP Jeff Teper was quoted in the big announcement today: “We envision a world in which social is woven into the apps you use every day — where people work together using new experiences that combine the power of social with collaboration, email and unified communications.”

If you’ve got a few hours, you can watch the action in today’s keynote here.

0
Your rating: None

If you haven’t heard of AnchorFree, then there’s a pretty good chance you’re just not the type of person who worries about using open Wi-Fi hotspots and the security implications that tend to arise from that.

Today, AnchorFree announced that Goldman Sachs has made a $52 million Series C investment. Prior investors include RENN Capital. The investment brings its total capital raised to $63 million.

AnchorFree makes Hotspot Shield, a simple VPN tool that’s in use by some 60 million people around the world. It’s the creation of David Gorodyansky and Eugene Malobrodsky, who started the company in 2005 just after finishing college.

During the Tahrir Square uprising in Egypt, the company recorded more than a million downloads of the product in a single night. It’s now in use in 190 countries. The point of the capital raise is simple: Push the number of users higher still. There are, after all, some 1.6 billion people using the Internet.

0
Your rating: None

Not everyone wants to run their applications on the public cloud. Their reasons can vary widely. Some companies don’t want the crown jewels of their intellectual property leaving the confines of their own premises. Some just like having things run on a server they can see and touch.

But there’s no denying the attraction of services like Amazon Web Services or Joyent or Rackspace, where you can spin up and configure a new virtual machine within minutes of figuring out that you need it. So, many companies seek to approximate the experience they would get from a public cloud provider on their own internal infrastructure.

It turns out that a start-up I had never heard of before this week is the most widely deployed platform for running these “private clouds,” and it’s not a bad business. Eucalyptus Systems essentially enables the same functionality on your own servers that you would expect from a cloud provider.

Eucalyptus said today that it has raised a $30 million Series C round of venture capital funding led by Institutional Venture Partners. Steve Harrick, general partner at IVP, will join the Eucalyptus board. Existing investors, including Benchmark Capital, BV Capital and New Enterprise Associates, are also in on the round. The funding brings Eucalyptus’ total capital raised to north of $50 million.

The company has an impressive roster of customers: Sony, Intercontinental Hotels, Raytheon, and the athletic-apparel group Puma. There are also several government customers, including the U.S. Food and Drug Administration, NASA, the U.S. Department of Agriculture and the Department of Defense.

In March, Eucalyptus signed a deal with Amazon to allow customers of both to migrate their workloads between the private and public environments. The point here is to give companies the flexibility they need to run their computing workloads in a mixed environment, or move them back and forth as needed. They could also operate them in tandem.

Key to this is a provision of the deal with Amazon that gives Eucalyptus access to Amazon’s APIs. What that means is that you can run processes on your own servers that are fully compatible with Amazon’s Simple Storage Service (S3), or its Elastic Compute cloud, known as EC2. “We’ve removed all the hurdles that might have been in the way of moving workloads,” Eucalyptus CEO Marten Mickos told me. The company has similar deals in place with Wipro Infotech in India and CETC32 in China.

0
Your rating: None

larry ellison

Editor’s note: Aaron Levie is CEO of Box. Follow him on Twitter @levie.

In 1997, Larry Ellison had a vision for a new paradigm of computing which he called the Network Computer (NC). The idea was simple: a group of partners would build devices and services that leveraged the power of the Internet to compete against the growing Windows monopoly.

Ellison believed that the computer in the client/server era had evolved into too complex a machine for most tasks. With the NC, the ‘heavy’ computation of software and infrastructure would be abstracted from the actual device and delivered instead to thinner terminals via the web, thus radically simplifying access and enabling all new applications and mobility.

But the NC never made it mainstream. Microsoft and its allies had already amassed considerable power, and the cost of personal computers was dropping rapidly, making them even more attractive and ubiquitous. Furthermore, many of the applications were too immature to compete with the desktop software experience at the time; and few people, as it turned out, wanted to buy a device championed by Oracle.

The NC fell short on execution, but Ellison was right about the vision: “It’s the first step beyond personal computing, to universal computing.” In many ways, he was the first to glimpse a future resembling the post-PC world we are rapidly moving towards today.

15 years later, it is Apple that has brought its version of this vision to life. And Apple’s rising tide – already 172 million devices strong, sold in the last year alone – has in turn given rise to a massive, vibrant ecosystem: companies generating hundreds of millions and billions of dollars in value in under a few years, revolutionizing industries like gaming, social networking, entertainment and communications in the process. Then of course there’s Instagram.  All proving that value created in this mobile and Post-PC world will rival traditional computing categories.

But the post-PC transformation isn’t limited to the consumer landscape. In the enterprise, we’re transitioning to a way of working that is far more fluid, boundary-less and social. And mobile pushes computing to the cloud and rewrites all applications in its wake. Those who saw it coming (Oracle) and those who initially resisted its arrival (Microsoft) have equally been taken by surprise by the power and speed of the post-PC shift within today’s enterprises, and it’s creating one of the biggest opportunities ever.

Why the change is so profound

We recently met with the IT leadership team of a fairly conservative 50,000-person organization where all the participants all had iPads. No big surprise there. But the apps they were using were radically different from what you would have found in their organization only a few years back – a mix of apps from a new set of vendors that together supplant the traditional Microsoft Office stack.

Post-PC devices are driving enterprises to rethink their entire IT architecture, thanks to a wildly unpredictable and improbable chain reaction set off by a new consumer device from Apple.  For the first time in decades, CIOs have the opportunity – and necessity – to completely re-imagine and rebuild their technology strategy from the ground up. Catalyzing this change is the fact that the technology switching costs are often less than the price of maintaining existing solutions. A shipment of 1,000 new iPads requires applications to run on these devices – and choosing all-new applications and vendors is generally cheaper than the service fees, infrastructure, and operational costs of legacy software.

And thus, the Post-PC era drives the final nail in the coffin of the traditional enterprise software hegemony. Microsoft, in particular, built up a practical monopoly that lasted nearly twenty years, and forced an entire industry to conform to its way of seeing the world.  Yet this arrangement served its benefactor far more than the ecosystem, as the Redmond giant built up leadership positions across nearly every application category.

In the Post-PC era, businesses will shift from deploying and managing end-to-end enterprise solutions from a single vendor, to consuming apps a la carte both as individuals and en masse. But which apps and vendors will help define this new world?

What’s coming won’t look like what came before

Change always begins incrementally at first. Predicting specifically what will happen in the next year or two is a far more realistic undertaking than anticipating where we’ll be in a decade. In shifting from one technology generation to the next, we minimize disruption by porting the old way of doing things to newer mediums or channels. Not until the new model settles in do we see the real results that rise from these foundational shifts.

Mobility is such a foundational shift, and it’s still very, very early. Even when the Microsofts and Oracles of the world relent and build applications for post-PC devices, these apps will carry much of the DNA of their desktop predecessors. We can imagine that each of the enterprise mainstays – ERP, HR management, supply chain, business intelligence, and office productivity – will be painstakingly moved to mobile. But that’s just the first phase.

Emerging CRM startups like Base will challenge longstanding assumptions about where and how you manage customer interactions. Data visualization software like Roambi will make business analytics more valuable by making it available everywhere. Entire industries are already being transformed: mobile healthcare apps will enable cutting-edge health outcomes, and construction sites will eventually be transformed by apps like PlanGrid.  Companies like CloudOn and Onlive aim to virtualize applications that we never imagined would be available outside the office walls. Evernote’s 20+ million users already make it one of the most popular independent productivity software apps of all time, whose value is dramatically amplified by this revolution.  In a mobile and Post-PC world, the very definition of the office suite is transformed.

And with this transformation, much of the $288B spent annually on enterprise software is up for grabs.  The post-PC era is about no longer being anchored to a handful of solutions in the PC paradigm. Instead, we’re moving to a world where we mix and match best-of-breed solutions. This means more competition and choice, which means new opportunities for startups, which should mean more innovation for customers. As soon as individual workers look to the App Store for an immediate solution to their problem instead of calling IT (who in turn calls a vendor) you can tell things will never be the same.

In many ways, the enterprise software shift mirrors that of the media and cable companies fighting for relevance in a world moving to digital content (HT @hamburger). If users and enterprises can select apps that are decoupled from an entire suite, we might find they’d use a completely different set of technology, just as many consumers would only subscribe to HBO or Showtime if given the option.

Of course, every benefit brings a new and unique challenge. In a world where users bring their own devices into the workplace, connect to any network, and use a mix of apps, managing and securing business information becomes an incredibly important and incredibly challenging undertaking. Similarly, how do we get disparate companies to build apps that work together, instead of spawning more data silos?  And as we move away from large purchases of suites from a single provider, what is the new business model that connects vendors with customers (both end users and IT departments) with minimal friction?

And then there’s the inherent fragmentation of devices and platforms that defines the post-PC era. Android, iOS, and Windows 7 and 8 all have different languages and frameworks, UI patterns, and marketplaces. The fate of mobile HTML5 is still indeterminate. Fragmentation and sprawl of apps and data is now the norm. And while this fragmentation is creating headaches for businesses and vendors alike, it’s also opening a window for the next generation of enterprise software leaders to emerge and redefine markets before the industry settles into this new paradigm.

It would appear that Larry Ellison’s vision for the NC was right all along, just 15 years early. Welcome to the post-PC enterprise.

0
Your rating: None