Skip navigation
Help

IT

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Ben Cherian

software380

Image copyright isak55

In every emerging technology market, hype seems to wax and wane. One day a new technology is red hot, the next day it’s old hat. Sometimes the hype tends to pan out and concepts such as “e-commerce” become a normal way to shop. Other times the hype doesn’t meet expectations, and consumers don’t buy into paying for e-commerce using Beenz or Flooz. Apparently, Whoopi Goldberg and a slew of big name VCs ended up making a bad bet on the e-currency market in the late 1990s. Whoopi was paid in cash and shares of Flooz. At least, she wasn’t paid in Flooz alone! When investing, some bets are great and others are awful, but often, one only knows the awful ones in retrospect.

What Does “Software Defined” Mean?

In the infrastructure space, there is a growing trend of companies calling themselves “software defined (x).” Often, it’s a vendor that is re-positioning a decades-old product. On occasion, though, it’s smart, nimble startups and wise incumbents seeing a new way of delivering infrastructure. Either way, the term “software defined” is with us to stay, and there is real meaning and value behind it if you look past the hype.

There are three software defined terms that seem to be bandied around quite often: software defined networking, software defined storage, and the software defined data center. I suspect new terms will soon follow, like software defined security and software defined management. What all these “software-defined” concepts really boil down to is: Virtualization of the underlying component and accessibility through some documented API to provision, operate and manage the low-level component.

This trend started once Amazon Web Services came onto the scene and convinced the world that the data center could be abstracted into much smaller units and could be treated as disposable pieces of technology, which in turn could be priced as a utility. Vendors watched Amazon closely and saw how this could apply to the data center of the future.

Since compute was already virtualized by VMware and Xen, projects such as Eucalyptus were launched with the intention to be a “cloud controller” that would manage the virtualized servers and provision virtual machines (VMs). Virtualized storage (a.k.a. software defined storage) was a core part of the offering and projects like OpenStack Swift and Ceph showed the world that storage could be virtualized and accessed programmatically. Today, software defined networking is the new hotness and companies like Midokura, VMware/Nicira, Big Switch and Plexxi are changing the way networks are designed and automated.

The Software Defined Data Center

The software defined data center encompasses all the concepts of software defined networking, software defined storage, cloud computing, automation, management and security. Every low-level infrastructure component in a data center can be provisioned, operated, and managed through an API. Not only are there tenant-facing APIs, but operator-facing APIs which help the operator automate tasks which were previously manual.

An infrastructure superhero might think, “With great accessibility comes great power.” The data center of the future will be the software defined data center where every component can be accessed and manipulated through an API. The proliferation of APIs will change the way people work. Programmers who have never formatted a hard drive will now be able to provision terabytes of data. A web application developer will be able to set up complex load balancing rules without ever logging into a router. IT organizations will start automating the most mundane tasks. Eventually, beautiful applications will be created that mimic the organization’s process and workflow and will automate infrastructure management.

IT Organizations Will Respond and Adapt Accordingly

Of course, this means the IT organization will have to adapt. The new base level of knowledge in IT will eventually include some sort of programming knowledge. Scripted languages like Ruby and Python will soar even higher in popularity. The network administrators will become programmers. The system administrators will become programmers. During this time, DevOps (development + operations) will make serious inroads in the enterprise and silos will be refactored, restructured or flat-out broken down.

Configuration management tools like Chef and Puppet will be the glue for the software defined data center. If done properly, the costs around delivering IT services will be lowered. “Ghosts in the system” will watch all the components (compute, storage, networking, security, etc.) and adapt to changes in real-time to increase utilization, performance, security and quality of service. Monitoring and analytics will be key to realizing this software defined future.

Big Changes in Markets Happen With Very Simple Beginnings

All this amazing innovation comes from two very simple concepts — virtualizing the underlying components and making it accessible through an API.

The IT world might look at the software defined data center and say this is nothing new. We’ve been doing this since the 80s. I disagree. What’s changed is our universal thinking about accessibility. Ten years ago, we wouldn’t have blinked if a networking product came out without an API. Today, an API is part of what we consider a 1.0 release. This thinking is pervasive throughout the data center today with every component. It’s Web 2.0 thinking that shaped cloud computing and now cloud computing is bleeding into enterprise thinking. We’re no longer constrained by the need to have deep specialized knowledge in the low-level components to get basic access to this technology.

With well documented APIs, we have now turned the entire data center into many instruments that can be played by the IT staff (musicians). I imagine the software defined data center to be a Fantasia-like world where Mickey is the IT staff and the brooms are networking, storage, compute and security. The magic is in the coordination, cadence and rhythm of how all the pieces work together. Amazing symphonies of IT will occur in the near future and this is the reason the software defined data center is not a trend to overlook. Maybe Whoopi should take a look at this market instead.

Ben Cherian is a serial entrepreneur who loves playing in the intersection of business and technology. He’s currently the Chief Strategy Officer at Midokura, a network virtualization company. Prior to Midokura, he was the GM of Emerging Technologies at DreamHost, where he ran the cloud business unit. Prior to that, Ben ran a cloud-focused managed services company.

0
Your rating: None

Sdlist

If there’s a good example of the phrase thankless task, it’s trying to corral all the members of It’s Nice That and our sister studio INT Works into some kind of coherent whole. And yet that’s exactly what photographer Suki Dhanda was up against when she came into create the portrait for Eye’s feature on us. Thankfully she has as much patience as she does talent and the end result was a cracker. But it’s no surprise that Suki smashed it out the park because she’s amazing, with a particularly powerful portfolio of portraits ranging from a soul-searching Jude Law to a tie-wielding Nick Clegg. Anyone tempted to underplay the sheer skill that makes a great portraitist from the rest need only immerse themselves in her site.

Read more

0
Your rating: None

The prevalence of free, open WiFi has made it rather easy for a WiFi eavesdropper to steal your identity cookie for the websites you visit while you're connected to that WiFi access point. This is something I talked about in Breaking the Web's Cookie Jar. It's difficult to fix without making major changes to the web's infrastructure.

In the year since I wrote that, a number of major websites have "solved" the WiFi eavesdropping problem by either making encrypted HTTPS web traffic an account option or mandatory for all logged in users.

For example, I just noticed that Twitter, transparently to me and presumably all other Twitter users, switched to an encrypted web connection by default. You can tell because most modern browsers show the address bar in green when the connection is encrypted.

Twitter-https-encryption-indicators

I initially resisted this as overkill, except for obvious targets like email (the skeleton key to all your online logins) and banking.

Yes, you can naively argue that every website should encrypt all their traffic all the time, but to me that's a "boil the sea" solution. I'd rather see a better, more secure identity protocol than ye olde HTTP cookies. I don't actually care if anyone sees the rest of my public activity on Stack Overflow; it's hardly a secret. But gee, I sure do care if they somehow sniff out my cookie and start running around doing stuff as me! Encrypting everything just to protect that one lousy cookie header seems like a whole lot of overkill to me.

Of course, there's no reason to encrypt traffic for anonymous, not-logged-in users, and Twitter doesn't. You get a plain old HTTP connection until you log in, at which point they automatically switch to HTTPS encryption. Makes sense.

It was totally painless for me, as a user, and it makes stealing my Twitter identity, or eavesdropping on my Twitter activity (as fascinating as I know that must sound), dramatically more difficult. I can't really construct a credible argument against doing this, even for something as relatively trivial as my Twitter account, and it has some definite benefits. So perhaps Twitter has the right idea here; maybe encrypted connections should be the default for all web sites. As tinfoil hat as this seemed to me a year ago, now I'm wondering if that might actually be the right thing to do for the long-term health of the overall web, too.

ENCRYPT ALL THE THINGS

Why not boil the sea, then? Let us encrypt all the things!

HTTPS isn't (that) expensive any more

Yes, in the hoary old days of the 1999 web, HTTPS was quite computationally expensive. But thanks to 13 years of Moore's Law, that's no longer the case. It's still more work to set up, yes, but consider the real world case of GMail:

In January this year (2010), Gmail switched to using HTTPS for everything by default. Previously it had been introduced as an option, but now all of our users use HTTPS to secure their email between their browsers and Google, all the time. In order to do this we had to deploy no additional machines and no special hardware. On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10KB of memory per connection and less than 2% of network overhead. Many people believe that SSL takes a lot of CPU time and we hope the above numbers (public for the first time) will help to dispel that.

HTTPS means The Man can't spy on your Internet

Since all the traffic between you and the websites you log in to would now be encrypted, the ability of nefarious evildoers to either …

  • steal your identity cookie
  • peek at what you're doing
  • see what you've typed
  • interfere with the content you send and receive

… is, if not completely eliminated, drastically limited. Regardless of whether you're on open public WiFi or not.

Personally, I don't care too much if people see what I'm doing online since the whole point of a lot of what I do is to … let people see what I'm doing online. But I certainly don't subscribe to the dangerous idea that "only criminals have things to hide"; everyone deserves the right to personal privacy. And there are lots of repressive governments out there who wouldn't hesitate at the chance to spy on what their citizens do online, or worse. Much, much worse. Why not improve the Internet for all of them at once?

HTTPS goes faster now

Security always comes at a cost, and encrypting a web connection is no different. HTTPS is going to be inevitably slower than a regular HTTP connection. But how much slower? It used to be that encrypted content wouldn't be cached in some browsers, but that's no longer true. And Google's SPDY protocol, intended as a drop-in replacement for HTTP, even goes so far as to bake encryption in by default, and not just for better performance:

[It is a specific technical goal of SPDY to] make SSL the underlying transport protocol, for better security and compatibility with existing network infrastructure. Although SSL does introduce a latency penalty, we believe that the long-term future of the web depends on a secure network connection. In addition, the use of SSL is necessary to ensure that communication across existing proxies is not broken.

There's also SSL False Start which requires a modern browser, but reduces the painful latency inherent in the expensive, but necessary, handshaking required to get encryption going. SSL encryption of HTTP will never be free, exactly, but it's certainly a lot faster than it used to be, and getting faster every year.

Bolting on encryption for logged-in users is by no means an easy thing to accomplish, particularly on large, established websites. You won't see me out there berating every public website for not offering encrypted connections yesterday because I know how much work it takes, and how much additional complexity it can add to an already busy team. Even though HTTPS is way easier now than it was even a few years ago, there are still plenty of tough gotchas: proxy caching, for example, becomes vastly harder when the proxies can no longer "see" what the encrypted traffic they are proxying is doing. Most sites these days are a broad mashup of content from different sources, and technically all of them need to be on HTTPS for a properly encrypted connection. Relatively underpowered and weakly connected mobile devices will pay a much steeper penalty, too.

Maybe not tomorrow, maybe not next year, but over the medium to long term, adopting encrypted web connections as a standard for logged-in users is the healthiest direction for the future of the web. We need to work toward making HTTPS easier, faster, and most of all, the default for logged in users.

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.

0
Your rating: None