abel is in the early stages of developing a closed-source financial app within a niche market. He is hiring his first employees, and he wants to take steps to ensure these new hires don't steal the code and run away. "I foresee disabling USB drives and DVD writers on my development machines," he writes. But will that be enough? Maybe a better question is: will that be too much?
See the original question here.
The Linux Foundation has taken control of the open source Xen virtualization platform and enlisted a dozen industry giants in a quest to be the leading software for building cloud networks.
The 10-year-old Xen hypervisor was formerly a community project sponsored by Citrix, much as the Fedora operating system is a community project sponsored by Red Hat. Citrix was looking to place Xen into a vendor-neutral organization, however, and the Linux Foundation move was announced today. The list of companies that will "contribute to and guide the Xen Project" is impressive, including Amazon Web Services, AMD, Bromium, Calxeda, CA Technologies, Cisco, Citrix, Google, Intel, Oracle, Samsung, and Verizon.
Amazon is perhaps the most significant name on that list in regard to Xen. The Amazon Elastic Compute Cloud is likely the most widely used public infrastructure-as-a-service (IaaS) cloud, and it is built on Xen virtualization. Rackspace's public cloud also uses Xen. Linux Foundation Executive Director Jim Zemlin noted in his blog that Xen "is being deployed in public IaaS environments by some of the world's largest companies."
Aurich Lawson / Thinkstock
Tens of thousands of websites, some operated by The Los Angeles Times, Seagate, and other reputable companies, have recently come under the spell of "Darkleech," a mysterious exploitation toolkit that exposes visitors to potent malware attacks.
The ongoing attacks, estimated to have infected 20,000 websites in the past few weeks alone, are significant because of their success in targeting Apache, by far the Internet's most popular Web server software. Once it takes hold, Darkleech injects invisible code into webpages, which in turn surreptitiously opens a connection that exposes visitors to malicious third-party websites, researchers said. Although the attacks have been active since at least August, no one has been able to positively identify the weakness attackers are using to commandeer the Apache-based machines. Vulnerabilities in Plesk, Cpanel, or other software used to administer websites is one possibility, but researchers aren't ruling out the possibility of password cracking, social engineering, or attacks that exploit unknown bugs in frequently used applications and OSes.
Researchers also don't know precisely how many sites have been infected by Darkleech. The server malware employs a sophisticated array of conditions to determine when to inject malicious links into the webpages shown to end users. Visitors using IP addresses belonging to security and hosting firms are passed over, as are people who have recently been attacked or who don't access the pages from specific search queries. The ability of Darkleech to inject unique links on the fly is also hindering research into the elusive infection toolkit.
I've been a Microsoft developer for decades now. I weaned myself on various flavors of home computer Microsoft Basic, and I got my first paid programming gigs in Microsoft FoxPro, Microsoft Access, and Microsoft Visual Basic. I have seen the future of programming, my friends, and it is terrible CRUD apps running on Wintel boxes!
Of course, we went on to build Stack Overflow in Microsoft .NET. That's a big reason it's still as fast as it is. So one of the most frequently asked questions after we announced Discourse was:
Why didn't you build Discourse in .NET, too?
Let me be clear about something: I love .NET. One of the greatest thrills of my professional career was getting the opportunity to place a Coding Horror sticker in the hand of Anders Hejlsberg. Pardon my inner fanboy for a moment, but oh man I still get chills. There are maybe fifty world class computer language designers on the planet. Anders is the only one of them who built Turbo Pascal and Delphi. It is thanks to Anders' expert guidance that .NET started out such a remarkably well designed language – literally what Java should have been on every conceivable level – and has continued to evolve in remarkably practical ways over the last 10 years, leveraging the strengths of other influential dynamically typed languages.
All that said, it's true that I intentionally chose not to use .NET for my next project. So you might expect to find an angry, righteous screed here about how much happier I am leaving the oppressive shackles of my Microsoft masters behind. Free at last, free at least, thank God almighty I'm free at last!
Like any pragmatic programmer, I pick the appropriate tool for the job at hand. And as much as I may love .NET, it would be an extraordinarily poor choice for an 100% open source project like Discourse. Why? Three reasons, mainly:
The licensing. My God, the licensing. It's not so much the money, as the infernal, mind-bending tax code level complexity involved in making sure all your software is properly licensed: determining what 'level' and 'edition' you are licensed at, who is licensed to use what, which servers are licensed... wait, what? Sorry, I passed out there for a minute when I was attacked by rabid licensing weasels.
I'm not inclined to make grand pronouncements about the future of software, but if anything kills off commercial software, let me tell you, it won't be open source software. They needn't bother. Commercial software will gleefully strangle itself to death on its own licensing terms.
The friction. If you want to build truly viable open source software, you need people to contribute to your project, so that it is a living, breathing, growing thing. And unless you can download all the software you need to hack on your project freely from all over the Internet, no strings attached, there's just … too much friction.
If Stack Overflow taught me anything, it is that we now live in a world where the next brilliant software engineer can come from anywhere on the planet. I'm talking places this ugly American programmer has never heard of, where they speak crazy nonsense moon languages I can't understand. But get this. Stand back while I blow your mind, people: these brilliant programmers still code in the same keywords we do! I know, crazy, right?
Getting up and running with a Microsoft stack is just plain too hard for a developer in, say, Argentina, or Nepal, or Bulgaria. Open source operating systems, languages, and tool chains are the great equalizer, the basis for the next great generation of programmers all over the world who are going to help us change the world.
The ecosystem. When I was at Stack Exchange we strove mightily to make as much of our infrastructure open source as we could. It was something that we made explicit in the compensation guidelines, this idea that we would all be (partially) judged by how much we could do in public, and try to leave behind as many useful, public artifacts of our work as we could. Because wasn't all of Stack Exchange itself, from the very first day, built on your Creative Commons contributions that we all share ownership of?
You can certainly build open source software in .NET. And many do. But it never feels natural. It never feels right. Nobody accepts your patch to a core .NET class library no matter how hard you try. It always feels like you're swimming upstream, in a world of small and large businesses using .NET that really aren't interested in sharing their code with the world – probably because they know it would suck if they did, anyway. It is just not a native part of the Microsoft .NET culture to make things open source, especially not the things that suck. If you are afraid the things you share will suck, that fear will render you incapable of truly and deeply giving back. The most, uh, delightful… bit of open source communities is how they aren't afraid to let it "all hang out", so to speak.
So as a result, for any given task in .NET you might have – if you're lucky – a choice of maybe two decent-ish libraries. Whereas in any popular open source language, you'll easily have a dozen choices for the same task. Yeah, maybe six of them will be broken, obsolete, useless, or downright crazy. But hey, even factoring in some natural open source spoilage, you're still ahead by a factor of three! A winner is you!
As I wrote five years ago:
I'm a pragmatist. For now, I choose to live in the Microsoft universe. But that doesn't mean I'm ignorant of how the other half lives. There's always more than one way to do it, and just because I chose one particular way doesn't make it the right way – or even a particularly good way. Choosing to be provincial and insular is a sure-fire path to ignorance. Learn how the other half lives. Get to know some developers who don't live in the exact same world you do. Find out what tools they're using, and why. If, after getting your feet wet on both sides of the fence, you decide the other half is living better and you want to join them, then I bid you a fond farewell.
I no longer live in the Microsoft universe any more. Right, wrong, good, evil, that's just how it turned out for the project we wanted to build.
However, I'd also be lying if I didn't mention that I truly believe the sort of project we are building in Discourse does represent most future software. If you squint your eyes a little, I think you can see a future not too far in the distance where .NET is a specialized niche outside the mainstream.
But why Ruby? Well, the short and not very glamorous answer is that I had narrowed it down to either Python or Ruby, and my original co-founder Robin Ward has been building major Rails apps since 2006. So that clinched it.
I've always been a little intrigued by Ruby, mostly because of the absolutely gushing praise Steve Yegge had for the language way back in 2006. I've never forgotten this.
For the most part, Ruby took Perl's string processing and Unix integration as-is, meaning the syntax is identical, and so right there, before anything else happens, you already have the Best of Perl. And that's a great start, especially if you don't take the Rest of Perl.
But then Matz took the best of list processing from Lisp, and the best of OO from Smalltalk and other languages, and the best of iterators from CLU, and pretty much the best of everything from everyone.
And he somehow made it all work together so well that you don't even notice that it has all that stuff. I learned Ruby faster than any other language, out of maybe 30 or 40 total; it took me about 3 days before I was more comfortable using Ruby than I was in Perl, after eight years of Perl hacking. It's so consistent that you start being able to guess how things will work, and you're right most of the time. It's beautiful. And fun. And practical.
Steve is one of those polyglot programmers I respect so much that I basically just take whatever his opinion is, provided it's not about something wacky like gun control or feminism or T'Pau, and accept it as fact.
I apologize, Steve. I'm sorry it took me 7 years to get around to Ruby. But maybe I was better off waiting a while anyway:
Ruby is a decent performer, but you really need to throw fast hardware at it for good performance. Yeah, I know, interpreted languages are what they are, and caching, database, network, blah blah blah. Still, we obtained the absolute fastest CPUs you could buy for the Discourse servers, 4.0 Ghz Ivy Bridge Xeons, and performance is just … good on today's fastest hardware. Not great. Good.
Yes, I'll admit that I am utterly spoiled by the JIT compiled performance of .NET. That's what I am used to. I do sometimes pine away for the bad old days of .NET when we could build pages that serve in well under 50 milliseconds without thinking about it too hard. Interpreted languages aren't going to be able to reach those performance levels. But I can only imagine how rough Ruby performance had to be back in the dark ages of 2006 when CPUs and servers were five times slower than they are today! I'm so very glad that I am hitting Ruby now, with the strong wind of many solid years of Moore's law at our backs.
Ruby is maturing up nicely in the 2.0 language release, which happened not more than a month after Discourse was announced. So, yes, the downside is that Ruby is slow. But the upside is there is a lot of low hanging performance fruit in Ruby-land. Like.. a lot a lot. On Discourse we got an across the board 20% performance improvement just upgrading to Ruby 2.0, and we nearly doubled our performance by increasing the default Ruby garbage collection limit. From a future performance perspective, Ruby is nothing but upside.
Ruby isn't cool any more. Yeah, you heard me. It's not cool to write Ruby code any more. All the cool people moved on to slinging Scala and Node.js years ago. Our project isn't cool, it's just a bunch of boring old Ruby code. Personally, I'm thrilled that Ruby is now mature enough that the community no longer needs to bother with the pretense of being the coolest kid on the block. That means the rest of us who just like to Get Shit Done can roll up our sleeves and focus on the mission of building stuff with our peers rather than frantically running around trying to suss out the next shiny thing.
And of course the Ruby community is, and always has been, amazing. We never want for great open source gems and great open source contributors. Now is a fantastic time to get into Ruby, in my opinion, whatever your background is.
Even if done in good will and for the best interests of the project, it's still a little scary to totally change your programming stripes overnight after two decades. I've always believed that great programmers learn to love more than one language and programming environment – and I hope the Discourse project is an opportunity for everyone to learn and grow, not just me. So go fork us on GitHub already!
[advertisement] Hiring developers? Post your open positions with Stack Overflow Careers and reach over 20MM awesome devs already on Stack Overflow. Create your satisfaction-guaranteed job listing today!
A representation of how TLS works.
Nadhem J. AlFardan and Kenneth G. Paterson
Software developers are racing to patch a recently discovered vulnerability that allows attackers to recover the plaintext of authentication cookies and other encrypted data as they travel over the Internet and other unsecured networks.
The discovery is significant because in many cases it makes it possible for attackers to completely subvert the protection provided by the secure sockets layer and transport layer protocols. Together, SSL, TLS, and a close TLS relative known as Datagram Transport Layer Security are the sole cryptographic means for websites to prove their authenticity and to encrypt data as it travels between end users and Web servers. The so-called "Lucky Thirteen" attacks devised by computer scientists to exploit the weaknesses work against virtually all open-source TLS implementations, and possibly implementations supported by Apple and Cisco Systems as well. (Microsoft told the researchers it has determined its software isn't susceptible.)
The attacks are extremely complex, so for the time being, average end users are probably more susceptible to attacks that use phishing e-mails or rely on fraudulently issued digital certificates to defeat the Web encryption protection. Nonetheless, the success of the cryptographers' exploits—including the full plaintext recovery of data protected by the widely used OpenSSL implementation—has clearly gotten the attention of the developers who maintain those programs. Already, the Opera browser and PolarSSL have been patched to plug the hole, and developers for OpenSSL, NSS, and CyaSSL are expected to issue updates soon.
hypnosec writes "Following news that a Java 0-day has been rolled into exploit kits, without any patch to fix the vulnerability, Mozilla and Apple have blocked the latest versions of Java on Firefox and Mac OS X respectively. Mozilla has taken steps to protect its user base from the yet-unpatched vulnerability. Mozilla has added to its Firefox add-on block-list: Java 7 Update 10, Java 7 Update 9, Java 6 Update 38 and Java 6 Update 37. Similar steps have also been taken by Apple; it has updated its anti-malware system to only allow version 220.127.116.11 or higher, thereby automatically blocking the vulnerable version, 18.104.22.168." Here are some ways to disable Java, if you're not sure how.
Read more of this story at Slashdot.
We all know about the patent wars that have dominated the mobile industry over the last couple of years.
Apple and Samsung, Motorola and Microsoft, Oracle and Google — to name just a few. But there are also the patent disputes that you never hear about.
Many of these lawsuits are filed by little-known companies whose sole purpose in being is to bring patent actions and collect money for their owners. Often dubbed patent trolls, such non-practicing entities now make up the bulk of patent suits.
Within the broader category of non-practicing entities are different types of firms, including defensive patent collectors, start-ups as well as companies whose sole business is suing companies with products in the market. That last category now accounts for more than three-fifths of all patent action, according to a study by Santa Clara University Law School professor Colleen V. Chien.
Chien, who presented her findings at a Department of Justice/Federal Trade Commission event on Monday, said that while the economics of bringing suit help keep overall patent actions in check, the economies of scale have made patent trolling into a profitable business.
First of all, while companies that make goods are typically countersued for infringing on their target’s patents, non-practicing entities don’t make anything and therefore can’t be countersued.
Secondly, while big companies like Apple, Samsung and Google rack up huge legal fees in their battles, non-practicing entities have found a more cost-effective option. Much like injury victims, the patent firms often find lawyers willing to work on a contingency basis.
That leaves the companies with only the direct expenses related to their lawsuits, which are themselves often minimized by filing multiple similar suits against different companies. That spreads out the costs and lessens the impact of losing any one case.
As a result, the incentives that may be forcing deals such as Apple’s recent settlement with HTC aren’t having the same effect on the non-practicing entities.
“The assumption is that companies will eventually tire of the smartphone wars between operating companies,” Chien told AllThingsD. “Suits invite countersuits and are expensive, disruptive, and messy. These restraints don’t apply to companies that assert patents as a business model.”
And for every suit brought, there are dozens more that get settled before a court action is filed, in large part because the targets know it is cheaper to settle in many cases than to fight things out.
While many of these non-practicing entities have names few people have ever heard of, the field has spawned some big players, perhaps most notably Nathan Myhrvold’s Intellectual Ventures. (Several spinoff businesses have come out of Myhrvold’s firm, which touts its in-house invention capabilities in addition to its collection of acquired patents.)
Even start-ups, particularly well-funded ones, are finding themselves in the crosshairs, Chien said.