What data structure is more sacred than the link list? If we get rid of it what silly interview questions would we use instead? But not using linked-lists is exactly what Aater Suleman recommends in Should you ever use Linked-Lists?

In The Secret To 10 Million Concurrent Connections one of the important strategies is not scribbling data all over memory via pointers because following pointers increases cache misses which **reduces performance**. And there’s nothing more iconic of pointers than the link list.

Here are Aeter's reasons to be anti-linked-list:

What was Microsoft's original mission?

In 1975, Gates and Allen form a partnership called Microsoft. Like most startups, Microsoft begins small, but has a huge vision – a computer on every desktop and in every home.

The existential crisis facing Microsoft is that *they achieved their mission years ago*, at least as far as the developed world is concerned. **When was the last time you saw a desktop or a home without a computer? 2001? 2005?** We're long since past the point where Microsoft's original BHAG was met, and even exceeded. PCs are absolutely ubiquitous. When you wake up one day to discover that you've completely conquered the world … what comes next?

Apparently, the Post PC era.

Microsoft never seemed to recover from the shock of achieving their original 1975 goal. Or perhaps they thought that they hadn't quite achieved it, that there would always be some new frontier for PCs to conquer. But Steve Jobs certainly saw the Post PC era looming as far back as 1996:

The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That's over. Apple lost. The desktop market has entered the dark ages, and it's going to be in the dark ages for the next 10 years, or certainly for the rest of this decade.

If I were running Apple, I would milk the Macintosh for all it's worth – and get busy on the next great thing. The PC wars are over. Done. Microsoft won a long time ago.

What's more, Jobs did something about it. Apple is arguably the biggest (and in terms of financials, now *literally* the biggest) enemy of general purpose computing with the iPhone and iPad. These days, their own general purpose Mac operating system, OS X, largely plays second fiddle to the iOS juggernaut powering the iPhone and iPad.

The slope of this graph is the whole story. The complicated general purpose computers are at the bottom, and the simpler specialized computers are at the top.

I'm incredibly conflicted, because as much as I love the do-anything computer …

- I'm not sure that many people in the world truly
*need*a general purpose computer that can do anything and install any kind of software. Simply meeting the core needs of browsing the web and email and maybe a few other basic things covers a lot of people. - I believe the kitchen-sink-itis baked into the general purpose computing foundations of PCs, Macs, and Unix make them fundamentally incompatible with our brave new Post PC world. Updates. Toolbars. Service Packs. Settings. Anti-virus. Filesystems. Control panels. All the stuff you hate when your Mom calls you for tech support? It's deeply embedded into of the culture and design of every single general purpose computer. Doing potentially "anything" comes at a steep cost in complexity.
- Very, very small PCs – the kind you could fit in your pocket – are starting to have the same amount of computing grunt as a high end desktop PC of, say, 5 years ago. And that was plenty, even back then, for a relatively inefficient general purpose operating system.

But the primary wake up call, at least for me, is that **the new iPad finally delivered an innovation that general purpose computing has been waiting on for thirty years**: a truly high resolution display at a reasonable size and price. In 2007 I asked where all the high resolution displays were. Turns out, they're only on phones and tablets.

That's why I didn't just buy the iPad 3 (sorry, The New iPad). **I bought two of them.** And I reserve the right to buy more!

iPad 3 reviews that complain "all they did was improve the display" are clueless bordering on stupidity. Tablets are pretty much *by definition all display*; nothing is more fundamental to the tablet experience than the quality of the display. These are the first iPads I've ever owned (and I'd argue, the first worth owning), and the display is as sublime as I always hoped it would be. The resolution and clarity are astounding, a joy to read on, and give me hope that one day we could potentially achieve near print resolution in computing. The new iPad screen is everything I've always wanted on my desktops and laptops for the last 5 years, but I could never get.

Don't take my word for it. Consider what screen reading pioneer, and inventor of ClearType, Bill Hills has to say about it:

The 3rd Generation iPad has a display resolution of 264ppi. And still retains a ten-hour battery life (9 hours with wireless on). Make no mistake. That much resolution is stunning. To see it on a mainstream device like the iPad - rather than a $13,000 exotic monitor - is truly amazing, and something I've been waiting more than a decade to see.

It will set a bar for future resolution that every other manufacturer of devices and PCs will have to jump.

And the display calibration experts at DisplayMate have the measurements and metrics to back these claims up, too:

… the new iPad’s picture quality, color accuracy, and gray scale are not only much better than any other Tablet or Smartphone, it’s also much better than most HDTVs, laptops, and monitors. In fact with some minor calibration tweaks the new iPad would qualify as a studio reference monitor.

Granted, this is happening on tiny 4" and 10" screens first due to sheer economics. It will take time for it to trickle up. I shudder to think what a 24 or 27 inch display using the same technology as the current iPad would cost right now. But until the iPhone and iPad, near as I can tell, *nobody else was even trying* to improve resolution on computer displays – even though all the existing HCI research tells us that higher resolution displays are a deep fundamental improvement in computing.

At the point where these simple, fixed function Post-PC era computing devices are not just "enough" computer for most folks, but also **fundamentally innovating in computing as a whole** … well, all I can say is bring on the post-PC era.

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.

- 3d printing
- America
- Android
- android devices
- annoying anti-aliasing algorithms
- anti-aliasing algorithms
- Apple
- Apple computer
- Apple II series
- Apple IIs
- Apple Inc.
- Apple iPhone 3GS Smartphone
- Apple iPhone 4 Smartphone
- Apple iPod Touch Portable Audio Device
- Beirut
- Bill Hills
- cad
- cellular telephone
- CGI
- client devices
- computing
- connectivity technology
- connectivity technology
- Dan Booth
- decent input devices
- Dell
- devices
- display technology
- display technology
- Edward Tufte
- era device
- era operating systems
- Every Android tablet
- Fortran
- Galaxy tablets
- Get a Mac
- graphic processing cores
- GSM
- HP
- IBM
- IBM Personal Computer
- in-home manufacturing
- internet access
- iPads
- iPhone
- iPod
- Jakob Nielsen
- John Lindquist
- laser
- Lebanon
- Mac
- Mac operating system
- mainstream device
- Microsoft
- mobile devices
- mobile phones
- Motorola
- Nintendo
- Office equipment
- operating system
- operating systems
- PC
- PC world
- Personal computer
- php
- php
- Pixel density
- Post PC
- post pc device
- post-pc devices
- precise pointing device
- purpose computing
- purpose computing foundations
- RAM
- real software
- Red Herring
- same technology
- Samsung
- smartphone
- software industry
- Somalia
- Steve Jobs
- Steve Jobs
- tablet hardware
- Technology
- the different software
- time producing software
- tiny car
- Todd Vance
- travel brochures
- United States
- Unix
- Until tablets
- Verizon
- VIC
- web/multimedia/etc
- well-understood device
- Western Europe
- word processors
- word processors
- Workstation
- YouTube

First time accepted submitter osman84 writes "I've been developing web/mobile apps for some time, and have managed to build up some decent experience about usability. However, as I'm growing a team of developers now, I've noticed that most of the young ones have a very poor sense of usability. Unfortunately, since I was never really taught usability as science, I'm having trouble teaching them to develop usable apps. Are there any good books that make a good read for general usability guidelines for web/mobile apps? I have a couple from my college days, but I'd like something more recent, written in the era of mobile apps, etc."

Read more of this story at Slashdot.

- 37Signals
- A lot of products
- About Face 2
- API
- basic software usability concepts
- Ben Schneiderman
- Books
- buggy software
- Communication
- Computing
- Garrett
- Heuristic evaluation
- Human interface guidelines
- Human-computer interaction
- iPhone
- Linux
- Microsoft
- online resource
- real software
- Slashdot
- smartphone
- software application
- Software quality
- software quality attribute
- Steve Krug
- Technical communication
- Technology
- Usability
- Usability engineering
- User interface
- Victor Papanek
- World Wide Web

Have you ever heard a software engineer refer to a problem as "NP-complete"? That's fancy computer science jargon shorthand for "incredibly hard":

The most notable characteristic of NP-complete problems is that no fast solution to them is known; that is, the time required to solve the problem using any currently known algorithm increases very quickly as the size of the problem grows. As a result, **the time required to solve even moderately large versions of many of these problems easily reaches into the billions or trillions of years**, using any amount of computing power available today. As a consequence, determining whether or not it is possible to solve these problems quickly is one of the principal unsolved problems in Computer Science today.

While a method for computing the solutions to NP-complete problems using a reasonable amount of time remains undiscovered, computer scientists and programmers still frequently encounter NP-complete problems. An expert programmer should be able to recognize an NP-complete problem so that he or she does not unknowingly waste time trying to solve a problem which so far has eluded generations of computer scientists.

You do want to be an *expert* programmer, don't you? Of course you do!

NP-complete problems are like hardcore pornography. Nobody can define what makes a problem NP-complete, exactly, but you'll know it when you see it. Just this once, I'll refrain from my usual practice of inserting images to illustrate my point.

(Update: I was shooting for a poetic allusion to the P=NP problem here but based on the comments this is confusing and arguably incorrect. So I'll redact this sentence. Instead, I point you to this P=NP poll (pdf); read the comments from CS professors (including Knuth) to get an idea of how realistic this might be.)

Instead, I'll recommend a book Anthony Scian recommended to me: Computers and Intractability: A Guide to the Theory of NP-Completeness.

Like all the software engineering books I recommend, this book has a timeless quality. It was originally published in 1979, a shining testament to smart people attacking truly difficult problems in computer science: "I can't find an efficient algorithm, but neither can all these famous people."

So how many problems are NP-complete? Lots.

Even if you're a layman, you might have experienced NP-Completeness in the form of Minesweeper, as Ian Stewart explains. But for programmers, I'd argue the most well known NP-completeness problem is the travelling salesman problem.

Given a number of cities and the costs of travelling from any city to any other city, what is the least-cost round-trip route that visits each city exactly once and then returns to the starting city?

The brute-force solution -- trying every possible permutation between the cities -- might work for a very small network of cities, but this quickly becomes untenable. Even if we were to use theoretical CPUs our children might own, or our children's children. What's worse, every other algorithm we come up with to find an optimal path for the salesman has the same problem. That's the common characteristic of NP-complete problems: they are **exercises in heuristics and approximation**, as illustrated by this xkcd cartoon:

What do *expert* programmers do when faced by an intractable problem? **They cheat**. And so should you! Indeed, some of the modern approximations for the Travelling Salesman Problem are *remarkably* effective.

Various approximation algorithms, which quickly yield good solutions with high probability, have been devised. Modern methods can find solutions for extremely large problems (millions of cities) within a reasonable time, with a high probability of being just 2-3% away from the optimal solution.

Unfortunately, not all NP-complete problems have good approximations. But for those that do, I have to wonder: if we can get so close to an optimal solution by cheating, does it really matter if there's no known algorithm to produce *the* optimal solution? If I've learned nothing else from NP-complete problems, I've learned this: **sometimes coming up with clever cheats can be more interesting than searching in vain for the perfect solution**.

Consider the First Fit Decreasing algorithm for the NP-complete Bin Packing problem . It's not perfect, but it's incredibly simple and fast. The algorithm is so simple, in fact, it is regularly demonstrated at time management seminars. Oh, *and* it guarantees that you will get within 22% of the perfect solution every time. Not bad for a lousy cheat.

So what's *your* favorite NP-complete cheat?

[advertisement] Peer code review without meetings, paperwork, or stopwatches? No wonder Code Collaborator won the Jolt Award.

- ABC
- actual solution
- Alan Turing
- also randomized approximation algorithms
- Amazon
- AMD
- Anthony Scian
- approximate solution
- approximate solutions
- approximating algorithms
- approximation algorithm
- approximation algorithm
- approximation algorithms
- approximation algorithms
- Australia
- Bill Weiss
- Bobby Smith
- brute force algorithm
- brute force algorithm
- brute-force solution
- C
- C++
- choices various data structures and algorithms
- Complexity classes
- Computational complexity theory
- computer chips
- Cook–Levin theorem
- cryptographic algorithms
- Dorit Hochbaum
- electronics
- favorite algorithm
- favorite algorithm
- Fit Decreasing algorithm
- gas money
- generic algorithm
- generic algorithm
- generic solution
- Genetic algorithm
- given solution
- good approximating algorithm
- good approximating algorithm
- Huber Heights
- Ian Stewart
- Intel
- Jeff Atwood
- Kevin Bacon
- known algorithm
- Mathematics
- National Institute of Standards and Technology
- non trivial web site
- NP
- NP-complete
- NP-hard
- nVidia
- optimization algorithms
- optimization algorithms
- Optimization problem
- P = NP problem
- P-complete
- paint designs
- performance-critical applications
- polynomial order algorithm
- polynomial order algorithm
- probabilistic polynomial-time algorithm
- probabilistic polynomial-time algorithm
- problem solutions
- randomized approximation algorithms
- real software
- Reduction
- Roberta Smith
- semiconductor
- software development
- software engineering
- software engineering definition
- software engineering defintion
- software writers
- stupid mistakes that cause our software
- Sweden
- Theoretical computer science
- Thomas Smith
- Tom Smith
- TSP
- using approximation algorithms
- web developer
- web programming
- wonderful real world argument tha plagues software development
- _software engineering_