Skip navigation
Help

display technology

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Andrew Cunningham


I log some face-on time with Glass at Google I/O.

Florence Ion

"When you're at a concert and the band takes the stage, nowadays 50,000 phones and tablets go into the air," said Google Senior Development Advocate Timothy Jordan in the first Google Glass session of this year's Google I/O. "Which isn't all that weird, except that people seem to be looking at the tablets more than they are the folks onstage or the experience that they're having. It's crazy because we love what technology gives us, but it's a bummer when it gets in the way, when it gets between us and our lives, and that's what Glass is addressing."

The upshot of this perspective is that Glass and its software is designed for quick use. You fire it up, do what you want to do, and get back to your business without the time spent diving into your pocket for your phone, unlocking it, and so on. Whether this process is more distracting than talking to someone with Glass strapped to his or her face is another conversation, but this is the problem that Google is attempting to solve.

Since Google I/O is a developer's conference, the Glass sessions didn't focus on the social implications of using Glass or the privacy questions that some have raised. Rather, the focus was on how to make applications for this new type of device, something that is designed to give you what you want at a moment's notice and then get out of the way. Here's a quick look at what that ethos does to the platform's applications.

Read 22 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Andrew Cunningham


The Galaxy S 4's display is a sizable step forward for PenTile AMOLED, according to DisplayMate's Raymond Soneira.

Florence Ion

We've already given you our subjective impressions of Samsung's Galaxy S 4 and its 1080p AMOLED display, but for those of you who hunger for quantitative data, Dr. Raymond Soneira of DisplayMate has given the phone an in-depth shakedown. Soneira compares the screen's brightness, contrast, color gamut, and power consumption to both the Galaxy S III (which also uses an AMOLED display) and the IPS panel in the iPhone 5. What he found was that Samsung's AMOLED technology is still fighting against some of its inherent weaknesses, but it has made great strides forward even since the Galaxy S III was released last year.

To recap: both the S III and S 4 use PenTile AMOLED screens, which use a slightly different pixel arrangement than traditional LCD screens. A pixel in a standard LCD panel has one red, one green, and one blue stripe; PenTile uses alternating red-green-blue-green subpixels, taking advantage of the eye's sensitivity to green to display the same image using fewer total subpixels. These screens cost less to manufacture but can have issues with color accuracy and text crispness. The backlight for each type of display is also different—white LEDs behind the iPhone's display shine through the red, green, and blue subpixels to create an image, while the AMOLED subpixels are self-lit. This has implications for brightness, contrast, and power consumption.


A close-up shot of PenTile AMOLED in the Nexus One, when the tech was much less mature. Luke Hutchinson

We'll try to boil Soneira's findings down to their essence. One of the S 4's benefits over its predecessor is (obviously) its pixel density, which at 441 ppi is considerably higher than either its predecessor or the iPhone 5. Soneira says that this helps it to overcome the imbalance between PenTile's green subpixels and its less numerous red and blue ones, which all but banishes PenTile's "fuzzy text" issues:

Read 5 remaining paragraphs | Comments

0
Your rating: None

Located on a rather nondescript industrial estate in a suburb of Leicester you'll find an equally nondescript warehouse unit. Nestled amongst the usual glut of logistics companies and scrap metal merchants, the building in question once housed a firm that was poised to dramatically alter the world of interactive entertainment as we know it, and worked with such illustrious partners as Sega, Atari, Ford and IBM.

That company was Virtuality. Founded by a dashing and charismatic Phd graduate by the name of Jonathan D. Waldern, it placed the UK at the vanguard of a Virtual Reality revolution that captured the imagination of millions before collapsing spectacularly amid unfulfilled promises and public apathy.

The genesis of VR begins a few years prior to Virtuality's birth in its grey and uninspiring industrial surroundings. The technology was born outside of the entertainment industry, with NASA and the US Air Force cooking up what would prove to be the first VR systems, intended primarily for training and research. The late '80s and very early '90s saw much academic interest in the potential of VR, but typically, it took a slice of Hollywood hokum to really jettison the concept into the global consciousness and create a new buzzword for the masses.

Read more…

0
Your rating: None

What was Microsoft's original mission?

In 1975, Gates and Allen form a partnership called Microsoft. Like most startups, Microsoft begins small, but has a huge vision – a computer on every desktop and in every home.

The existential crisis facing Microsoft is that they achieved their mission years ago, at least as far as the developed world is concerned. When was the last time you saw a desktop or a home without a computer? 2001? 2005? We're long since past the point where Microsoft's original BHAG was met, and even exceeded. PCs are absolutely ubiquitous. When you wake up one day to discover that you've completely conquered the world … what comes next?

Apparently, the Post PC era.

Microsoft never seemed to recover from the shock of achieving their original 1975 goal. Or perhaps they thought that they hadn't quite achieved it, that there would always be some new frontier for PCs to conquer. But Steve Jobs certainly saw the Post PC era looming as far back as 1996:

The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That's over. Apple lost. The desktop market has entered the dark ages, and it's going to be in the dark ages for the next 10 years, or certainly for the rest of this decade.

If I were running Apple, I would milk the Macintosh for all it's worth – and get busy on the next great thing. The PC wars are over. Done. Microsoft won a long time ago.

What's more, Jobs did something about it. Apple is arguably the biggest (and in terms of financials, now literally the biggest) enemy of general purpose computing with the iPhone and iPad. These days, their own general purpose Mac operating system, OS X, largely plays second fiddle to the iOS juggernaut powering the iPhone and iPad.

Here's why:

Apple-cumulative-sales

The slope of this graph is the whole story. The complicated general purpose computers are at the bottom, and the simpler specialized computers are at the top.

I'm incredibly conflicted, because as much as I love the do-anything computer …

  • I'm not sure that many people in the world truly need a general purpose computer that can do anything and install any kind of software. Simply meeting the core needs of browsing the web and email and maybe a few other basic things covers a lot of people.
  • I believe the kitchen-sink-itis baked into the general purpose computing foundations of PCs, Macs, and Unix make them fundamentally incompatible with our brave new Post PC world. Updates. Toolbars. Service Packs. Settings. Anti-virus. Filesystems. Control panels. All the stuff you hate when your Mom calls you for tech support? It's deeply embedded into of the culture and design of every single general purpose computer. Doing potentially "anything" comes at a steep cost in complexity.
  • Very, very small PCs – the kind you could fit in your pocket – are starting to have the same amount of computing grunt as a high end desktop PC of, say, 5 years ago. And that was plenty, even back then, for a relatively inefficient general purpose operating system.

But the primary wake up call, at least for me, is that the new iPad finally delivered an innovation that general purpose computing has been waiting on for thirty years: a truly high resolution display at a reasonable size and price. In 2007 I asked where all the high resolution displays were. Turns out, they're only on phones and tablets.

iPad 2 display vs iPad 3 display

That's why I didn't just buy the iPad 3 (sorry, The New iPad). I bought two of them. And I reserve the right to buy more!

iPad 3 reviews that complain "all they did was improve the display" are clueless bordering on stupidity. Tablets are pretty much by definition all display; nothing is more fundamental to the tablet experience than the quality of the display. These are the first iPads I've ever owned (and I'd argue, the first worth owning), and the display is as sublime as I always hoped it would be. The resolution and clarity are astounding, a joy to read on, and give me hope that one day we could potentially achieve near print resolution in computing. The new iPad screen is everything I've always wanted on my desktops and laptops for the last 5 years, but I could never get.

Don't take my word for it. Consider what screen reading pioneer, and inventor of ClearType, Bill Hills has to say about it:

The 3rd Generation iPad has a display resolution of 264ppi. And still retains a ten-hour battery life (9 hours with wireless on). Make no mistake. That much resolution is stunning. To see it on a mainstream device like the iPad - rather than a $13,000 exotic monitor - is truly amazing, and something I've been waiting more than a decade to see.

It will set a bar for future resolution that every other manufacturer of devices and PCs will have to jump.

And the display calibration experts at DisplayMate have the measurements and metrics to back these claims up, too:

… the new iPad’s picture quality, color accuracy, and gray scale are not only much better than any other Tablet or Smartphone, it’s also much better than most HDTVs, laptops, and monitors. In fact with some minor calibration tweaks the new iPad would qualify as a studio reference monitor.

Granted, this is happening on tiny 4" and 10" screens first due to sheer economics. It will take time for it to trickle up. I shudder to think what a 24 or 27 inch display using the same technology as the current iPad would cost right now. But until the iPhone and iPad, near as I can tell, nobody else was even trying to improve resolution on computer displays – even though all the existing HCI research tells us that higher resolution displays are a deep fundamental improvement in computing.

At the point where these simple, fixed function Post-PC era computing devices are not just "enough" computer for most folks, but also fundamentally innovating in computing as a whole … well, all I can say is bring on the post-PC era.

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.

0
Your rating: None