Skip navigation
Help

Stylus

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Andrew Cunningham

Aurich Lawson / Thinkstock

Welcome back to our three-part series on touchscreen technology. Last time, Florence Ion walked you through the technology's past, from the invention of the first touchscreens in the 1960s all the way up through the mid-2000s. During this period, different versions of the technology appeared in everything from PCs to early cell phones to personal digital assistants like Apple's Newton and the Palm Pilot. But all of these gadgets proved to be little more than a tease, a prelude to the main event. In this second part in our series, we'll be talking about touchscreens in the here-and-now.

When you think about touchscreens today, you probably think about smartphones and tablets, and for good reason. The 2007 introduction of the iPhone kicked off a transformation that turned a couple of niche products—smartphones and tablets—into billion-dollar industries. The current fierce competition from software like Android and Windows Phone (as well as hardware makers like Samsung and a host of others) means that new products are being introduced at a frantic pace.

The screens themselves are just one of the driving forces that makes these devices possible (and successful). Ever-smaller, ever-faster chips allow a phone to do things only a heavy-duty desktop could do just a decade or so ago, something we've discussed in detail elsewhere. The software that powers these devices is more important, though. Where older tablets and PDAs required a stylus or interaction with a cramped physical keyboard or trackball to use, mobile software has adapted to be better suited to humans' native pointing device—the larger, clumsier, but much more convenient finger.

Read 22 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Scott Gilbertson

Hybrids. Image: Screenshot/Webmonkey.

The advent of hybrid laptops that double as tablets or offer some sort of touch input has greatly complicated the life of web developers.

A big part of developing for today’s myriad screens is knowing when to adjust the interface, based not just on screen size, but other details like input device. Fingers are far less precise than a mouse, which means bigger buttons, form fields and other input areas.

But with hybrid devices like touch screen Windows 8 laptops or dockable Android tablets with keyboards, how do you know whether the user is browsing with a mouse or a finger?

Over on the Mozilla Hacks blog Patrick Lauke tackles that question in an article on detecting touch-capable devices. Lauke covers the relatively simple case of touch-only, like iOS devices, before diving into the far more complex problem of hybrid devices.

Lauke’s answer? If developing for the web hasn’t already taught you this lesson, perhaps hybrid devices will — learn to live with uncertainty and accept that you can’t control everything.

What’s the solution to this new conundrum of touch-capable devices that may also have other input methods? While some developers have started to look at complementing a touch feature detection with additional user agent sniffing, I believe that the answer – as in so many other cases in web development – is to accept that we can’t fully detect or control how our users will interact with our web sites and applications, and to be input-agnostic. Instead of making assumptions, our code should cater for all eventualities.

While learning to live with uncertainty and providing interfaces that work with any input sounds nice in theory, developers are bound to want something a bit more concrete. There’s some hope on the horizon. Microsoft has proposed the Pointer Events spec (and created a build of Webkit that supports it). And the CSS Media Queries Level 4 spec will offer a pointer query to see what sort of input device is being used (mouse, finger, stylus etc).

Unfortunately, neither Pointer Events nor Media Queries Level 4 are supported in today’s browsers. Eventually there probably will be some way to easily detect and know for certain which input device is being used, but for the time being you’re going to have to live with some level of uncertainty. Be sure to read through Lauke’s post for more details and some sample code.

0
Your rating: None
Original author: 
Florence Ion

Aurich Lawson / Thinkstock

It's hard to believe that just a few decades ago, touchscreen technology could only be found in science fiction books and film. These days, it's almost unfathomable how we once got through our daily tasks without a trusty tablet or smartphone nearby, but it doesn't stop there. Touchscreens really are everywhere. Homes, cars, restaurants, stores, planes, wherever—they fill our lives in spaces public and private.

It took generations and several major technological advancements for touchscreens to achieve this kind of presence. Although the underlying technology behind touchscreens can be traced back to the 1940s, there's plenty of evidence that suggests touchscreens weren't feasible until at least 1965. Popular science fiction television shows like Star Trek didn't even refer to the technology until Star Trek: The Next Generation debuted in 1987, almost two decades after touchscreen technology was even deemed possible. But their inclusion in the series paralleled the advancements in the technology world, and by the late 1980s, touchscreens finally appeared to be realistic enough that consumers could actually employ the technology into their own homes. 

This article is the first of a three-part series on touchscreen technology's journey to fact from fiction. The first three decades of touch are important to reflect upon in order to really appreciate the multitouch technology we're so used to having today. Today, we'll look at when these technologies first arose and who introduced them, plus we'll discuss several other pioneers who played a big role in advancing touch. Future entries in this series will study how the changes in touch displays led to essential devices for our lives today and where the technology might take us in the future. But first, let's put finger to screen and travel to the 1960s.

Read 51 remaining paragraphs | Comments

0
Your rating: None