Skip navigation
Help

internet connections

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Jon Brodkin


Can Google's QUIC be faster than Mega Man's nemesis, Quick Man?

Josh Miller

Google, as is its wont, is always trying to make the World Wide Web go faster. To that end, Google in 2009 unveiled SPDY, a networking protocol that reduces latency and is now being built into HTTP 2.0. SPDY is now supported by Chrome, Firefox, Opera, and the upcoming Internet Explorer 11.

But SPDY isn't enough. Yesterday, Google released a boatload of information about its next protocol, one that could reshape how the Web routes traffic. QUIC—standing for Quick UDP Internet Connections—was created to reduce the number of round trips data makes as it traverses the Internet in order to load stuff into your browser.

Although it is still in its early stages, Google is going to start testing the protocol on a "small percentage" of Chrome users who use the development or canary versions of the browser—the experimental versions that often contain features not stable enough for everyone. QUIC has been built into these test versions of Chrome and into Google's servers. The client and server implementations are open source, just as Chromium is.

Read 11 remaining paragraphs | Comments

0
Your rating: None


Nvidia CEO Jen-Hsun Huang unveils the Nvidia Grid server at the company's CES presentation.

Andrew Cunningham

The most interesting news to come out of Nvidia's two-hour-plus press conference Sunday night was doubtlessly the Tegra 4 mobile processor, followed closely by its intriguing Project Shield tablet-turned-handheld game console. However, company CEO Jen-Hsun Huang also revealed a small morsel of news about the cloud gaming initiative that Nvidia talked up back in May: the Nvidia Grid, the company's own server designed specifically to serve as the back-end for cloud gaming services.

Thus far, virtualized and streaming game services have not set the world on fire. OnLive probably had the highest profile of any such service, and though it continues to live on, it has been defined more by its troubled financial history than its success.

We stopped by Nvidia's CES booth to get some additional insight into just how the Grid servers do their thing and how Nvidia is looking to overcome the technical drawbacks inherent to cloud gaming services.

Read 14 remaining paragraphs | Comments

0
Your rating: None

The Electronic Frontier Foundation's SSL Observatory is a research project that gathers and analyzes the cryptographic certificates used to secure Internet connections, systematically cataloging them and exposing their database for other scientists, researchers and cryptographers to consult.

Now Arjen Lenstra of École polytechnique fédérale de Lausanne has used the SSL Observatory dataset to show that tens of thousands of SSL certificates "offer effectively no security due to weak random number generation algorithms." Lenstra's research means that much of what we think of as gold-standard, rock-solid network security is deeply flawed, but it also means that users and website operators can detect and repair these vulnerabilities.

While we have observed and warned about vulnerabilities due to insufficient randomness in the past, Lenstra's group was able to discover more subtle RNG bugs by searching not only for keys that were unexpectedly shared by multiple certificates, but for prime factors that were unexpectedly shared by multiple publicly visible public keys. This application of the 2,400-year-old Euclidean algorithm turned out to produce spectacular results.

In addition to TLS, the transport layer security mechanism underlying HTTPS, other types of public keys were investigated that did not use EFF's Observatory data set, most notably PGP. The cryptosystems that underlay the full set of public keys in the study included RSA (which is the most common class of cryptosystem behind TLS), ElGamal (which is the most common class of cryptosystem behind PGP), and several others in smaller quantities. Within each cryptosystem, various key strengths were also observed and investigated, for instance RSA 2048 bit as well as RSA 1024 bit keys. Beyond shared prime factors, there were other problems discovered with the keys, which all appear to stem from insufficient randomness in generating the keys. The most prominently affected keys were RSA 1024 bit moduli. This class of keys was deemed by the researchers to be only 99.8% secure, meaning that 2 out of every 1000 of these RSA public keys are insecure. Our first priority is handling this large set of tens of thousands of keys, though the problem is not limited to this set, or even to just HTTPS implementations.

We are very alarmed by this development. In addition to notifying website operators, Certificate Authorities, and browser vendors, we also hope that the full set of RNG bugs that are causing these problems can be quickly found and patched. Ensuring a secure and robust public key infrastructure is vital to the security and privacy of individuals and organizations everywhere.

Researchers Use EFF's SSL Observatory To Discover Widespread Cryptographic Vulnerabilities

0
Your rating: None

Michael J. Ross writes "With more people accessing the Internet using mobile devices than computers, web designers and developers are challenged to make sites that work well on both categories of hardware — or resign themselves to the greater costs and other disadvantages of maintaining two versions of each web site (a mobile-ready version as well as one for much larger screens). Fortunately, recent advances in web technologies are making it easier to build web pages whose contents and their positioning are automatically modified to match the available screen space of the individual user. These techniques are explored in detail in a recent book, Responsive Web Design, written by Ethan Marcotte, a veteran web designer and developer." Keep reading for the rest of Michael's review.

Read more of this story at Slashdot.

0
Your rating: None

At Book Off last week, I picked up an English translation of Tsutsumi Seiji’s Japan’s Consumer Society: A Critical Introduction 『消費社会批判』. Tsutsumi is, for those who do not know his legend, the man behind the Saison retailing group and its sophisticated retail chains Seibu Department Store, PARCO, Loft, Mujirushi Ryohin (MUJI), Wave, and Seed. He is also a former Marxist and award-winning poet/novelist who used his industrial power to support avant-garde artists such as Terayama Shuji.

The title of Tsutsumi’s book is a bit misleading: The volume is mostly abstract and theoretical, quoting Barthes, Bourdieu, and Baudrillard rather than talking about the specifics of Japanese consumer society. Written in 1996 — just as the Bubble had popped and the consumer market was about to peak — Tsutsumi offered many critiques to the Japanese industrial system. He, however, sounded most worried about Japan’s lag in the information technologies. When framed within the context of mobile phones and video games, this may have seemed like a silly concern. The following facts about the state of computer usage within Japan, however, grabbed my attention:

[A] 1993 study…of the diffusion rates for personal computers in the office showed Japan at 9.9% and the United States at 41.7%. Looking at Internet-connected systems as of January 1995, Japan had only 96,632 compared to the United States’ 3,179,170, and the gap is widening year by year. (174)

This data reveals a very significant difference in the centrality of the personal computer and Internet within the two perspective societies — even when held for population.

Of course, Japan eventually “caught up” and now boasts an impressive Internet diffusion rate. Thanks to highly-evolved mobile phones, even non-PC users can connect to the Internet (or its i-mode simulacra). Yet when you look at the “cultural development” of the Net, Japan still feels stunted. The most obvious example is that a very niche site like 2ch still works as the central hub for Net cultural creation and sets the overall tone, despite the core users’ non-mainstream values such as obsession with little girls and bitter neo-right-wing tendencies.

These computer diffusion numbers from 1995 help explain what is happening: Internet culture does not just rely upon the current state of usage but a compounded set of familiarities and expectations about the medium forged over a broad historical period. If less than 10% of the working Japanese population used computers in the 1990s and very few families had computers at home, that means that most Japanese people are not likely to be comfortable with computers nor communicating through them. Even those who have embraced computers in the last decade do not have a lifetime of knowledge about them from which to pull.

Personally speaking, my father’s work on math and statistics meant we always had a PC at home — from a TRS-80 to a Mac Classic II. Part of my joy of using computers and belief in the power of the Internet comes from my good fortune of being exposed to both PCs and the Net at an early age. And I do not think my case was that rare.

Conversely you cannot expect a population without these experiences to somehow make a full psychological embrace of the medium. This is especially true for older Japanese who likely never used computers at work nor saw their peers and neighbors use them with any kind of regularity. And based on the relative recentness of PC diffusion, we should expect that the top decision-makers in Japanese companies — who have always traditionally been in their 50s and 60s — do not have a deep-seated familiarity with the computer.

In this sense, I would argue that while Japan has caught up in terms of infrastructure, the idea of using computers as a social and communicative tool is still very young within a great majority of the population.

0
Your rating: None