Skip navigation
Help

Cryptography

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.

A group of researchers from MIT and the University of Ireland has presented a paper (PDF) showing that one of the most important assumptions behind cryptographic security is wrong. As a result, certain encryption-breaking methods will work better than previously thought.
"The problem, Médard explains, is that information-theoretic analyses of secure systems have generally used the wrong notion of entropy. They relied on so-called Shannon entropy, named after the founder of information theory, Claude Shannon, who taught at MIT from 1956 to 1978. Shannon entropy is based on the average probability that a given string of bits will occur in a particular type of digital file. In a general-purpose communications system, that’s the right type of entropy to use, because the characteristics of the data traffic will quickly converge to the statistical averages. ... But in cryptography, the real concern isn't with the average case but with the worst case. A codebreaker needs only one reliable correlation between the encrypted and unencrypted versions of a file in order to begin to deduce further correlations. ... In the years since Shannon’s paper, information theorists have developed other notions of entropy, some of which give greater weight to improbable outcomes. Those, it turns out, offer a more accurate picture of the problem of codebreaking. When Médard, Duffy and their students used these alternate measures of entropy, they found that slight deviations from perfect uniformity in source files, which seemed trivial in the light of Shannon entropy, suddenly loomed much larger. The upshot is that a computer turned loose to simply guess correlations between the encrypted and unencrypted versions of a file would make headway much faster than previously expected. 'It’s still exponentially hard, but it’s exponentially easier than we thought,' Duffy says."

0
Your rating: None

An anonymous reader writes "Ralph Langner, the security expert who deciphered how Stuxnet targeted the Siemens PLCs in Iran's Natanz nuclear facility, has come up with a cybersecurity framework for industrial control systems (ICS) that he says is a better fit than the U.S. government's Cyber Security Framework. Langner's Robust ICS Planning and Evaluation, or RIPE, framework takes a different approach to locking down ICS/SCADA plants than the NIST-led one, focusing on security capabilities rather than risk. He hopes it will help influence the final version of the U.S. government's framework."

0
Your rating: None

The National Security Agency and its UK counterpart have made repeated and determined attempts to identify people using the Tor anonymity service, but the fundamental security remains intact, as top-secret documents published on Friday revealed.

The classified memos and training manuals—which were leaked by former NSA contractor Edward Snowden and reported by The Guardian, show that the NSA and the UK-based Government Communications Headquarters (GCHQ) are able to bypass Tor protections, but only against select targets and often with considerable effort. Indeed, one presentation slide grudgingly hailed Tor as "the king of high-secure, low-latency Internet anonymity." Another, titled "Tor Stinks," lamented: "We will never be able to de-anonymize all Tor users all the time."

An article published separately by The Washington Post also based on documents provided by Snowden concurred.

"There is no evidence that the NSA is capable of unmasking Tor traffic routinely on a global scale," the report said. "But for almost seven years, it has been trying."

0
Your rating: None

benrothke writes "Narrating a compelling and interesting story about cryptography is not an easy endeavor. Many authors have tried and failed miserably; attempting to create better anecdotes about the adventure of Alice and Bob. David Kahn probably did the best job of it when wrote The Codebreakers: The story of secret writing in 1967 and set the gold standard on the information security narrative. Kahn's book was so provocative and groundbreaking that the US Government originally censored many parts of it. While Secret History: The Story of Cryptology is not as groundbreaking, it also has no government censorship. With that, the book is fascinating read that provides a combination of cryptographic history and the underlying mathematics behind it." Keep reading for the rest of Ben's review.

0
Your rating: None

Fnord666 writes with this excerpt from Tech Crunch "Twitter has enabled Perfect Forward Secrecy across its mobile site, website and API feeds in order to protect against future cracking of the service's encryption. The PFS method ensures that, if the encryption key Twitter uses is cracked in the future, all of the past data transported through the network does not become an open book right away. 'If an adversary is currently recording all Twitter users' encrypted traffic, and they later crack or steal Twitter's private keys, they should not be able to use those keys to decrypt the recorded traffic,' says Twitter's Jacob Hoffman-Andrews. 'As the Electronic Frontier Foundation points out, this type of protection is increasingly important on today's Internet.'"

Of course, they are also using Elliptic Curve ciphers.

0
Your rating: None
Original author: 
Casey Johnston

Few Internet frustrations are so familiar as the password restriction. After creating a few (dozen) logins for all our Web presences, the use of symbols, mixed cases, and numbers seems less like a security measure and more like a torture device when it comes to remembering a complex password on a little-used site. But at least that variety of characters keeps you safe, right? As it turns out, there is some contrary research that supports both how frustrating these restrictions are and suggests it’s possible that the positive effect of complexity rules on security may not be as great as long length requirements.

Let's preface this with a reminder: the conventional wisdom is that complexity trumps length every time, and this notion is overwhelmingly true. Every security expert will tell you that “Supercalifragilistic” is less secure than “gj7B!!!bhrdc.” Few password creation schemes will render any password uncrackable, but in general, length does less to guard against crackability than complexity.

A password is not immune from cracking simply by virtue of being long—44,991 passwords recovered from a dump of LinkedIn hashes last year were 16 characters or more. The research we describe below refers specifically to the effects of restrictions placed by administrators on password construction on their crackability. By no means does it suggest that a long password is, by default, more secure than a complex one.

Read 13 remaining paragraphs | Comments

0
Your rating: None
Original author: 
Joshua Kopstein

Facebooksecurity1_2040_large_jpg

Demand for encryption apps has increased dramatically ever since the exposure of massive internet surveillance programs run by US and UK intelligence agencies. Now Facebook is reportedly moving to implement a strong, decades-old encryption technique that's been largely avoided by the online services that need it most.

Forward secrecy (sometimes called "perfect forward secrecy") is a way of encrypting internet traffic — the connection between a website and your browser — so that it's harder for a third party to intercept the pages being viewed, even if the server's key becomes compromised. It's been lauded by cryptography experts since its creation in the early 1990's, yet most "secure" online services like banks and webmail still...

Continue reading…

0
Your rating: None