Encryption is less secure than we thought August 2013
A group of researchers at MIT just http://www.mit.edu/newsoffice/2013/encryption-is-less-secure-than-we-thought-0814.html" target="_blank">released a paper reconsidering a common mathematical assumption in Cryptography. This means, as the title implies, than most encryption systems are less secure than we thought, but not to worry, nowhere is it written the word "insecure" and it might really be negligible.
The problem here seems to be the definition of Entropy used.
In computing, entropy is the randomness collected by an operating system or application for use in cryptography or other uses that require random data. This randomness is often collected from hardware sources, either pre-existing ones such as mouse movements or specially provided randomness generators. The Famous Wikipedia
In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content. Shannon entropy provides an absolute limit on the best possible lossless encoding or compression of any communication, assuming that the communication may be represented as a sequence of independent and identically distributed random variables. The Famous Wikipediacomment on this story