Cryptography @ Infosec.pub

509 readers
2 users here now

Questions, answers, discussions, and literature on the theory and practice of cryptography

Rules (longer version here)

##Related resources;

founded 2 years ago
MODERATORS
26
27
10
submitted 3 months ago* (last edited 3 months ago) by Natanael to c/crypto
28
2
How to Hold KEMs (durumcrustulum.com)
submitted 3 months ago by Natanael to c/crypto
29
30
31
 
 

From: https://mastodon.social/@fj/114171907451597856

Interesting paper co-authored by Airbus cryptographer Erik-Oliver Blass on using zero-knowledge proofs in flight control systems.

Sensors would authenticate their measurements, the control unit provides in each iteration control outputs together with a proof of output correctness (reducing the need in some cases for redundant computations), and actuators verify that outputs have been correctly computed

32
4
submitted 3 months ago* (last edited 3 months ago) by Natanael to c/crypto
 
 

"The GSM Association announced that the latest RCS standard includes E2EE based on the Messaging Layer Security (MLS) protocol, enabling interoperable encryption between different platform providers for the first time"

33
 
 

HQC gets standardized, as an addition to ML-KEM (kyber). McEliece is out of the NIST process for two reasons, they consider it unlikely to be widely used, also ISO is considering standardizing it and they don't want to create an incompatible standard. If ISO does standardize it and it does see use, NIST is considering mirroring that standard (since lots of US agencies are bound to using NIST standards)

34
35
36
37
12
submitted 3 months ago* (last edited 3 months ago) by Natanael to c/crypto
38
39
40
41
42
43
 
 

UK wanted global access to decrypt any and all Apple users' iCloud data on request. Apple pulled iCloud encryption from the ADP program instead within UK.

Seems like their idea is to ensure encrypted data outside of UK stays out of UK jurisdiction because the affected feature isn't available there anymore. But this will prevent UK residents from using iCloud end to end encryption in ADP and keeping for example backups of photos and iMessage logs protected, so for example journalists are a lot more exposed to secret warrants and potential insider threats.

44
45
9
submitted 4 months ago* (last edited 4 months ago) by Natanael to c/crypto
46
 
 

Here's a copy of my own comment from the reddit thread;

Randomness is a property of a source, not of a number. Numbers are not random. Randomness is a distribution of possibilities and a chance based selection of an option from the possibilities.

What we use in cryptography to describe numbers coming from an RNG is entropy expressed in bits - roughly the (base 2 log of) number of equivalent unique possible values, a measure of how difficult it is to predict.

It's also extremely important to keep in mind that RNG algorithms are deterministic. Their behavior will repeat exactly given the same seed value. Given this you can not increase entropy with any kind of RNG algorithm. The entropy is defined exactly by the inputs to the algorithm.

Given this, the entropy of random numbers generated using a password as a seed value is equivalent to the entropy of the password itself, and the entropy of an encrypted message is the entropy of the key + entropy of the message. Encrypting a gigabyte of zeroes with a key has the total entropy of the key + "0" + length in bits, which is far less than the gigabytes worth of bits it produced, so instead of 8 billion bits of entropy, it's 128 + ~1 + 33 bits of entropy.

Then we get to kolgomorov complexity and computational complexity, in other words the shortest way to describe a number. This is also related to compression. The vast majority of numbers have high complexity which can not be described in full with a shorter number, they can not be compressed, and because of this a typical statistical test for randomness would say it passes with a certain probability (given the tests themselves can be encoded as shorter numbers), because the highest complexity test has too low complexity to have a high chance of describing the tested number.

(sidenote 1: The security of encryption depends on mixing in the key with the message sufficiently well that you can't derive the message without knowing the key - the complexity is high - and that the key is too big to bruteforce)
(sidenote 2: the kolgomorov complexity of a securely encrypted message is roughly the entropy + algorithm complexity, but for a weak algorithm it's less because leaking patterns lets you circumvent bruteforcing the key entropy - also we generally discount the algorithm itself as it's expected to be known. Computational complexity is essentially defined by expected runtime of attacks.)

And test suites are bounded. They all have an expected running time, and may be able to fit maybe 20-30 bits of complexity in there, because that's how much much compute resources you can put into a standardized test suite. This means all numbers with a pattern which requires more bits to describe will pass with a high probability.

... And this is why standard tests are easy to fool!

All you have to do is to create an algorithm with 1 more bit of complexity than the limit of the test and now your statistical tests will pass, because while algorithms with 15 bits of complexity will generally fail another bad algorithm with ~35 bits of complexity (above a hypothetical test threshold of 30) will frequently pass despite being insecure.

So if your encryption algorithm doesn't reach beyond the minimum cryptographic thresholds (roughly 100 bits of computational complexity, roughly equivalent to same bits of kolgomorov complexity*), and maybe just hit 35 bits, then your encrypted messages aren't complex enough to resist dedicated cryptoanalysis, and especially not if the adversary knows the algorithm already, even though they pass all standards tests.

What's worse is the attack might even be incredibly efficient once known (nothing says the 35 bit complexity attack has to be slow, it might simply be a 35 bit derived constant folding the rest of the algorithm down to nothing)!

* kolgomorov complexity doesn't account for different costs for memory usage versus processing power, nor for memory latency, so memory is often more expensive

47
48
49
6
submitted 4 months ago by Natanael to c/crypto
50
view more: ‹ prev next ›