ETH Series in Information Security and Cryptography
Vol. 1
One of the most important properties of a cryptographic system is a proof of its security. In the present work, information-theoretic methods are used for proving the security of unconditionally secure cryptosystems. The security of such systems does not depend on unproven intractability assumptions.
A survey of entropy measures and their applications in cryptography is presented. A new information measure, smooth entropy, is introduced to quantify the number of almost uniform random bits that can be extracted from a source by probabilistic algorithms. Smooth entropy unifies previous work on privacy amplification in cryptography and on entropy smoothing in theoretical computer science. It enables a systematic investigation of the spoiling knowledge proof technique to obtain lower bounds on smooth entropy.
The Rényi entropy of order at least 2 of a random variable is a lower bound for its smooth entropy, whereas an assumption about Rényientropy of order 1, which is equivalent to the Shannon entropy, is too weak to guarantee any non-trivial amount of smooth entropy. The gap between Rényi entropy of order 1 and 2 is closed by proving that Rényi entropy of order $\alpha$ between 1 and 2 is a lower bound for smooth entropy, up to a small parameter depending on $\alpha$, the alphabet size, and the failure probability.
The operation of many unconditionally secure cryptosystems can be divided into the three phases advantage distillation, information reconciliation, and privacy amplification. The relation between privacy amplification and information reconciliation is investigated, in particular, the effect of side information, obtained by an adversary through an initial reconciliation step, on the size of the secret key that can be distilled safely by subsequent privacy amplification. It is shown that each bit of side information reduces the size of the key that can be generated by at most one bit, except with negligible probability.
A private-key cryptosystem and a protocol for key agreement by public discussion are proposed that are unconditionally secure based on the sole assumption that an adversary's memory capacity is limited. The systems make use of a random bit string of length slightly larger than the adversary's memory capacity that can be received by all parties.
Direkt bestellen bei / to order directly from: Hartung.Gorre@t-online.de
ETH Series in Information Security and Cryptography
1 Introduction 1 2 Preliminaries 5 2.1 Discrete Probability Theory 5 2.2 Some Inequalities 8 2.2.1 The Jensen Inequality 8 2.2.2 The Moment, Markov, Chebychef, and other Inequalities 9 2.2.3 Chernoff-Hoeffding Bounds 9 2.3 Entropy and Information Theory 10 2.4 R\'enyi Entropy 13 2.5 The Asymptotic Equipartition Property 17 2.6 Universal Hashing and Privacy Amplification 20 3 Information Measures in Cryptography 23 3.1 Introduction 23 3.2 Scenarios and Information Measures 25 3.2.1 Perfect Secrecy: Shannon Entropy 26 Secret-Key Cryptosystems 26 Secret Sharing 27 Key Distribution for Dynamic Conferences 28 3.2.2 Authentication: Relative Entropy 30 3.2.3 Privacy Amplification: R\'enyi Entropy 33 3.2.4 Guessing Keys: Min-Entropy and ``Guessing Entropy'' 35 3.2.5 Hash Functions: Collision Probability 37 3.2.6 Probabilistic Bounds: Variational Distance 39 3.3 Some Relations Between Information Measures 42 3.4 Shannon Entropy and Almost Uniform Distributions 45 4 Smooth Entropy 47 4.1 Introduction 48 4.2 A General Formulation 51 4.3 Previous Work and Related Concepts 55 4.3.1 Privacy Amplification in Cryptography 56 4.3.2 Entropy Smoothing in Pseudorandom Generation 57 4.3.3 Relation to Entropy 58 4.3.4 Relation to Intrinsic Randomness 61 4.3.5 An Application in Learning Theory 62 4.3.6 Extractors, Weak Random Sources, and Derandomization 63 4.4 Spoiling Knowledge 65 4.4.1 Introduction 65 4.4.2 Spoiling Knowledge for Increasing Smooth Entropy with Probability 1 68 4.4.3 Spoiling Knowledge for Increasing Smooth Entropy with Probabilistic Bounds 74 4.5 Bounds Using Spoiling Knowledge 79 4.5.1 A Bound Using R\'enyi Entropy of Order $\alpha >1$ 80 4.5.2 A Tighter Bound Using the Profile of the Distribution 85 4.6 Smoothing an Unknown Distribution 89 4.7 Conclusions 92 5 Unconditional Security in Cryptography 95 5.1 Introduction 95 5.2 Unconditionally Secure Key Agreement Protocols 98 5.3 Linking Reconciliation and Privacy Amplification 102 5.3.1 Introduction 102 5.3.2 The Effect of Side Information on R\'enyi Entropy 103 5.3.3 Almost Uniform Distributions 107 5.3.4 Independent Repetition of a Random Experiment 109 5.3.5 Conclusions 111 5.4 Memory-Bounded Adversaries 112 5.4.1 Introduction 112 5.4.2 Pairwise Independence and Entropy Smoothing 114 5.4.3 Extracting a Secret Key from a Randomly Selected Subset 115 5.4.4 A Private-Key System 121 5.4.5 Key Agreement by Public Discussion 122 5.4.6 Discussion 125 6 Concluding Remarks 127 Bibliography 129 Index 141