# Secret Key Agreement¶

One of the only methods of encrypting a message from Alice to Bomb such that no third party (Eve) can possibly decrypt it is a one-time pad. This technique requires that Alice and Bob have a secret sequence of bits, $$S$$, which Alice then encrypts by computing the exclusive-or of it with the plaintext, $$P$$, to produce the cyphertext, $$C$$: $$C = S \oplus P$$. Bob can then decrypt by xoring again: $$P = S \oplus C$$.

In order to pull this off, Alice and Bob need to construct $$S$$ out of some sort joint randomness, $$p(x, y, z)$$, and public communication, $$V$$, which is assumed to have perfect fidelity. The maximum rate at which $$S$$ can be constructed in the secret key agreement rate.

## Background¶

Given $$N$$ IID copies of a joint distribution governed by $$p(x, y, z)$$, let $$X^N$$ denote the random variables observed by Alice, $$Y^N$$ denote the random variables observed by Bob, and $$Z^N$$ denote the random variables observed by Even. Furthermore, let $$S[X : Y || Z]$$ be the maximum rate $$R$$ such that, for $$N > 0$$, $$\epsilon > 0$$, some public communication $$V$$, and functions $$f$$ and $$g$$:

$\begin{split}S_X = f(X^N, V) \\ S_Y = g(Y^N, V) \\ p(S_X \neq S_Y \neq S) \leq \epsilon \\ \I{S : V Z^N} \leq \epsilon \\ \frac{1}{N} \H{S} \geq R - \epsilon\end{split}$

Intuitively, this means there exists some procedure such that, for every $$N$$ observations, Alice and Bob can publicly converse and then construct $$S$$ bits which agree almost surely, and are almost surely independent of everything Eve has access to. $$S$$ is then known as a secret key.

## Lower Bounds¶

### Lower Intrinsic Mutual Information¶

The first lower bound on the secret key agreement rate is known in dit as the lower_intrinsic_mutual_information(), and is given by:

$\I{X : Y \uparrow Z} = \max\{ \I{X : Y} - \I{X : Z}, \I{X : Y} - \I{Y : Z}, 0 \}$

### Secrecy Capacity¶

Next is the secrecy capacity:

$\begin{split}\I{X : Y \uparrow\uparrow Z} = \max \begin{cases} \displaystyle \max_{U - X - YZ} \I{U : Y} - \I{U : Z} \\ \displaystyle \max_{U - Y - XZ} \I{U : X} - \I{U : Z} \end{cases}\end{split}$

This gives the secret key agreement rate when communication is not allowed.

### Necessary Intrinsic Mutual Information¶

A tighter bound is given by the necessary_intrinsic_mutual_information() [GGunluK17]:

$\begin{split}\I{X : Y \uparrow\uparrow\uparrow Z} = \max \begin{cases} \displaystyle \max_{V - U - X - YZ} \I{U : Y | V} - \I{U : Z | V} \\ \displaystyle \max_{V - U - Y - XZ} \I{U : X | V} - \I{U : Z | V} \end{cases}\end{split}$

This quantity is actually equal to the secret key agreement rate when communication is limited to being unidirectional.

## Upper Bounds¶

### Upper Intrinsic Mutual Information¶

The secret key agreement rate is trivially upper bounded by:

$\min\{ \I{X : Y}, \I{X : Y | Z} \}$

### Intrinsic Mutual Information¶

The intrinsic_mutual_information() [MW97] is defined as:

$\I{X : Y \downarrow Z} = \min_{p(\overline{z} | z)} \I{X : Y | \overline{Z}}$

It is straightforward to see that $$p(\overline{z} | z)$$ being a constant achieves $$\I{X : Y}$$, and $$p(\overline{z} | z)$$ being the identity achieves $$\I{X : Y | Z}$$.

### Reduced Intrinsic Mutual Information¶

This bound can be improved, producing the reduced_intrinsic_mutual_information() [RSW03]:

$\I{X : Y \downarrow\downarrow Z} = \min_{U} \I{X : Y \downarrow ZU} + \H{U}$

This bound improves upon the Intrinsic Mutual Information when a small amount of information, $$U$$, can result in a larger decrease in the amount of information shared between $$X$$ and $$Y$$ given $$Z$$ and $$U$$.

### Minimal Intrinsic Mutual Information¶

The Reduced Intrinsic Mutual Information can be further reduced into the minimal_intrinsic_total_correlation() [GA17]:

$\I{X : Y \downarrow\downarrow\downarrow Z} = \min_{U} \I{X : Y | U} + \I{XY : U | Z}$

## All Together Now¶

Taken together, we see the following structure:

\begin{split}\begin{align} &\min\{ \I{X : Y}, \I{X : Y | Z} \} \\ &\quad \geq \I{X : Y \downarrow Z} \\ &\quad\quad \geq \I{X : Y \downarrow\downarrow Z} \\ &\quad\quad\quad \geq \I{X : Y \downarrow\downarrow\downarrow Z} \\ &\quad\quad\quad\quad \geq S[X : Y || Z] \\ &\quad\quad\quad\quad\quad \geq \I{X : Y \uparrow\uparrow\uparrow Z} \\ &\quad\quad\quad\quad\quad\quad \geq \I{X : Y \uparrow\uparrow Z} \\ &\quad\quad\quad\quad\quad\quad\quad \geq \I{X : Y \uparrow Z} \\ &\quad\quad\quad\quad\quad\quad\quad\quad \geq 0.0 \end{align}\end{split}

## Generalizations¶

Most of the above bounds have straightforward multivariate generalizations. These are not necessarily bounds on the multiparty secret key agreement rate. For example, one could compute the minimal_intrinsic_dual_total_correlation():

$\B{X_0 : \ldots : X_n \downarrow\downarrow\downarrow Z} = \min_{U} \B{X_0 : \ldots : X_n | U} + \I{X_0, \ldots, X_n : U | Z}$

## Examples¶

Let us consider a few examples:

In : from dit.multivariate.secret_key_agreement import *

In : from dit.example_dists.intrinsic import intrinsic_1, intrinsic_2, intrinsic_3


First, we consider the distribution intrinsic_1:

In : print(intrinsic_1)
Class:          Distribution
Alphabet:       ('0', '1', '2', '3') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 3
RV Names:       None

x     p(x)
000   1/8
011   1/8
101   1/8
110   1/8
222   1/4
333   1/4


With upper bounds:

In : upper_intrinsic_mutual_information(intrinsic_1, [, ], )
Out: 0.5


We see that the trivial upper bound is 0.5, because without conditioning on $$Z$$, $$X$$ and $$Y$$ can agree when the observe either a $$2$$ or a $$3$$, which results in $$\I{X : Y} = 0.5$$. Given $$Z$$, however, that information is no longer private. But, given $$Z$$, a conditional dependence is induced between $$X$$ and $$Y$$: $$Z$$ knows that if she is a $$0$$ that $$X$$ and $$Y$$ agree, and if she is a $$1$$ they disagree. This results $$\I{X : Y | Z} = 0.5$$. In either case, however, $$X$$ and $$Y$$ can not agree upon a secret key: in the first case the eavesdropper knows their correlation, while in the second they are actually independent.

The intrinsic_mutual_information(), however can detect this:

In : intrinsic_mutual_information(intrinsic_1, [, ], )
Out: -4.440892098500626e-16


Next, let’s consider the distribution intrinsic_2:

In : print(intrinsic_2)
Class:          Distribution
Alphabet:       (('0', '1', '2', '3'), ('0', '1', '2', '3'), ('0', '1'))
Base:           linear
Outcome Class:  str
Outcome Length: 3
RV Names:       None

x     p(x)
000   1/8
011   1/8
101   1/8
110   1/8
220   1/4
331   1/4


In this case, $$Z$$ no longer can distinguish between the case where $$X$$ and $$Y$$ can agree on a secret bit, and when they can not, because she can not determine when they are in the $$01$$ regime or in the $$23$$ regime:

In : intrinsic_mutual_information(intrinsic_2, [, ], )
Out: 1.4999999999999998


This seems to imply that $$X$$ and $$Y$$ can adopt a scheme such as: if they observe either a $$0$$ or a $$1$$, write down $$0$$, and if they observe either a $$2$$ or a $$3$$, write that down. This has a weakness, however: what if $$Z$$ were able to distinguish the two regimes? This costs her $$1$$ bit, but reduces the secrecy of $$X$$ and $$Y$$ to nil. Thus, the secret key agreement rate is actually only $$1$$ bit:

In : minimal_intrinsic_mutual_information(intrinsic_2, [, ], , bounds=(3,))
Out: 1.0