[Next] [Up] [Previous] [Index]

# The Large-Key Brainstorm

If one hearkens back to the schemes of previous chapters, such as the Hagelin machine, or the Vernam two-tape system, and allows the use of more key than is needed to achieve a given level of security, one can use the constructs we have met in this chapter to achieve, I believe, a fairly high level of security.

The diagram above is illustrative of the concept involved.

• Plaintext is encrypted, or ciphertext decrypted, using a block cipher with independent subkeys.
• A separate list of subkey values is used to supply each subkey.
• With each block enciphered, one advances through each of these lists - sometimes by one step, sometimes by two, three, or four steps.
• Another block cipher with a fixed key, operating in a combination of output feedback and counter modes (the output is fed back, but XORed with the value in a counter) supplies the bits which determine stepping through the list of subkeys.

The diagram shows, however, a scaled-down version of the cipher. A more specific description rests on the following principle: the number of bits used to control the stepping of all the lists must equal or exceed the number of bits in a block. In this way, the stepping alone is enough, or at least nearly enough, to cause any plaintext block to become any possible ciphertext block, making it hard to obtain information about the subkeys in the lists. (In turn, the randomness and secrecy of the list contents make it hard to obtain information about the stepping sequence.)

A specific construction would run like this:

• DES, but with 32 rounds as well as with independent subkeys, is used as the main block cipher here. In this way, there are 32 lists, the advance of each of which is controlled by two bits, thus meeting the important criterion noted above.
• A block cipher with a 128-bit block is used to generate the stepping bits; it operates once for every two 64-bit blocks enciphered, its output being used a half at a time. This ensures a very long period for the stepping sequence. As that cipher is not exposed to view, even one with weaknesses with respect to differential cryptanalysis could be used, such as the original LUCIFER cipher.

With a large key, key management is a problem. The key comes in four main parts:

• The bytes in all the lists of subkey values. Also, the lengths of these lists can be part of the key.
• The key for the block cipher used to generate stepping values.
• The initial value, and the counter value, for the generation of stepping bits.
• The starting positions within each of the lists of subkey values.

The last item on the list is only needed as part of the key if one is going to start up the encipherment operation several times with the same lists of subkeys, but it is reasonable to do so, since that part of the key is very large and difficult to set up.

The first two items can be regarded as semi-permanent, while the last two should be changed with every message. However, the last two must be kept secret, not sent in the clear, otherwise attacks on the contents of the lists of subkey values can become possible if there is enough traffic for the same values of the first two items in the key.

With a key too large to send inside a public-key block, and such a high level of security as to be wasted by any practical method of key distribution, perhaps I have only solved the problem of cryptanalysis for the case when another solution to that problem already exists! However, one could use a one-time-pad to encipher the key distribution keys, and use this cipher with those keys for key distribution. That would avoid too great a loss of security.

The long-term portions of the key need to be couriered, of course, combined with encrypting them by some other, weaker, method. Sending two different couriers, by two different routes, with an XOR-split key may be appropriate here.

Although the design above already provides far more security than required for any practical purpose, against the kind of computing power that exists today, the possibility that a practical quantum computer might be constructed might be felt to be worthy of consideration. If the obstacles to constructing a quantum computer can be overcome, such a computer could essentially try all the possible keys to a block cipher at once, in a circuit the size of the conventional circuit used to apply a single key, and output the value of the key for the one case where some known plaintext is matched.

To protect against a threat like that, one would like a design that requires an inordinate amount of known plaintext before any facts about the key can be derived with certainty. A design like what we've just seen is perhaps a step towards that kind of cipher. This design can certainly be elaborated further, with yet another level of indirection, where a second block cipher produces the 64 bits that control the subkeys of the block cipher used to encipher plaintext. Further on, a diagram will show that type of design, but instead of the final block cipher enciphering plaintext, it will be used to encipher previous ciphertext to produce a 64-bit output used to encipher a single byte of plaintext.

Perhaps the second block cipher might be 32-round Blowfish, or even a modified Blowfish changed to use 48-bit subkeys and an expansion permutation along these lines:

```23 32  9 18  1  2  3  4  5  6  7  8   31  8 17 26  9 10 11 12 13 14 15 16
7 16 25  2 17 18 19 20 21 22 23 24   15 24  1 10 25 26 27 28 29 30 31 32
```

so that the four S-boxes can each have 4,096 entries. To keep the time required to generate the S-boxes within some semblance of reason, though, I propose to use just 4-round Blowfish, rather than 32-round or 16-round Blowfish, for filling them.

Since I am going to extremes, instead of LUCIFER, let's let the 128-bit block cipher be the 40-round variation of Quadibloc II, using one of the more complicated variant round types. The stream cipher can consist of the XOR of the output of an expanded version of Gifford's cipher with the output of an elaborate MacLaren-Marsaglia construct.

Another preventive measure would be to use an enciphered random initialization vector that modifies the key in use for the next block of the message with message blocks that are always shorter than the size of the key, so that there is never enough known plaintext to attack any one key. Of course, the relationship between different message blocks and their initialization vectors can still be attacked; this device has been tried with simple ciphers and has not made them invincible. Applied to a sufficiently complex cipher, the gain in difficulty by using that trick, however, may, just possibly, be enough to provide some resistance to quantum computer attack.

This can be taken still further in the direction of wretched excess, in order to obtain increased security on the principle of an elephant giving birth to a gnat, by making use of the cipher-feedback principle of stream cipher design:

Here, we do not gain the advantages of limited error-propagation that a pure cipher-feedback design can offer; in fact, we not only have a large internal state, but previous ciphertext even influences that state, for the messy worst case of the autokey.

To encipher a single byte of the plaintext, we use the preceding twenty-four bytes of the ciphertext as input to the process. Sixteen bytes are XORed to the output of the initial stream cipher, which might as well be something elaborate, such as Panama or my modification of it; this is recirculated through a 128-bit block cipher.

Half of the result controls the subkeys for a 64-bit block cipher which enciphers the other half of the result.

The output of that cipher then controls the subkeys for another 64-bit block cipher which operates on the remaining eight bytes of preceding ciphertext.

And finally, that eight-byte output is applied to the one byte of plaintext being enciphered, being alternately added and XORed to it a byte at a time through eight layers of substitution.

Here we are: a truly secure symmetric-key cipher! And so it is, but it is outrageously excessive and wasteful. But perhaps the schematic diagram above will brighten a cubicle at the NSA and give some of the people there a chuckle.

Unless, of course, somebody actually implements this, and they would have liked to decrypt his traffic.

But, as noted, even the fact that this design is likely to produce (once the details are filled in) a genuinely secure cipher is not, in itself sufficient to mean it is suitable for practical use: a rough estimate of the time enciphering a message would take by this method is 64 times as long as it would take to encipher it with DES. Presumably, genuine security can be obtained at a somewhat lesser cost in computational resources.

Although wildly impractical in the specific form shown, before abandoning this design completely, some things should be noted:

• A scaled-down version of this type of design, not using full-scale block ciphers as its components, and not involving the somewhat gratuitous use of previous ciphertext (sixteen bytes, in the diagram above) to affect the very beginning of the encipherment process, and thus to affect large portions of the internal state, with the attendant consequences for error propagation, not only could be practical, but may even have been actually used.
• Also, assuming that a cipher of this kind did not evade the scrutiny of a quantum computer due to its sheer size, it is true that trivially some initial states, and consequently some keys, would produce the same ciphertext from the same plaintext, since some portions of the subkey pools for the various block cipher stages might be unused. However, such trivial duplicate keys will not foil quantum computer attack, since the program could be modified to say "don't think you're the right answer and collapse the wave function unless all unused parts of the key are zero", hence restoring uniqueness. By enciphering only a single byte at a time, however, some possibility of nontrivial duplicate keys is created. And certainly the difficulty of obtaining analytical insights that allow a reduction of effort over that of a brute-force search is increased.
• Finally, it should also be noted that despite the complexity of this kind of cipher, one limitation was retained. Although each block cipher stage is supplied with a subkey from a pool, at each step only four of the elements in that pool can be used. Ideally, one would like to use any possible subkey, and to prevent using the same one twice, use the principle behind the MacLaren-Marsaglia random number generator, and produce a replacement for each subkey after it is used once. However, that means that if the final block cipher has 16 rounds and 16 subkeys in each pool (each one, say, 32 bits wide) then the previous stage, instead of producing 64 bits each time to support a 32-round block cipher, must produce 64 bits to select subkeys, and 512 bits for use in replacement subkeys.

### The Aryabharata Cipher, and Two-Timing Pads

An article in Cryptologia entitled The Aryabharata Cipher discussed the following idea for a cipher:

To encipher a message, generate a random series of letters as long as the message. Encipher the message with that series of letters by means of Vigenere.

Then, send the random series of letters in enciphered form, and also send the enciphered message, again enciphered as well.

Since both the pad and the message are sent, the absolute security of the one-time-pad is not obtained. But the pad and the message can each be enciphered in different ways, and because both are random, the cryptanalyst can only make progress by working on both together.

Perhaps doing so would be more difficult than cracking a simple double encryption, using the two ciphers applied to the two pieces of the message as expanded in sequence on the message instead. And perhaps not; it is hard for me to say for certain.

But it does seem as though forcing the cryptanalyst to relate different messages to crack a cipher creates a problem, although this happens anyways with modern ciphers of any difficulty. Thus, the scheme is of interest even if there is some question about whether this is the method referred to in the Indian classic.

Thus, I came up with the following scheme, which prevents progress from being made through attacking a single message in a different way, which also shows an alternate way of making effective use of a large key.

The following is a schematic diagram of how the scheme operates:

The steps involved in the encryption are described below:

• Two parties wishing to communicate share a secret key which is 12,000 bytes long.
• When sending a message, they use a public-key block to establish 2,048 bits of key, which provides up to four session keys of 512 bits each.
• Messages consist of message segments, which are limited to a maximum of 4,096 bytes in length.
• To encipher a message segment:
• First, one operates on the 12,000 byte secret key to obtain key material for use with that message segment:
• The 12,000 byte secret key is subjected to a transposition, governed by 128 bits of the session key.
• It is then enciphered in some fashion that results in propagation of the encryption, such as a block cipher in CBC mode, using another 128 bits of the session key.
• The last few bytes of the result could then be used as a key to perform additional encryption. One possibility is another transposition; since quite a bit of key is potentially available, one could even perform a transposition, then a propagating encryption, then another transposition. Ending with a transposition seems to have nice properties. A specific possibility is as follows:
• The enciphered secret key is divided into two parts, 10,240 bytes to be further scrambled, and 1,760 bytes to be used as the key for that scrambling.
• First, all 10,240 bytes of the first part are transposed.
• Then, they are subjected to a block cipher in CBC mode.
• Then, the first 9,216 bytes of the first part are transposed, leaving the last 1,024 bytes not affected. (This will result in the keys used to encipher the 128-bit keys used for plaintext encipherment including the most propagated part of the scrambled secret key in the XOR that produced them.)
• The output of that encryption is then divided into two halves, which are XORed together. This ensures that it is difficult to derive any part of the long secret key from the bits to be used later.
• The result of the XOR is divided into a part that is 4,096 bits long, and two small parts that will be used as keys.
• The two small parts of the scrambled long secret key are each used to encrypt one 128 bit portion of the session key, to produce two encrypted keys to be used in encrypting the message segment.
• Then, one encrypts the message segment itself as follows:
• First, the message segment is encrypted in ECB mode with a block cipher, using the first encrypted 128 bit part of the session key.
• Then, the message is XORed with the 4,096 bit part of the scrambled long secret key.
• Finally, the message segment is encrypted in CFB mode with a block cipher, using the second encrypted 128 bit part of the session key.

The specific possibility for the additional encryption noted in the steps above is shown explicitly in this expanded version of the diagram:

and although it is based on a different principle, the same concept that it is more difficult for a cryptanalyst to correlate multiple messages than to directly solve a single one is used as was used in the Aryabharata cipher. Of course, the possibility of still performing such correlations is exactly what allows modern ciphers to be attacked, through such things as differential cryptanalysis. As I propose using block ciphers currently considered secure in this, I hope it at least compounds the problem they create.

Note, too, that the message segment itself is encrypted first in ECB mode, to ensure that it goes through the block cipher, then by being XORed with bits that are a function of part of the session key and of the long secret key, then in CFB mode. So good error-propagation characteristics are still retained for the message segment.

In addition to error-propagation, the modes used mean that, since the encryption of the large shared secret key can be performed ahead of time, as long as a time lag between session keys and their use is present, and sufficient processing power is available, this method is not limited to off-line uses such as E-mail, but could be used on a continuous basis, even for a digital voice transmission or a similar application, since the plaintext need only be delayed by the simple operations directly performed on it.