[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

"Flavors of Security Through Obscurity"

This was posted not too long ago on sci.crypt... Enjoy... I think the most
relevant information is near the top, but it's all quite good... :-)


There is no intrinsic difference between algorithm and data, the
same information can be viewed as data in one context and as
algorithm in another. Why then do so many people claim that
encryption algorithms should be made public and only the key
should be kept secret? (This is the famous and derisive mantra
about "security through obscurity".) There are several answers:

a) Some people, with little understanding about computer
technology, try to keep secret what their programs do, even
though the programs themselves are public. A program *is* a
representation of the algorithm, even though it happens to be
more difficult for humans to read than, say, an detailed
description in English. Actually it is a very good idea to keep
secret the algorithm (in all its representations), as long as you
can afford to do so. That is why major governments do exactly

b) One can memorize a key and keep it secret in one's head.
Normally, encryption algorithms are too complicated to be
memorized. Therefore it is easier to keep secret a key than an

c) Most people and organizations do not have sufficient expertise
to create a new and good encryption algorithm and then try to
keep it secret. A bad encryption algorithm, in this context, is
an algorithm that can be broken by a sophisticated attacker even
without knowledge about the algorithm itself.

As you see, the reasons are of a practical nature, and are not
derived from any fundamentals in cryptology. If we could find a
practical way to keep secret both the key (that is the data the
encryption method operates on) and also the method itself (or at
least part of the method), then security would be greatly
enhanced because the attacker would have less knowledge to work

I believe there are several ways to overcome these practical

a) Machine generated secret ciphers.

Today there are only a few encryption algorithms that are
generally accepted as good. But suppose there existed a generator
program which could construct a new encryption algorithm
depending on some random input. Actually, the generator program
would produce another program which would then be used as the
encryption software. In some important cases, it is feasible to
keep secret the resulting program: International organizations
could distribute disks containing the program using trusted
persons, the program could be loaded in centralized servers which
actually operate from within a safe, or maybe the program (in
encrypted form) would be run only from a floppy disk which would
be handled with the same care as the key to a safe. We all know
that absolute security is impossible. What I am suggesting here
is that in many cases this system of security is better than one
using a standardized and public algorithm which attracts a lot
cryptanalytic work and may be broken in the near future or may
have already been broken in secret.

b) Intrinsically secret ciphers.

Extend secrecy to parts of the encryption method. In his book,
Schneier very briefly describes a variant of DES where the Sboxes
(which most people would consider as part of the algorithm) are
variable and depend on the key. Another very interesting
possibility would have the key express the encryption method. In
other words consider the key as the program, and the cipher
simply as an interpreter, that follows the key's instructions to
scramble the plaintext or unscramble the ciphertext. This would
call for large keys, but not larger than keys used in public key

c) "Variable" ciphers.

The idea here is to implement a cipher that incorporates a huge
number of different encryption functions. The objective is to
overwhelm the analytic capability of an attacker. (At the end of
this post you will find the outline of a proof about why a cipher
of this type is intrinsically more secure.)

My GodSave encryption algorithm belongs to this class of
"variable" ciphers. It extensively uses data of the key to change
the control flow of the program execution. In other words,
whereas most algorithms just operate on the key, GodSave uses
information in the key to decide how to operate. Even better: It
is constructed in such a way that large sections of its code can
be modified by a programmer without affecting the security of the
algorithm, offering some of the advantages described under a)

Here is the definition of another cipher of this type (let us
call it "heavy DES"): Start by randomly defining 16K DES keys;
you need less the 1 MB space in your hard disk to save them.
Suppose that this large key set is public (either by choice or
because an attacker gained access to your computer and stole it).
So now you have a large set of DES ciphers with *known* keys; the
effort to break any one of them is 1. Now define a secret key of
140 bits. Use 14 bits at a time to index one of the 16K DES
functions. Encrypt a 64 bit block by sequentially chaining the 10
indexed DES functions. DES is not a group, therefore each
instance of the 140 bits long key results in a different mapping
of the plaintext space into the ciphertext space. If we choose
from 2^N DES functions and chain P of them together (in the
example above N=14 and P=10) then there are 2^(N*P) different
instances. Already with 140 bits of entropy, a brute force attack
is out of the question no matter how many hardware coded DES
machines you have. Suppose you have perfect cryptanalytic
knowledge of DES - trapdoor and all - even then, can you see a way
to attack this variable version?

Finally, let me try to demonstrate why a "variable" cipher is
more difficult to break:

Take two ciphers A and B with keys of N bits. These ciphers must
be independent in the sense that physically breaking one does not
help in breaking the other. (By "physical break" I mean the work
necessary to recover the key when the cipher is already
cryptanalyzed; "logical break" would be the work necessary to
successfully cryptanalize a cipher"). Let us suppose that these
ciphers are not perfect; and therefore there exists a method
(known or unknown) to physically break them that is more
efficient then brute force, i.e. the trying of all possible keys.
(Observe that no publicly known cipher with a key that is used
repeatedly has been proved to be perfect in this sense.) For
ciphers A and B there exists then a method to break them with as
few as 2^(N*k) operations where k<1 (2^N corresponds to brute
forcing them). If you increase the key length by 1 bit, then you
would need 2^((N+1)*k)=2^N * 2^k operations to break A or B. But
if you create a new cipher C with a key of N+1 bits where the
last bit is used to choose either A or B as the working cipher
with a key of N bits, then you must break A, and with 50%
probability B also, with an effort comparable to
2^(N*k)+0.5*2^(N*k)=3/2 * 2^(N*k). Therefore you need more effort
to break C with a key of N+1 bits, than either A or B with a key
of N+1 bits, as long as k is less then ln(3/2)/ln(2) = 0.58. If
instead of two ciphers, you started with 2^M different ciphers,
then the results are more dramatic. The effort required for
breaking the resulting cipher is now 2^(N*K-1)*(2^M+1) and it
will be stronger as long as k < 1/M*(ln(2^M+1)/ln(2) -1) or for
large M: k < 1 - 1/M. This works like a security amplifier: if you
can construct 1024 independent ciphers then by this method you
can produce a cipher that has a 10 bit longer key and is provably
512 times more secure than any one of them (in the sense that an
attacker must invest 512 times more effort to break it).



Do not meddle in the affairs of Dragons     |  Brandon Matthews - SysAdmin
 For you are crunchy and good with mustard. |  <bmatt devils eng fsu edu>
           -- Anonymous                     |  (850) 668-2677
 finger bmatt devils eng fsu edu for PGP public key

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]