From: tim@maths.tcd.ie (Timothy Murphy)
Newsgroups: sci.math
Subject: Re: Kolmogorov Complexity (again)
Date: 11 Jan 1999 17:52:16 -0000
Keywords: Kolmogorov Complexity as measure of information content
graham_fyffe@hotmail.com writes:
>Can someone explain why many people use Kolmogorov Complexity as an absolute
>measure of information content?
>Usually, people say "Kolmogov Complexity is an absolute intrinsic quantity in
>the sense that for any two languages, the KCs of one object differ by at most
>a finite constant".
>Would somebody please give pointers to a _proof_ that the latter implies the
>former? Or, exactly what is meant by "in the sense" and what use it is? It
>seems to me that this statement is analogous to "I like apples in the sense
>that I like oranges".
The statement seems pretty clear to me,
which is more than can be said for your question.
What exactly is "former" and "latter"?
As I understand it, Kolmogorov complexity (or entropy)
is defined relative to a universal Turing machine U.
If you choose a different universal machine V
then there is a constant C = C(U,V)
such that the 2 values for the complexity differ by less than C:
| H_U(s) - H_V(s) | \le C for all s.
This is all that is meant by the statement you quote.
Incidentally, has it ever been proved that there does not exist
an absolute Kolmogorov complexity up (to an absolute constant C),
ie a universal machine U such that
H_V(s) \ge H_U(s) + C
for all s and all universal machines V?
In other words, has it been shown that one cannot find a "best"
universal computer U?
--
Timothy Murphy
e-mail: tim@maths.tcd.ie
tel: +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
==============================================================================
From: Chris Hillman
Subject: Re: Kolmogorov Complexity (again)
Date: Sat, 16 Jan 1999 16:55:51 -0800
Newsgroups: sci.math
On Tue, 12 Jan 1999 graham_fyffe@hotmail.com wrote:
> Okay, what I'm up in arms about is this: The KC of an object can
> occur as any positive value, and hasn't even got a discernable
> expected value. I certainly wouldn't use the word "quantity" to
> describe something like that.
I think there might be some confusion here, since it is, strictly
speaking, incorrect to speak of "the" Kolmogorov complexity of an
"object". Presumably you mean the algorithmic complexity (or
Kolmogorov-Chaitin-Solomonoff) complexity of a finite string, taken with
respect to a fixed universal Turing machine U.
The complexity (wrt U) of a given finite string x is certainly well
defined; you can usually even proof upper and lower bounds for the
complexity of a particular x. What you -cannot- do is to provide an
algorithm which -always- computes the precise value of the complexity of x
(wrt U), since this would be tantamount to solving the halting problem.
The article by Martin Davis in Mathematics Today, ed. by Lynn Arthur
Steen, gives a particularly clear account of this connection.
Also, when you say "expected value", I have a hunch you -don't- mean what
that what that phrase would usually connote in mathematics, namely the
expected value of some quantity taken wrt to a probability measure on some
measure space (in this case, some space of finite strings). Indeed, as I
pointed out in a previous post, it is most certainly possible to prove
statistical statements about the complexity; see for instance the very
readable textbook Elements of Information Theory, by Thomas & Cover, which
provides some very simple connections between Shannonian information
theory (which involves probability measures) and algorithmic complexity.
> If there is a good explanation, please let me know. This has been bugging me
> for quite some time.
I'm not sure what is bugging you, but hopefully my comments above help.
Chris Hillman
==============================================================================
From: Chris Hillman
Subject: Re: Kolmogorov Complexity (again)
Date: Mon, 18 Jan 1999 11:32:04 -0800
Newsgroups: sci.math
On 18 Jan 1999, Tal Kubo wrote:
> Chris Hillman wrote:
> >
> >The complexity (wrt U) of a given finite string x is certainly well
> >defined; you can usually even proof upper and lower bounds for the
> >complexity of a particular x.
>
> Lower bounds above some fixed ceiling are never provable, for any
> particular x. Roughly, if ZFC has a description of length 500,
> it cannot prove that War And Peace has complexity > 800. The
> extra 300 is just the complexity of some fixed program converting
> finite program runs to ZFC proofs.
Whoops! I know better than that :-( Indeed, this is Chaitin's version of
Goedel's theorem.
Thanks for the correction!
Chris Hillman
==============================================================================
From: kubo@adams.math.brown.edu (Tal Kubo)
Subject: Re: Kolmogorov Complexity (again)
Date: 18 Jan 1999 17:03:04 -0500
Newsgroups: sci.math
Chris Hillman wrote:
[as above -- djr]
Actually I was waffling a bit myself. Here is a way to see why
the theorem should be true:
Consider a program P that searches for inconsistencies in ZFC,
and if it finds one at the N'th proof, prints out the f(N)'th possible
string and stops. (Say, f(N)=number of primes dividing N, maybe plus some
random noise from the decimal expansion of Pi. What matters is that
f(N) assumes all values infinitely often, and is sufficiently chaotic.)
ZFC will not be able to disprove, for any given string x,
that P describes x. It can't prove that *no* x is an output (Goedel),
and as long as f(N) is uncorrelated with the contents of the N'th
proof, it is no easier to prove that some particular x isn't an
output. So for any particular string x, ZFC can't prove that its
complexity exceeds |P|.
This is not a proof, but assuming that ZFC is not related to primes
or Pi in some mysterious way, it is clear that this type of construction
will work.
==============================================================================
From: kramsay@aol.com (KRamsay)
Subject: Re: Kolmogorov Complexity (again)
Date: 19 Jan 1999 05:37:42 GMT
Newsgroups: sci.math
Alternatively, an idea which can be made into a rigorous proof goes
like this. Write a program which searches for a proof in ZFC whose
conclusion is that some given string 's' cannot be generated by any
program of length
Subject: Kolmogorov Complexity Resources
Date: Wed, 10 Feb 1999 11:55:54 +0000
Newsgroups: comp.theory,comp.ai,sci.math,comp.compression,sci.crypt
Kolmogorov Complexity Resources
http://www.stud.enst.fr/~obousque/kolmogorov.html
I am building a new page with resources about Kolmogorov Complexity
(a.k.a. Chaitin Complexity, Algorithmic Complexity...), including
tutorials,
homepages, conferences, on-line papers....
Please visit and send any comment/addition to :
obousque@email.enst.fr
Enjoy !
Olivier Bousquet
http://www.stud.enst.fr/~obousque/
[HTML deleted -- djr]
==============================================================================
From: rcktexas@ix.netcom.com (R. Knauer)
Subject: Re: Kolmogorov Complexity Resources
Date: Wed, 10 Feb 1999 14:11:32 GMT
Newsgroups: comp.theory,comp.ai,sci.math,comp.compression,sci.crypt
On Wed, 10 Feb 1999 11:55:54 +0000, Olivier Bousquet
wrote:
[above post -- djr]
Greg Chaitin has a second web site that you did not list:
http://www.umcs.maine.edu/~chaitin/
Bob Knauer
"It is not a matter of what is true that counts, but a matter of
what is perceived to be true."
--Henry Kissinger