Saturday, January 28, 2006
Gitt Info Theory -- A Preliminary Response to T.O
Rich Baldwin's FAQ Concerning Gitt and Information Theory
I am still reading Gitt, but I think I know enough to at least preliminarily respond to Baldwin's page on T.O about Gitt. His first argument is this:
This is a fundamental misunderstanding of what Gitt is trying to do. Gitt does not disagree with Shannon as to what the measure or definition of information is at a statistical level. Gitt is talking about the higher levels of information. In fact, if you read the Shannon quote properly, you will see that it is confirming Gitt's views. Specifically "These semantic aspects of communication are irrelevant to the engineering problem". Gitt is absolutely correct, the "engineering problem" as Shannon refers to it, is wholly inadequate for describing meaning. Gitt and Shannon are in complete agreement with this. The point of Gitt is to extend the concept of information and be able to view it at more than just the statistical level.
It is strange that Baldwin used an issue that Gitt and Shannon agreed upon as a means of saying that Gitt is in contradiction to Shannon.
The next point is about randomness:
I think the point that Gitt was making was that a string of symbols following a strict semantics is evidence of it being from an intelligent source. While he did not prove it, I do not know of any counter-examples, nor did Baldwin even attempt to give any.
Now, using the algorithmic information theory definition of randomness, it is very easy to show that a given sequence of symbols is not random. Baldwin completely misses this point, and seems to assume that since randomness is an undecideable problem, then non-randomness is as well. This is simply false. In algorithmic information theory, if you can compress a string of symbols in any fashion then it is non-random.
In addition, I think it is fairly obvious that the genome fits a rather strict semantic, and in fact the existence of a discernable code is evidence of this. The fact that scientists can isolate genes means that the genes are following a specific semantic. Likewise for other structures such as regulatory regions, they likewise follow semantic rules for their operation. I view semantics as direct evidence of apobetics, and if someone wants to provide a counter-example, I would love to hear it.
I'm still reading the book, but this is the most accurate claim in the article. However, Baldwin still does not provide a counter-example to even a single one of them, or even mention which ones he disagrees with and why. He accuses Gitt of arm-waving, but himself is not demonstrating the falsity of any of his statements, except for the arguments against theorem 3 as already discussed.
I also agree with Baldwin that these results are not empirical. Perhaps this is Gitt misusing the term (his native language is German, I believe). These are logical deductions, not experimental results. I don't think that Gitt properly proved his theorems, but I think that this is more due to the fact that this is an area that has previously eluded examination. If Gitt's definition of semantic information is inadequate, what is a better one? The only substantive thing I can draw from Baldwin is that research in this field isn't finished, not that there is anything necessarily incorrect about Gitt's work (this isn't to say it is completely correct, either -- but I think that his work is much more interesting than simply saying it is wrong without providing counter-examples or logical reasons for them to be wrong).
Gitt agrees that this is a qualitative, not quantitative measurement in its current form.
Perhaps I am missing something, but I perceive the semantic aspects of the genome to be self-evident.
I am still reading Gitt, but I think I know enough to at least preliminarily respond to Baldwin's page on T.O about Gitt. His first argument is this:
A striking contradiction is readily apparent in Gitt's thinking- he holds that his view of information is an extension of Shannon, even while he rejects the underpinnings of Shannon's work. Contrast Gitt's words
(4) No information can exist in purely statistical processes.
and
Theorem 3: Since Shannon's definition of information relates exclusively to the statistical relationship of chains of symbols and completely ignores their semantic aspect, this concept of information is wholly unsuitable for the evaluation of chains of symbols conveying a meaning.
with Shannon's statement in his key 1948 paper, "A Mathematical Theory of Communication"
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.
It becomes very difficult to see how he has provided an extension to Shannon, who purposely modeled information sources as producing random sequences of symbols (see the article Classical Information Theory for further information). It would be more proper to state that Gitt offers at best a restriction of Shannon, and at worst, an outright contradiction.
This is a fundamental misunderstanding of what Gitt is trying to do. Gitt does not disagree with Shannon as to what the measure or definition of information is at a statistical level. Gitt is talking about the higher levels of information. In fact, if you read the Shannon quote properly, you will see that it is confirming Gitt's views. Specifically "These semantic aspects of communication are irrelevant to the engineering problem". Gitt is absolutely correct, the "engineering problem" as Shannon refers to it, is wholly inadequate for describing meaning. Gitt and Shannon are in complete agreement with this. The point of Gitt is to extend the concept of information and be able to view it at more than just the statistical level.
It is strange that Baldwin used an issue that Gitt and Shannon agreed upon as a means of saying that Gitt is in contradiction to Shannon.
The next point is about randomness:
Gitt allows himself to make guesses about the intelligence and purpose behind a source of a series of symbols, even though he doesn't know whether the source of the symbols is random. Gitt is trying to have it both ways here. He wants to assert that the genome fits his strictly non-random definition of information, even after acknowledging that randomness cannot be proven.
I think the point that Gitt was making was that a string of symbols following a strict semantics is evidence of it being from an intelligent source. While he did not prove it, I do not know of any counter-examples, nor did Baldwin even attempt to give any.
Now, using the algorithmic information theory definition of randomness, it is very easy to show that a given sequence of symbols is not random. Baldwin completely misses this point, and seems to assume that since randomness is an undecideable problem, then non-randomness is as well. This is simply false. In algorithmic information theory, if you can compress a string of symbols in any fashion then it is non-random.
In addition, I think it is fairly obvious that the genome fits a rather strict semantic, and in fact the existence of a discernable code is evidence of this. The fact that scientists can isolate genes means that the genes are following a specific semantic. Likewise for other structures such as regulatory regions, they likewise follow semantic rules for their operation. I view semantics as direct evidence of apobetics, and if someone wants to provide a counter-example, I would love to hear it.
Gitt describes his principles as "empirical", yet the data is not provided to back this up. Similarly, he proposes fourteen "theorems", yet fails to demonstrate them.
I'm still reading the book, but this is the most accurate claim in the article. However, Baldwin still does not provide a counter-example to even a single one of them, or even mention which ones he disagrees with and why. He accuses Gitt of arm-waving, but himself is not demonstrating the falsity of any of his statements, except for the arguments against theorem 3 as already discussed.
I also agree with Baldwin that these results are not empirical. Perhaps this is Gitt misusing the term (his native language is German, I believe). These are logical deductions, not experimental results. I don't think that Gitt properly proved his theorems, but I think that this is more due to the fact that this is an area that has previously eluded examination. If Gitt's definition of semantic information is inadequate, what is a better one? The only substantive thing I can draw from Baldwin is that research in this field isn't finished, not that there is anything necessarily incorrect about Gitt's work (this isn't to say it is completely correct, either -- but I think that his work is much more interesting than simply saying it is wrong without providing counter-examples or logical reasons for them to be wrong).
Neither do we see a working measure for meaning (a yet-unsolved problem Shannon wisely avoided). Since Gitt can't define what meaning is sufficiently to measure it, his ideas don't amount to much more than arm-waving.
Gitt agrees that this is a qualitative, not quantitative measurement in its current form.
If we use a semantic definition for information, we cannot assume that data found in nature is information. We cannot know a priori that it had an intelligent source. We cannot make the data have semantic meaning or intelligent purpose by simply defining it so.
Perhaps I am missing something, but I perceive the semantic aspects of the genome to be self-evident.
Comments:
<< Home
"Perhaps I am missing something, but I perceive the semantic aspects of the genome to be self-evident."
Yes, you are missing something.
I saw a talk by Gitt last month. I plan on writing it up soon. He was a decent fellow, but the only people he convinced were those that already agreed with him. He could not/would not answer any questions. It was a fairly pathetic showing, and he is your 'best'...
Post a Comment
Yes, you are missing something.
I saw a talk by Gitt last month. I plan on writing it up soon. He was a decent fellow, but the only people he convinced were those that already agreed with him. He could not/would not answer any questions. It was a fairly pathetic showing, and he is your 'best'...
<< Home