Saturday, 24 March 2012

Zadeh response to Klir Panel on Uncertainty/

From: Ronald R. Yager <yager@panix.com>


Hi Lotfi:
  In your response to George Klir you raise some interesting issues.  However I would like to ask you what do you mean by "how much" information is contained in a proposition, p, drawn from a natural language and your example about the amount of information in the statement  Blood pressure is high?
How is what you have in mind different from the concept of specificity ??

 I feel that the measure of information should be context dependent.  Perhaps rather then asking for a measure of information the more relevant question is how useful is the knowledge I have in the context I am trying to use the knowledge   Here then perhaps in the spirit of GTU the concept of "usefulness/ measure" of the knowledge should be more related to idea of containment.  Thus if I know that some variable is restricted by V is F and if I have a task that requires to know whether it is restricted by V is E then it appears to me  that  the amount of information I have  is related to my ability to distinguish
between the truth of   V is E and V is not E .



Regards
r r yager





\*******************************


Dear George:

    Many thanks for your scholarly summary of your Generalized Information Theory (GIT). In your summary, you touch upon the relationship between GIT and my Generalized Theory of Uncertainty (GTU). I should like to comment on this relationship.

    As I see it, in large measure GIT and GTU are complementary. The principal difference between GIT and GTU is the following. GIT is concerned, in the main, with measures of information, as is classical information theory. On another side, GTU is concerned, in the main, with the meaning of information. To amplify this difference, a bit of history is in order. I was fortunate to hear Shannon's first lecture on information theory in 1946, in New York. I was a student at Columbia University at the time. I was profoundly impressed by Shannon's exposition of his work. In his talk, Shannon opened the door to a new and fascinating world -- the digital world of data, entropy, channels and coding. Shannon was careful to point out that his main concern was communication of information, and not its meaning. In this context, it is clear that what matters is the measure of information and not its content. Subsequently, some attempts were made, notably by Minsky and Bar-Hillel, to develop theories of semantic information. These attempts were not successful.
    GIT is a significant generalization of classical information theory, but it stops short of consideration of semantic issues. In contrast, GTU stops short of exploration of issues which relate to measures of information. In the world of human cognition, information is described in a natural language. A basic issue which, to my knowledge, has not been addressed so far is: How much information is contained in a proposition, p, drawn from a natural language? Example. How much information is contained in the proposition: Blood pressure is high? A less simple example. How much information is contained in the proposition: It is probable that blood pressure is high? A significantly more complex example is the proposition: It is very unlikely that there will be a significant decrease in the price of oil in the near future. Clearly, to answer questions of this kind it is necessary to have a means of precisiating the meaning of a proposition drawn from a natural language. GTU provides this means but does not address the issue of measuring the information conveyed by propositions drawn from a natural language. This issue falls within the province of GIT.

    In GTU, information is equated to a restriction. In classical information theory as well as in GIT, information is assumed to be statistical in nature. Definition of information in GTU breaks away from this assumption.

    GIT is an important generalization of classical information theory. It is a real challenge to extend GIT in a way that would enable GIT to deal with information which is conveyed by propositions drawn from a natural language. Do you have any thoughts relating to this issue?

                              Sincerely,

                              Lotfi
--
Lotfi A. Zadeh
Professor in the Graduate School
Director, Berkeley Initiative in Soft Computing (BISC)

Address:
729 Soda Hall #1776
Computer Science Division
Department of Electrical Engineering and Computer Sciences
University of California
Berkeley, CA 94720-1776
zadeh@eecs.berkeley.edu
Tel.(office): (510) 642-4959
Fax (office): (510) 642-1712
Tel.(home): (510) 526-2569
Fax (home): (510) 526-2433
URL:
http://www.cs.berkeley.edu/~zadeh/

BISC Homepage URLs
URL:
http://zadeh.cs.berkeley.edu/


No comments:

Post a Comment