Thank you for organizing such an interesting FUZZY ENGINEERING COURSE
Attached herewith are the photos of the group as you requested.
Prof. Noor Ainy Harish
Each month in 2012, Taylor & Francis Engineering will be announcing journals and books news and offers from a different subject area. This month, we focus upon our Computer Science titles.
Highlights of this Computer Science newsletter include:
Sign in to access recent content from all these journals and more!
| || || |
| || || || |
| || || || |
Subscribe for more like this
Would you like to receive more offers like this?
Do you fancy winning a great selection of books from CRC Press?
This month, we're giving away the below CRC Press books.
New to Taylor & Francis for 2012
|Taylor & Francis is co-publishing International Journal of Computational Intelligence Systems with Atlantis Press from 2012. |
International Journal of Computational Intelligence Systems aims at covering state-of-the-art research and development in all fields where computational intelligence is applied. The journal publishes original papers on foundations and new developments of computational intelligence with an emphasis on applications, including current and potential applications of CI methods and techniques.
|The journal seeks original contributions in the area of applied computational intelligence research in general, with a focus on applications using new and emerging technologies. Applications may range from information technology and energy supply to environmental, societal and security related topics.|
20% Off Books from CRC Press
|For more newsletters such as this, and to be in with a chance of winning great prizes every month in 2012, subscribe here! |
To keep up to date with all of our latest Computer Science news, visit our Computer Science Subject News Page.
Browse our full Computer Science journals portfolio online and visit our journals' homepages.
On the journal homepages, you can register for Table of Contents alerts or recommend a title to your library.
|Like us on Facebook |
|Follow us on Twitter |
|To add our domain to your safe senders list, please click here|
If you would like to view this email online please click here
You will be able to update your details or unsubscribe at any time. We respect your privacy and will not disclose, rent or sell your email address to any outside organisations.
|Copyright © 2012 Taylor & Francis, an Informa Business|
Informa plc ("Informa") Registered Office: Mortimer House, 37-41 Mortimer Street, London, W1T 3JH.
Registered in England and Wales - Number 3099067.
Many thanks for your scholarly summary of your Generalized Information Theory (GIT). In your summary, you touch upon the relationship between GIT and my Generalized Theory of Uncertainty (GTU). I should like to comment on this relationship.
As I see it, in large measure GIT and GTU are complementary. The principal difference between GIT and GTU is the following. GIT is concerned, in the main, with measures of information, as is classical information theory. On another side, GTU is concerned, in the main, with the meaning of information. To amplify this difference, a bit of history is in order. I was fortunate to hear Shannon's first lecture on information theory in 1946, in New York. I was a student at Columbia University at the time. I was profoundly impressed by Shannon's exposition of his work. In his talk, Shannon opened the door to a new and fascinating world -- the digital world of data, entropy, channels and coding. Shannon was careful to point out that his main concern was communication of information, and not its meaning. In this context, it is clear that what matters is the measure of information and not its content. Subsequently, some attempts were made, notably by Minsky and Bar-Hillel, to develop theories of semantic information. These attempts were not successful.
GIT is a significant generalization of classical information theory, but it stops short of consideration of semantic issues. In contrast, GTU stops short of exploration of issues which relate to measures of information. In the world of human cognition, information is described in a natural language. A basic issue which, to my knowledge, has not been addressed so far is: How much information is contained in a proposition, p, drawn from a natural language? Example. How much information is contained in the proposition: Blood pressure is high? A less simple example. How much information is contained in the proposition: It is probable that blood pressure is high? A significantly more complex example is the proposition: It is very unlikely that there will be a significant decrease in the price of oil in the near future. Clearly, to answer questions of this kind it is necessary to have a means of precisiating the meaning of a proposition drawn from a natural language. GTU provides this means but does not address the issue of measuring the information conveyed by propositions drawn from a natural language. This issue falls within the province of GIT.
In GTU, information is equated to a restriction. In classical information theory as well as in GIT, information is assumed to be statistical in nature. Definition of information in GTU breaks away from this assumption.
GIT is an important generalization of classical information theory. It is a real challenge to extend GIT in a way that would enable GIT to deal with information which is conveyed by propositions drawn from a natural language. Do you have any thoughts relating to this issue?
Lotfi A. Zadeh
Professor in the Graduate School
Director, Berkeley Initiative in Soft Computing (BISC)
729 Soda Hall #1776
Computer Science Division
Department of Electrical Engineering and Computer Sciences
University of California
Berkeley, CA 94720-1776
Tel.(office): (510) 642-4959
Fax (office): (510) 642-1712
Tel.(home): (510) 526-2569
Fax (home): (510) 526-2433
BISC Homepage URLs