From Citizendium
Revision as of 18:56, 3 November 2007 by imported>Subpagination Bot (Add {{subpages}} and remove any categories (details))
Jump to navigation Jump to search
This article is developing and not approved.
Main Article
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
This editable Main Article is under development and subject to a disclaimer.

The objective of information theory is to determinate the informative content of determined object or event and to obtain a better measure of that content. The word “information” is derived of the Latin expression inform that signifies “to give a form to”, meaning the imposition of a structure over an indefinite set of facts. Given the fact that information is not matter neither energy, then it should not be considered an absolute greatness, instead it should be faced in a relative way, such as entropy. The expression “information quantity” is a metaphor without any physical or numerical property. The information isolated from a system and out of context does not have any meaning; it is the contextualization that brings information to live. Information is transformed in knowledge only when it has a practical utility and without this the information does not pass of mere abstract data, physical or not.

According to Shannon and Weaver (1949) the information can be measured as negative entropy being a measure of the order in a system. It can be said that information is a form of data processing concerning objects, events or persons that only has meaning for a receiver if the increase of knowledge reduces the uncertainly in decision making processes. Langford’s equation defines information according to the following formula:

I = i (D, S, t)

In this formula, I represents the information obtained by a determined interpretative mechanism i which in turn acts on data set D, relatively to a previous knowledge stage S, during a period of time t.

Intuitively, the amount of information gained by observing a single datum is a function of the distribution of values in D: if you live in the desert and it seldom rains, then observing that the sun is shining provides you with less information than would observing that the ground is moist. This is sometimes thought of as the "surprise factor": observations which are surprising carry provide more information than observations that are not surprising. To make this more concrete, consider the implications of discovering indigenous penguins in a temperate climate. We already know that penguins can be found in Antarctica and in the region of the Southern Ocean, so observing additional penguins (of any species) in that area doesn't tell us nrearly so much as would the (not so theoretical) discovery of a penguin in a temperate region.

Whatever the definition, information is an invisible agent that acts like an agglutinant in all the processes of decision-making, regulation and control. In an economically dynamic and technical situation information has the fundamental role of being the support for efficient decision making (Pinheiro, 1986). The information must be screened, condensed, stored, transmitted, received, aggregated and integrated. All these actions are based in the fact of that despite its infinity the information, for human consumption, can only be organized in a limited number of forms (Senn, 1989; Blethyn, 1990).

Organization is the result of the interaction of information with matter and energy. When it is added to the matter she shows up always some kind of structure or organization, i.e., the organization can be viewed as information stored. Beyond organizing matter and energy information is still able to self structure. The more organized is a process or a structure, less information is necessary to fully describe it. By the contrary, disorganization is always associated to an increase of system entropy (Stonier, 1990). The more disorganized is a system; more information is possible to extract from him hence there is no certainty that there aren’t hidden patterns in it (Rodrigues, 1991).