- Stahuj zápisky z přednášek a ostatní studijní materiály
- Zapisuj si jen kvalitní vyučující (obsáhlá databáze referencí)
- Nastav si své předměty a buď stále v obraze
- Zapoj se svojí aktivitou do soutěže o ceny
- Založ si svůj profil, aby tě tví spolužáci mohli najít
- Najdi své přátele podle místa kde bydlíš nebo školy kterou studuješ
- Diskutuj ve skupinách o tématech, které tě zajímají
Studijní materiály
Hromadně přidat materiály
IS komplet
ETE49E - Information systems
Hodnocení materiálu:
Vyučující: Ing. Ph.D. Tomáš Rain
Popisek: PDF se všemi přednáškami z IS. Výborně se v tom hledá při zkoušce :-)
Zjednodušená ukázka:
Stáhnout celý tento materiálIng. Tomáš Rain Ph.D.
Outline
head2right Information
head2right Theory of information
head2right History of information theory
head2right Mathematical theory of information
head2right Mutual information (transinformation)
head2right Continuous equivalents of entropy
head2right Channel capacity
head2right Source theory
head2right Classification of information
head2right Characteristics of information
head2right Information resources
Information
head2rightInformation is fundamental
communications unit in each information
and management system.
head2rightMessage can be denoted as information if
restrains uncertainty his knowledge.
head2rightFeature „to be message“ is relative – it`s
depends on particular system and
particular person.
Defining information 1/2
head2right No universal definition of "information" itself has become
possible yet.
head2right Depending on the context, different phenomena get called
"information." Three kinds of phenomena are commonly
referred to as „information“.
checkbld Information as a cognitive process;
checkbld Information as knowledge imparted;
checkbld Signifying objects (data, documents, and the like) are
commonly referred to as "information."
head2right Further, the word „information“ is commonly used so
metaphorically or so abstractly that the meaning is unclear.
Defining information 2/2
head2right Rather than being important in itself, information becomes
so because of its relationship to knowledge.
head2right As Francis Bacon observed in 1597: „Nam et ipsa scientia
potestas est," knowledge is power.
head2right He did not say „Information is power.“ Knowledge is
power, because „Scientia et potentia humana in idem
coincidunt, quia ignoratio causae destuit effectum.“
(Human knowledge and human power meet in one,
because where the cause is not known the effect cannot be
produced.)
head2right Knowledge is empowering. Information, then, can be
indirectly empowering to the extent to which knowledge is
derived from it.
Theory of information
head2right Information theory is a field of mathematics concerning the
storage and transmission of data and includes the fundamental
concepts of source coding and channel coding.
head2right These topics are rigorously addressed using mathematics
introduced by Claude Shannon in 1948.
head2right His papers spawned the field of information theory, which goes
beyond the above questions to extended and combined problems
such as those in network information theory and related problems
including portfolio theory and cryptography.
head2right The impact of information theory has been crucial to the success
of the
checkbld Voyager missions to deep space
checkbld the invention of the CD
checkbld the feasibility of mobile phones
checkbld the development of the Internet and broadband Internet access
checkbld the analysis of DNA, and numerous other fields.
Theory of information
head2right The central paradigm of classic information theory is the
engineering problem of the transmission of information
over a noisy channel.
head2right The most fundamental results of this theory are Shannon's
source coding theorem, which establishes that on average
the number of bits needed to represent the result of an
uncertain event is given by the entropy.
head2right Shannon's noisy-channel coding theorem, which states that
reliable communication is possible over noisy channels
provided that the rate of communication is below a certain
threshold called the channel capacity.
head2right The channel capacity is achieved with appropriate
encoding and decoding systems.
History of information theory
head2right The decisive event which established the subject of
information theory, and brought it to immediate worldwide
attention, was the publication of Claude E. Shannon
(1916–2001)'s classic paper „A Mathematical Theory of
Communication„.
head2right Main ideas:
checkbld qualitative and quantitative model of communication as a
statistical process
checkbld information entropy and redundancy
checkbld relevance through the source coding theorem
checkbld the mutual information
checkbld channel capacity
Before 1948
head2rightThe most direct antecedents of Shannon's
work were
checkbldHarry Nyquist and
checkbldRalph Hartley
checkbldboth were still very much research leaders at
Bell Labs when Shannon arrived there in the
early 1940s.
Nyquist’s 1924 paper
head2right Topic: „Certain Factors Affecting Telegraph Speed is
mostly concerned with some detailed engineering aspects
of telegraph signals.“
head2right But a more theoretical section discusses quantifying
„intelligence“ and the „line speed“ at which it can be
transmitted by a communication system, giving the relation
head2right W is the speed of transmission of intelligence
head2right m is the number of different voltage levels to choose from
at each time step
head2right K is a constant
Hartley's 1928 paper called
„Transmission of Information“
head2right Main ideas
checkbld introducing the word information
checkbld making explicitly clear the idea that information in this
context was a measurable quantity
checkbld reflecting only that the receiver was able to distinguish that
one sequence of symbols had been sent rather than any other
quite regardless of any associated meaning or other
psychological or semantic aspect the symbols might
represent
head2right where S was the number of possible symbols, and n the number
of symbols in a transmission.
head2right The natural unit of information was therefore the decimal digit,
much later renamed the hartley in his honour as a unit or scale or
measure of information.
Entropy in statistical mechanics
head2right Ludwig Boltzmann had, in the context of his H-theorem of 1872,
first introduced the quantity:
head2right as a measure of the breadth of the spread of states available to a
single particle in a gas of like particles, where f represented the
relative frequency distribution of each possible state.
head2right Boltzmann argued mathematically that the effect of collisions
between the particles would cause the H-function to inevitably
increase from any initial configuration until equilibrium was
reached; and identified it as an underlying microscopic rationale
for the macroscopic thermodynamic entropy of Clausius.
head2right The theorem relies on a hidden assumption, that useful information
is destroyed by the collisions, which can be questioned; also, it
relies on a non-equilibrium state being singled out as the initial
state (not the final state), which breaks time symmetry.
Entropy in statistical mechanics
head2right Boltzmann's definition was soon reworked by the American
mathematical physicist J. Willard Gibbs into a general formula
for the statistical-mechanical entropy, no longer requiring
identical and non-interacting particles, but instead based on the
probability distribution pi for the complete microstate i of the
total system:
head2right This (Gibbs) entropy from statistical mechanics can be found to
directly correspond to the Clausius's classical thermodynamical
definition, as explored further in the article: „Thermodynamic
entropy.“
Development since 1948
head2right The publication of Shannon's 1948 paper, „A
Mathematical Theory of Communication“, in the
Bell System Technical Journal was the founding
of information theory as we know it today.
head2right Many developments and applications of the
theory have taken place since then, which have
made many modern devices for data
communication and storage such as CD-ROMs
and mobile phones possible.
Mathematical theory of information
head2right Shannon defined a measure of information content called
the self-information or surprisal of a message m:
head2right where p(m) = Pr(M = m) is the probability that message m is
chosen from all possible choices in the message space M.
head2right This equation causes messages with lower probabilities to
contribute more to the overall value of I(m).
head2right In other words, infrequently occurring messages are more
valuable.
head2right For example, if John says "See you later, honey" to his wife every
morning before leaving to office, that information holds little
"content" or "value". But, if he shouts "Get lost" at his wife one
morning, then that message holds more value or content (because,
supposedly, the probability of him choosing that message is very
low).
Entropy
head2right The entropy of a discrete message space M is a measure of
the amount of uncertainty one has about which message
will be chosen. It is defined as the average self-information
of a message m from that message space:
head2right The logarithm in the formula is usually taken to base 2, and
entropy is measured in bits.
head2right An important property of entropy is that it is maximized
when all the messages in the message space are
equiprobable. In this case H(M) = log | M | .
Joint entropy
head2right The joint entropy of two discrete random variables X and
Y is defined as the entropy of the joint distribution of X
and Y:
head2right If X and Y are independent, then the joint entropy is simply
the sum of their individual entropies.
head2right Note: The joint entropy is not to be confused with the cross
entropy, despite similar notation.
Conditional entropy (equivocation)
head2right Given a particular value of a random variable Y, the
conditional entropy of X given Y = y is defined as:
head2right Where
head2right A basic property of the conditional entropy is that:
is the conditional probability of x given y.
head2right The conditional entropy of X given Y, also called the
equivocation of X about Y is then given by:
Mutual information
head2right It turns out that one of the most useful and important
measures of information is the mutual information, or
transinformation.
head2right This is a measure of how much information can be
obtained about one random variable by observing another.
head2right The mutual information of X relative to Y (which
represents conceptually the average amount of information
about X that can be gained by observing Y) is given by:
head2right Mutual information is closely related to the log-likelihood
ratio test in the context of contingency tables and the
Multinomial distribution and to Pearson's χ2 test.
Continuous equivalents of entropy
head2right Shannon information is appropriate for measuring uncertainty over
a discrete space.
head2right Its basic measures have been extended by analogy to continuous
spaces.
head2right The sums can be replaced with integrals and densities are used in
place of probability mass functions.
head2right By analogy with the discrete case, entropy, joint entropy,
conditional entropy, and mutual information can be defined as
follows:
where f(x,y) is the joint
density function, f(x) and
f(y) are the marginal
distributions, and f(x | y)
is the conditional
distribution.
Channel capacity
head2right Let me return for the time being to our consideration of the
communications process over a discrete channel. At this time it
will be helpful to have a simple model of the process:
head2right Here X represents the space of messages transmitted, and Y the
space of messages received during a unit time over our channel.
head2right Let p(y | x) be the conditional probability distribution function
of Y given X. We will consider p(y | x) to be an inherent fixed
property of our communications channel (representing the
nature of the noise of our channel).
head2right Then the joint distribution of X and Y is completely determined
by our channel and by our choice of f(x), the marginal
distribution of messages we choose to send over the channel.
Channel capacity
head2right Under these constraints, we would like to maximize the
amount of information, or the signal, we can communicate
over the channel.
head2right The appropriate measure for this is the transinformation,
and this maximum transinformation is called the channel
capacity and is given by:
Source theory
head2right Any process that generates successive messages can be
considered a source of information.
head2right Sources can be classified in order of increasing generality
as
checkbld memoryless,
checkbld ergodic,
checkbld Stationary
checkbld and stochastic
checkbld (with each class strictly containing the previous one).
head2right The term „memoryless“ as used here has a slightly
different meaning than it normally does in probability
theory.
head2right Here a memoryless source is defined as one that generates
successive messages independently of one another and
with a fixed probability distribution.
Source theory
head2right However, the position of the first occurrence of a particular
message or symbol in a sequence generated by a
memoryless source is actually a memoryless random
variable.
head2right The other terms have fairly standard definitions and are
actually well studied in their own right outside information
theory.
head2right The rate of a source of information is (in the most general
case)
head2right The expected, or average, conditional entropy per message
(i.e. per unit time) given all the previous messages
generated.
head2right It is common in information theory to speak of the "rate"
or "entropy" of a language. This is appropriate, for
example, when the source of information is English prose.
Fundamental theorem
Statement (noisy-channel coding theorem)
head2right 1. For every discrete memoryless channel, the channel
capacity
head2right 2. If a probability of bit error pb is acceptable, rates
up to R(pb) are achievable, where
head2right 3. For any pb, rates greater than R(pb) are not
achievable.
Channel capacity of particular model channel
head2right A continuous-time analog communications channel
subject to Gaussian noise.
Classification of information
head2right Chronological viewpoint
checkbld Identification (cognitive) information
checkbld Model information
checkbld Normative information
head2right Functional viewpoint
checkbld Economics information
checkbld Scientific information
checkbld Cultural information
checkbld Political information
head2right Stage of processing viewpoint
checkbld Primary information
checkbld Secondary (resulting) information
Characteristics of information
head2rightInformation must be
checkbldComplete
checkbldTimely
checkbldCorrect
checkbld Intelligible
Information resources
head2right Server Wikipedia: http://en.wikipedia.org/wiki/Information_theory#Overview, 29. 4. 2006
head2right R.V.L. Hartley, "Transmission of Information," Bell System Technical Journal, July 1928
head2right J. L. Kelly, Jr., "New Interpretation of Information Rate," Bell System Technical Journal, Vol. 35, July 1956, pp. 917-
26
head2right R. Landauer, "Information is Physical" Proc. Workshop on Physics and Computation PhysComp'92 (IEEE Comp.
Sci.Press, Los Alamitos, 1993) pp. 1-4.
head2right R. Landauer, "Irreversibility and Heat Generation in the Computing Process" IBM J. Res. Develop. Vol. 5, No. 3,
1961
Textbooks on information theory
head2right Claude E. Shannon, Warren Weaver. The Mathematical Theory of Communication. Univ of Illinois Press, 1963.
ISBN 0252725484
head2right Robert B. Ash. Information Theory. New York: Dover 1990. ISBN 0486665216
head2right Thomas M. Cover, Joy A. Thomas. Elements of information theory, 2nd Edition. New York: Wiley-Interscience,
2006. ISBN 0471241954 (forthcoming, to be released 2006.).
head2right Stanford Goldman. Information Theory. Mineola, N.Y.: Dover 2005 ISBN 0486442713
head2right Fazlollah M. Reza. An Introduction to Information Theory. New York: Dover 1994. ISBN 048668210
head2right David J. C. MacKay. Information Theory, Inference, and Learning Algorithms Cambridge: Cambridge University
Press, 2003. ISBN 0521642981
Other books
head2right James Bamford, The Puzzle Palace, Penguin Books, 1983. ISBN 0140067485
head2right Leon Brillouin, Science and Information Theory, Mineola, N.Y.: Dover, [1956, 1962] 2004. ISBN 0486439186
head2right W. B. Johnson and J. Lindenstrauss, editors, Handbook of the Geometry of Banach Spaces, Vol. 1. Amsterdam:
Elsevier 2001. ISBN 0444828427
head2right A. I. Khinchin, Mathematical Foundations of Information Theory, New York: Dover, 1957. ISBN 0486604349
head2right H. S. Leff and A. F. Rex, Editors, Maxwell's Demon: Entropy, Information, Computing, Princeton University Press,
Princeton, NJ (1990). ISBN 069108727X
Ing. Tomáš Rain Ph.D.
Outline
head2rightSystem and information system
head2rightInformation Systems vs. Computer
Science
head2rightIS History
head2rightWhat is an Information System?
head2rightTerminology
head2rightTypes of information systems
System
head2right System (from the Latin (systēma), and this from
the Greek σύστηµα (sustēma)) is an assemblage
of elements comprising a whole, and that each
element is related to other elements.
head2right Any element which has no relationship with any
other element of the system, cannot be a part of
that system.
head2right The elements (or components) of a system
interface in order to facilitate the 'flow' of
information, matter or energy.
head2right A subsystem is then a set of elements which is a
proper subset of the whole system.
RAW DATA + PROCESS = MEANINGFUL
INFORMATION.
head2right RAW DATA + PROCESS = MEANINGFUL
INFORMATION. This concept where Raw Data
which has hardly any meaning is Processed and
the Outcome which results in Meaningful
information is simply understood as the
Information System.
head2right For example adding of two numbers.
checkbld The two numbers are just Raw Data that mean nothing
for a start but the addition process when applied to the
two numbers will result in a meaningful answer.
checkbld Thus Information systems are born where systems are
created and put into place to filter raw data to produce
sensible information.
Classical IS definition
head2right Any telecommunications and/or computer
related equipment or interconnected system or
subsystems of equipment that is used in the
acquisition, storage, manipulation, management,
movement, control, display, switching,
interchange, transmission, or reception of voice
and/or data, and includes software, firmware,
and hardware
Source: from Federal Standard 1037C and from
MIL-STD-188 and from the National
Information Systems Security Glossary
IS Structure and Behaviour
The simplest model that describes the Structure and Behaviour of
an Information System takes five objects:
For Structure:
head2right Repositories, hold data permanent or temporarily, such as
buffers, RAM, hard disks, cache, etc.
head2right Interfaces, exchange information with the non-digital world, such
as keyboards, speakers, scanners, printers, etc.
head2right Channels, connect repositories, such as buses, cables, wireless
links, etc. A Network is a set of logical or physical channels.
For Behaviour:
head2right Services; provide value to users or to other services via messages
interchange.
head2right Messages; carries a meaning to users or services.
head2right Source: from book "Seguridad de la Informacion", 2004 ISBN
84-933336-7-0
IPO Model
Definition
head2right I: Input - External material/stimuli that enters the
system
head2right P: Processing - Actions taken upon/using input or
stored material
head2right O: Output - Results of the processing that then exit the
system
head2right S: Storage - Location(s) where material inside the
sytem is/are placed for possible use at a later time
(optional)
IPO Model
head2right There are three very common terms that are linked to each other.
They describe particular stages in information handling. They
are:
checkbld Input, processing and output.
checkbld You are often asked for a definition of these terms in the
exam. Make sure you can explain them.
checkbld Along with a definition, many times you will be asked to
draw a diagram to show the stages. It is essential that you
learn the diagram below, along with the direction of
information flows. You need to be able to reproduce it
exactly.
Information Systems
vs. Computer Science
head2right Computer Science has its concentration in the study of
algorithms, computation, software, and data structures. Its
roots are in mathematics and engineering. Programming is
only one aspect of computer science.
head2right Information Systems is an extension of management and
organization theory that applies technical capabilities and
solutions initially developed by computer science, to tasks
in organizations. It involves the study of information – its
structure, representation,and utilization.
head2right It focuses on the information needs of organizations for a
wide variety of business processes, management, decision-
making, and planning purposes.
IS History with Other Disciplines
head2right Managerial Accounting - looks at relevant costs
and performance analysis for managerial control
and decision making.
head2right Operations Research - systematic approach to
problem solving; use of models.
head2right Management and Organization Theory -
behavioral theory; individual and group decision
making; leadership; organization design and
development.
head2right Computer Science - algorithms, computation,
software, data str
Vloženo: 18.06.2009
Velikost: 2,02 MB
Komentáře
Tento materiál neobsahuje žádné komentáře.
Mohlo by tě zajímat:
Reference vyučujících předmětu ETE49E - Information systemsReference vyučujícího Ing. Ph.D. Tomáš Rain
Podobné materiály
Copyright 2024 unium.cz