Humans receive the great majority of information about their environment through
sight, and at least 50% of the human brain is dedicated to vision. Vision is also a key
component for building artificial systems that can perceive and understand their environment.
Computer vision is likely to change society in many ways; for example,
it...
The twentieth century witnessed the birth of revolutionary ideas in the phys-
ical sciences. These ideas began to shake the traditional view of the universe
dating back to the days of Newton, even to the days of Galileo. Albert Ein-
stein is usually identified as the creator of the relativity theory, a theory that
is used to model the...
The present volume is dedicated to aspects of algorithmic work in bioinformat
ics and computational biology with an emphasis on string algorithms that play
a central role in the analysis of biological sequences. The papers included are
a selection of articles corresponding to talks given at one of two meetings spon
sored by The Royal...
The second, revised edition of this book covers all aspects of non-uniform rational B-splines necessary to design geometry in a computer-aided environment. Basic B-spline features, curve and surface algorithms, and state-of-the-art geometry tools are all discussed. Detailed code for design algorithms and computational tricks are covered, too,...
What do people learn when they do not know that they are learning? Until recently all of the work in the area of implicit learning focused on empirical questions and methods. In this book, Axel Cleeremans explores unintentional learning from an information-processing perspective. He introduces a theoretical framework that unifies existing...
Computer vision is the science and technology of making machines that see.
It is concerned with the theory, design and implementation of algorithms that
can automatically process visual data to recognize objects, track and recover
their shape and spatial layout.
The International Computer Vision Summer School - ICVSS was...
The main purpose of statistical theory is to derive from observations of a
random phenomenon an inference about the probability distribution underlying
this phenomenon. That is, it provides either an analysis (description)
of a past phenomenon, or some predictions about a future phenomenon of
a similar nature. In this book, we insist...
Crystallization from solution is a core technology in major sectors of the
chemical process and allied industries. Crystals are produced in varying sizes
ranging from as small as a few tens of nanometers to several millimetres
or more, both as discrete particles and as structured agglomerates. Well-
established examples include bulk...
Linear Programming deals with the problem of minimizing or maximizing a
linear function in the presence of linear inequalities. Since the development of
the simplex method by George B. Dantzig in 1947, linear programming has been
extensively used in the military, industrial, governmental, and urban planning
fields, among others. The...
Bioinformatics is an emerging field in which statistical and computational techniques
are used extensively to analyze and interpret biological data obtained
from high-throughput genomic technologies. Genomic technologies allow us
to monitor thousands of biological processes going on inside living organisms
in one snapshot, and are...
With knowledge representation we face more or less the same problem as Augustine
(354–430) when thinking about time: if nobody asks what it is, it seems
clear enough, but being asked it proves to be very difficult to provide an answer.
At the beginning of our research we thought that a solution for the problem
of...
Ten years ago the authors undertook to produce a book covering the known material on
formal languages, automata theory, and computational complexity. In retrospect, only a
few significant results were overlooked in the 237 pages. In writing a new book on the
subject, we find the field has expanded in so many new directions that a...