|
Computer Science is a very young discipline compared to most others. Alan Turing
published the seminal paper of the field in 1936. Around the same time, the militaries in
Germany, UK, and US commissioned the first digital electronic computer projects. One of
these, the Colossus at Bletchley Park in the UK, was used to break the German Enigma code
and help turn the tide of World War II; its existence was not made public until the 1970s. The
other projects reached completion after the war: the ENIAC at University of Pennsylvania in
1946 and EDSAC at University of Cambridge in 1949 are prominent examples. The first
academic degree in computing was University of Pennsylvania‘s program in computing in
1959. The first academic computer science departments were Purdue and Stanford in 1962.
For many years, people in the other fields were not sure what to make of computer
science. Depending on their background, they saw computer science as an outgrowth of
mathematics, science, or electrical engineering. Although many people were delighted with
the advances enabled in many fields by computing technologies, they did not see anything
fundamental about computing to warrant a permanent seat at the table of science. It became a
standing joke that any field using the name ?science? in its title could not be a real science
based on deep, universal principles. In reaction, many people in the field today called it the
?computing field?, a shorthand for ?computer and information science and engineering?.
Through the mid 1980s, when most computer scientists were focused on building
computer systems and networks, an engineering perspective dominated the field. Many
observers believed that in due course, once the infatuation with the newness of computing
wore off, computer science would be absorbed back into electrical engineering. Other
observers believed that the only part of computer science that worked with fundamental
principles was the mathematics part; they believed that in due course, computer science
would be absorbed back into mathematics.
Scientists from different fields are searching for the appropriate definition of what is digital and analogue, but the profile of the technologies that have changed mankinds lifestyle and the history of the world still resist the scrutiny of thinkers. Rocchi, a brilliant scientist of IBM, claims that understanding analogue versus digital is not as easy as comparing the adjectives digital/continuous or natural/artificial. He suggests a new key to interpret the digital and the analogue machines and conducts an accurate analysis of the systems' core. |
|