Monday, March 28, 2011

The Cold War – Risks and Responsibilities


In the article by Paul Edwards, it is stressed that the US defence was the single most important reason for the development of advanced computers. The arms race and the technological competition between the US and the Soviet Union in the Cold War period resulted in extensive scientific research, which gave birth to computers. But there is a certain risk involved in this justification.


Risk, in this context, has a more subtle meaning. Take for example, the EVMs (Electronic Voting Machines) in India. It was discovered by a software engineer that that these EVMs could be hacked into and the poll results could be viewed and changed. In fact, the hacking was possible in two ways – one, by using a Bluetooth connection and a hacking code and two, by tampering with an easily detachable memory chip that recorded the votes of hundreds of voters.  The Election Commission, by neglecting this huge flaw, has put millions of Indian citizens at risk. The single most important aspect of the Indian governance – democracy – could be potentially mean nothing if the poll results did not reflect the choice of voters. Therefore it is crucial for the Election Commission to take responsibility for its slackness. After all, the Election Commission is the only link, although weak, between Indians citizens and their government.




Coming back to the Cold War context, the risk factor by such a justification is the assumption that computers would not have been developed if not for the perceived need for national security by the US. It is a heavy thought for all of us that our PCs are a result of military research and that they were developed from the ENIAC - the machine that computed ballistics tables for missiles that killed thousands of people.

By Pranav R Kamat

References:
1. The Risks Digest
2. From “Impact” to Social Process: Computers in Society and Culture by Paul Edwards
3. An Unforseen Revolution: Computers and Expectations, 1935 - 1985 by Paul Ceruzzi

Sunday, March 27, 2011

Risks and Responsibilties: Cold War and Now


Introduction
Though Computers were used for limited activities during the Second World War, their potential to decisively change the course of wars was realized as evidenced in the cracking of Nazi Ciphers by Britain’s Colossus Computer. This led to a sustained funding for scientific research of military interest during the Cold War. This Research was mainly conducted by Civilian and Industrial Laboratories in association with the military. The Main reason for this technophilic trend essentially boils down to the conflicts which existed on the economic, political front between Communist Nations and the Western countries. This led to fears of another War like WWII and led to the amassing of weapons of Mass Destruction and the drive to further the development of the computer. This kind of a justification for scientific research based on possibility of a war i.e. a threat to national security, though fruitful had its own risks and a set of associated responsibilities which were needed in order to avert accidental disasters or oversights. I would like to discuss this aspect with a relevant example from the present time.

A Modern Day Example
The Electronic Voting System was started as a means to counter possible corruption and manipulation of Votes through the Ballot system in India. For the success of a Democratic nation, a safe and secure method of conducting voting is essential and technological research aimed at solving these problems in light of its importance to national security, has some possible risks and inherent responsibilities associated with it. For instance, the System has to be foolproof and be backed up so as to prevent any power outage from affecting it – which was realized and done so. Also, it should be ideally tamperproof, but as can be seen with the current EVMs in India, they are far from tamper proof. These Machines can be manipulated in various ways as shown by Hari Prasad and Group. While normal ballot system involved criminals cheating by manipulating the votes afterward during storage phase, no safe guard was developed in the Electronic system against a similar attack by more tech savvy criminals. Then how come EVMs have been claimed to be better than the normal system?

 The Justification from officials is that it is “safer” than the human controlled system which can be tampered easily but this view has remained legitimate and largely unquestioned due to the fact that common folk (which includes these officials who are not trained professionally in these technologies) while very well understanding the concept of human error, are not familiar with the concept of computer error, and this has been the key then and now for the perceived safety in these systems. For example, the SAGE defense system which was made in America had one goal in mind – to show the citizens of the US that they were ready to react to the threat of Nuclear Weapons. However, in hindsight it has been shown that the SAGE system while inspiring and advancing Technology was not capable of handling the threat - it was little more than a stunt. In fact, in the past, a major Nuclear Incident was averted because of the correct handling by the human operator in charge of overseeing the system who on seeing that SAGE showed the US being attacked by a multitude of Missiles realized that it was actually a glitch in the system that had caused this and did not report it to his higher officers thus saving the world from M.A.D. 

In Conclusion
While safeguards were taken in both cases, these were not enough in the sense that they did not anticipate certain key ways the system would fail. The sense of threat to national security which was perceived to be greater before the usage of Electronic systems, led to these machines not being tested adequately before going live. Thus, a key responsibility that the developers of these systems have is to test the reliability of these systems as this is the very reason they are replacing the current system. The system developed have to be made accountable to professionally capable people who can access the real value of these systems and not be distorted in their judgment due to a one sided knowledge based on a unreal idea of the safety in using Electronic systems against a perceived threat to security of the nation.

Bibliography

1)      1)From Impact to Social Process :Computer in Society and Culture , Paul Edwards
2)      2)The Risks Digest, http://catless.ncl.ac.uk/risks
3)      3)http://indiaevm.org/
Barath A

Risk and Responsibility-EVMs

As we have seen with regard to our previous blogpost, where we studied the vision of science and technology since the 1930s, the major advances in technology were headed towards projecting the nation's sovereignty and military supremacy. But after the atomic bombing of Japan, the view of science changed dramatically; and one couldn't avoid the moral implications of the technology they mean to create. As we have have seen in our earliest blogpost, predicting the full extent of the social impact on a technological invention is impossible to predict. The Cold War didn't help in that respect, as it was just another excuse for each country to continue it's research on establishing their own supremacy. But, as engineers, we must take it upon ourselves to set a standard of ethics and model a society (as far as possible) with the inclusion of the technological enhancement, and see what pops up. Every piece of technology has it's risks, and we Engineers must take the responsibility to alter the invention in mind appropriately so that the society could cope with the demons that it brings with it.

Let us take an example in the Electronic Voting Machines. The US came up with the 'then' supercomputer ENIAC, which was the precursor to the computer chips responsible for the drive of computerizing every aspect of the society. The Election Commission of India has officially declared that elections, from now on, shall be done electronically. But there are many risks to the introduction of this technology in the election system. The chip which has the counter coded within it is unreadable and it would be impossible to tell if the manufacturer has other plans in mind. It has also been proven that it is possible to rig the displays, immaterial of which candidate has won. It is also proven possible that the results can be rigged using a bluetooth device that can be slipped into the machine. On the whole the machine has a lot of openings. Sources say that there might exist as many as a million people in India who have the technical know-how of rigging these EVMs.

But now we need to ask ourselves:- how do alter the machines to avoid it getting rigged? What is the alternative to EVMs, and would it tackle the issues at hand? Consider the paper ballot system. It's functioning is better understood than the EVMs and the set of people who have the technical know-how to alter the election results are a lot larger than in case of the EVMs. As for solutions to the problems posted by the EVMs, a few ideas come to mind; but they make the system more complex, and if ever the elections are then rigged it would be impossible to tell.

But on weighing the options, I personally feel  the EVMs would make a better system for elections rather than the paper ballot system. But we engineers must continue to strive to make the system fool-proof.

-Amit M Warrier
EE09B004

References:-
1) The Risk Digest

Risks, Responsiblities and the Cold War

Introduction:
The Cold War was a period of military tension, mistrust and general paranoia in both the USA and the USSR. An arms race resulted in the stocking up of nuclear warheads and the threat of "mutually assured destruction" was guaranteed by both sides. As a result, the American Government funded Project Whirlwind in the MIT and created the SAGE perimeter defense system. According to Paul Edwards in his essay on 'Computers and Society', the value of the SAGE project was almost entirely imaginary and ideological. Though its military potential was minimal, it helped create a sense of active defense to assuage some of the helpless passivity of nuclear fear. Such justifications were also common during this period in collaborations between scientists at various educational institutions and the military. In a process of 'mutual orientation', engineers constructed visions of military uses of computers they wanted to build in order to justify grant applications. We notice that in both the above examples, the entire truth was not let out. Does something 'smell' here?

Through a glimpse into the process of possible electoral fraud, I will try to explain some of the risks and responsibilities associated with such justifications.

Electronic Voting - A Look at the Risks:
Most people consider electronic voting to be simply an automated process where votes can easily be processed and counted to produce a result. In this naive view, electronic voting, or e-voting, is simply a faster and more accurate method of traditional voting. This view however, is flawed, and electronic voting machines can easily be hacked into and tampered with. According to some, e-voting unnecessarily introduces a third party into the picture, a separation where the voter cannot directly see whether the vote he has provided has been recorded. Several voters consider this practice of handing over their ballot paper to an intermediate as unacceptable, since it violates the very nature of the 'secret ballot'. An electronic voting system can be involved in any one of a number of steps in the setup, distributing, voting, collecting, and counting of ballots. Possible electronic or algorithmic errors at any of these stages may be a weakness for a hacker to exploit. What's worse, under a secret ballot system, there is no known input, nor any expected output with which to compare electoral results. Hence, electronic electoral results and thus the accuracy, honesty and security of the entire electronic system cannot be verified by humans.

Responsibilities and Professional Ethics:
If there is a risk associated with the usage of a particular technology, there is a corresponding responsiblity associated with its engineer or manufacturer. Producers of electronic voting machines in the US, however, continue to make defective machines, resulting in several electoral frauds in the USA - most notably in the 2000 presidential elections. Here, both the producers and the government are guilty of violating their professional ethics in popularizing such voting machines. In the case of the SAGE defense system, the governement did not let out the whole truth and misled its citizens. Obviously, this was done for their 'greater good', since revealing their objective would spoil the effectiveness of the program. Here, we see that there are shades of grey separating the ethical from the unethical.

Conclusion:
The process for computerized warfare developed in result of a perceived threat to security during the Cold War period. Several new technologies emerged such as digital computers, hydrogen bombs, and automated defense systems. With great power however, comes great responsibility and the creators of such technology have responsibilities to its users. A professional must have his or her own self enforced code of ethics which they will follow at all times. As a thumb rule, we may use some of the tests suggested in class - to not do anything that makes us uncomfortable or 'smells'.

References:
[1] http://en.wikipedia.org/wiki/Electronic_voting
[2] http://catless.ncl.ac.uk/risks
[3] Article by Paul Edwards titled 'Computers and Society'

Saturday, March 26, 2011

Role of technology in cold war and similar risks and responsibilities involved in present day processes (Electronic voting to be particular)

Introduction
     The Cold War was a period in which the world saw many advanced and technologies emerging. The point to note about these technological marvels is that the amount of money involved in them is enormous (sometimes outrageous).

     Take the SAGE perimeter defense system for example. The value of the SAGE project was almost entirely imaginary and ideological. Though its military potential was minimal, it helped create a sense of active defense to assuage some of the helpless passivity of nuclear fear. Such justifications were also common during this period in collaborations between scientists at various educational institutions and the military. In a process of 'mutual orientation', engineers constructed visions of military uses of computers they wanted to build in order to justify grant applications. We notice that in both the above examples, the entire truth was not let out.


Risks involved in Electronic Voting
Can we trust these?
      Electronic voting machines can easily be hacked into and tampered with. The biggest flaw in EVMs is that it introduces a third party into the picture, and there is no way in which a voter can verify that his vote has really been cast. This practice of handing over ballot paper to an intermediate is simly unacceptable, since it violates the very nature of the 'secret ballot'. An electronic voting system can be involved in any one of a number of steps in the setup, distributing, voting, collecting, and counting of ballots. Possible electronic or algorithmic errors at any of these stages may be a weakness for a hacker to exploit. What's worse, under a secret ballot system, there is no known input, nor any expected output with which to compare electoral results. Hence, electronic electoral results and thus the accuracy, honesty and security of the entire electronic system cannot be verified by humans.

Responsibilites
       Where there is a risk, there is a corresponding responsibility associated with the maker of the technology. Producers of electronic voting machines in the US, however, continue to make defective machines, resulting in several electoral frauds in the USA - most notably in the 2000 presidential elections. Here, both the producers and the government are guilty of violating their professional ethics in popularizing such voting machines.
       Coming back to the context of Cold War, in the case of the SAGE defense system, the government did not let out the whole truth and misled its citizens. Obviously, this was done for their 'greater good', since revealing their objective would spoil the effectiveness of the program. Here, the question of the boundary between ethical and unethical comes into picture.

Conclusion
        The process for computerized warfare developed in result of a perceived threat to security during the Cold War period. Several new technologies emerged such as digital computers, hydrogen bombs, and automated defense systems. With great power however, comes great responsibility and the creators of such technology have responsibilities to its users. A professional must have his or her own self enforced code of ethics which they will follow at all times. As a thumb rule, we may use some of the tests suggested in class - to not do anything that makes us uncomfortable or 'smells'.

References
  1. The Risks Digest
  2. From “Impact” to Social Process: Computers in Society and Culture by Paul Edwards
  3. An Unforseen Revolution: Computers and Expectations, 1935 - 1985 by Paul Ceruzzi

Friday, March 18, 2011

THE VISION FOR SCIENCE AND WAR IN THE 1930s AND THE CURRENT SITUATION:-


Science and technology came to the fore during the Renaissance period, a period which saw massive attempts at the colonization of the lesser world. Even though emphasis was still given to knowledge and understanding of the world around us and to make the lives of people more comfortable, technological advances were made to exercise the sovereignty of nation-states over the rest of the world. With the onset of World war I, it became necessary to gain an upper-hand over the enemy, technologically. This period, from the early 1930s until the end of the cold war, saw massive strides in both science and technology with regard to the research approach. Scientists were encouraged to conduct research in any field, but the applications of these scientific discoveries were thought up and created with the sole and supreme aim of projecting the country's sovereignty and military might.

One such approach to war was 'Cybernetics'. The term was coined by Norbert Wiener, a Professor at MIT, who was also responsible for the design of the anti-aircraft guns that were used by the US during World War II. The concept of 'Cybernetics' is based on the concept of Feedback. The result of the use of Cybernetics on technology is a tool that is empowered with senses, or ports for information input, and takes a course of action accordingly. For example, the anti-aircraft guns designed by Wiener used a certain model of flight patterns of the enemy, combined with senses that observe the target, to predict the motion of aircrafts and gun them down. Cybernetics derives it's roots from nature, wherein all organisms function on a feedback-reward system. The concept of Artificial Intelligence was also born from Wiener's Idea. The core of Cybernetics lies in the accuracy with which the objects of interest, which in the case of Wiener's guns are the enemy aircrafts, can be modeled.

But the outlook towards science and technology changed dramatically following the atomic bombing of a Japan, the event that marked the end of the World War and the start of the Cold War. Even though the arms race was still existent in the world, it was now impossible to view technological development without studying it's moral implications. Weapons that cause a lot more damage than that which was caused in Japan, also exist in this world; but the intention is peaceful as these weapons act as a deterrent to war. The technology used to fight wars were modified appropriately to be used commercially. Once again the focus of science and technology was shifted back to what it was before World War I, to make the lives of people more comfortable.

-Amit M Warrier
EE09B004

References:-
  1. The Class ppt.

The vision for war – from the 1930’s to now

The vision for science and war seems to have undergone a radical change from the 1930’s to the present. This change, pioneered by MIT applied mathematician Norbert Wiener, has essentially got to do with the introduction of cybernetics and its applications to war, thereby securing the crucial marriage between science and War, which was previously absent. This resulted in the battle being between machines acting like humans rather than human soldiers themselves. Let me explain.





Norbert Wiener, regarded by many as the father of cybernetics, introduced the idea of Cybernetics, or, the “science of control and regulation,” during the time he worked for the US defence in developing the radar-guided AA (anti-aircraft) gun. He studied the predictable behaviour of the human mind in stressed situations to help predict the path of the enemy aircraft, so that it could be gunned down efficiently. The most important aspect of this was the feedback loop running between the enemy pilot, the aircraft, the AA predictor and the AA operator. The actions of each member of the loop regulated the actions of the subsequent member of the loop. This allowed the replacement of man by machine and machine by man. The actions of the pilot, for example, could be predicted by a machine; hence the pilot was being viewed not as a human being with emotions and senses, but as a machine whose actions were predictable and somewhat pre-determined.
Thus, the use of cybernetics in warfare had one important result that would determine the vision for war in decades to come: scientific research would extensively be used in warfare and technology would become the single most crucial factor in determining victory in war. This was realised early on by Vannevar Bush, a visionary who united six thousand leading American scientists and coordinated their research in warfare. As the head of the Manhattan Project that developed the atomic bomb, he knew the importance of technology in warfare and this was what made him seek government funding in scientific research for warfare, which, when also adopted by nations worldwide, changed the war scenario. A battle between skilled human soldiers (1930’s) changed to a battle between intelligent machines.

Therefore, the change in warfare that we see from the 1930’s to now, most importantly, is the advent of technology in the warfare realm and the criterion for victory changing from skill, strength and size to technological advancement.






By Pranav R Kamat


References:
1. The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision by Peter Galison
2. The History and Development of Cybernetics - Presented by The George Washington University in Cooperation with The American Society for Cybernetics

Thursday, March 17, 2011

What was the vision for science and War in the 1930's. Have things changed since then?

During the “Enlightenment” the period between 16th and 17th centuries when philosophers celebrated the power of human reason. There were new techniques of science and mathematics which started play a role in making human activities much more easy and productive. Science and technology were looked at as highest order of learning and civilization. Working with science was looked at as constructive and noble job. Today science and technology is for mass destruction. In this blog post I will try to show the changing face of science over the years.

Though science was meant for human development is has been adapted and modified and for destructive purposes like war since ages. Archimedes ancient Greek mathematician, philosopher who is famous for his varies contributions to early mathematics, formalized a method to setting enemy ships on fire using sunlight . In 17th century French military under Napoleon Bonaparte started adapting mathematical techniques in designing fortress to measuring the size of cannon balls. Though these examples show the early adaptations of science, they were of little impact on determining the outcome of a war. This was situation until the beginning of 20th century.

It was in world war I that countries realized the potential of science in war. World war I was often referred as the 'The Chemists' war' because of the extensive use of chemicals like nitrites, poison gas. Germans used chlorine from from powerful die industries. Realizing the potential of these chemicals, chemists from these countries were pushed to develop more harmful chemicals to countermeasure the chemicals from the other country. Physicist contributed by developing wireless communication technologies and sound-based methods of detecting U boats. This marked beginning of the scientific research dedicated for war.

At the end of the first world war the nations realized the significance of science and technology in war. Until this time there was just adaptations of existing scientific and technology in war and the scientific research was meant to development. The inter-war years mark the change in the face of science. The purpose of science started transforming from 'human development' to 'human destruction'. Governments started funding the research which had a potential in war. For example in US until then academic research in science and engineering is not considered a federal responsibility; almost all support comes from private contributions and charitable foundations and military research was heavily disorganized. In the inter-war It was reorganized and was heavily funded by the government one of it being the Manhattan project.

The view of war has changed from a mode on 'settlement of issues' to mad race of destruction. There was disappearance of understanding of a human life among the scientist developing technologies. Some technologies like atom bomb were developed that could wipe-out an entire city in seconds. It can be compared to the chemicals that were developed were to mass kill mosquitoes, they made such chemical because we don't value a life of a mosquito. An atom bomb was no different. Many new mathematical and computational tools like 'game theory' , 'operational research' were developed ,all for the purpose of enhancing defense system. British and American work on radar influenced to course of the war.

After the second world war, he advent of the Cold War solidified the links between military institutions and academic science. Whole new field like digital computing and networking were developed under the military patronage.

The extensive military patronage since 1930 changed the face and vision of science. Like Paul Forman an American historian in his 1987 article “Behind quantum electronics: National security as a basis for physical research in the United State, 1940-1960,” said military funding initiated "a qualitative change in its(science) purposes and character."

References:

- Sujan

The vision for science and War in the 1930's and the current situation.

      In this essay, I explore the relation of science with war in the 30s and the vision about science as conceived by visionaries like Vannevar Bush.

Science before the war
      The very meaning of the word science is "to know". Scientists were driven by the desire to explore the laws of nature and to deepen the human knowledge and understanding. All the scientific research was geared towards one goal - the betterment of human-kind. Most of the research was done for the intellectual satisfaction gained out of it.

Impact of the war
       In first world war, the very notion of science changed. The roadmap of science and technology was given a sharp turn in the 30s. Knowledge no longer remained the only motivation for for science. Technologies were developed to win the war, kill the enemy, and not for the greater good of mankind. The very foundations on which science was based, were shaken.
       The WW I was considered to be "war of chemists" where chemicals were employed for mass destruction. 1930s was the period when a lot of new technologies emerged, not due to the usual drive for the betterment of man, but through the insane drive to vanquish the enemy and the brutal urge to wipe them out.


Cybernetics
       Cybernetics is an example of one such branch of science. It was literally born out of war. It had many fundamentally revolutionary ideas at its root. The main concept behind it was that under extreme conditions like very high altitude, high mental tension and high speed aircraft manipulation, rather than fully processing the situation and then taking the decision, human mind behaves mechanically. This understanding paved the way to model human mind mathematically (at least in extreme situations). Then followed extensive research and result was high precision anti aircraft guns!
        Unlike many other technologies which were invented without war in mind, but proved to be very useful in the war, cybernetics was one which was invented keeping the war in mind, and then after war, it found applications in other fields.


Bush's vision for science
        Bush rightly envisioned that the sheer amount of knowledge being produced by rapid development will change the methods of acquisition and storage. And accordingly he developed, or at least paved a way to develop these technologies. He predicted that traditional methods of acquisition and storage were not enough for this explosion of information.


Where do we stand now
       After the end of cold war, once again we are back to the time when science is done for development's sake rather than for destruction.
       Many technologies which were war specific have been turned upside down for the greater good. For example, U.S. used the technology captured from German rocket scientists to send men on moon, and in general, take the first step of man towards space exploration. The satellite technology which was originated from the need to spy on the enemy without being easily spotted, is now used for communication and networking.
       Bush's vision has come true and we have been able to cope up with the "information overload" which he predicted. In the form of world wide web, we have achieved the goal of efficient mechanism for storage and acquisition of knowledge. 

References
  1. The Ontology of the Enemy: Norbert Wiener
  2. Cybernetic Vision : Peter Galison
  3. The History and Development of Cybernetics : The George Washington University
  4. Wikipedia

Wednesday, March 16, 2011

Visions for War: Past meets Present ?


Introduction
The Sciences and their systematic methods gained new recognition in the face of WWII. Previously, research in military was very restricted and inefficient, but with Bush's precedent set during the Nazi Onslaught, the scenario changed. The Marriage of Science and Military was completed during the Second World War, and the momentum generated by this event has been carried through till this day - albeit in modified forms. The Visions of Science and War from the 1930's, which were primarily guided by Vanevar Bush's vision and his outline of post-war research directives, have remained very much the same today while undergoing some modifications due to the changed world scenario. My objective is to discuss these changes and similarities with some examples.

The Race for Information
One of the things, military technologists in US concentrated during and after world war 2, was to look at how best to store and make easy the access of the piles of information and data generated by scientific research and military offices all across the US. Bush suggested some analog techniques to overcome these problems but they were largely overshadowed by the development of Digitisation by Shannon and the invention of the internet i.e. Networking. In light of these revolutionary events, which Bush failed to foresee, one might be led to expect that the problems of information access, communication, storage, cross-linking and referencing, that is, the problem of information processing is largely solved - Nothing could be farther from the truth.
 How can this be possible? While different reasons come to mind, some have had wider and bigger impact than others. One of them is the evolution of Global Collaborations between military agencies in light of terrorist threats and of scientists in general due to the explosion in scientific research post-war worldwide. We all know that Terrorists employ methods where they primarily use hidden warfare - striking when moments are ripe and disappearing without trace before retaliation is possible. This has made the power to process information and derive new information with high speed and accuracy even more valuable in order to assess these threats as soon as they appear and counter them. Moreover, Representation and visualization of information has become important in the context of scientific and industrial research where lot of raw data have become easily accessible due to improvement in methods of measurements. Thus, it can be seen that the race to obtain informational advantage is still at large although for somewhat different reasons than initially intended during 1930’s.

Bulletproof Ballistics
A major focus of scientists involved in military research during WWII was to develop countermeasures to tackle enemy aircrafts and missiles. This can be seen from the celebrated example of Wiener and the feedback based Anti-Aircraft technology. The systems which wiener helped to build could accurately predict the dynamics of manned aircrafts using feedback and control mechanisms. It was a revolutionary breakthrough for the Allies, aiding them in stopping the blitz bombings. This revolutionary idea has become deeply embedded in Ballistic systems of Today and has come far beyond its rudimentary beginnings – both in its application to tackling ballistic weapons as well as in the counter strategies to overcome the enemy’s defenses for the same. The main reason why these weapons have still survived as a part of a nation’s military offensive even though every nation is scrambling to develop counter measures against them is due to development in various different fields of sciences in particular fundamental advances in chemistry, biology and in control and feedback theory which have given new methods to inflict precise and devastating harm through these missiles and hence raising the stakes to make error-free systems for both defense and offense. The motivation for these developments can be very plainly seen in the M.A.D(Mutually Assured Destruction) policy adopted by most super states today. These objectives are not very different from the aims of military research to develop countermeasures for aircrafts during the War, even though the Nuclear Bomb and the concept of M.A.D were not fully formed back then.


In Conclusion
Thus, we can see from these examples that even though scientific body of Knowledge has changed and exploded post-WWII, the combined visions of Military with Science have not changed much. Though the enemy to world peace has changed and the policies of States have undergone changes with regard to war and even though, the methods of science have changed, the military research needed to tackle enemy threats are largely guided by similar aims as the 1930’s military research.

Bibliography:
1) On Bush's As we may think, Notes by Elyon Caspi, AJ Shankar, Jingtao Wang.
2)The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision, Peter Galison
3)Wikipedia page on WWII and Science policy in the US 

Barath

Science and war in 30's

To completely understand the correlation between science and war in the 1930's and in and around the this time period is that at this point of time in history entire nations were at war and the whole world was consumed in assisting the war in one way or another. Scientist at this point in time did not see the purpose in pursuing research in any other area independent of war because when any person is faced with the threat of extinction or capture, the person would do absolutely anything in their power and utilise their entire capabilities to prevent such a situation from taking place. Apart from this fact the entire world saw and heard about the atrocities of war and the various warcrimes anr scientist actually felt that it was their duty to help in whatever manner possible. So what is essential to be understood about this period is that science and war were progressing hand in hand and science was being utilised to its full extent to help out in whatever way possible to assist the wartime effort.
Today this picture has changed to a very large extent, research goes on in a wide plethora of topics and is not restricted only to war as it was earlier. An important facet in this war ie the second world war was the introduction of  machines in war and now it was as tough the machine and its human operator were one single entity and this was how the basic fighting block of warfare was seen as.This led to the emergence of a new scientific understanding of war in which man and machine were seen as one. The enemy now known as the "Manichean devil" was an integrated form of man and machine who makes calculated rational moves during war. This was how the enemy was being modeled and simulated in the various labs and universities where war research was being done. This similarity of the enemy is carried over even today where the extremist elements are modeled on the same "Manichean devil" concept where the don't have any intrinsic value for life and will go to any end to achieve their goals. Norbert Weiner's research was also based on models of the pilot and the AA operator as such entities. This paved a new wave of thinking because now humans were modelled as machines and this is the essence of simulating human behavior using machines. Hence using the concept of feedback the human nervous system was modeled on the same lines using a "Black Box" approach, a model where the functionality of the model is more important than its  constitution. Weiner was trying to prove that humans worked on the same feedback mechanisms as machines.
So in essence the convergence of science and war in this period led to a complete blurring of the man machine boundary an ideology that is so basic that it is seen carried over even today.
By
Vivek Subramaniam

Vision for science and war in the 30s

Introduction :
The beginning of the twentieth century was a period of rapid scientific advancement. Science was being put to various technological uses and people had already begun to wonder on its future prospects and limitations. Prominent among them was Vannevar Bush, who considered science an essential tool for interacting with the world, as well as for organizing and disseminating knowledge. Bush's idea of an information workstation, the memex, was revolutionary and served as a kernel for the World Wide Web and the information age. Alongside, there were also those sciences which were developed only as a result of military requirements due the political context of the time. The World Wars, especially World War 2, changed the way science was done and during its period emerged several new directions of scientific research.

War and the Cybernetic Vision:
In his paper titled 'The Ontology of the Enemy', Peter Galison describes three different pictures of enemies emergent in WW2- the sub-human Japanese, the anonymous enemy separated by physical and moral distance, and a cold blooded, machine like calculating enemy. In order to tackle this third kind of enemy, three closely related sciences were developed - game theory, operations research and cybernetics. Each one of them had its own vision of the enemy and focussed on solving a particular class of problems. The field of cybernetics, led by Norbert Wiener, was designed to predict the actions of and defeat the so called 'Manichean Devil', a logical and cunning opponent capable of bluffing in order to maximize its chance of victory. He designed the Anti Aircraft Radar system capable of using feedback to track and shoot down enemy planes effectively. In Wiener's analysis, the machine was indistinguishable from the man who controlled it and both could be treated as a servo mechanism which could then be subjected to statistical study. We notice that the war blurred the boundaries between man and machine, a concept that has gained more popularity and acceptance in recent years with the advancement of artificial intelligence and prosthetics.

The Postwar Period:
Did things change after the war? Yes and no. With the devastation caused by the atomic bombs in Hiroshima and Nagasaki, the world came to realise that scientific discoveries could be put to destructive uses. The physical and moral separation of the anonymous enemy vanished with the huge death toll and widespread suffering evident as a result of the war. Several scientists, including Wiener, became disillusioned and resolved not to pursue their research any further, fearing it could be misused. However, during the buildup of tensions between the USA and the USSR during the Cold War, several new and more destructive weapons were developed. Apparently, the lessons both countries learned during the World War were quickly forgotten in the rush for military supremacy.

The Current Scenario:
The World War ushered in a new age in cybernetics in a way that voltage regulators and thermostats could not. Vannevar Bush's beliefs that science could better the life of man in times of peace reveal a balance and optimism towards both science and military that is much rarer today. Concerns have been raised that technology is developing so rapidly that it is out-pacing our ability to control it. Cybernetics has evolved to the point that we now envision cyborgs and other kinds of artificial intelligence. A third world war at this time will be catastrophic for human civilization. Thus, I feel that since the 1930s, our visions of science and war have changed in many ways - perhaps the most important difference being a more mature and holistic outlook to the impact of new scientific developments on society.

Thursday, March 3, 2011

Babbage located Intelligence in the Mind not the attentive crafting body


In the early 18th century, with the invention of Watt's steam engine began a slow transition in parts of Europe previously manual labor and draft-animal–based economy towards machine-based manufacturing. With the complete mechanization of the textile industries began so called the 'Industrial Revolution'. In this time of history the European society started transforming, there was an emergence of a new labor class. The understanding of life was more mechanistic by which mind was viewed separated from the body.
In this environment Charles Babbage an English mathematician, philosopher, inventor, and mechanical engineer , popularly known as the 'father of computers', came with a 'calculating engine' which he advertised as 'Mechanized Intelligence'. His second engine the 'Analytical engine' was also an attempt to simulate intelligence. This engine was initially built to replace the human computers ,calculating astronomical tables. He thought that the introduction of machine would increase accuracy. Babbage's high estimation of the potential intelligence of machines rested on his idea of a mechanical universe. Referring to Zimmerman's article we understand that Babbage had a mechanistic idea of the whole universe, according to him everything is governed by some set of “laws assigned by the Almighty for the government of matter and of mind"
Babbage’s definition of intelligence is the combination of memory and foresight. According  to Babbage the owner of an article is the person who designs rather than a person who crafts it. It can be seen when Babbage laid claims to owning the means of production, while his engineer thought he could make more calculating engines if they went into production. In Babbage own words on the 'Calculating engine':
'My right to dispose, as I will, of such inventions cannot be contested; it is more sacred in its nature than any hereditary or acquired property, for they are the absolute creations of my own mind'
Babbage understood intelligence is of the mind , not in the body, in other words owner is he mind of the inventor and not craftspeople. This was extended to the way he understood 'factory', systematize the unintelligent work to make the product of the intelligent.
His idea of separation of mind and body and the superiority of the mind over the body can be clearly seen in his text 'On the Economy of Machine and Manufacture'. Babbage described what is now called the Babbage principle, which describes certain advantages with division of labor. If the labor process can be divided among several workers, it is possible to assign only high-skill tasks to high-skill(mind) and -cost workers and leave other working tasks to less-skilled(body) and paid workers, thereby cutting labor costs. According the Babbage the machines in a factory will help to keep a check on the workers and increase there productivity. In Babbage' words:
"One great advantage which we may derive from machinery is from the check which it affords against the inattention, the idleness, or the dishonesty of human agents"
By this he makes a worker in a factory a 'slave of the machine', while factory represent 'admirable adaptations of human skill and intelligence' where we see 'the triumph of mind over matter'. Babbage puts machines between the mind and body, as the workers(body) are its slave while mind triumphs over machine.
Babbage's understanding of universe and intelligence is reflected in all his works (Calculating engine, Analytic engine). For Babbage the intelligence is limited to the mind of the inventer and not the body craftsman.
References:
1. 'The Ideology of the Machine and the Spirit of the Factory'-Andrew Zimmerman
2. 'Babbage's Intelligence'-Simon Schaffer 
-Sujan 

Class and gender in computation


In order to address this question it is first important to know what sort of period in history that is exactly being talked about here. This is the mid 19th century it is a time where there is a central change being brought about in the way people think, in the sense that there is a new viewpoint of looking at life, the mechanistic viewpoint one where actions are being replicated and simulated. To understand the social structure at this point in history is also important to the essence of the argument because this is the primary reason as to why the way in which people are thinking is changing. This is an era where there is an over dependency on the skill of artisans, whose skills though valued high are not shared as they are a closed knit group. So this is the sort of transient society that Babbage and other innovators of the time are living in. Now the central idea that drove Babbage to the point of obsession was mechanization, in a society that was artisan dependent he wanted to bring in the idea of mass production to replicate the unique set of skills that these artisans had in machines and to make these machines ‘intelligent’ in the sense that to render the human input going into the machine invisible. This is the factory production ideology that Babbage was obsessed with, to the extent that he planned on mechanizing the functionality of the human mind itself only in this case it was the people who do computations who are being replaced. Now what is important here is to see that the common link between computation and intelligence and manufacturing, in both these cases there are distinct classes of people there is the designer or the factory manager who has authority and has complete surveillance over the actions of the people under him or in the case of a machine designer a panopticon view of all the individual components that go into his final product. And then there is the working class which is not really ‘intelligent’ because those functions is taken over by the machine and who take care of an infinitesimal part of the entire manufacturing/computing process and as a result have no clue about what exactly is the big picture and what exactly their actions surmount to on the large scale. It is the designers who wield the true intelligence and power and monitor the working class, they maintain each individual component of their assembly process under constant surveillance and believe that the work force/individual components have to be maintained under strict discipline and it is only then that any sort of work will be accomplished. And finally there is the substituted human component be it the artisans or the computers who have their skills simulated to such an extent that individuals with a fraction of their skill can perform tasks with better accuracy and at a far greater quantity and this a class full of people holding great resentment towards the idea of industrialization.
To see the mark of gender it is important to realize that computation and intelligence had its similarities to weaving in the sense that the earliest punchcard technologies took their inspiration from the mechanics of weaving and just as weaving became more and more mechanized so did computation and the process of intelligent thought. And weaving was initially viewed as a method of female compensation, essentially a feminine activity and this is the link that gets established between computation and feminism. Just as the notions of feminism were initially obscure and came to light only in the twentieth century on a similar scale the notions of mechanized thought, computational science and intelligence only in the latter half of the twentieth century. Due to is similarity with the weaving machines which were essentially rooted in feministic ideals, the initial computational machines which were seen as an embodiment of this ideology were also seen as feministic. So the roots of computation has its origins as a feministic embodiment.
BY 
Vivek Subramaniam
AE09b031
Refrences
1. Simon Schaffer on Babbage
2.Sadie Plant on Lovelace

Babbage’s Location of Intelligence


In Babbage’s Victorian England, the word “computer” had a meaning totally different from its current one. A computer was a worker who earned a living by performing mathematical calculations related to mathematically involved tasks such as the compilation of navigation charts, mathematical tables and astronomical tables. After the invention of the Difference Engine and other calculating machines, the task of calculation underwent mechanization and thus was born today’s term “computer”, an anthropomorphism of the earlier (human) computer.

This was a time when England was going through the Industrial Revolution – a process by which human (physical) labour was replaced by machine labour. But what was unexpected was the possibility of a machine to replace mental labour. The idea of a machine being able to “think” was not to be readily accepted.

On substituting mental work with a mechanism for the same, the terms “intelligence”, “labour” and “factory” had to be redefined. According to Schaffer, the intelligence of machines comes from the invisibility of the very labour force that drives them. This would make it seem that the machine is “thinking by itself”. This would portray “labour” as simply a means to an end, that is, the product or, in this case, the result of the mathematical calculation. In turn, a “factory” had to be a network of such bodies and minds working together to execute a specific task. Quoting Ure’s definition of the factory from Schaffer,

…he defined the factory both as 'a vast automaton, composed of various mechanical and intellectual organs,…, all of them being subordinated to a self-regulated moving force' and as 'the combined operation of many orders of work-people ...in tending with assiduous skill a series of productive machines'.

It is interesting to note that Babbage placed more importance in the outcome of labour than on labour itself. He argued that the end result would be the same, whether done by a machine or by a human. He drew a parallel between the work done by a machine and the power consumed, and the wages consumed by an equivalent number of skilled labourers. This analogy quantified human labour – something which was previously thought of as immeasurable. Hitherto, the attentive crafting body was described in terms of the skill it possessed. To place human labour and mechanised labour on the same scale and establish them as comparable entities was, to say the least, a novel idea. This made it possible to describe artisan skill in terms of its wage equivalent.

With the advent of industrialization and capitalism, individuals who owned small enterprises such as a weaving loom were forced to give up their business and instead work in a factory on the capitalist’s loom. With this, constant supervision and control was imposed on the labourers to ensure production. The factory was divided into different units, each performing a particular task in accordance with an organized and efficient plan, much like the process in the Analytical Engine where different parts of a bigger calculation were fed into different components of the machine and executed simultaneously for greater calculation speed. This process broke down huge, seemingly impossible tasks into smaller, manageable tasks that were completed within a short span of time.

Here, the craftsmen did their work as dictated by a plan and the result was fast and accurate. Hence, Babbage concluded that intelligence existed in the mind and not the body; in the minds of the inventor and the industrialist and not in the bodies or the skills of the artisans. Thus, great efficiencies in manufacture or, for that matter, calculation, could be achieved if one worked with a factory structure – an organization in which the intelligent inventor decides the plan of action and the body of craftspeople executes it. This separation of the mind from the body or of the “thinking entity from the “working entity” was what was Babbage’s industrialist idea was all about.

By Pranav R Kamat

References:

1. Schaffer, Simon - "Babbage's Intelligence: Calculating Engines and the Factory System." Critical Inquiry 21, no. 1 (1994): 203-227.

2. Zimmerman, Andrew - “The Ideology of the Machine and the Spirit of the Factory: Remarx on Babbage and Ure.” Cultural Critique No. 37 (Autumn, 1997), pp. 5-29

Computation and intelligence bear marks of class and gender

Schaffer's article introduces a computer to be a worker who undertook calculations in connection with compiling navigation charts, astronomical tables, etc. He also leads us to understand how 'intelligence' emerged out of a very particular set of social and labour relations back in babbage's era. It was babbage's understanding that the intelligence of a machine lies in the necessity of their source of power, i.e. the labour force that surrounds and runs them. That is to say that machines with higher intelligence could be run even if the labour force running them were to disappear. So now that we have a definition for intelligence, lets go on to how it bears marks of class and gender.

From this definition, it is easy to see that as machines become more intelligent, more of the labour force required for work in manufacturing disappear. People with Mechanist argue that the intelligence of machines involved in active manufacturing in any era divides all workers of the system into the low class artisans and the higher class management. This is because machines are a one time investment, normally require relatively very low maintenance as compared to human labour. These artisans began to lose their status starting with the copying of nature's movement in the defecating duck. This has been discussed in the previous blogpost. Here, Schaffer gives specific importance to the word 'Computation' as Babbage's difference and analytical engines breached this new level to redefine and bear a mark of class. It redefined the intelligence, i.e. those processes and work that can be automated from those that cannot. Though Babbage's influence was mostly political, the manufacture of the difference engine involved large amounts of initial investment, and human computers weren't done away with back in his day. But it laid the foundations for further research, which went on to separate functionality with mechanical design( hardware and software ); and the effects of Babbage's difference engine on the line separating the low class artisans from the intelligent head was seen through the 20th century. The importance of the difference engine was that 'computation' now fell below the line. The earlier mechanistic belief that all systems consisted of an independent mind that controls a mechanical body; was hit as 'computation', a thing of the mind, could now be mechanized.

But now that we have understood how computation and intelligence bear marks in class, we shall look at how it does so in gender. But, for doing so, we shall assume( not discuss ) the fact that intelligence is a trait of the mind and the body symbolizes everything mechanical and non-intelligent. To understand how it affects gender-biasing, we shall look at Plant's study of Ada Lovelace's life, achievements and how they were taken by the society. Though it was Babbage who invented the difference engine, it was Ada's idea of combining traits of the jacquard loom and the engine to give the programmable analytical engine. Here programs could be fed in using punch cards, just as in a jacquard loom where each punch card represents the stitching of 1sq in. But, in excerpts, Ada is described to be hysterically in love with maths. It was considered hysteria if a woman does math, a solution was seen in getting her pregnant, while it was seen to be perfectly normal in men like babbage wherein the body was of no significance.
Ada was considered fortunate to have math tutors,etc and wasn't easily permitted to participate as a full member of the community of scientists. The society forced Ada to believe that she herself was doing too much math, driving her to use opium. Over time, these ideas were overturned, as being imaginative and eccentric became the mark of genius, while women were considered too 'practical' to be pathbreaking scientists.

Thus, a struggle with the body became a kind of struggle with gender, where the body itself became feminine while the mind became masculine. And then of course comes freudian thinking, which takes the same ideology to radical heights.

-Amit M Warrier
-EE09B004

References:- Schaffer and Plant

Wednesday, March 2, 2011

Babbage located Intelligence in the Mind not the attentive crafting body

Introduction:
Following Vaucanson's creation of the defecating duck, a wide assortment of mechanical devices and contraptions were developed over the next century, designed as substitutes for human labour and time. These machines tried to mimic the behaviour of humans, and could usually perform only one specific task. However, the Analytical Engine planned by Charles Babbage was different. Though only a small part of it was completed, Babbage designed it so that it could perform tasks which were then considered intellectual in nature.

Mechanisation of Intelligence:
The calculating devices designed by Babbage and others were called 'computers', a term originally used to refer to those human workers whose job it was to peform tedious numerical calculations. This anthropomorphism of equating machines to humans is evidence that Babbage and others were subscribers to the prevalent mechanistic worldview at the time. Indeed, in his campaigns for the mechanisation of intelligence, terms used by Babbage to describe the operation of his Analytical Engine hinted that it possessed intelligence of some kind. Babbage claimed that his machine was capable of 'memory' and 'foresight' and that the mechanical means he used to employ these operations bore analogy to the actual workings of our mind. It is no surprise then that Babbage, along with other notable natural philosophers of his time, located intelligence in the mind and not in the body. As Simon Schaffer aptly puts it, 'To make machines look intelligent it was necessary that the sources of their power, the labour force which surrounded and ran them, be rendered invisible'. The machinery of the factory and the calculating engines precisely embodied the intelligence of theory and abolished individual intelligence of the worker.
Further, Babbage believed that he had unconditional rights over his creations and their subsequent production. Quoting Babbage,
'My right to dispose, as I will, of such inventions cannot be contested; it is more sacred in its nature than any hereditary or acquired property, for they are the absolute creations of my own mind'.

Such declarations demonstrated his control over the engine and 'camouflaged the work force' on which it depended.

Hierarchy of intelligence:
Babbage intended to use machines as a check for and to discipline human labour. In his own words, 'One great advantage which we may derive from machinery is from the check which it affords against the inattention, the idleness, or the dishonesty of human agents'. However the worker of the machine was not guaranteed any extra lesiure time as a benefit of the machine , in accordance with the economic principles which Babbage followed where the price of a good was directly dependent on the amount of human labour gone into it. Thus, machines in no way helped those at the lowest rung of the new hierarchy on intelligence. On the contrary, in this new system of classification, the body was a mere slave of the machine while the mind reigned supreme.

Conclusion:
In my opinion, Babbage's take on intelligence reflects his primarily mechanistic worldview. Today, we know that intelligence exists in several forms, both analytic, synthetic and perhaps even emotional. Though it may reside in the mind, such a form of intelligence is severely limited unless expressed by the body or some automaton capable of capturing it. At this stage in our development of AI, we still have a long way to go before we can claim that our machines truly embody a genuine form of intelligence.

Babbage's geography of Intelligence

  Introduction
Babbage, a brilliant Cambridge Mathematician is often credited as the “father of the Computer”, and rightly so for his designs of the difference and analytical Engines. But more than these, his main contribution to Britain and to the world can be said to be his outlook towards Machinofacture and Industrialisation and his unending effort to push these ideas into mainstream industry of his day. Although, one might claim this is the obvious stance one of babbage’s analytically brilliance will take, it is undeniable today that it was also influenced intrinsically by Babbage’s view towards human intelligence and its associated qualities.

The Economics of Skill
             Babbage supported Adam Smith’s idea of Division of Labor. This however, as can be seen in Schaffer’s article, stemmed from his belief that labor of production should be divided into small portions each based on the level of ‘skill’ required to execute them and he automatically demarcated one from another as being ‘more’ or ‘less’ requiring of skill, thus, we can see that Babbage not only acknowledged in the difference of skills for various tasks but also ranked them based on the amount of skill required. He went on to compare Tasks requiring a different set of skills – those which were on different sectors so as to speak – for example, he believed that by standardization and making accurate mechanisms, the task of an artisan could be mechanized and hence reduced to his wage equivalent, thus what was once a very precious skill – which was highly dependent on one’s skill with their hands, basically a mastery over their motor skills was now trivialized by factory machines and hence according to Babbage less requiring of skill, whereas, Babbage held the ability to make such as machine, a design oriented task, such as his own designs of the analytical engine as well as the difference engine as “admirable adaptation of human Skill and Intelligence”, he believed that the machine oriented factory production system, as stated in his celebrated text ‘On the Economy of Machine and Manufacture', as “giving to the present age its peculiar and wonderful characteristic, namely the triumph of mind over matter.”. it remains to be clearly seen that Babbage laid stress upon the Designer (the intelligent mind) of the invention rather than the manufacturer (crafting body) of the object, even though both might be requiring of considerable skill although in different realms – one being more cerebral and the other more of an application of the body, requiring co-ordination and sound control of motor skills.
Conclusion
                Babbage even goes further by saying about his Inventions when once an Engineer who “My right to dispose, as I will, of such inventions cannot be contested; it is more sacred in its nature than any hereditary or acquired property, for they are the absolute creations of my own mind” We can see that according to Babbage the owner of an article is the person who designs rather than a person who crafts it. As one might expect from his previously seen weightage and assessment of ‘level of skill’ required to perform a task. Thus, on analysis, we find that much of the views Babbage had and hence helped to shape about his views towards Manufacture and Economy were based upon his intrinsic working philosophy of the superiority of the Mind over the Body, a belief I feel he made evident in his outlook towards machines, manufacture and above all, his view on the factory based production which has, for a large part, shaped the world as we know today.
References:

1. Schaffer, Simon - "Babbage's Intelligence: Calculating Engines and the Factory System." Critical Inquiry 21, no. 1 (1994): 203-227.

2. Zimmerman, Andrew - “The Ideology of the Machine and the Spirit of the Factory: Remarx on Babbage and Ure.” Cultural Critique No. 37 (Autumn, 1997), pp. 5-29

Barath

A discussion on "Babbage located Intelligence in the Mind not the attentive crafting body."

It has been a fundamental question of the relative importance of the precesses of thinking versus crafting. Whether there is any such superiority relation at all, and on what basis can we say so if it exists. Here I discuss Babbage's idea of intelligence being located in the Mind, and not in the attentive crafting body with reference to Simon Schaffer's article "Babbage's Intelligence".


Introduction : 
   To understand the scope and radicality of Babbage's idea, we must first understand the socio-economical situation of his time. In Babbage's Victorian England, the traditional thinking was such that skill was recognized as a property inherent in the workers themselves. Skill was reckoned to be scarcely communicable outside the restricted sphere of worksmen, which was designed to remain opaque to the surveillance of the managers and inspectors. In this light, I would say that Babbage's idea was revolutionary and hence quite controversial.

Babbage's own struggle :
  Babbage's strong belief in this idea is very clearly demonstrated with his quote regarding the intellectual rights over the difference engine :

 "My right to dispose, as I will, of such inventions cannot be contested; it is more sacred in its nature than any hereditary or acquired property, for they are the absolute creations of my own mind."

  Also, the deep-rooted  opposite belief in the society comes into light with the fact that although the design of engine was totally Babbage's, he had to fight legal battle for almost a decade to gain the rights over the engine.

The difference engine :
   In Babbage's time, the word "computer" had a totally different meaning than that of now. Computers were workers employed to do mathematical calculations for observatories ect. They were workers who, instead of doing physical labor, were paid for the intellectual work they did. Himself having done the job of a computer, Babbage knew the process envisioned a machine which will do the calculations instead of humans. Thus, the difference engine was born. And "computer" became anthropomorphism of of the (human) computer.
  And there came the real difficulty : getting acceptance. It was time when England was going through industrial revolution, and physical labor was getting mechanized, but the challenge was accepting the fact that even intellectual labor like calculating can also be mechanized. That is, precisely what difference engine was going to do.
   Difference engine can be looked at as the materialization of the idea which Babbage so dearly nurtured - "Intelligence is not in body, but in mind." It is the very idea on which the the whole concept of the engine is based on : In order to create a thinking (calculating) machine, there is no need to simulate how the body functions, rather the process of thinking itself can be simulated. This would not have been possible if the intelligence and body were not separable.

Consequences :
  A very important concept which finds its roots in the idea of intelligence-body separation is the idea of an industrial "Panopticon". Once we separate the intelligence from the body, it immediately follows that in order to get maximum productivity from a factory setting, the 'unintelligent' and crafting 'bodies of workers' must be supervised. To put it in Babbage's words,
"One great advantage which we may derive from machinery is from the check which it affords against the inattention, the idleness, or the dishonesty of human agents."

  Babbage also emphasized on concealing of labor and work force in order for the intelligence to get noticed.
"To make machines look intelligent it was necessary that the sources of their power, the labor force which surrounded and ran them, be rendered invisible."
Conclusion : 
   Babbage's idea shows his mechanistic viewpoint of looking at things. It emphasizes the belief that even intelligence can be mechanized. The question is that where are we now? Have we mechanized intelligence? It is true that we have come a long way from Vaucanson's defecating duck, then Babbage's engines, to todays modern "Intelligent" robots like Kismet. But we have still long way to go before we capture all forms of intelligence into machine and only then the idea of separation of body and intelligence would stop being an 'idea' and become 'fact'.