ABACUS TO COMPUTER – GEETHA CHART


Man first used his fingers and then his toes to count things. Gradually, he needed more than his fingers and toes for the many things he had to rely on. With pebbles, the early man used to count his cattle. He graduated to the first calculating device with the Chinese development of the abacus. It is a rectangular wooden frame with beads on horizontal rods. Today, it is used by nursery children plodding through elementary arithmetic lessons.

From the abacus, a man moved on to the odometer, now called the speedometer, which led to the development of mechanical adders and multipliers. The next stage was John Napier’s logarithm tables (1617), which simplified computing techniques and led to the development of the slide rule by the Germans. Slide rule used Napier’s logarithmic scales in two directions – one slide against the other on a groove, facilitating rapid calculation. In 1680, the slide rule was taken over by a mechanical calculator. The need to do a large amount of complex computing at increasingly faster rates forced a man to develop machines to do the counting for him.

FIRST MACHINE

The first mechanical calculating machine was made by a French mathematician, Blaise Pascal – the device consisted of gears, wheels, and dials; the wheels had ten segments each and were based on the principle that when one wheel completed one rotation, the rest of wheel will move one piece. His calculator could perform addition and subtraction by dialling these series of revolutions, hearing numbers 0 to 9 around them; a German, Gotterfield Leibnitz, later modified Pascal’s calculator. In 1791, Charles Babbage made a machine called ‘Differential Engines, which could accurately evaluate algebraic expressions and mathematica] tables up to 20 Decimal Places. Babbage’s device could be called a Neanderthal computer compared to today’s supercomputers. Later, automatic computing machines designed to add at 80 per minute and have memory were developed.

Another genius in the field of computing was Dr Hoiman Hollerith of the USA, which added to the arithmetic capability of the existing calculator, the ability to store intermediate results for subsequent calculations, thus eliminating copying and re-entering data. Data were entered together with the sequence of operations to be performed (which we now call program) automatically from sets of cards compiled together. Cards are even today a common medium for feeding information into computers.

EARLY COMPUTERS

The late ’30s of this century saw the development of various types of computers, But all were mechanical machines. In 1948, an electrically operated computer was set up at Harvard University. After World War II, significant improvements in computer making were made. The rapid development of computers after World War can be traced to five stages, commonly called generations.

FIRST GENERATION COMPUTER

These computers were made using values like the ones on the radio. Electronic numerical integrator and calculator, popularly known as ENIAC, was the first electronics-based calculator of this generation. It could perform 5000 additions/350 multiplication in a second. The first-generation computers were voluminous and had slow operating speeds.

SECOND GENERATION

The invention of the transistor heralded the II generation computer. Transistors are small electronic devices mode of semiconductors often Used in Place Of thermionic Valves in radio sets. Only very low voltage is required to operate these transistors, which can amplify small currents. Using transistors instead of valves reduced the size and manufacturing cost of computers.

THIRD GENERATION

The third generation saw the birth of an integrated circuit or chip—where several transistors were integrated with other components and sealed into a small package. The use of integrated circuits (I.C.) further reduced the size of computers. Computers of this type were called minicomputers.

FOURTH GENERATION

In third-generation computers, chips or I.C. used were large and expensive. Scientists were struck by the novel idea of placing all components of the entire computer on a single piece of a semiconductor chip. It was a daring conceptual move, and the results became fruitful after wrestling with the design. The entire computer circuitry on a single semiconductor chip is called a microprocessor. A computer using chips is called a microcomputer. Japanese have succeeded in procuring pocket calculators, which we operate. Present-day pocket calculators are the progeny of the Japanese microcomputer.

FIFTH GENERATION

A fifth-generation computer is a new super breed of computer. These contain bits of information and will be able to think and make decisions. The human brain is the most extraordinary computer assembled over a thousand years. Questions like “Can a machine think?” and “What will be the nature of its intelligence always intrigued scientists.

MICROPROCESSOR

The invention of the microprocessor has taken man closer to producing artificial intelligence. An electronic chip could do everything from guiding a missile to roasting bread. Yet, most scientists regard them as ‘Dumb Brutes’ as they do only what they are told to do without the kind of perception which they in the ‘effectiveness of a poem or “incongruity’ of response; one cannot say that intelligence is present.

More routine repetition of steps, which might involve adding numbers or solving equations, relies more on mechanics than intellect. There is an ongoing race to develop an ‘intelligent computer’, popularly known as artificial intelligence. Two areas of behaviour are reasonable for classifying behaviour as intelligent and can be elicited from computer—learning and reasoning. Scientists have been able to teach computers to play chess end checkers. They have developed programs that create the will to win and chalk out moves in advance on a computer. Thus, a computer can’t learn to play.

Many computers can learn by watching others, reading, being told, and trial and error. But it has some pre-programmed knowledge gained by being meant/fed. Computers’ thought or decision-making process occurs through a path of if, then, or construction, i.e., If this is true or false. Or if something else is true, then do this. To take an example from our day-to-day life, if a person is 25 years old and has earned an M.A. in Political Science or M.A. in Sociology, then the computer prints that he is eligible for such a job. The thinking/decision-making procBooleandone with the help of Boolean Algebra, a numbering system with 0 and 1 as digits.

Claude Shannon established the relationship between Boolean Algebra and the flow of power and logic. “0” represents a switch turned off, and one illustrates a button turned on. It has been effectively put to use to make computer-reach decisions. Present-day computers are serial processors in that they proceed from point to point, one step at a time, with the next step determined by the result of the previous one. Human beings, in contrast, use not only serial processing but also parallel processing in which several trains of thought— some conscious and others not, are underway together.

COMPUTER INTELLIGENCE

The capabilities of computers are increasing day by day at a fantastic rate, and the raw human intelligence is changing slowly, if at all. Artificial Intelligence has only recently developed a sufficient track record of accomplishment to attract industrial interest. The most notable achievement is a complex function generally associated with human expertise. Progress in artificial intelligence has also been noteworthy in fields such as natural language understanding and vision. Scientists have long dreamed of a machine recognising and reacting to a human voice.

Speech recognition research programme scientists in Britain are making that dream seem possible. Simple systems exist, but they are not accurate voice recognition. They range from toys that respond to sound – a handclap is as good as a command – to machines that recognize only a limited vocabulary in ideal conditions. Several factors make it hard to improve matters. Accents differ between individuals and even for the same person. The IBM computer system has demonstrated an accurate representation of a 5000-word vocabulary. The commercial, military, and significant health and education sectors are now exploring techniques developed by artificial intelligence researchers.

Trials are underway on a new technique called FACES (facial analysis, comparison, and elimination system) for locating photographs of offenders by computer using witnesses’ descriptions. By 1990, the intel licence machines will work with our best minds. The product of man’s brain will become his salvation in a world of crushing complexities.

A computerized library in Britain is attracting interest in several countries. It can order new books and calculate costs from many foreign currency rates. It even chases up an order if it becomes overdue and keeps track of the movement of books and off the shelves into a student’s briefcase or back to a bookbinder.

Source: Press Release
PIB
PRESS INFORMATION BUREAU
GOVERNMENT OF INDIA
Date: February 1, 1988