The replacement of vacuum tubes by transistors saw the advent of the second generation of computing. They were a big improvement over the vacuum tube, despite still subjecting computers to damaging levels of heat. However, they were hugely superior to the vacuum tubes, making computers smaller, faster, cheaper and less heavy on electricity use. This meant programmers could create instructions in words.
The early versions of these machines were developed for the atomic energy industry. The first computer to use transistors was the TX-0 and was introduced in Using IC's in computers helped reduce the size of computers even more compared to second- generation computers, as well as make them faster. By this phase, transistors were now being miniaturized reduced and put on silicon chips called semiconductors.
This led to a massive increase in speed and efficiency of these machines. These were the first computers where users interacted using keyboards and monitors which interfaced with an operating system, a significant leap up rise from the punch cards and printouts.
This enabled these machines to run several applications at once using a central program which functioned to monitor memory. Nearly all computers since the mid to late s have utilized IC's. While the third generation is considered by many people to have spanned from to , IC's are still used in computers today. Over 45 years later, today's computers have deep roots going back to the third generation. Microprocessors, along with integrated circuits, helped make it possible for computers to fit easily on a desk and for the introduction of the laptop.
Some of the earliest computers to use a microprocessor include the Altair , IBM , and Micral. Today's computers still use a microprocessor, despite the fourth generation being considered to have ended in This revolution can be summed in one word: Intel.
What filled a room in the s now fit in the palm of the hand. The Intel chip housed thousands of integrated circuits. The year saw the first ever computer IBM specifically designed for home use and saw the MacIntosh introduced by Apple. Microprocessors even moved beyond the realm of computers and into an increasing number of everyday products.
The increased power of these small computers meant they could be linked, creating networks. Which ultimately led to the development, birth and rapid evolution of the Internet. Other major advances during this period have been the Graphical user interface GUI , the mouse and more recently the astounding advances in lap-top capability and hand-held devices.
Leaps have been made in AI technology and computers, but there is still much room for improvement. Computer devices with artificial intelligence are still in development, but some of these technologies are beginning to emerge and be used such as voice recognition. The Google search engine also utilizes AI to process user searches. AI is a reality made possible by using parallel processing and superconductors.
Leaning to the future, computers will be radically transformed again by quantum computation, molecular and nano technology. The essence of fifth generation will be using these technologies to ultimately create machines which can process and respond to natural language, and have capability to learn and organise themselves.
The Ten Commandments of Computer Ethics 1. Thou shalt not use a computer to harm other people. Thou shalt not interfere with other people's computer work. Thou shalt not snoop around in other people's computer files. Thou shalt not use a computer to steal. Thou shalt not use a computer to bear false witness. Thou shalt not copy or use proprietary software for which you have not paid. Thou shalt not use other people's computer resources without authorization or proper compensation.
Thou shalt not appropriate other people's intellectual output. Thou shalt think about the social consequences of the program you are writing or the system you are designing.
Thou shalt always use a computer in ways that ensure consideration and respect for your fellow humans. And like human language, there are many different computer languages. Essentially, computer software can be divided into three main groups depending on their use and application.
These are system software or operating system referred simply as the OS, application software and programming languages. Usually most of us interact with a computer using application software.
System Software: System software or operating system is the software used by the computer to translate inputs from various sources into a language which a machine can understand. Basically, the OS coordinates the different hardware components of a computer. There are many OS in the market. The most popular OS are from the stable of Microsoft.
We have all heard, used and wondered at the Windows software, which is an OS. Starting with Windows, Microsoft has migrated to Vista, its latest offering in the market. It may come as a surprise to some that there are other operating systems used by others. Among these UNIX is used for large office setups with extensive networking.
XENIX is software which has now become redundant. Apache OS is quite popular with web servers. IBM still uses proprietary operating systems for its main frames. Proprietary systems are generally built with the help of a variant of UNIX operating system. Application software: A normal user rarely gets to see the operating system or to work with it. But all of us are familiar with application software which we must use to interact with a computer.
Popular examples of application software are the Microsoft office suite which includes Word, Excel and PowerPoint.
We have used these applications extensively. Internet explorer, Mozilla Firefox is two applications used to access the internet. E-mail software like Outlook express is used to manage Emails. It is obvious that all software utilized for working on a computer is classified as application software. In fact, all user interfaces are an application. It was times faster than the previous electromechanical computers but was a little slow when it came to re-programming.
Among many things, The ENIAC was used to study the feasibility of thermonuclear weaponry, firing of ballistic artillery and engine thermal ignition, and elsewhere, for weather predictions.
These systems were enormous in size and occupied entire rooms while using lots of electric power. This made them generate unbearable heat. Presper Eckert, was the first in the same era to be designed for commercial rather than military use. It manipulated both the alphabet and numbers fairly well and was used by USA Census Bureau to enumerate the general population. It was later used to manipulate payrolls, records, company sales, and even predicted presidential election results in It was also half the size of its predecessor and sold over 46 units.
These were computers which used transistors instead of vacuum tubes. They were better than their predecessors in many ways because of apparent small size, speed, and cheaper cost. Transistors are more or less the building blocks of any microchip out there, and also, more reliable, energy efficient and capable of conducting electricity faster and better. The transistor was used in the 2nd computer generation. Just like vacuum tubes, transistors are switches or electronic gates used to amplify or control current, or switch electric signals on and off.
They are called semiconductors because they contain elements which lie between conductors and insulators. Transistor semiconductors were invented at Bell Laboratories in by scientists William Shockley, John Bardeen and Walter Brattain, but did not see the day of light until mids. Second generation computers saw advancement in data input and output procedures.
Initially, these processes were similar to the last models of 1st gen computers. They were tedious because they involved multiple personnel carrying punched cards from room to room. To speed up the process, the batch system was conjured up and implemented.
It involved collecting multiple data jobs into multiple punched cards and feeding them into single magnetic tapes using a fairly smaller and inexpensive system. The IBM was one such computer. Processing, on the other hand, was done using a more powerful system like the IBM When data manipulation was complete, the files were transferred back to a magnetic tape.
These were the harbingers of operating system software to come. Using a smaller system again, say IBM, the data was printed out to multiple punch cards as output. IBM computer with one circuit card access drawer opened, on display at the Computer History Museum. Besides the development of operating systems software, other commercial applications were also hitting the 'shelves'. This was probably due to the overall upgrade from restrictive binary based machine code to languages that wholly supported symbolic and alphanumeric coding.
The early mainframes and supercomputers were just some of the machines which took advantage of transistors. The semiconductor IC packed a huge number of transistors, capacitors, diodes, and rectifiers onto a single germanium or silicon. These were then printed on separate parts of a printed circuit board. The implementation of these computers was also in line with Moore's Law , which observed that transistor size was shrinking so fast, that double the number would fit into new microchips every two years for 10 years to come.
The IC sought to solve the cumbersome procedures that went into designing the transistor circuitry. The manual interconnection of capacitors, diodes, and rectifiers in transistors was time-consuming and not completely reliable. Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Corporation separately discovered the benefits of integrated circuits in and , respectively.
Kilby built his IC onto germanium whereas Noyce built one onto a silicon chip. The first systems to use the IC was the IBM , which was packed with the muscle to handle both commercial and scientific assignments. Besides the reduction in cost, the speed and performance of any one computer increased tremendously after placing multiple transistors on a single chip.
Since its invention, the IC speed doubled every two years, shrinking both the size and cost of computers even further. Almost all electronic devices today use some form of integrated circuits placed on printed circuit boards. The IC circuitry aside, the interaction with computers improved. Instead of punched cards printouts, keyboards and better input peripherals were used to input data which were displayed for output through visual display units.
Computers now used operating system software to manage computer hardware and resources. This allowed systems to run different applications at a time. This was because of centralized applications that monitored memory distribution.
This generation also ushered in the concept of 'computer family' which challenged manufacturers to come up with computer components that were compatible with other systems. By Bobo The next generation of mainframes and supercomputers took advantage of integrated circuits IC. By Wolfgang Manousek. The birth of the microprocessor was at the same time the birth of the microcomputer.
Computer Networks Lectures 8 hours Arnab Chakraborty. Computer Graphics 99 Lectures 6 hours Arnab Chakraborty. Computer Fundamentals Online Training 46 Lectures 2. Previous Page Print Page. Save Close. First Generation The period of first generation: Vacuum tube based. Second Generation The period of second generation: Transistor based.
0コメント