Powered By Blogger

Linggo, Agosto 14, 2011

haaaiiisssttt................ sana wala na lng mga exams!   ANG HIRAP MAG-REVIEW!

Lunes, Agosto 8, 2011

summary of chapter 3 lesson 3 (ICT)

HOW TO TAKE CARE OF YOUR PC

  1. Computers need a good-working temperature to work properly. They need to be placed in a cool dry and dry place.
  2. All the cables and connectors must be tied together to keep them away from walkways to avoid accidents.
  3. Refrain from eating or drinking in front of a computer.
  4. Always use an AVR to regulate the electricity. Too much electric voltage might cause the unit to short. Also, turn off computers during lightning as these can cause electrical surges to the computer.
  5. Do not bump or drop the computer peripherals or components as any damage may cause them to malfunction.
  6. Avoid clutter around your computer. Use soft cloth in cleaning your computer to avoid scratches.
  7. Always scan for computer viruses.
USER'S HEALTH RISKS AND PREVENTION
GOOD WORKING HABITS:
  • Tap on the keys and mouse buttons gently
  • Avoid long uninterrupted periods of typing
  • Avoid staring at the monitor for long periods
PROPER WORKSTATION DESIGN:
  • Position in a well-ventilated, comfortable room
  • Use an adjustable workstation and an ergonomic computer chair
  • Place the monitor 16 to 24 inches away, at eye level or slightly at a lower angle. Tilt the monitor or a adjust the light source
  • Use an extendable/retractable legs of the keyboard
  • Place mouse where it is easily accessible by your dominant hand
  • Use a document holder to minimize vertical head movements
PROPER POSTURE
  • Seat up straight and put the feet flat on the floor
  • Position your lower arms parallel with the floor and level to your keyboard, with elbows at your side
  • Keep wrists straight
  • Do not lean into the monitor, but seat close enough to the keyboard and the mouse to stay relaxed
 




??????

kelanba talaga yung periodicals??

Martes, Hulyo 26, 2011

Types of Computers

          A computer is one of the most brilliant inventions of mankind. Thanks to the computer technology, we were able to achieve an efficient storage and processing of data; we could rest our brains by employing computer memory capacities for storage of the information. Owing to computers, we have been able speed up daily work, carry out critical transactions and achieve accuracy and precision in work output. The computers of the earlier years were of the size of a large room and were required to consume huge amounts of electric power. However, with the advancing technology, computers have shrunk to the size of a small watch. Depending on the processing powers and sizes of computers, they have been classified under various types. Let us look at the classification of computers.


Different types of Computers

Based on the operational principle of computers, they are categorized as analog computers and hybrid computers.

Analog Computers: These are almost extinct today. These are different from a digital computer because an analog computer can perform several mathematical operations simultaneously. It uses continuous variables for mathematical operations and utilizes mechanical or electrical energy.

Hybrid Computers: These computers are a combination of both digital and analog computers. In this type of computers, the digital segments perform process control by conversion of analog signals to digital ones.




Microcomputers

microcomputer is a computer that has a microprocessorchip as its CPU. They are often called personal computersbecause they are designed to be used by one person at a time. Personal computers are typically used at home, at school, or at a business. Popular uses for microcomputers include word processing, surfing the web, sending and receiving e-mail, spreadsheet calculations, database management, editing photographs, creating graphics, and playing music or games.
Personal computers come in two major varieties, desktop computers and laptop computers:
 


Desktop computers are larger and not meant to be portable. They usually sit in one place on a desk or table and are plugged into a wall outlet for power. The case of the computer holds themotherboarddrives, power supply, and expansion cards. This case may lay flat on the desk, or it may be a tower that stands vertically (on the desk or under it). The computer usually has a separate monitor (either a CRT or LCD) although some designs have a display built into the case. A separate keyboard and mouse allow the user to input data and commands.
 

Laptop or notebook computers are small and lightweight enough to be carried around with the user. They run on battery power, but can also be plugged into a wall outlet. They typically have a built-in LCD display that folds down to protect the display when the computer is carried around. They also feature a built-in keyboard and some kind of built-in pointing device(such as a touch pad).
While some laptops are less powerful than typical desktop machines, this is not true in all cases. Laptops, however, cost more than desktop units of equivalent processing power because the smaller components needed to build laptops are more expensive.

PDAs and Palmtop Computers

 

Personal Digital Assistant (PDA) is a handheld microcomputer that trades off power for small size and greater portability. They typically use a touch-sensitive LCD screen for both output and input (the user draws characters and presses icons on the screen with a stylus). PDAs communicate with desktop computers and with each other either by cable connection, infrared (IR) beam, or radio waves. PDAs are normally used to keep track of appointment calendars, to-do lists, address books, and for taking notes.
palmtop or handheld PC is a very small microcomputer that also sacrifices power for small size and portability. These devices typically look more like a tiny laptop than a PDA, with a flip-up screen and small keyboard. They may use Windows CE or similar operating system for handheld devices.
Some PDAs and palmtops contain wireless networking or cell phone devices so that users can check e-mail or surf the web on the move.

Workstations/Servers

 

workstation is a powerful, high-end microcomputer. They contain one or more microprocessor CPUs. They may be used by a single-user for applications requiring more power than a typical PC (rendering complex graphics, or performing intensive scientific calculations).
Alternately, workstation-class microcomputers may be used asserver computers that supply files to client computers over anetwork. This class of powerful microcomputers can also be used to handle the processing for many users simultaneously who are connected via terminals; in this respect, high-end workstations have essentially supplanted the role of minicomputers (see below).
Note! The term “workstation” also has an alternate meaning: In networking, any client computer connected to the network that accesses server resources may be called a workstation. Such a network client workstation could be a personal computer or even a “workstation” as defined at the top of this section. Note:Dumb terminals are not considered to be network workstations (client workstations on the network are capable of running programs independently of the server, but a terminal is not capable of independent processing).
There are classes of computers that are not microcomputers. These include supercomputers, mainframes, and minicomputers.

Minicomputers

 

minicomputer is a multi-user computer that is less powerful than a mainframe. This class of computers became available in the 1960’s when large scale integrated circuits made it possible to build a computer much cheaper than the then existing mainframes (minicomputers cost around $100,000 instead of the $1,000,000 cost of a mainframe).
The niche previously filled by the minicomputer has been largely taken over by high-end microcomputer workstations serving multiple users (see above).

Mainframes

   
mainframe computer is a large, powerful computer that handles the processing for many users simultaneously (up to several hundred users). The name mainframe originated after minicomputers appeared in the 1960’s to distinguish the larger systems from the smaller minicomputers.
Users connect to the mainframe using terminals and submit their tasks for processing by the mainframe. A terminal is a device that has a screen and keyboard for input and output, but it does not do its own processing (they are also called dumb terminals since they can’t process data on their own). The processing power of the mainframe is time-shared between all of the users. (Note that a personal computer may be used to “emulate” a dumb terminal to connect to a mainframe or minicomputer; you run a program on the PC that pretends to be a dumb terminal).
Mainframes typically cost several hundred thousand dollars. They are used in situations where a company wants the processing power and information storage in a centralized location. Mainframes are also now being used as high-capacityserver computers for networks with many client workstations.

Supercomputers

 

supercomputer is a mainframe computer that has been optimized for speed and processing power. The most famous series of supercomputers were designed by the company founded and named after Seymour Cray. The Cray-1 was built in the 1976 and installed at Los Alamos National Laboratory. Supercomputers are used for extremely calculation-intensive tasks such simulating nuclear bomb detonations, aerodynamic flows, and global weather patterns. A supercomputer typically costs several million dollars.
Recently, some supercomputers have been constructed by connecting together large numbers of individual processing units.

Information appliances

Information appliances are computers specially designed to perform a specific 
user-friendly function —such as playing musicphotography, or editing text. The term is most commonly applied to mobile devices, though there are also portable and desktop devices of this class.


Embedded computers

Embedded computers are computers that are a part of a machine or device. Embedded computers generally execute a program that is stored in non-volatile memory and is only intended to operate a specific machine or device. Embedded computers are very common. Embedded computers are typically required to operate continuously without being reset or rebooted, and once employed in their task the software usually cannot be modified. An automobile may contain a number of embedded computers; however, a washing machine and a DVD player would contain only one. The central processing units (CPUs) used in embedded computers are often sufficient only for the computational requirements of the specific application and may be slower and cheaper than CPUs found in a personal computer.

Lunes, Hulyo 25, 2011

HISTORY OF COMPUTERS
         The abacus was an early aid for mathematical computations. Its only value is that it aids the memory of the human performing the calculation. A skilled abacus operator can work on addition and subtraction problems at the speed of a person equipped with a hand calculator (multiplication and division are slower). The abacus is often wrongly attributed to China. In fact, the oldest surviving abacus was used in 300 B.C. by the Babylonians. The abacus is still in use today, principally in the far east. A modern abacus consists of rings that slide over rods, but the older one pictured below dates from the time when pebbles were used for counting.

          In 1617 an eccentric Scotsman named John Napier invented logarithms, which are a technology that allows multiplication to be performed via addition. The magic ingredient is the logarithm of each operand, which was originally obtained from a printed table. But Napier also invented an alternative to tables, where the logarithm values were carved on ivory sticks which are now called Napier's Bones.

           Napier's invention led directly to the slide rule, first built in England in 1632 and still in use in the 1960's by the NASA engineers of the Mercury, Gemini, and Apollo programs which landed men on the moon.

            The first gear-driven calculating machine to actually be built was probably the calculating clock, so named by its inventor, the German professor Wilhelm Schickard in 1623. This device got little publicity because Schickard died soon afterward in the bubonic plague.

            In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father who was a tax collector. Pascal built 50 of this gear-driven one-function calculator (addition) but couldn't sell many because of their exorbitant cost and because they really weren't that accurate. Up until the present age when car dashboards went digital, the odometer portion of a car's speedometer used the very same mechanism as the Pascaline to increment the next wheel after each full revolution of the prior wheel. Pascal was a child prodigy. At the age of 12, he was discovered doing his version of Euclid's thirty-second proposition on the kitchen floor. Pascal went on to invent probability theory, the hydraulic press, and the syringe.

              Just a few years after Pascal, the German Gottfried Wilhelm Leibniz (co-inventor with Newton of calculus) managed to build a four-function calculator that he called the stepped reckoner because, instead of gears, it employed fluted drums having ten flutes arranged around their circumference in a stair-step fashion. Although the stepped reckoner employed the decimal number system (each drum had 10 flutes), Leibniz was the first to advocate use of the binary number system which is fundamental to the operation of modern computers. Leibniz is considered one of the greatest of the philosophers but he died poor and alone.

              In 1801 the Frenchman Joseph Marie Jacquard invented a power loom that could base its upon a pattern automatically read from punched wooden cards, held together in a long row by rope. Descendents of these punched cards have been in use ever since.


              By 1822 the English mathematician Charles Babbage was proposing a steam driven calculating machine the size of a room, which he called the Difference Engine. This machine would be able to compute tables of numbers, such as logarithm tables. He obtained government funding for this project due to the importance of numeric tables in ocean navigation. By promoting their commercial and military navies, the British government had managed to become the earth's greatest empire. But in that time frame the British government was publishing a seven volume set of navigation tables which came with a companion volume of corrections which showed that the set had over 1000 numerical errors. It was hoped that Babbage's machine could eliminate errors in these types of tables. But construction of Babbage's Difference Engine proved exceedingly difficult and the project soon became the most expensive government funded project up to that point in English history. Ten years later the device was still nowhere near complete, acrimony abounded between all involved, and funding dried up. The device was never finished.

               Hollerith's invention, known as the Hollerith desk, consisted of a card reader which sensed the holes in the cards, a gear driven mechanism which could count, and a large wall of dial indicators to display the results of the count.

                Hollerith built a company, the Tabulating Machine Company which, after a few buyouts, eventually became International Business Machines, known today as IBM. IBM grew rapidly and punched cards became ubiquitous. IBM continued to develop mechanical calculators for sale to businesses to help with financial accounting and inventory accounting.

               One early success was the Harvard Mark I computer which was built as a partnership between Harvard and IBM in 1944. This was the first programmable digital computer made in the U.S. But it was not a purely electronic computer. Instead the Mark I was constructed out of switches, relays, rotating shafts, and clutches. The machine weighed 5 tons, incorporated 500 miles of wire, was 8 feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned by a 5 horsepower electric motor. The Mark I ran non-stop for 15 years, sounding like a roomful of ladies knitting. To appreciate the scale of this machine note the four typewriters in the foreground of the following photo.



                In 1953 Grace Hopper invented the first high-level language, "Flow-matic". This language eventually became COBOL which was the language most affected by the infamous Y2K problem. A high-level language is designed to be more understandable by humans than is the binary language understood by the computing machinery. A high-level language is worthless without a program -- known as a compiler -- to translate it into the binary language of the computer and hence Grace Hopper also constructed the world's first compiler. Grace remained active as a Rear Admiral in the Navy Reserves until she was 79.
 
                The Mark I operated on numbers that were 23 digits wide. It could add or subtract two of these numbers in three-tenths of a second, multiply them in four seconds, and divide them in ten seconds. Forty-five years later computers could perform an addition in a billionth of a second! Even though the Mark I had three quarters of a million components, it could only store 72 numbers! Today, home computers can store 30 million numbers in RAM and another 10 billion numbers on their hard disk. Today, a number can be pulled from RAM after a delay of only a few billionths of a second, and from a hard disk after a delay of only a few thousandths of a second. This kind of speed is obviously impossible for a machine which must move a rotating shaft and that is why electronic computers killed off their mechanical predecessors.

                  On a humorous note, the principal designer of the Mark I, Howard Aiken of Harvard, estimated in 1947 that six electronic digital computers would be sufficient to satisfy the computing needs of the entire United States. IBM had commissioned this study to determine whether it should bother developing this new invention into one of its standard products. Aiken's prediction wasn't actually so bad as there were very few institutions that could afford the cost of what was called a computer in 1947. He just didn't foresee the micro-electronics revolution which would allow something like an IBM Stretch computer of 1959.

                  The microelectronics revolution is what allowed the amount of hand-crafted wiring seen in the prior photo to be mass-produced as an integrated circuit which is a small sliver of silicon the size of your thumbnail .

                  The primary advantage of an integrated circuit is not that the transistors are miniscule, but rather that millions of transistors can be created and interconnected in a mass-production process. All the elements on the integrated circuit are fabricated simultaneously via a small number of optical masks that define the geometry of each layer. This speeds up the process of fabricating the computer -- and hence reduces its cost -- just as Gutenberg's printing press sped up the fabrication of books and thereby made them affordable to all.

                 The IBM Stretch computer of 1959 needed its 33 foot length to hold the 150,000 transistors it contained. These transistors were tremendously smaller than the vacuum tubes they replaced, but they were still individual elements requiring individual assembly. By the early 1980s this many transistors could be simultaneously fabricated on an integrated circuit. Today's Pentium 4 microprocessor contains 42,000,000 transistors in this same thumbnail sized piece of silicon.

                It's humorous to remember that in between the Stretch machine (which would be called a mainframe today) and the Apple I (a desktop computer) there was an entire industry segment referred to as mini-computers such as the following PDP-12 computer of 1969.

                One of the earliest attempts to build an all-electronic  digital computer occurred in 1937 by J. V. Atanasoff, a professor of physics and mathematics at Iowa State University. By 1941 he and his graduate student, Clifford Berry, had succeeded in building a machine that could solve 29 simultaneous equations with 29 unknowns. This machine was the first to store data as a charge on a capacitor, which is how today's computers store information in their main memory (DRAM or dynamic RAM). As far as its inventors were aware, it was also the first to employ binary arithmetic. However, the machine was not programmable, it lacked a conditional branch, its design was appropriate for only one type of mathematical problem, and it was not further pursued after World War II. It's inventors didn't even bother to preserve the machine and it was dismantled by those who moved into the room where it lay abandoned.

               Colossus, built during World War II by Britain for the purpose of breaking the cryptographic codes used by Germany. Britain led the world in designing and building electronic machines dedicated to code breaking, and was routinely able to read coded Germany radio transmissions. But Colossus was definitely not a general purpose, reprogrammable machine.

               The Harvard Mark I, the Atanasoff-Berry computer, and the British Colossus all made important contributions. American and British computer pioneers were still arguing over who was first to do what, when in 1965 the work of the German Konrad Zuse was published for the first time in English. Scooped! Zuse had built a sequence of general purpose computers in Nazi Germany. The first, the Z1, was built between 1936 and 1938 in the parlor of his parent's home.

              Zuse's third machine, the Z3, built in 1941, was probably the first operational, general-purpose, programmable digital computer. Without knowledge of any calculating machine inventors since Leibniz , Zuse reinvented Babbage's concept of programming and decided on his own to employ binary representation for numbers. The Z3 was destroyed by an Allied bombing raid. The Z1 and Z2 met the same fate and the Z4 survived only because Zuse hauled it in a wagon up into the mountains. Zuse's accomplishments are all the more incredible given the context of the material and manpower shortages in Germany during World War II. Zuse couldn't even obtain paper tape so he had to make his own by punching holes in discarded movie film. Because these machines were unknown outside Germany, they did not influence the path of computing in America. But their architecture is identical to that still in use today: an arithmetic unit to do the calculations, a memory for storing numbers, a control system to supervise operations, and input and output devices to connect to the external world. Zuse also invented what might be the first high-level computer language, "Plankalkul", though it too was unknown outside Germany.

             ENIAC, which stood for Electronic Numerical Integrator and Calculator. ENIAC was built at the University of Pennsylvania between 1943 and 1945 by two professors, John Mauchly and the 24 year old J. Presper Eckert, who got funding from the war department after promising they could build a machine that would replace all the "computers", meaning the women who were employed calculating the firing tables for the army's artillery guns. The day that Mauchly and Eckert saw the first small piece of ENIAC work, the persons they ran to bring to their lab to show off their progress were some of these female computers.

            ENIAC filled a 20 by 40 foot room, weighed 30 tons, and used more than 18,000 vacuum tubes. Like the Mark I, ENIAC employed paper card readers obtained from IBM. When operating, the ENIAC was silent but you knew it was on as the 18,000 vacuum tubes each generated waste heat like a light bulb and all this heat meant that the computer could only be operated in a specially designed room with its own heavy duty air conditioning system.

            Eckert and Mauchly's next teamed up with the mathematician John von Neumann to design EDVAC, which pioneered the stored program.

         By the end of the 1950's computers were no longer one-of-a-kind hand built devices owned only by universities and government research labs. Eckert and Mauchly left the University of Pennsylvania over a dispute about who owned the patents for their invention. They decided to set up their own company. Their first product was the famous UNIVAC computer, the first commercial computer. UNIVAC was also the first computer to employ magnetic tape.


Use of transitors for internal operations: tiny solid state transitors replace vacuum tubes in computers. The heat problems was then minimized and computers could be made smaller and faster.

Magnetic core as primary internal-storage medium: Electric currents pass through wires which magnetize the core to represent on and off states.Data in the cores can be found and retrieved for processing in a few millionths of a second.

Increased main-storage capacity: The internal or main storage was supplemented by use of magnetic tapes for external storage. These tapes substituted for punched cards or paper. Magnetic disks were also developed that stored information on circular tracks that looked like phonograph records. The disks provided direct or random access to records in a file.

Faster input/output; tape orientation: Devices could be connected directly to the computer and considered "on-line". This allowed for faster printing and detection and correction of errors.

High-level programming languages (COBOL,FORTRAN) : These languages resembled English. FORTRAN (FORmula TRANslator) was the first high-level language that was accepted widely. This language was used mostly for scientific applications. COBOL (Common Business-Oriented Language) was developed in 1961 for business data processing. Its main features include: file-processing, editing, and input/output capabilites.

Increased speed and reliability: Modular-hardware was developed through the design of electronic circuits. Complete modules called "breadboards" could be replaced if malfunctions occurred, or the machine "crashed". This decreased lost time and also new modules could be added for added features such as file-processing, editing , and input/output features.

Batch-oriented applications:billing, payroll processing, updating and inventory files: Batch processing allowed for collection of data over a period time and then one processed in one computer run. The results were then stored on magnetic tapes.

Use of integrated circuits: The use of integrated circuits (Ics) replaced the transitors of the second-generation machines. The circuits are etched and printed and hundreds of electronic components could be put on silicon circuit chips less than one-eighth of an inch square.




Smaller size and better performance and reliability: Advances in solid-state technology allowed for the design and building of smaller and faster computers. Breadboards could easily be replaced on the fly.
Extensive use of high-level programming languages: The software industry evolved during this time. Many users found that it was more cost effective to buy pre-programmed packages than to write the programs themselves. The programs from the second generation had to be rewritten since many of the programs were based on second generation architecture.


Emergence of minicomputers: The mini computers offered many of the same features as the mainframe computers only on a smaller scale. These machines filled the needs of the small business owner.


Remote processing and time-sharing through communication: Computers were then able to perform several operations at the same time. Remote terminals were developed to communicate with a central computer over a specific geographic location. Time sharing environments were established.
Availability of operating-systems(software) to control I/O and do tasks handled by human operators: Software was developed to take care of routine tasks required of the computer freed up the human operator.


Applications such as airline reservation systems, market forcasting, credit card billing: The applications also included inventory, control, and scheduling labor and materials. Multitasking was also accomplished. Both scientific and business applications could be run on the same machine.         


  1. Control Data Corporation: STAR 100 computer which has a vector based design. Information is processed as vectors instead of numbers. This allows for faster speed when problems are processed in vector form. Charles A. Burrus develops the (LED) light-emitting diode. RCA develops (MOS) technology, a metal-oxide semiconductor for the making of integrated circuits, making them cheaper and faster to produce. The circuits can also be made smaller.
  2. Texas Instruments introduces the first pocket calculator the Pocketronic. It can add subtract, multiply and divide. It costs around $150.
  3. Odyssey developed by Magnavox(first video game).Intel develops the first 8-bit microprocessor chip the 8008. (Used in the Mark-8 personal mini-computer). Nolan Bushnell invents a video game with a liquid crystal screen. The toy is called Pong. Bushnell founds Atari.
  4. Using LSI (large scale integration) ten thousand components are placed on a chip of 1 square inch.
  5. Hewlett Packard introduces the programmable pocket calculator. David Ahl develops a microcomputer consisting of a video display, keyboard and central processing unit. D-RAM (dynamic random access becomes commercially available and will be used in the first personal computers.
  6. Edward Roberts introduces the first personal computer call the Altair 8800 in kit form. It has 256 bytes of memory.
  7. A computer chip with 16 kilobits (16,384 bits) of memory becomes commercially available. It will be used in the first IBM personal computer.
  8. Steve P. Jobs and Stephen Wozniak introduce the Apple II. The first personal computer in assembled form. Xerox introduces the Star 8010 and office computer based on the Alto developed a few years earlier. The first linked automatic teller machines (ATMs) are introduced in Denver.
  9. DEC introduces a 32-bit computer with a virtual address extension (VAX). It runs large programs and becomes an industry standard for scientific and technical systems. Its operating system is called a VMS. Intel introduces its first 16-bit processor the 8086. The 8088 is used in the central processing unit in their first PC.
  10. Control Data introduces Cyber 203 supercomputer. Motorola introduces the 68000 microprocessor chip. It is a 24-bit capacity chip for reading memory and can address 16 megabytes of memory. It will be the basis for the Macintosh computer developed by Apple. Steven Hofstein invents the field-effect transistor using metal oxide technology. (MOSFET)1881 IBM Personal Computer uses the industry standard disk operating system. (DOS)
  11. IBM introduces the 5120 microcomputer. It is not successful.
  12. Osborne builds the first portable computer in which disk drives, monitor, and processor are mounted in a single box. It is the size of a suitcase. Clive Sinclair develops the ZX81 which connects to a television receiver. Japanese produce 64 kilobit chips (65,536 bits) of memory which captures the world market.
  13. Columbia Data Products announces the first computer based on the IBM PC that run programs designed for the IBM machine and gets the name "clones". Compaq introduces its first IBM-PC clone that is portable. Japan starts a project nationally funded to develop a fifth generation computer based on artificial intelligence using the Prolog language.
  14. IBM’s PC-XT introduced. It is the first personal computer with a hard drive built into the computer. It can store 10 megabytes of information even when the machine is turned off. It replaces many floppy diskettes. The machine is updated using DOS 2.0. IBM introduces PC-JR a scaled down version of the IBM-PC. It is unsuccessful. Immos, (British company) develops a transputer which several processors are contained in one computer and they work simultaneously on the same problem. Intel introduces the 8080, and 8 bit microprocessor that replaces the 8008.
  15. Philips and Sony introduce the CD-ROM (compact disk ready-only memory) an optical disk that can store large amounts of information. Apple introduces the Macintosh, a graphics based computer that use icons, a mouse and an intuitive interface derived from the Lisa computer. IBM ‘s PC AT (advanced technology) computer designed around the 16 bit Intel 80286 processor chip and running at 6 MHz becomes the first personal computer to use a new chip to expand speed and memory. Motorola introduces the 68020 version of the 68000 series of microprocessors. It has a 32-bit processing and reading capacity. NEC manufactures computer chips in Japan with 256 kilobits (262,144) of computer memory. IBM introduces a megabit RAM (random access memory) chip with four times the memory of earlier chips.
  16. Microsoft develops Windows for the IBM-PC. Intel introduces the 80386, a 32-bit microprocessor. Masaki Togai and Hiroyuki Watanabe develop a logic chip that operates on fuzzy logic at Bell Labs.
  17. Compaq leaps past IBM by introducing the DeskPro, a computer that uses an advanced 32-bit microprocessor, the Intel 80386. It can run software faster than the quickest 16-bit computer. Terry Sejnowski at Johns Hopkins in Baltimore develops a neural network computer that can read text out loud without knowing any pronunciation rules. The first DAT (digital audio tape) recorders are developed in Japan.
  18. The Macintosh II and Macintosh SE made by Apple become the most powerful personal computers.Sega Electronics introduces a three dimensional video game. The images appear three-dimensional. Telephones become available on commercial airplanes. Computer chips are manufactured with a 1 megabyte (1000 kilobits or 1,048,576 bits) of computer memory. Japan also introduces an experimental 4 and 16 megabit chip.
  19. Motorola introduces it 32 bit 88000 series of RISC (reduced instruction set computing) microprocessors. They can operate much faster than conventional chips.Compaq and Tandy develop the EISA (Extended Industry Standard Architecture). Steven Jobs introduces the NeXT Computer System. It is a graphical-based system that includes 256 megabyte optical storage disk and 8 megabytes of RAM. Robert Morris develops a computer virus that is planted in the Internet and causes the whole system to go down for two days. Scriptel introduces a method for inputing data into a computer by writing on a screen.
  20. Japan initiates daily broadcasts of it analog version of high definition television. (HDTV). Philips and Sony bring the videodisk to the open market. Seymour Cray founds the Cray Computer Corporation.
  21. Bell Laboratories Alan Huang demonstrates the first all-optical processor. Hewlett Packard announces a computer with RISC processor. IBM later introduces the RS/6000 family of RISC workstations. Computer chips introduced with 4 megabit of computer memory. Intel introduces the i486 processor chip which can operate at 33 MHz. Intel also launches the iPSC/860 microprocessor that is designed for multiprocessor computers. Motorola introduces the 68040 version of its 68000 series of microprocessors. The chip has 1.2 million transistors. IBM develops a transistor that can operate at 75 billion cycles per second.
  22. The 64-megabyte dynamic random access memory chip is invented. (D-RAM)
  23. IBM develops the silicon insulator (SOI) bipolar transistor. It can operate at 20 GHz .
  24. Harry Jordan and Vincent Heuring develop the first general purpose-all optical computer capable of being programmed and manipulating instructions internally. Intel ships their Pentium processor to computer manufacturers. It is the fifth generation of the chip that powers the PC. The chip contains 3.1 million transistors and is twice a fast as the fourth generation 486DX2. Fujitsu in Japan announces of a 256 megabit memory chip.
1994 to the present. The world is changing rapidly and so i