Computing History


Computing History Essay, Research Paper



n 1842 Babbage’s Difference Engine and the Analytical Engine Convinced his machine would benefit England, Babbage applied for and received one of the first government grants to build the difference engine. Hampered by nineteenth century machine technology, cost over runs, and the possibility his chief engineer was padding the bills, Babbage completed only a portion of the difference Engine before the government with drew its support in 1842, deeming the project “worthless to science”. Meanwhile Babbage had conceived of the idea of a more advanced “analytical engine”. In essence, this was a general-purpose computer that could add, subtract, multiply, and divide in automatic sequence at a rate of 60 additions per second. His 1833 design, which called for thousands of gears and drives, would cover the area of a football field and be powered by a locomotive engine. Babbage worked on this project until his death. In 1991 London’s Science Museum spent $600,000 to build a working model of the difference engine, using Babbage’s original plans. The result stands 6 feet high, 10 feet long, contains 4000 parts, and weighs 3 tons.

The Honeywell 400 and the second Generation of Computers. The invention of the transistor signaled the start of the second generation of computers (1959-1964). Transistorized computers were more powerful, more reliable, less expensive, and cooler to operate that their vacuum-tubed predecessors. Honeywell established itself as a major player in the second generation of computers. Burroughs, Univac, NCR, CDC, and Honeywell IBM’s biggest competitors during the 1960s and early 1970s became as the BUNCH.


he IBM system 360 and the third generation of computers. The third generation was characterized by computers built around integrated circuits. Of these, some historians consider IBM’s system 30 line of computers, introduced in 1963, the single most important innovation in the history of computers. System 360 was conceived as a family of computers with upwards compatibility; when a company outgrew one model it could move up to the next model without worrying about converting its data. System 360 and the other lines built around intergraded circuits made all previous computers obsolete, but the advantages were so grate that most users wrote the costs of conversion off as the price of progress.

In the early 1960’s, Dr thoms Kurtz and Dr. John Kemeny of Darmouth College began develoing a programming language that a beginner could learn and use quickly. Their work culminated in 1964 with BASIC. Over the years, BASIC gained widespread popularity an devolved from a teaching language into a versatile and powerful language for both business and scientific applications, From micros to mainframes, BASIC is supported on more computers than any other language.

Although most computer vendors would classify their computers as fourth generation, most people pinpoint 1971 as generation’s beginning. That was the year large-scale integration of circuitry. (more circuits per unit of space) was introduced. The base technology, though is still the integrated circuit. This is not to say that two decades have passed without significant innovations. In truth, the computer industry has experienced a min boggling succession of advances in further miniaturization of circuitry, data communications, and the design of computer hardware and software.


n 1968, seventh grader Bill Gates and ninth grader Paul Allen were teaching the computer to play monopoly and commanding it to play millions of games to discover gaming strategies. Seven years later, in 1975 they were to set a course which would revolutionize the computer industry. While at Harvard, Gates and Allen developed a BASIC programming language for the first commercially available microcomputer, the MITS Altair. After Successful completion of the project, the two formed Microsoft Corporation, now the largest and most influential software company in the world. Microsoft was given an enormous boost when its operating system software, MS-DOS was selected for use bye the IBM PC. Gates, now the richest man in the world, provides the company’s vision on new product ideas and technologies.

Not until 1975 and the introduction of the Altair 8800 personal computer was computing made available to individuals and very small companies. This event has forever changed how society perceives computers. One prominent entrepreneurial venture during the early years of personal computers was the Apple II computer. Two young computer enthusiasts, Steven Jobs and Steve Wozniak (then 21 and 26 years of age, respectively), collaborated to create and build their Apple ii computer on a makeshift production line in Jobs garage. Seven years later, Apple Computer earned a spot on the Fortune 500 , a list of the 500 largest corporations in the United States.

In 1981, IBM tossed its hat into the personal computer ring with its announcement of the IBM Personal Computer, or IBM PC. By the end of 1982, 835,000 had been sold. When software vendors began to orient their products to the IBM PC, many companies began offering IBM-PC compatibles or clones. Today the IBM PC and its clones have become a powerful standard for the microcomputer industry,

Mitchell Kapor is one of the major forces behind the microcomputer boom in the 1980’s in 1982, Kapor founded Lotus Development Company, now one of the largest applications software companies in the world. Kapor and the company indroduced an electronic spreadsheet product that giave IBM’s recently introduced IBM PC (1981) credibility in business marketplace. Sales of the IBM pc and the electronic spreadsheet, lotus 1-2-3 soared.


icrosoft introduced Windows, a GUI for IBM PC compatible computers in 1985; however, windows did not enjoy widespread acceptance until 1990 with the release of Windows 3.0 Windows 3.0 gave a huge boost to the software industry because larger, more complex programs could now be run on IBM pc compatibles. Subsequent releases, including Windows 95, Windows NT and Windows 98 make personal computers even easier to use. Fueling the PC explosion of the 1990’s

Biography of Tim Berners-Lee

Tim Berners-Lee graduated from the Queen’s College at Oxford University, England, 1976. Whilst there he built his first computer with a soldering iron, TTL gates, an M6800 processor and an old television.

He spent two years with Plessey Telecommunications Ltd (Poole, Dorset, UK) a major UK Telecom equipment manufacturer, working on distributed transaction systems, message relays, and bar code technology.

In 1978 Tim left Plessey to join D.G Nash Ltd (Ferndown, Dorset, UK), where he wrote among other things typesetting software for intelligent printers, and a multitasking operating system.

A year and a half spent as an independent consultant included a six month stint (Jun-Dec 1980) as consultant software engineer at CERN, the European Particle Physics Laboratory in Geneva, Switzerland. Whilst there, he wrote for his own private use his first program for storing information including using random associations. Named “Enquire”, and never published, this program formed the conceptual basis for the future development of the World Wide Web.

From 1981 until 1984, Tim worked at John Poole’s Image Computer Systems Ltd, with technical design responsibility. Work here included real time control firmware, graphics and communications software, and a generic macro language. In 1984, he took up a fellowship at CERN, to work on distributed real-time systems for scientific data acquisition and system control. Among other things, he worked on FASTBUS system software and designed a heterogeneous remote procedure call system.

In 1989, he proposed a global hypertext project, to be known as the World Wide Web. Based on the earlier “Enquire” work, it was designed to allow people to work together by combining their knowledge in a web of hypertext documents. He wrote the first World Wide Web server, “httpd”, and the first client, “WorldWideWeb” a what-you-see-is-what-you-get hypertext browser/editor which ran in the NeXTStep environment. This work was started in October 1990, and the program “WorldWideWeb” first made available within CERN in December, and on the Internet at large in the summer of 1991.

Through 1991 and 1993, Tim continued working on the design of the Web, coordinating feedback from users across the Internet. His initial specifications of URIs, HTTP and HTML were refined and discussed in larger circles as the Web technology spread.

In 1994, Tim joined the Laboratory for Computer Science (LCS)at the Massachusetts Institute of Technology (MIT). to be Director of a W3 Consortium which coordinates W3 development worldwide, with teams at MIT, at INRIA in France, and at Keio University in Japan. The Consortium takes as it goal to lead the Web to its full potential, ensuring its stability through rapid evolution and revolutionary transformations of its usage. The Consortium my be found at

Biography of Marc Andreessen

Marc Andreessen did not invent the Web browser. Nor did he create the first browser with a graphical user interface. Andreessen’s browser wasn’t even the first to use pictures. Nevertheless, the Web wouldn’t be where it is today without him.

So what exactly did Andreessen do? He made the Web accessible to the masses. In 1993, as a 22-year-old undergraduate working for the National Center for Supercomputing Applications (NCSA), Andreessen cowrote Mosaic with Eric Bina. As the first Web browser to combine pictures and text in the same window, Mosaic made Web pages look a lot more like printed material. Suddenly, this formerly obscure part of the Internet could appeal to a wide audience.

Andreessen and Bina also made Mosaic much easier to install and use than other Web browsers, again appealing to a nontechnical audience. And Andreessen made himself available for 24-hour technical support, learning how to improve the product and ensuring user loyalty. Finally, he marketed the Mosaic name heavily, training users to associate it with the Web. His efforts paid off: a million copies of Mosaic were downloaded in its first year, and another million in the next six months.

In early 1994, Andreessen joined up with Jim Clark, the founder of Silicon Graphics, to form the company that would become Netscape Communications. Andreessen brought most of his NCSA colleagues with him, and within months they released Mosaic NetScape (later renamed Netscape Navigator), a faster, slicker, more secure browser than the original Mosaic. Navigator was a runaway success, quickly snapping up more than 70 percent of the browser market share. Time magazine even named it one of the ten best products of 1994, right up there with the Chrysler Neon, the Wonderbra, and Fruitopia. Kay, so maybe that wasn’t such an honor. Hehe :)

But the mythos surrounding the company was almost more important to the evolution of the Web than the browser itself. Along with Yahoo and other Web startups, Netscape embodied Web culture–young, hip, smart, irreverent–and the baby-faced, often barefooted Andreessen served as its poster child. The company generated so much buzz that its summer 1995 initial public offering stock, which had been valued at $28 a share, instead opened at $71–unbelievable for a company that had never turned a profit and gave most of its software away. Once again, Netscape was a leader.

An Outline

Of Computer History

p. 1-6

O 1842 Babbage’s Difference Engine and the Analytical Engine

O 1959 The Honeywell 400 and 2nd generation of computers

O 1964 IBM 360 and third generation of computers

O 1964 The BASIC language

O 1971 Integrated Circuits and fourth generation of Computers.

O 1975 Microsoft and Bill Gates

O 1977 The Apple II

O 1981 The IBM PC

O 1982 Mitchell Kapor Designs Lotus 123

O 1985 to present: Microsoft Windows


Tim Berners-Lee 6-8

Marc Andreessen 8-10


? Histroy and theory of the Web:

? E histroy

? Netscape

? Computers: sixth edition.

Lary Long and Nancy Long

Додати в блог або на сайт

Цей текст може містити помилки.

A Free essays | Essay
20кб. | download | скачати

Related works:
Computing Now
Computing Professional
Intro Computing
Dna Computing The Future Or The End
Computing Architectures
Computers And Computing
Women And Computing
Distributed Computing
© Усі права захищені
написати до нас