Introduction to Operating Systems:History The 1980s
The 1980s
The 1980s was the decade of the personal computer and the workstation.49 Micro- processor technology evolved to the point where high-end desktop computers called workstations could be built that were as powerful as the mainframes of a decade earlier. The IBM Personal Computer released in 1981 and the Apple Macintosh personal computer released in 1984 made it possible for individuals and small businesses to have their own dedicated computers. Communication facilities could be used to transmit data quickly and economically between systems. Rather than bringing data to a central, large-scale computer installation for processing, computing was distributed to the sites at which it was needed. Software such as spreadsheet programs, word processors, database packages and graphics packages helped drive the personal computing revolution by creating demand from businesses that could use these products to increase their productivity.
Personal computers proved to be relatively easy to learn and use, partially because of graphical user interfaces (GUI) that used graphical symbols such as windows, icons and menus to facilitate user interaction with programs. Xerox’s Palo Alto Research Center (PARC) developed the mouse and GUI (for more on the origins of the mouse, see the Biographical Note, Doug Engelbart); Apple’s release of the Macintosh personal computer in 1984 popularized their use. In Macintosh computers, the GUI was embedded in the operating system so that all applications would have a similar look and feel.50 Once familiar with the Macintosh GUI, the user could learn to use new applications faster.
As technology costs declined, transferring information between computers in computer networks became more economical and practical. Electronic mail, file transfer and remote database access applications proliferated. Distributed computing (i.e., using multiple independent computers to perform a common task) became widespread under the client/server model. Clients are user computers that request various services; servers are computers that perform the requested services. Servers often are dedicated to one type of task, such as rendering graphics, managing data- bases or serving Web pages.
The software engineering field continued to evolve, a major thrust coming from the United States government aimed at providing tighter control of Department of Defense software projects.51 Some goals of the initiative included realized code reusability and the early construction of prototypes so developers and users could suggest modifications early in the software design process.52
Self Review
1. What aspect of personal computers, popularized by the Apple Macintosh, made them especially easy to learn and use?
2. (T/F) A server cannot be a client.
Ans: 1) Graphical User Interfaces (GUIs) facilitated personal computer use by providing an easy-to-use, uniform interface to every application. This enabled users to learn new appli- cations faster. 2) False.A computer can be a client and server. For example,a Web server can be both a client and server. When users request a Web page, it is a server; if the server then requests information from a database system, it becomes a client of the database system.
History of the Internet and World Wide Web
In the late 1960s ARPA—the Advanced Research Projects Agency of the Depart- ment of Defense rolled out the blueprints for networking the main computer systems of about a dozen ARPA-funded universities and research institutions. They were to be connected with communications lines operating at a then-stunning 56 kilobits per second (Kbps)—1 Kbps is equal to 1,000 bits per second—at a time when most people (of the few who could be) were connecting over telephone lines to computers at a rate of 110 bits per second. HMD vividly recalls the excitement at
Biographical Note
Doug Engelbart
Doug Engelbart invented the computer mouse and was one of the primary designers of the original graphical displays and windows.
Engelbart’s background was in electronics. During World War II he worked as an electronics technician on a variety of systems including RADAR and SONAR.53 After leaving the military, he went back to Oregon State to complete a degree in Electrical Engineering in 1948.54 He went on to receive his Ph.D. from the University of California at Berke- ley, then took a job at the Stan- ford Research Institute (SRI), where he gained his first experi- ence with computers.55 In 1968, at the Joint Computer Conference in
San Francisco, Engelbart and his coworkers displayed their computer system, NLS (oNLine System) which featured Engelbart’s computer mouse and a graphical interface with windows.56 This original mouse, called an X-Y Position Indicator for a Display Sys- tem, had only one button.57 The mouse had two wheels on the bottom, one horizontal and one vertical, to detect movement.58 The mouse and the graphical windows were interdependent. The mouse made it significantly easier to switch between windows, and without windows the mouse was not as useful.
Engelbart has dedicated his life to augmenting human intellect. His original idea behind the NLS system was to create a system that could help people solve problems faster and enhance intelligence. Engelbart founded the Bootstrap Institute to foster worldwide awareness of his mission. Bootstrapping, according to Engelbart, is the idea of improving one’s methods of improvement. He believes this is the best way to improve human intelligence.59
Today, Engelbart is still working with the Bootstrap Institute. He has received recognition for his work including the Lemelson-MIT Prize, the National Medal of Technology and induction into the National Inventors Hall of Fame.60 that conference. Researchers at Harvard talked about communicating with the Univac 1108 “supercomputer” across the country at the University of Utah to han- dle the massive computations related to their computer graphics research. Aca- demic research was about to take a giant leap forward. Shortly after this conference, ARPA proceeded to implement what quickly became called the ARPAnet—the grandparent of today’s Internet.
Although the ARPAnet did enable researchers to network their computers, its chief benefit proved to be its capability for quick and easy communication via what came to be known as electronic mail (e-mail). This is true even on the Internet today, with e-mail, instant messaging and file transfer facilitating communications among hundreds of millions of people worldwide and growing rapidly.
The ARPAnet was designed to operate without centralized control. This meant that if a portion of the network should fail, the remaining working portions would still be able to route data packets from senders to receivers over alternative paths.
The protocols (i.e., sets of rules) for communicating over the ARPAnet became known as the Transmission Control Protocol/Internet Protocol (TCP/IP). TCP/IP was used to manage communication between applications. The protocols ensured that messages were routed properly from sender to receiver and that those messages arrived intact. The advent of TCP/IP promoted worldwide computing growth. Initially, Internet use was limited to universities and research institutions; later, the military adopted the technology.
Eventually, the government decided to allow access to the Internet for commercial purposes. This decision led to some concern among the research and military communities—it was felt that response times would suffer as “the Net” became saturated with users. In fact, the opposite occurred. Businesses rapidly realized that they could use the Internet to tune their operations and to offer new and better ser- vices to their clients. Companies spent vast amounts of money to develop and enhance their Internet presence. This generated intense competition among communications carriers, hardware suppliers and software suppliers to meet the increased infrastructure demand. The result is that bandwidth (i.e., the information- carrying capacity of communications lines) on the Internet has increased tremendously, and hardware and communications costs have plummeted.
The World Wide Web (WWW) allows computer users to locate and view multimedia-based documents (i.e., documents with text, graphics, animation, audio or video) on almost any subject. Although the Internet was developed more than three decades ago, the introduction of the World Wide Web (WWW) was a rela- tively recent event. In 1989, Tim Berners-Lee of CERN (the European Center for Nuclear Research) began to develop a technology for sharing information via hyperlinked text documents (see the Biographical Note, Tim Berners-Lee). To implement this new technology, Berners-Lee created the HyperText Markup Language (HTML). Berners-Lee also implemented the Hypertext Transfer Protocol (HTTP) to form the communications backbone of his new hypertext information system, which he called the World Wide Web.
Surely, historians will list the Internet and the World Wide Web among the most important and profound creations of humankind. In the past, most computer applications ran on “stand-alone” computers (computers that were not connected to one another). Today’s applications can be written to communicate among the world’s hundreds of millions of computers. The Internet and World Wide Web merge computing and communications technologies, expediting and simplifying our work. They make information instantly and conveniently accessible to large numbers of people. They enable individuals and small businesses to achieve worldwide exposure. They are changing the way we do business and conduct our personal lives. And they are changing the way we think of building operating systems. Today’s operating systems provide GUIs that enable users to “access the world” over the Internet and the Web as seamlessly as accessing the local system. The operating systems of the 1980s were concerned primarily with managing resources on the local computer. Today’s distributed operating systems may utilize resources on computers worldwide. This creates many interesting challenges that we discuss throughout the book, especially in Chapters 16–19, which examine networking, distributed computing and security.
Biographical Note
Tim Berners-Lee
The World Wide Web was invented by Tim Berners-Lee in 1990. The Web allows computer users to locate and view multimedia-based documents (i.e., documents with text, graphics, animation, audio or video) on almost ay subject.
Berners-Lee graduated from Queen’s College at Oxford University with a degree in Physics in 1976. In 1980 he wrote a program called Enquire, which used hyper- text links to help him quickly navigate the numerous documents in a large project. He entered into afellowship at the European Center for Nuclear Research (CERN) in 1984, where he gained experience in communication software for real-time networked systems.61, 62, 63
Berners-Lee invented HTTP (the Hyper Text Transfer Protocol), HTML (Hypertext Markup Language) and the first World Wide Web server and browser in 1989, while working at CERN.64, 65 He intended the Web to be a mechanism for open, available access to all shared knowledge and experience.66
Until 1993, Berners-Lee individually managed changes and suggestions for HTTP and HTML, sent from the early Web users. By 1994 the Web community had grown large enough that he started the World Wide Web Consortium (W3C; www.w3.org) to monitor and establish Web technology standards.67 As director of the organization, he actively promotes the principle of freely avail- able information accessed by open technologies.68
Self Review
1. How did the ARP Anet differ from traditional computer networks? What was its primary benefit?
2. What creations did Berners-Lee develop to facilitate data sharing over the Internet?
Ans: 1) The ARPAnet was decentralized, so the network continued to be able to pass information even if portions of the network failed. The primary benefit of the ARPAnet was its capability for quick and easy communication via e-mail. 2) Berners-Lee developed the Hyper Text Markup Language (HTML) and the Hypertext Transfer Protocol (HTTP), making possible the World Wide Web.
Comments
Post a Comment