A Brief History of Computer Science: 1950-2000

A Brief History of Computer Science: 1950-2000

What have been the major innovations in the field of computer science in America? We explore the field of computer science from 1950-2000.

The story of Computer Science from 1950 to 2000 is, in fact, the story of the launching and evolution of the field itself. In 1950, British mathematician and code breaker Alan Turing wrote the seminal paper on Artificial Intelligence (though the term wouldn’t be coined for a few more years). With his “Computing Machinery and Intelligence,” which published in the Journal Mind, Turing first introduced in writing his idea of the Imitation Game for computers and people, or what is now known eponymously as the Turing Test.

John Von Neumann, John Mauchly, J.P. Eckert and others built ENIAC, the world’s first computer, in the 1940s. However, the field of “computer science” did not yet exist in the way that we think of it. Computer programming techniques and algorithms exploded in the 1950s, and with Turing’s landmark article at the beginning of the new decade, computer science quickly became synonymous with the ambition of programming machines to do intelligent things.

Other Related Articles

Computer science (and soon Artificial Intelligence) grew rapidly as a field distinct from electrical engineering or pure mathematics, focused instead on building, studying, and programming computers. Computer scientists had a new tool and sought to enhance it with more powerful hardware and software in the decades that followed. Perhaps more than any other field in recent history, Computer Science would soon transform the modern world.

Computers in the 1950s were massive, refrigerator-sized (or larger) devices that computed the results of programs input by punch cards, using at first vacuum tubes or (even worse) electromechanical switches to perform their operations.

They were slow, noisy, and required massive amounts of manual preparation and work to maintain and use. New hardware was needed. Fortunately, a group of physicists led by the mercurial and temperamental William Shockley developed the microchip–the world’s first semiconductor (made first of germanium, then silicon, which is still used today) computer hardware that contained all the instructions in a computer, etched into a single medium. Microchips drastically reduced the size of computers while increasing their memory and processing power.

By the 1960s, computers using microchips were using early full-featured languages like FORTRAN or COBOL, the latter developed in part by Grace Hopper, who helped program the first computer, the ENIAC—and later EDVAC. Computers went into the noses of rockets in the Cold War, transformed business and government by facilitating rapid calculation of large numerical datasets (like census counts, taxes, and sales projections), and made possible the navigation of a manned vehicle to the moon.

By the 1970s, “Silicon Valley” in Northern California was replete with big computer companies like Intel as well as a growing number of startups launched by entrepreneurs like Bill Gates, who saw the future of computer science not just as the use of computing machinery by large companies and government but by individuals, using what came to be called Personal Computers, or PCs.

By the 1970s, computer science departments were ubiquitous in colleges and universities not just in the United States but around the world. Data structures and algorithms remained the core curriculum (the study of how information is stored and manipulated by digital computers), but the tidal wave of innovation stemming from the rise of computers in the mid to late 20th century meant that new classes appeared seemingly overnight. Computer security became an issue, as networks linked computers together with analog and digital lines, making remote break-ins and hacks a possibility (and then a real threat).

The Internet came online in the late 1960s and expanded in the 1970s to include more “hubs,” or centers where traffic could pass through, like the University of Utah. Digital networking thus became a core part of computer science. Artificial Intelligence also continued its exploration of intelligent computing, specifically in the context of game programming—first checkers (in the 1950s), then chess (finally dominated by computers in the 1990s). In truth, AI research in Computer Science ran into many problems, but the study of computing remained a juggernaut that kept transforming how we live, work, and interact.

If the 1980s was the era of the PC, with Microsoft and Apple offering PCs powered by proprietary operating systems and software, the 1990s was the era of the World Wide Web. British physicist Tim Berners-Lee invented a markup language, called “HyperText Markup Language,” or HTML, and also helped develop the world’s first Web browser, with Mosaic. The Web began simply enough with Berners-Lee’s innovations, but by 1995 had become a hotbed for commercial speculation, with Netscape and soon Yahoo! launching commercial websites intended to capitalize on the promise of new innovations in computer science. The rest, as they say, is history.

By the turn of the century, computer science was among the most powerful and popular academic subjects around the world. Students flocked to computer science departments to learn the theories and tools of a new age. And by the turn of the century, the availability of massive datasets from the explosion of web pages on the World Wide Web meant that computer scientists, entrepreneurs, and indeed everyone would need strategies for storing, analyzing, and securing truly gargantuan quantities of digital information. This led the way to yet more innovations (and challenges) in the decades after 2000.

***

Find out which influencers have most contributed to advancing the field of computer science over the last two decades with a look at The Most influential People in Computer Science, for the years 2000 – 2020.

And to find out which schools are driving the computer science field forward today, check out The Most Influential Schools in Computer Science, for the years 2000 – 2020.

Get more study tips, learning tools, and study starters with a look out our Complete Library of Study Guides.

Or jump to our student resource library for tips on everything from studying to starting on your career path.

Do you have a question about this topic? Ask it here