A Brief History of Computer Science: 2000-2020

A Brief History of Computer Science: 2000-2020

What have been the major innovations in the field of computer science in the United States? We explore the field of computer science from 2000-2020.

The twenty year period beginning at the turn of the century ushered enormous changes into the now ubiquitous field of Computer Science. Ironically, the year 2000 itself was ushered in on a wave of anxiety about our trust in computer systems. The so-called “Y2K” problem emerged when developers and scientists realized that digital dates in computer systems worldwide did not include an option for “2000.”

Computer scientists couldn’t say exactly what would happen to these inteconnected systems at the stroke of midnight on December 31st, 1999. As these computer systems were, by then, driving the world economy, there was considerable alarm about what a systemwide breakdown might mean. Leading world governments raced to find a solution. It turned out, quite simply, that the format for computer dating needed to be reformated from two digits to four digits. Calamity was avoided. Computational systems in banks, businesses, educational institutions and government all continued to behave in the expected and predictable manner. The much-feared Y2K bug never came to pass.

Other Related Articles

However, real changes in the science and practice of computation were soon to come. In Artificial Intelligence research, for instance, classic approaches using explicitly represented symbols petered out, largely on the heels of the sudden availability of massive datasets useful for statistical approaches, as the World Wide Web went mainstream. Machine learning, a subfield of AI, exploded in university classrooms and in industry as researchers realized that Web data could feed such algorithms and put them on steroids. By 2005 many of the top university departments in Computer Science were focused on training, testing, and deploying more and more advanced techniques in machine learning.

In the next decade, a major innovation swept through computer science called “Deep Learning,” which ironically is based on a decades-old machine learning approach known to computer scientists even in the 1950 as neural networks (Artificial Neural Networks or ANNs).”

“Search” (once called “Information Retrieval”) became a hot topic, and tasks such as image recognition and autonomous navigation (as with self-driving cars) also gained a foothold in the new data-rich climate. By the end of the 2000s, “Big Data” entered the lexicon as a buzzword describing the power unleashed by having orders of magnitude more structure and unstructured data available for analysis. A broad coalition between university research and technology companies emerged, and many of the top computer scientists in academia began consulting or working for Big Tech companies like Google, Facebook, or Amazon.

In the next decade, a major innovation swept through computer science called “Deep Learning,” which ironically is based on a decades-old machine learning approach known to computer scientists even in the 1950 as neural networks (Artificial Neural Networks or ANNs). Neural networks were thought to be a tool in the machine learning toolkit—one among many—until in 2012, at the ImageNet competitions, a type of neural network known as a convolutional neural network (aka Deep Learning) blew away the competition at recognizing images culled from online photo sharing websites like Flikr.

AlexNet, the winning system, had only a 15% error rate classifying photos with their correct labels (like “African Elephant” for a photo of an African elephant). This was fully 10% better than the world’s best approach at that time. The difference? It was in the Deep Learning algorithm used.

The success of Deep Learning at the ImageNet competition in 2012 led quickly to widespread adoption of Deep Learning for tasks ranging from image recognition to autonomous driving to question answering systems and natural language processing. Voice actuated personal assistants like Alexa quickly adopted a Deep Learning approach to learning relevant (and correct) answers to user questions. Many have called the 2010s the decade of AI. The computer science field had grown to exert enormous influence and importance in the modern digital economy. “Data science,” really a combination of big data and Deep Learning (and other machine learning algorithms), rapidly came to dominate active research and development in computer science.

...as broadband internet access achieved greater penetration in public, private, and commercial life the world over, companies like Apple and Google introduced powerful new operating systems underlying groundbreaking products like the iPhone, iPad, and Android.”

These developments in AI, machine learning and Deep Learning all transpired alongside a revolution in computing hardware, one that would place increasingly greater computing power at the fingertips of everyday users. According to a report by the U.S. Census Bureau, in 2000, just over 41% of U.S. households had internet access. In April of 2021, the Census Bureau placed that number at 85%. Moreover, the vast majority of households in the U.S. now possess at least one personal computer, portable laptop, or any in a series of innovative, user-friendly Apple products.

In fact, as broadband internet access achieved greater penetration in public, private, and commercial life the world over, companies like Apple and Google introduced powerful new operating systems underlying groundbreaking products like the iPhone, iPad, and Android. These mobile telecommunication devices rapidly expanded individual computing capabilities, providing us with rapid online access anywhere, at any time, while simultanouesly facilitating an ever-growing universe of applications for commercial, recreational, educational, medical, fitness, organizational, and countless other purposes.

The world entered the 21st Century with cautious ambition about the opportunities represented in both machine learning and web technology. 20 years hence, these areas of innovation have fully penetrated our lives and changed the world. More than anything else, the first two decades of the 21st Century may be remembered as a time when computer science graduated from a secondary science into a field of singular importance.

***

Find out which influencers have most contributed to advancing the field of computer science over the last two decades with a look at The Most Influential People in Computer Science, for the years 2000 – 2020.

And to find out which schools are driving the computer science field forward today, check out The Most Influential Schools in Computer Science, for the years 2000 – 2020.

Or, take a look back at the early history of the computer science discipline.

Get more study tips, learning tools, and study starters with a look out our Complete Library of Study Guides.

Or jump to our student resource library for tips on everything from studying to starting on your career path.

Do you have a question about this topic? Ask it here