Upgrade to remove ads
Arts and Humanities
Science and Technology in a Changing World
Terms in this set (42)
The Discovery of Penicillin
In September 1928, Alexander Fleming discovered penicillin, the world's first antibiotic. The discovery was a turning point in history. Doctors worldwide finally had a way to cure certain deadly infectious diseases.
The First General-Purpose Computer
The first computer made for general use was built in 1946. It was called the Electronic Numerator Integrator and Computer (ENIAC).
The Launch of the First Orbiting Satellite
The Soviet Union launched the first artificial Earth satellite on October 4, 1957. It was named Sputnik I and was placed in the planet's orbit. The launch of Sputnik I marked the beginning of the space age.
The Development of the Internet
The development of the Internet took years of research. By 1985, an early form of the Internet had begun serving a growing community of users.
The Human Genome Project Begins
The Human Genome Project was an international scientific research project that began in 1990. The scientists working on this project sought to discover and decode all the genes that make up human DNA.
Continuity and Change
The world is changing faster than anyone ever could have imagined. Change has occurred throughout history, but technological advances and scientific discoveries have greatly increased the pace of change in modern times.
The natural curiosity in humans has helped bring about changes in the world since prehistoric times. The earliest human tool was developed more than two million years ago. With the passage of time, these simple tools helped humans build great civilizations. Let's take a look at the discoveries and inventions that helped shape human civilization from the earliest times
The Discovery of Fire
One of the earliest discoveries made by humans was fire. Fire helped humans cook, clear forests, and heat stones to make stone tools. Over time, humans found that they could use fire to help create metal tools. Fire also allowed humans to make bricks for houses, as well as household items such as pottery.
The Invention of Tools and Weapons
Prehistoric people used metals to build tools and equipment such as plows. These inventions helped in the advancement of agriculture. The invention of metal weapons such as knives, catapults, bows, and arrows soon followed.
The Invention of the Wheel
The wheel was invented around 3500 BC. It was first used by potters to make earthen pots. But soon wheels started being used in new forms of transportation, such as the cart and the wheelbarrow. The wheel made it much easier to transport humans and goods from one place to another.
The Domestication of Animals
Dogs were the first animals to be domesticated by humans. Historians believe that dogs probably accompanied humans on hunting trips. They also helped with herding other animals and guarding humans. Sheep and goats were domesticated later. Around 4000 BC, when trade routes developed, humans began using oxen, donkeys, horses, and camels to transport goods. Oxen were also used to prepare soil for crops by tilling the land.
The tools that were invented by early humans worked well for thousands of years, though they underwent small changes. For instance, the first wheels, invented around 3500 BC, were made of stone. Gradually, they were replaced with wooden wheels, which were lighter but equally effective.
Like prehistoric humans, people of the ancient world also made important advancements. The empires of the Roman and Mongolian civilizations established efficient postal systems. The ancient Romans also developed an impressive network of roads. Modern civilizations borrowed these innovations.
Human invention and innovation took a major leap forward with the Industrial Revolution of the eighteenth and nineteenth centuries. During this period, new inventions allowed products to be produced quickly and for less money. Many people moved from the countryside to towns so they could work in factories. Many tools and equipment of the past gave way to modern machinery and methods.
The world certainly changed during the thousands of years between prehistoric times and the Industrial Revolution. But it has changed even more rapidly in last 300 years than in all the previous centuries. And it will likely continue to do so.
In modern times, technology moves at a fast pace, and there's a constant invention of new electronic gadgets. It can be difficult for some people to keep up with all the changes. Think of a time when an adult you know had trouble using a device or performing a task, and asked for your help. Open the Notebook () tool and answer each of the following questions in four to five sentences.
-What task did you help the adult with?
-Where did you learn how to do this task?
-How did you teach the adult to do this task?
The Evolution of Computers
No modern industry better illustrates the fast pace of change today than the computer industry. And no industry has had such far-reaching effects. These days, almost everything we do involves computer technology.
Before the creation of modern electronic computers came the invention of punched cards. These stiff paper cards had holes punched in different places. The holes were somewhat like computer code. When put into a machine, the machine could translate the holes into data or information. The earliest punched card system was able to speed up some tasks. However, the machines were so huge that they took up space in several large rooms.
In 1946, Americans John Mauchly and John Presper Eckert developed the first modern computer. It was called the Electrical Numerical Integrator and Calculator, or ENIAC. It took a year to design the machine and about a year and a half to build it. It cost about $50,000 (about $650,000 today).
ENIAC was 8 feet tall and 80 feet long, and it weighed 30 tons. To operate, it used more than 17,000 heat-generating vacuum tubes. Vacuum tubes were a common part of most electronics in the twentieth century, including radios, televisions, and telephone networks. But several of ENIAC's tubes burned out almost every day, which affected its reliability. Critics wrote the machine off. They said it was a waste of money and time.
However, the ENIAC was faster than any machine that came before it. In one second, it could complete 5,000 addition problems. The US military put it to use for calculations, weather prediction, wind-tunnel design, and even the development of hydrogen bombs.
The next evolution in computers was the transistor computer. These second-generation computers were built in the 1950s and 1960s. Engineers replaced vacuum tubes with newer technology called transistors. The transistor computers were much more powerful and reliable than vacuum tube computers. They were also much smaller in size. Transistor computers paved the way for personal computers.
The third-generation computers used integrated circuits. An American inventor named Jack Kilby developed the first integrated circuit in 1958. By 1969, integrated circuits were a crucial part of computers.
In integrated circuits, transistors were more compact and integrated into silicon chips. These chips increased the speed and efficiency of computers. These computers also had an operating system. Users interacted with the computers using monitors and keyboards. Many applications could also be run at one time. As computers became user-friendly, smaller, and more affordable, they became popular with a variety of new users.
Another invention that has become an essential part of our lives is the mobile phone. The mobile phone was introduced to the world in 1973. Motorola employee and inventor Martin Cooper made the historic first call on April 3 from New York City. Today's mobile phones, or cell phones, do much more than make phone calls. They feature functions that were previously available only on computers.
Three years later, on April 1, 1976, three friends formed what would become a world-famous technology company. Steve Jobs, Steve Wozniak, and Ronald Wayne founded Apple Computer to develop and sell personal computers. The company revolutionized the computer industry with its unique inventions and products.
Another advancement of the twentieth century was computer networking. A computer network connects two or more computers in the same location or in different locations. One early system of networking was called the Advanced Research Projects Agency Network (ARPANET). By 1983, ARPANET was being used by the US military for several research and development projects and other operational activities. Another advancement, called Network Control Protocol (NCP), allowed computers in the network to exchange information. These innovations soon gave rise to the modern Internet and the development of electronic mail, or email.
By 1985, the Internet was an established technology. It supported a broad community of university researchers and developers. Other communities were also beginning to use it for daily computer communications. In addition, they were making greater use of email. However, different communities often used different computer and networking systems, which made it difficult to share information from one community to the next.
The growing use of the Internet and email showed the need for a single system that could connect many more people. The solution to this problem was the World Wide Web. The year 1989 is generally considered the year the World Wide Web was created. Tim Berners-Lee has been credited with its invention.
The World Wide Web helped connect the world digitally and helped people access information. Watch the video on the next screen to learn more about the beginnings of the World Wide Web.
The computer industry also took giant strides in the 1980s. By the start of that decade, there were approximately one million computers in use. International Business Machines (IBM) introduced personal computers in 1981. This innovation led to an increase in the use of computers in offices and homes.
Another pioneering computing company, Microsoft, was established by Bill Gates and Paul Allen. From its humble beginnings, Microsoft rose to become a revolutionary computer technology corporation. In the 1990s, Microsoft introduced its computer operating system, Microsoft Windows. Its user-friendly applications transformed how people used computers.
The invention of cell phones, the Internet, and personal computing led to an explosion of technology. By 1993, there were 50 World Wide Web servers across the world. In the same year, the first smartphone was introduced. By 2013, there were 6.8 billion cell phones and about 2 billion computers in use around the globe. With innovations coming quickly, each generation of computers and cell phones became smaller and more powerful.
We may talk to each other less because we spend so much time looking at content on the Internet. People are also concerned that thieves may be able to access personal information shared on the Internet.
The Internet is a great place to find information. It also allows people to connect with each other.
The Effects of Technology on Knowledge
The growth of computer technology resulted in the rapid development of new ideas and knowledge. In his 1982 book, Critical Path, American architect and inventor Buckminster Fuller introduced the concept of the "knowledge doubling curve."Fuller revealed that human knowledge had doubled every century until around 1900. By the end of World War II in 1945, the knowledge level began doubling at an even faster rate. The time had reduced drastically to 25 years.
However, not all knowledge levels have been advancing at an even pace. For example, the knowledge to produce new medical technology is doubling every 18 months. Nanotechnology knowledge is doubling every two years. According to IBM, the Internet and advances in communication might allow human knowledge to someday double every 12 hours.
With increased knowledge in the fields of science and technology, there have been improvements in areas such as human hygiene, medicine, literacy, agricultural production, and manufacturing.
The lives of people are constantly changing and will continue to change in the future. New jobs will be created to cater to new technology and knowledge. For instance, experts believe that 60 percent of the jobs that people will hold 10 years from now haven't been invented yet.
Here are five career paths that I am interested in pursuing when I enter the workforce:
-computer systems analyst
None of the jobs that I was interested in pursuing are mentioned in the given link. So, I'll need to learn to develop new skills for new jobs, probably for the rest of my life.
Here are three careers I might be interested in and the skills I would probably need:
-Smart car interior designer: Advanced spatial thinking skills would likely be required. To be successful, I would need to be creative and develop critical thinking and problem solving skills.
-Robotician: Advanced technological skills would be required to be successful in this profession.
-astro teacher: Teaching skills would be required to be successful in this profession.
The period between the fifteenth and the seventeenth centuries is known as the age of exploration. During this era, Europeans explored much of the Americas, Africa, and Asia. One of the goals of these explorations was to gather information about places and people beyond Europe. These explorations resulted in a dramatic increase in human knowledge of the world.
For several more centuries, there continued to be unexplored territories that lay beyond human settlements. These territories were called the frontier. By the middle of the twentieth century, however, most of the land on the planet had been explored. Without an unexplored frontier on Earth, people started thinking of outer space as "the last frontier." Space exploration is an example of a rapidly changing field of science. The lessons learned from outer space have also changed the way we live on Earth.
To explore the last frontier of space, scientists started experimenting with rocket technology. During the early twentieth century, three scientists began to work separately on rocket engines for the purpose of space exploration. These scientists were Robert Goddard from the United States, Konstantin Tsiolkovsky from Russia, and Hermann Oberth from Germany. Robert Goddard in particular wanted to use rockets for peaceful purposes. He believed rockets could help send people to the moon. He built the world's first liquid-fueled rocket.
However, during World War II, Nazi Germany realized the potential for using long-distance rockets as weapons systems. German scientist Wernher von Braun developed a weapon called the V-2. This rocket could travel at a speed of 3,500 miles per hour. The Germans used the V-2 to conduct aerial attacks on London, England, in the closing months of the war.
The end of World War II signaled the beginning of another conflict known as the Cold War. The Cold War pitted two superpowers against each other—the United States and the Soviet Union. The two nations engaged in an unspoken competition to be the first to explore outer space. This rivalry was known as the space race. The two rivals also engaged in an arms race to build more weapons than the other.
Both countries created their own rocket building and testing programs. The programs served two purposes. Some rockets were built for peaceful space missions focused on scientific discovery. Others were used as missiles that could drop bombs from a long distance.
The liquid-fueled rockets that were invented early in the twentieth century were used to launch artificial satellites. On October 4, 1957, the Soviet Union launched the first artificial satellite into space. It was called Sputnik 1. The United States followed with the launch of its first artificial satellite, Explorer 1, on January 31, 1958. Immediately, world leaders realized it was possible to launch a satellite that could be used to spy on enemies from space.
Early artificial satellites were used to spy on enemies. Today, artificial satellites serve a number of other purposes:
--Observation satellites: Scientists use these satellites to study and forecast weather conditions. The satellites also help scientists predict natural and human-made events, such as floods and forest fires, changes in ice cover, rising sea levels, and air pollution.
--Communication satellites: These satellites send television and telephone signals and data transmission back to Earth. This advancement has resulted in almost an instant sharing of information across the globe.
--Navigation satellites: This type of satellite has been designed to enable aircrafts, vehicles, and ships to locate their exact geographical positioning.
After launching artificial satellites in the 1950s, the next step was to send humans into space. With that goal in mind, the United States and the Soviet Union established human space flight programs. On April 12, 1961, a Soviet named Yuri Gagarin became the first human to orbit Earth. On May 5 of the same year, astronaut Alan Shepard became the first American to fly into space, though he did not orbit Earth. On February 20, 1962, John Glenn became the first American to orbit Earth. Another American, Neil Armstrong, became the first human to step foot on the moon. This historic event occurred on July 20, 1969.
Space exploration is taking giant strides in the twenty-first century. Unmanned spacecraft have been launched to planets such as Venus and Mars. Countries from all over the world are joining forces to explore space.
One such effort is the International Space Station. Its purpose is to support the study of space science. Launched in 1988, the space station demonstrates the coming together of science, technology, and the human mind. It has been continuously occupied since November 2000. More than 200 people from 15 countries have visited the station.
The earliest successful attempts to explore outer space involved _____.
Advances in Medical Technology
In the last century, the human population has risen dramatically. The most important factor that contributed to this growth is the advancement in medicine and medical technology. With better medical facilities, life expectancy for humans has risen to an all-time high.
These advances in medical science actually began in the eighteenth century. Prior to that time, life was quite different. In England, the average person lived only until about the age of 35. Many children died before they reached the age of five. Epidemics such as the plague, small pox, diphtheria, and typhus were common. People knew little about nutrition or sanitation. They also didn't yet understand the concept of germs and how they spread disease.
Six important inventions in the medical field changed the world completely. Let's take a look at them.
In 1798, Edward Jenner demonstrated the possibility of vaccination. He introduced a vaccine for the deadly smallpox disease. During the late eighteenth century, many children in England were dying from smallpox. Jenner proved that patients could become immune to smallpox if he exposed their bodies to a small dose of a less serious disease known as cowpox. Jenner coined the word vaccine from the Latin word vacca, meaning cow. This vaccination provided safer and more reliable protection than previous methods.
An antiseptic is a substance that kills bacteria and therefore prevents infection. Antiseptics were first introduced as part of surgeries by a nineteenth-century surgeon named Joseph Lister. Lister realized that many patients died from infections after surgery. In the 1860s, he learned about antiseptics and tried simple steps to reduce infections in his patients. For example, he began spraying the operating room with an antiseptic called carbolic acid. He also used this antiseptic to protect open wounds on his patients. In addition, he started washing his hands before surgery and killing bacteria on surgical instruments by sterilizing them. These techniques reduced infection to a great extent. Many surgeons adopted his principles, which made modern surgery safer.
Robert Koch was a German doctor and pioneering microbiologist. In the 1870s, he developed experimental methods to identify which bacteria caused a particular disease. This achievement opened the door to identifying the germs behind other infectious diseases.
Louis Pasteur was a French chemist and another pioneering microbiologist. He is best known for his invention of pasteurization in the late 1880s. This process helps kill bacteria in liquid food, such as milk. Pasteur also developed vaccines against diseases caused by bacteria, such as anthrax. Similar to Koch, he discovered that bacteria and other germs cause diseases.
German physicist Wilhelm Roentgen discovered X-rays in 1895. X-rays are invisible rays of high energy that can pass through objects that light cannot pass through. This discovery was a boon for doctors, who began using X-rays to detect various injuries and abnormalities inside their patients' bodies. During World War I and World War II, X-rays were used to find and remove bullets and metal shrapnel from wounded soldiers.
Scottish biologist Alexander Fleming identified penicillin in 1928. It was the first chemical compound with antibiotic properties. Antibiotics help treat infections caused by bacteria. With this discovery, diseases such as tuberculosis, gangrene, and many others were nearly wiped out.
Pasteurization helped kill bacteria in food. Vaccines helped control diseases such as tuberculosis. Penicillin helped control tuberculosis.
discovery of penicillin
helped get rid of tuberculosis
use of vaccine
stopped the smallpox epidemic
use of pasteurization
increased the safety of food
Medicine Since World War II
Since World War II, scientists have continued to make great strides in the field of medical technology. One important field of medical technology is biotechnology. Let's take a look at how this new area of medicine fights and prevents diseases.
Biotechnology relies on recent advances in science and technology. It uses living microorganisms, such as cells or bacteria, as tools to produce new medicines and other useful products.
The popularity of biotechnology is growing in our changing world. Biotechnology is used across industries to make products such as vaccinations, medicines, and industrial bacteria. Biotechnology also includes research on how to increase crop yields by manipulating the DNA of food crops. Similarly, biotechnology research has been used to improve the health of livestock.
Biotechnology is also related to genetics and genetic engineering. Genetics is the study of heredity and the different variations in the inherited traits of organisms. Genetic engineering involves the manipulation of an organism's DNA. It makes changes in the organism's genome using the science of biotechnology. Developments in the field of genetics have benefited several industries, including medicine.
An important aspect of genetics is cloning. Clones are organisms that share identical DNA. Thus, they are exact genetic copies of one another. Cloning is helping scientists grow plants that are genetically identical and possess certain desired characteristics such as resistance to diseases and insects.
In medicine, biotechnology has helped with the development of new ways to diagnose and treat illness. For example, biotechnology helps make genetic testing possible. Genetic testing is a procedure that allows for the diagnosis of genetic, or inherited, diseases. Biotechnology has also allowed pharmaceutical companies to create medicines related to specific genes and diseases. It has even allowed medical researchers to analyze a person's response to certain medicines.
Along with new testing procedures and medicines, biotechnology has also helped in developing better treatment procedures. One example is gene therapy. This procedure uses genes as medicine to treat a person whose own genes have a flaw in them. Another modern procedure uses stem cells. Stem cells are unique because they can multiply again and again. They can also become other types of cells. Stem cell therapy uses stem cells to treat certain diseases. Transplanting bone marrow is one of the most widely used stem cell therapies today.
The Human Genome Project
One of the most far-reaching applications of biotechnology is the Human Genome Project. The project's goal was to discover, decode, and analyze all the genes that make up human DNA. The project was successfully completed in April 2003. By the end, its researchers understood the basic building blocks of a human being. Furthermore, researchers were able to decode more than 1,800 human diseases. This achievement further helped in developing treatments.
Researchers are now able to find a gene suspected of causing a disease in a matter of days. And currently, there are more than 2,000 genetic tests available for human conditions. These tools have empowered physicians, who are now able to assess patients' risks for getting a disease.
Watch the following video about genome research at the Sanger Institute. Researchers there are working on one of the deadliest modern diseases: cancer.
According to the video, what do scientists look for when comparing the sequence of a genome with cancer to the sequence of a normal genome?
THIS SET IS OFTEN IN FOLDERS WITH...
Cold War Ends
Decolonization in Africa
Imperialism in Africa, the Middle East, and Latin…
YOU MIGHT ALSO LIKE...
history of environment
Grade 6 history chapter 1
Chapter 14 Anthropology
Survey of Anthropology
OTHER SETS BY THIS CREATOR
A Global Society
Revolution and Reaction in the Americas
The Cold War Ends
OTHER QUIZLET SETS
Chemistry 1 Test 4: Periodic Trends, Bonding, Lewi…
Real Estate: Chapter 2 | Questions
9th world lit