[How We Got To Now: Six Innovations That Made The Modern World Download Pdf

0 views
Skip to first unread message

Oludare Padilla

unread,
Jun 13, 2024, 6:51:28 AM6/13/24
to lessthewlime

Thomas Edison invented the phonograph to send audio letters, and Alexander Graham Bell intended for people to use the telephone to listen to live orchestra music. What does this say about innovation and unintended consequences?

How We Got to Now: Six Innovations That Made the Modern World download pdf


Download File ⚹⚹⚹ https://t.co/dI5g2XrMsz



It says that part of the process of innovation comes from the consumer side of the equation. You can invent the telephone and put it out in the world and say, "This would be fantastic for you playing cello on one end and someone else listening to you playing cello on the other end," but it gets out into the world and people start using it. They say, "That would be a terrible way of using the telephone. But it is really great for calling my grandmother." That is always the case with technology when it gets unleashed into the world. People end up pushing it in directions that the inventors never dreamed of.

It is a term originally coined by Stuart Kauffman, a brilliant complexity theorist. Basically, when someone comes up with a new idea, technology or platform of some kind, it makes a whole other set of new ideas imaginable for the first time.

In this illustrated volume, Steven Johnson explores the history of innovation over centuries, tracing facets of modern life (refrigeration, clocks, and eyeglass lenses, to name a few) from their creation by hobbyists, amateurs, and entrepreneurs to their unintended historical consequences.

If you believe that invention comes from single, solitary geniuses working on their own, trying to invent something that will make them fantastically rich, then you have a set of policies and prescriptions as a society that encourage that kind of invention. You have a really strong patent protection, so that when someone comes up with this brilliant idea, no one can steal it, and the inventor will be able to maximize the value that he or she gets from the invention.

Yeah. Think about Darwin. Think about Ben Franklin. These are people that had a thousand hobbies. They would focus on their primary projects at various different points in their lives. Darwin had the theory of evolution, but he also had a beetle collection, and his beetle collection shaped his interest in evolution in all these subtle ways. Focus is overrated.

There is going to be artificial intelligence of some sort, not necessarily computers becoming self aware or anything like the science fiction versions, but there is going to be much more human-like intelligence in our machines 10 years from now.

When they [IBM employees] trained [the supercomputer] Watson, they trained it by having it read the entirety of Wikipedia. The teacher for this new machine was basically all of us. Millions of people have collectively authored this global encyclopedia. We took all of that intelligence and set it into a computer, and the computer somehow became smart on a level that no computer had been smart before. There is something kind of lovely in that.

Ancient Rome had a large influence on the modern world. Though it has been thousands of years since the Roman Empire flourished, we can still see evidence of it in our art, architecture, technology, literature, language, and law. From bridges and stadiums to books and the words we hear every day, the ancient Romans have left their mark on our world.

Although the Romans were heavily influenced by ancient Greece, they were able to make improvements to certain borrowed Greek designs and inventions. For example, they continued the use of columns, but the form became more decorative and less structural in Roman buildings. Ancient Romans created curved roofs and large-scale arches, which were able to support more weight than the post-and-beam construction the Greeks used. These arches served as the foundation for the massive bridges and aqueducts the Romans created. The game-loving ancients also built large amphitheaters, including the Colosseum. The sports stadiums we see today, with their oval shapes and tiered seating, derive from the basic idea the Romans developed.

The arches of the Colosseum are made out of cement, a remarkably strong building material the Romans made with what they had at hand: volcanic ash and volcanic rock. Modern scientists believe that the use of this ash is the reason that structures like the Colosseum still stand today. Roman underwater structures proved to be even sturdier. Seawater reacting with the volcanic ash created crystals that filled in the cracks in the concrete. To make a concrete this durable, modern builders must reinforce it with steel. So today, scientists study Roman concrete, hoping to match the success of the ancient master builders.

Sculptural art of the period has proven to be fairly durable, too. Romans made their statues out of marble, fashioning monuments to great human achievements and achievers. You can still see thousands of Roman artifacts today in museums all over the world.

Along with large-scale engineering projects, the Romans also developed tools and methods for use in agriculture. The Romans became successful farmers due to their knowledge of climate, soil, and other planting-related subjects. They developed or refined ways to effectively plant crops and to irrigate and drain fields. Their techniques are still used by modern farmers, such as crop rotation, pruning, grafting, seed selection, and manuring. The Romans also used mills to process their grains from farming, which improved their efficiency and employed many people.

The ancient Romans helped lay the groundwork for many aspects of the modern world. It is no surprise that a once-booming empire was able to impact the world in so many ways and leave a lasting legacy behind.

The audio, illustrations, photos, and videos are credited beneath the media asset, except for promotional images, which generally link to another page that contains the media credit. The Rights Holder for media is the person or group credited.

For information on user permissions, please read our Terms of Service. If you have questions about how to cite anything on our website in your project or classroom presentation, please contact your teacher. They will best know the preferred format. When you reach out to them, you will need the page title, URL, and the date you accessed the resource.

Percy Spencer, an American engineer and expert in radar tube design who helped develop radar for combat, looked for ways to apply that technology for commercial use after the end of the war. The common story told claims that Spencer took note when a candy bar he had in his pocket melted as he stood in front of an active radar set. Spencer began to experiment with different kinds of food, such as popcorn, opening the door to commercial microwave production. Putting this wartime technology to use, commercial microwaves became increasingly available by the 1970s and 1980s, changing the way Americans prepared food in a way that persists to this day. The ease of heating food using microwaves has made this technology an expected feature in the twenty first century American home.

More than solely changing the way Americans warm their food, radar became an essential component of meteorology. The development and application of radar to the study of weather began shortly after the end of World War II. Using radar technology, meteorologists advanced knowledge of weather patterns and increased their ability to predict weather forecasts. By the 1950s, radar became a key way for meteorologists to track rainfall, as well as storm systems, advancing the way Americans followed and planned for daily changes in the weather.

Similar to radar technology, computers had been in development well before the start of World War II. However, the war demanded rapid progression of such technology, resulting in the production of new computers of unprecedented power. One such example was the Electronic Numerical Integrator and Computer (ENIAC), one of the first general purpose computers. Capable of performing thousands of calculations in a second, ENIAC was originally designed for military purposes, but it was not completed until 1945. Building from wartime developments in computer technology, the US government released ENIAC to the general public early in 1946, presenting the computer as tool that would revolutionize the field of mathematics. Taking up 1,500 square feet with 40 cabinets that stood nine feet in height, ENIAC came with a $400,000 price tag. The availability of ENIAC distinguished it from other computers and marked it as a significant moment in the history of computing technology. By the 1970s, the patent for the ENIAC computing technology entered the public domain, lifting restrictions on modifying these technological designs. Continued development over the following decades made computers progressively smaller, more powerful, and more affordable.

Of all the scientific and technological advances made during World War II, few receive as much attention as the atomic bomb. Developed in the midst of a race between the Axis and Allied powers during the war, the atomic bombs dropped on Hiroshima and Nagasaki serve as notable markers to the end of fighting in the Pacific. While debates over the decision to use atomic weapons on civilian populations continue to persist, there is little dispute over the extensive ways the atomic age came to shape the twentieth century and the standing of the United States on the global stage. Competition for dominance propelled both the United States and the Soviet Union to manufacture and hold as many nuclear weapons as possible. From that arms race came a new era of science and technology that forever changed the nature of diplomacy, the size and power of military forces, and the development of technology that ultimately put American astronauts on the surface of the moon.

The arms race in nuclear weapons that followed World War II sparked fears that one power would not only gain superiority on earth, but in space itself. During the mid-twentieth century, the Space Race prompted the creation of a new federally-run program in aeronautics. In the wake of the successful launch of the Soviet satellite, Sputnik 1, in 1957, the United States responded by launching its own satellite, Juno 1, four months later. In 1958, the National Aeronautics and Space Act (NASA) received approval from the US Congress to oversee the effort to send humans into space. The Space Race between the United States and the USSR ultimately peaked with the landing of the Apollo 11 crew on the surface of the moon on July 20, 1969. The Cold War between the United States and the USSR changed aspects of life in almost every way, but both the nuclear arms and Space Race remain significant legacies of the science behind World War II.

795a8134c1
Reply all
Reply to author
Forward
0 new messages