The United States came into being during the Age of Enlightenment (circa 1680 to 1800), a period in which writers and thinkers rejected the superstitions of the past. Instead, they emphasized the powers of reason and unbiased inquiry, especially inquiry into the workings of the natural world. Enlightenment philosophers envisioned a "republic of science," where ideas would be exchanged freely and useful knowledge would improve the lot of all citizens.

From its emergence as an independent nation, the United States has encouraged science and invention. It has done this by promoting a free flow of ideas, by encouraging the growth of "useful knowledge," and by welcoming creative people from all over the world.

The United States Constitution itself reflects the desire to encourage scientific creativity. It gives Congress the power "to promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries." This clause formed the basis for the U.S. patent and copyright systems, which ensured that inventions and other creative works could not be copied or used without the creator's receiving some kind of compensation.

Table of contents
1 A Good Climate for Science
2 Science Immegration
3 American Applied Science
4 The Atomic Age
5 The Space Age
6 Medicine and Health Care
7 Emphasis on Prevention
8 Troubling signs

A Good Climate for Science

In the early decades of its history, the United States was relatively isolated from Europe and also rather poor. Nonetheless, it was a good place for science. American science was closely linked with the needs of the people, and it was free from European preconceptions.

Two of America's founding fathers were scientists of some repute. Benjamin Franklin conducted a series of experiments that deepened human understanding of electricity. Among other things, he proved what had been suspected but never before shown: that lightning is a form of electricity. Franklin also invented such conveniences as bifocal eyeglasses and a stove that bears his name. (The Franklin stove fits into a fireplace and circulates heat into the adjacent room.)

Thomas Jefferson was a student of agriculture who introduced various types of rice, olive trees, and grasses into the New World. He stressed the scientific aspect of the Lewis and Clark expedition (1804-06), which explored the Pacific Northwest, and detailed, systematic information on the region's plants and animals was one of that expedition's legacies.

Like Franklin and Jefferson, most American scientists of the late 18th century were involved in the struggle to win American independence and forge a new nation. These scientists included the astronomer David Rittenhouse, the medical scientist Benjamin Rush, and the natural historian Charles Willson Peale.

During the American Revolution, Rittenhouse helped design the defenses of Philadelphia and built telescopes and navigation instruments for the United States' military services. After the war, Rittenhouse designed road and canal systems for the state of Pennsylvania. He later returned to studying the stars and planets and gained a worldwide reputation in that field.

As United States Surgeon General, Benjamin Rush saved countless lives of soldiers during the Revolutionary War by promoting hygiene and public health practices. By introducing new medical treatments, he made the Pennsylvania Hospital in Philadelphia an example of medical enlightenment, and after his military service, Rush established the first free clinic in the United States.

Charles Willson Peale is best remembered as an artist, but he also was a natural historian, inventor, educator, and politician. He created the first major museum in the United States, the Peale Museum in Philadelphia, which housed the young nation's only collection of North American natural history specimens. Peale excavated the bones of an ancient mastodon near West Point, New York; he spent three months assembling the skeleton, and then displayed it in his museum. The Peale Museum started an American tradition of making the knowledge of science interesting and available to the general public.

Science Immegration

American political leaders' enthusiasm for knowledge also helped ensure a warm welcome for scientists from other countries. A notable early immigrant was the British chemist Joseph Priestley, who was driven from his homeland because of his dissenting politics. Priestley, who came to the United States in 1794, was the first of thousands of talented scientists who emigrated in search of a free, creative environment.

Other scientists had come to the United States to take part in the nation's rapid growth. Alexander Graham Bell, who arrived from Scotland by way of Canada in 1872, developed and patented the telephone and related inventions. Charles Steinmetz, who came from Germany in 1889, developed new alternating-current electrical systems at General Electric Company, and Vladimir Zworykin, who left Russia in 1919 and later invented atelevision camera.

Into the early 1900s Europe remained the center of science research, notably in England and Germany. However with the rise of the Nazi party in Germany, a huge number of scientists, many of them of Jewish decent, left the country and travelled to the US. The first to do so was Albert Einstein in 1933. At his urging, and often with his support, a good percentage of Germany's theoretical physics community, previously the best in the world, left for the US. Enrico Fermi, who came from Italy in 1938 and who produced the world's first self-sustaining nuclear chain reaction.

In the post-war era the US was left in a position of unchallenged scientific leadership. The public "loved" science, and the leadership was fond of funding it. By the mid-1950s the research facilities in the US were second to none, and scientists were drawn to the US for this reason alone. The changing pattern can be seen in the winners of the Nobel Prizes in physics and chemistry. During the first half-century of Nobel Prizes – from 1901 to 1950 – American winners were in a distinct minority in the science categories. Since 1950, Americans have won approximately half of the Nobel Prizes awarded in the sciences.

American Applied Science

During the 19th century, Britain, France, and Germany were at the forefront of new ideas in science and mathematics. But if the United States lagged behind in the formulation of theory, it excelled in using theory to solve problems: applied science. This tradition had been born of necessity. Because Americans lived so far from the well-springs of Western science and manufacturing, they often had to figure out their own ways of doing things. When Americans combined theoretical knowledge with "Yankee ingenuity," the result was a flow of important inventions. The great American inventors include Robert Fulton (the steamboat); Samuel Morse (the telegraph); Eli Whitney (the cotton gin); Cyrus McCormick (the reaper); and Thomas Alva Edison, the most fertile of them all, with more than a thousand inventions credited to his name.

Edison was not always the first to devise a scientific application, but he was frequently the one to bring an idea to a practical finish. For example, the British engineer Joseph Swan built an incandescent electric lamp in 1860, almost 20 years before Edison. But Edison's light bulbs lasted much longer than Swan's, and they could be turned on and off individually, while Swan's bulbs could be used only in a system where several lights were turned on or off at the same time. Edison followed up his improvement of the light bulb with the development of electrical generating systemss. Within 30 years, his inventions had introduced electric lighting into millions of homes.

Another landmark application of scientific ideas to practical uses was the innovation of the brothers Wilbur and Orville Wright. In the 1890s they became fascinated with accounts of German glider experiments and began their own investigation into the principles of flight. Combining scientific knowledge and mechanical skills, the Wright brothers built and flew several gliders. Then, on December 17, 1903, they successfully flew the first heavier-than-air, mechanically propelled airplane.

An American invention that was barely noticed in 1947 went on to usher in the Information Age. In that year John Bardeen, William Shockley, and Walter Brattain of Bell Laboratories drew upon highly sophisticated principles of quantum physics to invent the transistor, a small substitute for the bulky vacuum tube. This, and a device invented 10 years later, the integrated circuit, made it possible to package enormous amounts of electronics into tiny containers. As a result, book-sized computers of today can outperform room-sized computers of the 1960s, and there has been a revolution in the way people live – in how they work, study, conduct business, and engage in research.

The Atomic Age

One of the most spectacular – and controversial – accomplishments of US technology has been the harnessing of nuclear energy. The concepts that led to the splitting of the atom were developed by the scientists of many countries, but the conversion of these ideas into the reality of nuclear fission was the achievement of US scientists in the early 1940s.

After German physicists split a uranium nucleus in 1938, Albert Einstein, Enrico Fermi, and Leo Szilard concluded that a nuclear chain reaction was feasible. In a letter to President Franklin Roosevelt, Einstein warned that this breakthrough would permit the construction of "extremely powerful bombs." His warning inspired the Manhattan Project, the U.S. effort to be the first to build an atomic bomb. The project bore fruit when the first such bomb was exploded in New Mexico on July 16, 1945.

The development of the bomb and its use against Japan in August of 1945 initiated the Atomic Age, a time of anxiety over weapons of mass destruction that has lasted through the Cold War and down to the antiproliferation efforts of today. But the Atomic Age has also been characterized by peaceful uses of atomic energy, as in nuclear power and nuclear medicine.

The first US commercial nuclear power plant started operation in Illinois in 1956. At the time, the future for nuclear energy in the United States looked bright. But opponents criticized the safety of power plants and questioned whether safe disposal of nuclear waste could be assured. A 1979 accident at Three Mile Island in Pennsylvania turned many Americans against nuclear power. The cost of building a nuclear power plant escalated, and other, more economical sources of power began to look more appealing. During the 1970s and 1980s, plans for several nuclear plants were cancelled, and the future of nuclear power remains in a state of uncertainty in the United States.

Meanwhile, American scientists have been experimenting with other renewable energy, including solar power. Although solar power generation is still not economical in much of the United States, recent developments might make it more affordable.

In 1994 Subhendu Guha, executive vice president of United Solar Systems in Troy, Michigan, was lecturing on the benefits of solar energy and showing a picture of solar cells arrayed on the roof of a house. An architect in the audience said, "But it's so ugly. Who would want that on their house?" That remark got Guha thinking about how to make the photovoltaics look more like the roof, instead of mounting the solar cells on frames that jut skyward.

Two years later, Guha's innovation came off the assembly line – solar shingles that can be nailed directly onto the roof. The shingles are made from stainless steel sheeting, coated with nine layers of silicon, a semiconducting film, and protective plastic. Roofers install the shingles just as they do normal ones, but they must drill a hole in the roof for electrical leads from each shingle. On average, one-third of a home's roof covered with solar shingles should provide enough power to meet all electrical needs when the sun is shining. Guha believes that his shingles will be economical in some parts of the United States and that they will be even more promising in Japan, where energy prices are high and the government subsidizes solar energy.

The Space Age

Running almost in tandem with the Atomic Age has been the Space Age. American Robert Goddard was one of the first scientists to experiment with rocket propulsion systems. In his small laboratory in Worcester, Massachusetts, Goddard worked with liquid oxygen and gasoline to propel rockets into the atmosphere, and in 1926 he successfully fired the world's first liquid-fuel rocket which reached a height of 12.5 meters. Over the next 10 years, Goddard's rockets achieved modest altitudes of nearly two kilometers, and interest in rocketry increased in the United States, Great Britain, Germany, and the Soviet Union.

Expendable rockets provided the means for launching artificial satellites, as well as manned spacecraft. In 1957 the Soviet Union launched the first satellite, Sputnik I, and the United States followed with Explorer I in 1958. The first manned space flights were made in the spring of 1961, first by Soviet cosmonaut Yuri Gagarin and then by American astronaut Alan Shepard.

From those first tentative steps, to the 1969 Apollo program landing on the Moon, to today's reusable Space Shuttle, the American space program has brought forth a breathtaking display of applied science. Communications satellites transmit computer data, telephone calls, and radio and television broadcasts. Weather satellites furnish the data necessary to provide early warnings of severe storms.

Medicine and Health Care

As in physics and chemistry, Americans have dominated the Nobel Prize for physiology or medicine since World War II. The National Institutes of Health, the focal point for biomedical research in the United States, has played a key role in this achievement. Consisting of 24 separate institutes, the NIH occupies 75 buildings on more than 120 hectares in Bethesda, Maryland.

The goal of NIH research is knowledge that helps prevent, detect, diagnose, and treat disease and disability -- everything from the rarest genetic disorder to the common cold. At any given time, grants from the NIH support the research of about 35,000 principal investigators, working in every US state and several foreign countries. Among these grantees have been 91 Nobel Prize-winners. Five Nobelists have made their prize-winning discoveries in NIH laboratories.

NIH research has helped make possible numerous medical achievements. For example, mortality from heart disease, the number-one killer in the United States, dropped 41 percent between 1971 and 1991. The death rate for strokes decreased by 59 percent during the same period. Between 1991 and 1995, the cancer death rate fell by nearly 3 percent, the first sustained decline since national record-keeping began in the 1930s. And today more than 70 percent of children who get cancer are cured.

With the help of the NIH, molecular genetics and genomics research have revolutionized biomedical science. In the 1980s and 1990s, researchers performed the first trial of gene therapy in humans and are now able to locate, identify, and describe the function of many genes in the human genome. Scientists predict that this new knowledge will lead to genetic tests for susceptibility to diseases such as colon, breast, and other cancers and to the eventual development of preventive drug treatments for persons in families known to be at risk.

Perhaps the most exciting scientific development under way in the United States is the NIH's Human Genome Project. This is an attempt to construct a genetic map of humans by analyzing the chemical composition of each of the 50,000 to 100,000 genes making up the human body. The project is expected to take 15 years to complete, at a cost of at least $3 billion.

Research conducted by universities, hospitals, and corporations also contributes to improvement in diagnosis and treatment of disease. NIH funded the basic research on Acquired Immune Deficiency Syndrome (AIDS), for example, but many of the drugs used to treat the disease have emerged from the laboratories of the American pharmaceutical industry; those drugs are being tested in research centers across the country.

One type of drug that has shown promise in treating the AIDS virus is the protease inhibitor. After several years of laboratory testing, protease inhibitors were first given to patients in the United States in 1994. One of the first tests (on a group of 20 volunteers) showed that not only did the drug make the amount of virus in the patients' blood almost disappear, but that their immune systems rebounded faster than anyone had thought possible.

Doctors have combined protease inhibitors with other drugs in "combination therapy." While the results are encouraging, combination therapy is not a cure, and, so far, it works only in the blood; it does not reach into the other parts of the body -- the brain, lymph nodes, spinal fluid, and male testes -- where the virus hides. Scientists continue to experiment with combination therapy and other ways to treat the disease, while they search for the ultimate solution – a vaccine against it.

Emphasis on Prevention

While the American medical community has been making strides in the diagnosis and treatment of disease, the American public also has become more aware of the relationship between disease and personal behavior. Since the US surgeon general first warned Americans about the dangers of smoking in 1964, the percentage of Americans who smoke has declined from almost 50 percent to approximately 25 percent. Smoking is no longer permitted in most public buildings or on trains, buses, and airplanes traveling within the United States, and most American restaurants are divided into areas where smoking is permitted and those where it is not. Studies have linked a significant drop in the rate of lung cancer to a nationwide decline in cigarette smoking.

The federal government also encourages Americans to exercise regularly and to eat healthful diets, including large quantities of fruits and vegetables. More than 40 percent of Americans today exercise or play a sport as part of their regular routine. The per capita consumption of fruits and vegetables has increased by about 20 percent since 1970.

Donna Shalala, Secretary of Health and Human Services in the Clinton administration, frequently speaks out in support of scientific research and preventive medicine. Addressing a conference of medical and public health professionals in 1996 she said:

We must continue to unlock the incremental mysteries in basic science that culminate in blockbuster discoveries over time. But, we must cast our net wider than that. It must encompass behavioral research, occupational research, health services and outcomes research, and environmental research -- all of which hold the potential to prevent disease – and help Americans live healthier lives.

Troubling signs

In recent years the formerly natural split between research and commercial activity has become increasingly blurry. Many purely theoretical advances have been funded at various industrual think tanks such as Xerox's PARC and AT&T's Bell Labs, with little or no interest in the possible commercial applications of the research being carried out. Research centers like these are increasingly seen as a thing of the past, as companies demand better return on their investment dollar.

Meanwhile the laws govorning the use of public funds, notably federal research grants, have increasingly allowed for commercial interests to immediately take the research and patent it. Changes to these laws have been demanded for some time, notably by the pharmaceutical industry, in order to get products into the marketplace faster. They argue that they will be providing the drug to the public anyway, so there is no reason to demand that those developments made with public funds shouldn't be commerciallized as soon as possible.

However this has also led to a number of abuses that are increasing the ire of the consumers. One case in particular, the development of AZT, deserves mention. AZT was originally developed using public funding in the 1960s as a treatment for cancer, during a time when it was thought that cancer was caused by a retrovirus. When its efficacy against AIDS was demonstrated in the 1980s, Glaxo-Wellcome immediately patented it, thereby generating huge profits for essentially zero money of their own.

Meanwhile the patent process itself is seen as being increasingly abused. After years of complaints about what could and could not be patented, the United States Patent and Trademark Office relented and allowed mathematical formulas and computer programs to be patented. The result has been a rush of dubious patents that have strained the system, filling it with patents that many consider invalid.

These developments are often seen as a troubling pattern, in which commercial interests are being given increasing power over the direction of basic research, and the applications that come out of that research. It is not entirely clear what the effects of these changes will be.