By the turn of the nineteenth century, research into solar energy and the development of solar powered technology had focused on how to concentrate light with lenses and mirrors. This type of research had continued into the 1800s, with the development of solar-powered steam engines and water pumps. Alongside this older path of research, however, a new understanding of solar energy was emerging with the discovery of the photovoltaic effect—the generation of electrical currents in certain materials upon their exposure to light. It was this discovery that ultimately paved the way for a much larger-scale use of solar power.
In essence, the photovoltaic effect occurs when sunlight strikes certain types of material known as semiconductors—substances that are poor conductors of electricity in their natural state, but which become excellent conductors under particular conditions, such as exposure to heat or light. When sunlight strikes these materials (such as silicon, the most common semiconductor used in solar technology today), electrons are released and travel through the material, creating electricity. The photovoltaic effect was first observed in 1839 by a young French scientist, Edmond Becquerel, but it would be decades before the process was better understood and fully developed. The key turning point came in the 1870s when scientists discovered that selenium was a semiconductor, and would generate electricity if exposed to sunlight. This paved the way for the invention of the first selenium solar cell by American scientist Charles Fritts in 1883. The problem, however, was its utter lack of efficiency; Fritts estimated that his cell converted no more than 1%
of the light’s energy into electricity. Thus, while his invention represented an important scientific milestone, it did not generate enough electricity to make it practical or cost-effective. Scientists, including Albert Einstein, continued to study the photovoltaic effect, but the breakthrough that would translate this understanding into a viable source of electricity remained elusive.
Several major developments after the Second World War changed the course of solar energy research. The first was the discovery that silicon, one of the most common elements found on Earth, was an excellent semiconductor. The first solar panel made from silicon was built in 1954, and it converted 6% of the sunlight it received into electricity—much more efficient than the earlier selenium model, making it a workable source of energy. The second development was a tense mixture of science and international politics—the so-called Space Race between the United States and the Soviet Union, which began in earnest with the launch of the Soviet satellite Sputnik in 1957. Satellites and manned spacecraft required a source of power, and solar energy—available in great amounts in outer space—made the most sense (a trend that continues to this day with the International Space Station). This development was crucial because it ensured that research and development of solar technology would continue, no matter the financial cost through the 1950s and 1960s: in the highly charged political atmosphere of the Cold War, both sides viewed the cost of falling behind in science and technology as unacceptable.