Electric industry history

The word electricity is derived from the Latin word electricus, which means to "produce from amber by friction." As long ago as 600 B.C., Greeks knew that amber could be charged by rubbing. And it was Thales of Miletus who many credit with the first discovery of electricity. He noticed that rubbing two pieces of amber together created a force that could attract light objects such as cat fur. 

While man has long known that the phenomenon we now call electricity existed, it was not until much later that it was studied and ultimately put to use. In 1600, the English scientist William Gilbert described the electrification of many different substances and coined the word electricity. By 1660, scientists had invented a machine for producing static electricity by rotating and rubbing a ball of sulfur.

One of Gilbert’s Experiments (Source: NASA)

As society entered the 1700s, scientists began the work of harnessing electricity. In 1729, the conduction of electricity was discovered, and subsequent work identified substances that would act as conductors. And in 1745, the Leyden jar was invented. This device contained a glass vial partially filled with water and a thick wire that could conduct an electrical charge. The Leyden jar was notable in that it was the first device that allowed electric charge to be stored and later discharged all at once (leading ultimately to the concept of electric current). It was also during this time that Benjamin Franklin invented the lightning rod, proving that lightning was actually a form of electricity.

The Layden jar (Source: commons.wikipedia.com)

In 1799, copper and zinc plates separated by cardboard soaked in salt water were used to create the first continuous and controlled source of electricity, the forerunner of our modern-day battery. Having a steady source of electricity pushed researchers’ capabilities into a new realm, and by the mid-1800s we had laws that described the basic behavior of electricity (Ohm’s law and Kirchoff’s law) as well as various electric technologies such as the electromagnet, the electric motor, the electric generator, and electric arc lights. We also saw one of the first practical uses of electricity — the telegraph, invented by Samuel Morse around 1840.

As the world entered the industrial age, scientists and engineers pressed to create additional practical uses of electricity. The discovery of the self-excited dynamo — a generator that could quickly ramp up to full capacity — made it feasible to create small-scale generating stations, and by the mid-1870s electric arcs were lighting the streets of Paris, London, and New York. Arc lights, however, were too powerful for indoor uses, and widespread use of electricity awaited Thomas Edison’s development of a practical incandescent bulb in 1879. Soon after this discovery, Edison patented his design for an electrical distribution system. On September 4, 1882, the Edison Electric Illuminating Company opened its first central generating station at Pearl Street in Manhattan, and the era of the electric utility was born. Soon thereafter, a number of small distribution systems were created and Benjamin Harrison, elected in 1888, became the first president to have electricity in the White House (although, as the story goes, he and his wife were afraid of being shocked by the light switches, so they opted to use the gas lights instead). Use of electricity to power street railways also became common, and by 1889 there were 154 of them in the U.S.

Pearl Street power station in New York
(Source: Museum of Innovation and Science Schenectady)

Broader development of electric utilities was hamstrung by the fact that initial systems were direct current (DC). DC systems at voltages safe for home use cannot efficiently distribute power for more than about a half mile due to losses of electricity along the distribution wires. At that time, raising voltages to transmit power and then lowering them again for household use was too expensive to be feasible.

The breakthrough that led to today’s modern utility came in 1888 when Nikola Tesla created workable alternating current (AC) generators and motors. Unlike DC power, AC power can be transmitted longer distances at low voltages without undue losses. Using Tesla’s patents, George Westinghouse received a contract to construct a power plant at Niagara Falls. The plant opened in August 1895 and powered two 3.7 MW generators. Initially the power was used locally for the manufacture of aluminum and carborundum, but in 1896 a 20-mile transmission line was constructed to Buffalo where the power was used for lighting and street cars.

Soon the electrical giants Westinghouse and General Electric came to dominate electric power technology, and in the years between 1900 and the First World War new electrical appliances such as the refrigerator, washing machine, vacuum cleaner, and radiant heaters as well as improved light bulbs led to increasing demand for electricity. The concept of the vertically integrated utility that owned and operated electric generation, transmission, and distribution soon dominated the industry, leading to the need for regulation and, in some areas, municipalization of the utility function. Meanwhile, as part of the Depression-era New Deal, the federal government began construction of numerous federal hydro power projects.

Hoover Dam under construction (Source: Bureau of Reclamation)

Between 1945 and 1965 the utility industry continued to grow and investors became familiar with the steady, if unspectacular, returns provided by utilities. The technology for central generating units powered by coal, fuel oil, and natural gas matured, providing for increased efficiency and lower costs. Customers remained generally satisfied as technological innovations boosted transmission and distribution reliability at the same time that rates were falling.

The year 1965 marked the start of a new era. The Northeast blackout in November 1965 left over 30 million customers — including all of New York City — without power. This was perhaps our first realization that we had become dependent on an interconnected system that was less reliable than assumed. As we entered the 1970s, many utilities became enamored with the potential of nuclear energy, and construction began on a number of large nuclear generating units. Shocks soon to follow included the Arab Oil Embargo of 1973-1974, the Three Mile Island nuclear accident in March 1979, and subsequent rapidly increasing generation costs for many utilities.

Towers at Rancho Seco power plant in California that was abandoned during construction

By the 1980s, many utilities were burdened with high debt levels and interest rates, incomplete power plant projects, slowing growth in the demand for electricity, the need for substantial electric rate increases, and increasing environmental concerns. Suddenly utilities found themselves portrayed in a negative light. In the 1990s and 2000s, the turmoil continued. Encouraged by deregulation in the natural gas, airline, transportation, and some foreign electric industries, free market advocates began pushing for competition in the electric industry. This led to the breakup of vertically integrated utilities in some regions, the spectacular rise and fall of marketing companies such as Enron and Dynegy, financial difficulties driven by expansion of utilities into unregulated activities, and a U.S. marketplace split into varying market structures. Other regions of the world became equally fragmented, with some countries pushing competitive markets and others sticking with monopoly utilities.

California ISO control room (Source: caiso.com)

In the mid-2010s, rapid industry evolution continued. Driven by technological and manufacturing advances, the cost of renewables fell significantly while wind and solar generation became the cheapest source of new generation in many regions of the world. The cost of batteries for grid storage also began dropping quickly. In some cases, combinations of renewable generation and batteries became competitive with the costs of operating existing nuclear or fossil fuel power plants. In the U.S., reductions in renewable costs combined with low natural gas costs resulted in a dramatic decrease in generation by coal and contributed to the closure of several nuclear power plants.

Growth in distributed energy resources (DERs) including rooftop solar, combined heat and power (CHP), demand-side management (DSM), electric vehicles, behind-the-meter storage, and smart distribution systems resulted in utilities and their regulators rethinking traditional business models. And in most parts of the world, the industry acknowledged the need to transition away from fossil fuel generation in response to concerns about global climate change. Today it seems certain that the pace of change for the industry has accelerated and that energy companies will continue to rapidly evolve.

Wind farm in Aragon, Spain
Wind farm in Aragon, Spain

Up next …