11 Reasons Why We Still Need Earth Day

Earth Day 1970 speaker

US Senator Edmund Muskie, author of the 1970 Clean Air Act, addressing an estimated 40,000-60,000 people as keynote speaker for Earth Day in Fairmount Park, Philadelphia on April 22, 1970.

Can you believe it? We’ve just observed the 45th Earth Day. American society as a whole has not always given thought to what people do to the environment. But you do now. You probably wouldn’t think about it if Earth Day had never captured the popular imagination.

In 1969 Senator Gaylord Nelson conceived the idea of holding a national teach-in on environmental issues and picked the date of April 22, 1970.

The event, named Earth Day by one of the college students who helped coordinate events, succeeded beyond anyone’s imagination.

Earth Day has been observed on April 22 every year since then. Why was Earth Day necessary in 1970, and why is it still necessary now?

1. The economy depends wasteful spending

Have you ever heard economists and politicians complain that consumers don’t spend enough? Probably every time there is general discussion about economic troubles. It’s the number one complaint and has been since the Eisenhower administration.

The American economy suddenly faced a new problem in the prosperity that followed the Second World War. Always before, humans had to face problems of scarcity.

Postwar America had factories churning out products faster than people could buy them. They couldn’t export much, because no one else in the world any money leftover after the costs of recovering from the devastation of war.

Couldn’t factories scale back production to match demand? Business and government didn’t think so. By 1960 the amount of goods a single worker could produce was growing about 3% annually. Cutting production, it seemed, would only put people out of work and return to Depression-era conditions.

No surprise that advertisers found ways to induce people to buy more and more. The government encouraged people to buy more and more. Recessions occurred because people weren’t spending enough. By 1960 the average American consumed twice as many goods as before the war.

Nearly all Americans already owned cars and refrigerators and so on. How could business sell to those customers? They could sell replacements by making products that broke down more readily than older ones. They could induce consumers to buy more than one of each item. They could introduce technological improvements, or at least noticeable changes, to make what people already owned obsolete.

They did all that and more. They made disposable products. They made faddish junk like lava lamps and mood rings in the 1960s to plastic singing fish several decades later.

Franklin Roosevelt urged austerity and restrained spending during the war years. The economy depended on it. After 9/11, George W. Bush became at least the tenth consecutive President to urge people to go shopping. The economy depended on it.

Of course that means using up a lot of natural resources and raw materials to make stuff that we’ll throw out sooner rather than later.

2. The government encourages wasteful spending

F-15C fires AIM-7 Sparrow to contain the Russians and boost factory production.

F-15C fires AIM-7 Sparrow to contain the Russians and boost factory production.

The military gladly paid hundreds of dollars for parts available in stores for pocket change.

If anything needed painting, the services ordered more paint than necessary for the project and put the surplus in a warehouse, where no one would ever look when more paint of the same color was wanted.

The whole concept of economic growth started in the postwar years. Both Democrats and Republicans called for more growth. They only had different ideas on how to achieve it.

Everyone gave the supposedly rapid growth of the Soviet Union as the reason why the U.S. needed to grow, despite the fact, acknowledged by as least some economists, that Russia, with only about 5% of American production of consumer goods, had no chance of catching up with the American economy even if it was growing a little faster.

Defense spending amounted to 10% of the total output of goods and services. The stock market went into a tailspin at any hint of a reduction in defense spending. Politicians could hardly advocate cutting back on it. Their rhetoric pointed to the Russian threat, but in fact, any reduction in the arms industry would put some of their constituents out of work.

The federal government encouraged overproduction in agriculture. It spent billions of dollars every year enabling farmers to grow much more food than Americans could eat. Then it spent more millions of dollars on metal to build storage bins to keep it all.

New agricultural technology starting in the 1930s had greatly increased agricultural yields, the beginning of the so-called green revolution. Healthier plants required more water and fertilizer.

Manure would have been the best fertilizer, but fertilizer made from petroleum was less expensive and had the advantage of an industrial marketing campaign behind it. Farmers poured truckloads of synthetic fertilizer into fields that should have been allowed to lay fallow

No one in positions of political power and influence attempted to explain how production of pinball machines or cleaners designed to clean only one thing—windows or floors or countertops, etc., but not anything else–would add to our military preparedness.

No one attempted to distinguish between desirable or undesirable growth. No one questioned that it was a good idea to put up thousands of houses on prime farmland. Everyone assumed that growth, any growth, was inherently a good thing.

Conditions have changed. Most important, we are no longer the only successfully operating economy in the world. Probably all the subsidies ever granted are still in effect long after any justification for them has vanished.

For example, subsidizing ethanol from corn started as way to keep the agricultural economy growing by dealing with annual surpluses. Now that corn could be better put to use as food somewhere, but no farm state politician can afford not to fight for the ethanol subsidies.

3. America became a nation of suburbs

suburban neighborhood. not sustainable


Lot’s of people got married and started families after the war. Returning soldiers got married. They were older than most previous generations’ first time marriages. Men too young to serve in the war also came of age after the war and got married. It didn’t take long for the baby boom to start.

Housing starts in the last half of the 20th century were 350% higher than in the first half. There were a record 1.9 million housing starts in 1950, of which 1.7 million (88%) were single-family houses.

Where was there land for all of those houses? Certainly not in established cities. Developers plowed up farmland and bulldozed forests in an expanding ring of suburbs surrounding established cities.

Many of them had names like Oak Forest or Elmwood or Walnut Cove or Pineville or Farmington. In other words, many suburbs took their names from the landscape that had to be obliterated to make room for them.

It wasn’t exactly rural living, because too many people were crowded together for that. But it shared one disadvantage of rural living in that everywhere people wanted to go was a long way away.

Those were good years for the auto industry and the road construction industry, not just the housing industry. New houses in 1950 had on average 1065 square feet, 983 square feet of finished space.

They were small partly because the economy was still recovering from Depression and wartime conditions, partly because they were built mostly for first-time homebuyers. Once the baby boom got rolling, people started needing bigger houses.

4. No one could walk in new neighborhoods

The typical city had a major downtown and then smaller shopping areas scattered throughout the neighborhoods. People could walk to the stores, to schools, churches, parks, and to neighborhood movie houses. Many could walk to work. Anyone could walk through the neighborhoods just for the sake of walking.

The new suburbs, and new neighborhoods built in the cities, had no shopping areas or much of anything else within walking distance of the homes. Often the streets had no sidewalks. Even taking a walk for enjoyment meant getting in the car and driving somewhere, even short distances.

Traffic congestion resulted from so many people driving. Stoplights relieved the congestion by making cars stop and idle for a while.

Unlike older neighborhood stores, both big box stores and shopping malls (even little strip malls) require large parking lots. At least those lots make us walk at least a little, but they are impervious to rain.

When rain lands on soil, it soaks in, at least until the soil becomes saturated. When it lands on roofs, streets, or parking lots the size of lots of our ancestors’ farms, it runs off. It contributes greatly to urban flooding.

5. Smaller households live in larger houses

In 1972, housing starts reached a new record, 2.4 million units, but 44% of that construction was for multi-family housing: apartments or condominiums. The single-family homes built that year had an average finished area of 1634 square feet, a 66% increase from the 1950 average. At least to some extent the newer homes had more rooms: 65% had three or more bedrooms; 23% had four or more bedrooms.

Half of them had at least a bathroom and a half. I have not found statistics about lot sizes, but I suspect that these larger houses were built on larger lots.

And yet the average household size, the number of people living together in a house, had begun to decrease. Much of the multi-family housing built in the early 1970s was essentially student housing, rented by baby boomers, that is, the very students who provided so much of the energy for the first Earth Day.

Multi-family housing was built mostly in already developed areas. The number of new single-family homes was actually less in 1972 than in 1950. Conversion of open space to housing probably remained fairly steady during the interval.

The students of the 1970s were the baby boom. Students traditionally have little money of their own. It is easy to recommend that other people spend less, but the recommendation faced two problems. White people made the recommendation, and black people felt that it was just another way of holding them back. And then all the students grew up to start their own careers and families.

If they grew up living in houses built in the 1950s, the average home had only 983 square feet of usable space. The average size of finished space in single-family homes in the 1970s had grown to 1,634 square feet, which was still less than the average size of new houses in the prosperous 1920s.

Did the size of houses begin to level off? Not until the housing crisis that began in 2007. By that time, average single-family house size exceeded 2,200 feet. After dropping slightly for about two years, the average size of a home started to increase again, and it now tops 2,400 square feet.

Meanwhile, the size of the average household had already decreased between 1950 and 1970 and has continued to decrease. Larger houses occupy still larger lots with fewer people living in them. Within living memory, reel mowers could adequately care for the average yard. As yards grew, power mowers proliferated. At first, it was still necessary to push the mower across the lawn. More and more homeowners now require riding mowers just to tend to grass in reasonable time.

Stuff multiplies to take up available space. With fewer and fewer people living in bigger and bigger houses, some people have accumulated so much stuff that they have more than their house can hold. So they rent storage space for the overflow, and likely as not have only a hazy notion of what they have put there. People still value personal convenience above adopting eco-friendly lifestyles.

5. We misunderstand convenience

cost of convenienceI remember a Windex commercial from my childhood that showed a housewife preparing to wash windows. She mixed ammonia and water in a bucket, wrinkled her nose at the smell, and carried the bucket—the stinky ammonia and water mixture sloshing out with every step.

Windex was so much more convenient. Just spray and wipe! Anyone who has ever washed windows with ammonia and a squeegee knows that it gets windows cleaner faster. Spraying and wiping leaves streaks. If you wash both sides of a window, which side are the streaks?

Several years after the last time I saw that commercial, Windex advertised a new and improved formula: now with ammonia!

American companies made more than just dedicated window cleaners. They made a wide array of cleaners that only cleaned a few kinds of surfaces. All of them much more convenient than the old fashioned products your grandmother had to use.

They take up a lot more room in the kitchen and bathroom cabinets, too. And when you’ve used them up the empties take up a lot more room in the landfills.

I mentioned earlier that neighborhood design made it necessary for people to drive everywhere, but having a car to drive at all seemed like a marvelous convenience. TV dinners and fast food restaurants also seemed like marvelous conveniences. The busy housewife no longer had to cook.

The first TV dinners tasted awful. The first fast food restaurants had limited selection. But they were so convenient! Manufacturers learned how to make tastier foods.

They also added the convenience of such a soft texture that it’s hardly necessary to chew it. The layers of fat, sugar, and salt that they concocted in their experimental kitchens turn out to be addictive.

To top it all off, the fast food industry has given us the convenience of not even having to get out of the car to get food. All we have to do any more is speak our order into a microphone and then drive to a window or two where we pay and get handed a sack of food.

How convenient is it really if you happen to be the fifth or tenth car lined up for a turn at the microphone? Getting zero miles to the gallon as you sit and idle and breathing everyone’s exhaust.

Our cars are making us fat by reducing exercise to get high-calorie convenience foods among other things. So to stay in shape, we need gym memberships and/or equipment and DVDs at home.

We have never learned to count the cost of convenience or ask if what seems like convenience really is convenient. http://sustainingourworld.com/?s=cost+of+convenience&x=17&y=11

6. Americans stopped separating garbage

garbage dump

Gulls at an open dump. Just imagine the rodents and insects!

Before about 1960, people routinely separated wet garbage from trash. It required two different collections on two different schedules.

They burned waste paper in a back yard incinerator. Likely as not, the wet garbage got burned in an incinerator. During the war people could turn in scrap materials, which factories recycled to make new products.

In Los Angeles, people loved their back yard incinerators despite the city’s notoriously dirty air, but in 1955, a county ordinance limited burning to specific hours of the day. By the end of the decade, the county banned the practice entirely. Now the people of Los Angeles had to deal with three collection schedules.

Sam Yorty became mayor in 1960 on the promise of ending source separation. He promised that the poor, inconvenienced housewives of Los Angeles could mix all their refuse in a single container and only have to deal with one collection.

Los Angeles would adopt a more modern waste management practice, the landfill. After all, it had plenty of canyons that weren’t good for anything else. The environmental hazards of landfills didn’t become objectionable until the mid 1970s.

In preparation for Earth Day, a group in California walked from Modesto to Sacramento and installed recycling drop-off centers at every stop. By the end of 1970 about 3,000 such centers were installed nationwide, but after a decade of not separating trash, they seemed too inconvenient.

Eventually municipalities began to collect recyclables at the curb. People only had to separate paper, glass, and metal. (Plastic recycling started only much later.) Curbside recycling was more convenient than drop-off centers, but not enough to encourage much preparation.

Now we can commingle recyclables in one container as we commingle trash in another. Food waste slopping out of inadequately rinsed containers contaminate high-quality recyclable paper so that it can’t be used to make recycled paper. But it’s more convenient than separating recyclables.

More people participate now, but not enough that the vast majority of recyclable material doesn’t wind up in the trash. Surely more people—perhaps everyone—would participate in recycling programs without Yorty’s innovation of eliminating separation.

Unfortunately, some people have always found trash cans too inconvenient, so they just drop whatever they don’t want any more wherever they decide they don’t want it. Litter is more than just unsightly. It causes multiple environmental problems.

8. We don’t know how to think through using technology

Coal smoke stack

Coal smoke stack

Too often, we see a problem, devise a solution, and then find that the solution simply creates another problem. So we find a patch for that one. Here is but a single example:

  1. Benjamin Franklin, one of America’s earliest environmentalists, advocated burning coal instead of firewood to preserve forests.
  2. Burning coal leaves two kinds of ash, bottom ash, which stays at the bottom of the chimney, and fly ash, which floats out of the top of the smoke stack. Larger pieces fall as soot. It makes whatever it lands on, say laundry that has just been hung out to dry, dirty. Smaller pieces remain in the air, where they can contribute to smog. Perhaps it wouldn’t have been so bad if confined to households, but the entire Industrial Revolution depended on burning coal.
  3. People only started objecting to soot some time in the 1940s. Federal clean air legislation passed in the wake of the first Earth Day mandated scrubbers to eliminate fly ash.
  4. Coal-fired plants still generated as much ash as ever, but the owner had much more ash to deal with. So they dug pits, dumped ash into them, and mixed it with water so that it wouldn’t get into the atmosphere.
  5. Ash pits, along with any other kind of landfill, leach potentially toxic chemicals into the ground and ultimately the ground water. We have discovered that fairly recently.
  6. It seems that all ash pits have been built near rivers or other bodies of water. We have known for a long time that dams can fail. We didn’t know what would happen when lots of coal ash got into a river until a dam near Kingston, Tennessee owned by the Tennessee Valley Authority burst in 2008. The collapse sent 5.4 million cubic feet of ash slurry into two rivers and inundated 300 acres of land.
  7. Someone noticed a potential problem at the Dan River Steam Plant in the 1940s and took steps to solve it. The problem was that the ash pit lay between the river and a depression where rainwater collected. A storm could cause rainwater to enter the pit, overtop the dam, and send coal ash into the river. The solution? Tunnel under the pit to install a drainage pipe—made of corrugated metal. That bit of foresight and cleverness worked very well—until the pipe rusted out and collapsed in 2014. That coal ash spill was much smaller than the one in Kingston, but still very serious. http://bit.ly/RiverRanGray
  8. Engineers are trying to figure out how to get all that coal ash out of all of those pits. Will they do their homework? Or will their plan lead to a new kind of manmade environmental disaster in another half century or so?

9. The air made people sick

Smog arrived in the Los Angeles Basin very early in the 20th century. The Native American name for the area is something like “the land where the campfire smoke never goes away.” In 1903, residents thought the thick haze was a solar eclipse. The later vogue for trash incineration, not to mention the city’s eventual car culture, only made a bad situation worse.

For some reason, no one made a connection between all the burning and the foul air. July 26, 1943 became known as “Black Monday” in Los Angeles. The smog engulfed downtown and reduced visibility to three blocks. The noxious fumes choked pedestrians. A chemical plant bore the brunt of the blame, but air quality did not improve when it was shut down.

Los Angeles may have had the worst overall air quality in the country, but by no means had a monopoly. Killer smog descended on Donora, Pennsylvania for five days, October 26-31, 1948. The town of 14,000 people on the Monongahela River sits in a valley.

Pesticides caused a different kind of air-quality problem. In 1939, a scientist who later won a Nobel Prize for his work discovered that a substance called DDT could kill insects. During World War II soldiers applied it to themselves in powder form to control lice. After the war, DDT was sprayed as a fumigant from airplanes onto large land areas, without the owners’ permission, to control fire ants and mosquitoes.

The scientific community began to realize that the chemicals not only killed insects, but animals, birds and fish. DDT weakened birds’ eggshells, and some species, such as the osprey, began to die out from inability to reproduce.

Housewives had their own concerns, as they witnessed squirrels and birds dying in their backyards after aerial applications of DDT and found their own children getting sick. Dairy farmers in upstate New York whose farms were sprayed to eradicate gypsy moths found their milk banned from the market.

Rachel Carson became interested in DDT and other synthetic pesticides when wildlife biologists in the U.S. Fish and Wildlife Service, where Carson worked as science editor, began to express concerns about its effects on birds and plants. Carson’s 1962 book Silent Spring helped spark the public interest in the environment that led to the first Earth Day.

10. The water made people sick

water pollution

Algae bloom with dead fish

Towns and industry dumped raw sewage and industrial wasted into rivers, lakes, and the ocean for generations. Then someone got the bright idea to treat the sewage first.

Decades of raw sewage, industrial waste, agricultural runoff, and household laundry created a vast dead zone in Lake Erie, the shallowest of the Great Lakes.

By the 1960s, though, raw sewage was less of a problem. Ironically, treated sewage might have caused even more damage.

Sewage technology at the time took advantage of part of the aquatic cycle by converting noxious organic human wastes into inorganic materials. Then it failed to take the other half of the aquatic cycle into account by discharging them into rivers and lakes instead of returning them to the soil. (That method, fortunately, is long obsolete.)

Algae fed on the inorganic material, and when presented with such an abundance of food, proliferated at the expense of the ecosystem as a whole.

That explains at least in part how modern technology nearly killed Lake Erie. Large algae blooms in the middle of the lake depleted the oxygen. Nothing could live under their surface. It was declared dead by 1970. Long before that, dead fish washed up on shore all around the lake.

Households contributed to water pollution every time anyone did a load of laundry. Detergents relied on phosphates, the very substance that killed Lake Erie. Wastewater treatment plants could not deal with phosphates. Fish died in rivers nationwide. Phosphates did not break down in backyard septic tanks or cesspools, either, and eventually seeped into the ground water.

Whether from rivers or wells, much of the nation’s tap water came with suds. It smelled and tasted bad. No one knew for sure if ingesting phosphates in drinking water posed a human health hazard, but a survey by the Department of Health, Education, and Welfare of 939 water systems in 1969 found that many (but by no means most) had dangerous levels of fecal bacteria and heavy metals.

It documented cases of illnesses caused by using tap water. Most water systems tested proved adequate, but that was not because of well-trained water system operators or rigorous inspections.

More than three quarters of plant operators lacked adequate training. Nearly half did not understand chemistry related to water treatment. Almost 80% of plants were not inspected, and another 10% did not meet whatever government standards existed for frequency of inspection.

11. Chemicals made people sick

I have already mentioned DDT and its effect on air quality. Public concern about food safety started to grow in the 1950s, which led to two amendments to the Food and Drug Act. The Miller Amendment empowered the FDA to establish zero tolerance for certain chemicals and forbid the sale of any food that had any detectable quantity of it.

The Delaney Clause regulated food additives by forbidding the use in food of any substance known to cause cancer in either humans or lab animals. Careless interpretation of that clause led to the great cranberry scare of 1959.

In November, Secretary of Health, Education, and Welfare Arthur Fleming warned consumers that traces of aminotriazole, a weed killer and not an additive, had been found in cranberries grown in Oregon and Washington.

He urged Americans to avoid eating cranberries until the new crop could be tested, just to be safe. That crop had set records for abundance, and cranberry growers anticipated record sales. It was, of course, impossible to complete the tests between then and Thanksgiving. Fleming’s warning resulted in record low sale of cranberries.

Fleming based his warning on three serious errors in judgment.

  • The herbicide was found only in those two states, which produced only a small portion of the nation’s cranberry crop.
  • Later research determined that only a miniscule amount had been found.
  • Third, Fleming completely misread the clause, which prohibited the sale of food that contained dangerous chemicals. Aminotriazole had been detected in raw cranberries, not in any finished cranberry products available for sale.

first earth day coverThe landmark legislation that appeared in the 1970s as a result of the first Earth Day largely took care of all of the environmental symptoms that had come to public attention over the previous 25 years. The root problems, unfortunately, seem to be hardwired into human nature.

Earth Day has since become more of a grassroots effort to take care of the environment locally than an occasion for speeches and mass gatherings. For example, it gives you a chance to join with others in cleaning litter from highways and streams.

Every Earth Day can remind you of your own commitment to living sustainably. It’s a chance to talk to your family, friends, and neighbors about it. And then role up your sleeves and do something special.

As long as people care about their own personal convenience more than the living conditions in their community or introduce new chemicals and technologies without adequate understanding of the consequences (etc.), we will need the annual opportunity to clean up some of the damage and think of more ways to live sustainably as a part of nature.

If you enjoyed this article, be sure to get my new book Before and After the First Earth Day, 1970

Photo credits:
Senator Muskie Earth Day speech. Public domain from Wikimedia Commons.
F-15C fires AIM-7 Sparrow. Pubic domain from Wikimedia Commons.
Suburbia. Public domain from Wikimedia Commons.
Cleaning product aisle. Some rights reserved by David 23.
Garbage dump. Source unknown
Coal smoke stack. Some rights reserved by Señor Codo.
Algae bloom with dead fish. Some rights reserved by Greenpeace China.

Leave a Reply

Your email address will not be published.