Click the play button to hear the audio commentary
Global warming used to be the defining term to represent the increase in the average temperature of the earth during the past 100 years. Recently, the more politically popular term, climate change, has replaced global warming. Why? One main reason is because the earth is currently cooling.
“Global warming” obviously entails global average temperature increase, whereas “climate change” is about much more than just temperature. “Climate change” can represent just about anything, which is handy when the earth doesn’t happen to cooperate with climate models predicting future climate catastrophe. If the earth gets too cold, if it gets too hot, if there happens to be a slight increase or decrease in storm/drought/precipitation frequency or intensity, all of these events can be blamed on “climate change.”
It is hard to advocate for overbearing regulations that attempt to reduce energy use and greenhouse gas emissions when global temperatures have been stable or declining. In fact, in at least the last seven years, global temperatures have declined, despite increases in atmospheric carbon dioxide. As it turns out, the term “global warming” is a little inconvenient for doomsayers predicting runaway global temperatures. Using the term “climate change” supports the modern-day witch hunt that allows any weather anomaly to be blamed on human activity.
The truth is that climate always changes. Every year, decade, and century is different from the last. If climate policies are honestly aimed at reducing global temperatures, then governments should stick to the term “global warming,” instead of using the ambiguous term “climate change” to regulate every sector of the economy, regardless of the actual temperature of earth.