watt-hour itself is already a somewhat silly unit, since joule already exists and is part of the definition of watt. but people can easily multiply an appliance's watt rating by the amount of time they use it to estimate its monthly energy use, which makes it a convenient unit for billing. I would guess kWh/yr is an extension of this idea. you could use J/yr, but most people don't have an intuitive sense of how much power that is.
The USA more or less fully adopted the metric system in the 90s. All food is labeled in metric units for example. We just never abandoned imperial weights and measures. This leads to some amusing things like large bottles of soda and wine being measured in metric, while cans are measured in ounces. Then again we're not alone. The UK is considered officially metric, but they still pour beer in pints and weigh people in stone.
There are good arguments to be made for everyday use of Fahrenheit over Celsius. The 0-100 range in F cover pretty much all of the temperatures one is likely to experience in the USA. It also has a finer resolution than degrees Celsius in everyday weather temperature ranges.
The SI units are designed for scientific and engineering use and they excel for the purpose they are designed.
> There are good arguments to be made for everyday use of Fahrenheit over Celsius. The 0-100 range in F cover pretty much all of the temperatures one is likely to experience in the USA.
You're mostly right, but there are plenty of places in the US where the daily low temperature drops well below zero for significant parts of the year.
I guess the inconvenience with Celsius is that you're more likely to go to negative numbers? Is there a problem other than the having to say "minus ten degrees" vs "fourteen degrees"?
> It also has a finer resolution than degrees Celsius in everyday weather temperature ranges.
I'd make the opposite argument about everyday use. Most people I know would barely differentiate 25C from 26C, so there's not really a need for finer gradations. And the 10 degree increments are surprisingly convenient:
I like that table, and when I’ve been in Europe it’s pretty much what I’ve used. Oddly though while 30 degrees in southern Germany feels unbearably hot, 86 degrees in southern California is warmer than ideal, but not what I’d call scorching. I wonder to what extent it’s physical factors besides temperature and if there is a psychological effect from the temperature scales.
And height as well. It is hard to remember mostly 1.x metre high (rarely 2m). But in feet you can have change in the first unit. After all feet is about foot. More natural.
I'd imagine it has something to do with the timeframe that terms describing units of electricity came into common usage.
The U.S. is a signatory of the Metre Convention (1875). All of our customary units are defined as linear conversions of corresponding SI units. The Watt and Joule (also Ohm) were not standardized until 1893. Funnily enough, the international conference where these were officially decided happened in Chicago.
Because a home that uses 11000 kWh/yr doesn't use 1.25 kW. During low power usage times (middle of the night) it uses substantially less, and at peak times it uses substantially more. And even these values will shift with time of year and weather. kWh/yr implies you are averaging energy use over a substantial period of time, whereas kW is ambiguous on its own. MJ/yr would work as well and have a similar order of magnitude, but it doesn't really have any advantage in its favor.
Seems to make sense for capacity planning. The considerable daily and seasonal variation in use argues in favor of using the aggregate rather than expressing it as an average.
But electricity consumption was never measured in anything else. Here, it's because the aggregate unit, kWh, is present on the bill, so the derivation for kWh/year is easy to determine.