Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I never understood why kWh/yr is so commonly used.


watt-hour itself is already a somewhat silly unit, since joule already exists and is part of the definition of watt. but people can easily multiply an appliance's watt rating by the amount of time they use it to estimate its monthly energy use, which makes it a convenient unit for billing. I would guess kWh/yr is an extension of this idea. you could use J/yr, but most people don't have an intuitive sense of how much power that is.


I'm just surprised that the U.S uses such metric units as Watts and Joules, rather than foot-pounds or calories


The USA more or less fully adopted the metric system in the 90s. All food is labeled in metric units for example. We just never abandoned imperial weights and measures. This leads to some amusing things like large bottles of soda and wine being measured in metric, while cans are measured in ounces. Then again we're not alone. The UK is considered officially metric, but they still pour beer in pints and weigh people in stone.

There are good arguments to be made for everyday use of Fahrenheit over Celsius. The 0-100 range in F cover pretty much all of the temperatures one is likely to experience in the USA. It also has a finer resolution than degrees Celsius in everyday weather temperature ranges.

The SI units are designed for scientific and engineering use and they excel for the purpose they are designed.


> There are good arguments to be made for everyday use of Fahrenheit over Celsius. The 0-100 range in F cover pretty much all of the temperatures one is likely to experience in the USA.

You're mostly right, but there are plenty of places in the US where the daily low temperature drops well below zero for significant parts of the year.

I guess the inconvenience with Celsius is that you're more likely to go to negative numbers? Is there a problem other than the having to say "minus ten degrees" vs "fourteen degrees"?

> It also has a finer resolution than degrees Celsius in everyday weather temperature ranges.

I'd make the opposite argument about everyday use. Most people I know would barely differentiate 25C from 26C, so there's not really a need for finer gradations. And the 10 degree increments are surprisingly convenient:

   -20 (or less)    dangerously cold
   -10 -- -20       damn cold
   -10 --  0        winter
    0  --  10       chilly
    10 --  20       brisk
    20 --  30       hot
    30 --  40       sweltering
    40 --  50       damn hot
    50 (or more)    dangerously hot


I like that table, and when I’ve been in Europe it’s pretty much what I’ve used. Oddly though while 30 degrees in southern Germany feels unbearably hot, 86 degrees in southern California is warmer than ideal, but not what I’d call scorching. I wonder to what extent it’s physical factors besides temperature and if there is a psychological effect from the temperature scales.


Humidity is a major factor. We cool our body below ambient temperature through evaporation cooling (sweat), which works much better in dry air.

From experience I'd say a sauna with 45°C and 80% humidity is about comparable with one at 80°C and 10% humidity.


Most beer pints are not a full pint. And they keep getting smaller!


And height as well. It is hard to remember mostly 1.x metre high (rarely 2m). But in feet you can have change in the first unit. After all feet is about foot. More natural.


Vast majority of people are between 5 foot and 6 foot, so I'm not sure what your point is.


Calories are metric as well: 1 kcal is the amount of energy required to heat 1 kilogram of water by 1 degree Celsius.


I'd imagine it has something to do with the timeframe that terms describing units of electricity came into common usage.

The U.S. is a signatory of the Metre Convention (1875). All of our customary units are defined as linear conversions of corresponding SI units. The Watt and Joule (also Ohm) were not standardized until 1893. Funnily enough, the international conference where these were officially decided happened in Chicago.


Don't jinx us! haha


>I would guess kWh/yr is an extension of this idea. you could use J/yr, but most people don't have an intuitive sense of how much power that is.

Just using kW for your average power usage over a year seems more intuitive to me.


Because a home that uses 11000 kWh/yr doesn't use 1.25 kW. During low power usage times (middle of the night) it uses substantially less, and at peak times it uses substantially more. And even these values will shift with time of year and weather. kWh/yr implies you are averaging energy use over a substantial period of time, whereas kW is ambiguous on its own. MJ/yr would work as well and have a similar order of magnitude, but it doesn't really have any advantage in its favor.


Seems to make sense for capacity planning. The considerable daily and seasonal variation in use argues in favor of using the aggregate rather than expressing it as an average.


If you're doing capacity planning, wouldn't you have to plan for the peaks? (Kind of like highways and rush hour.)


For the moon, I'd plan on batteries being used to meet demand peaks (like we are starting to do here).


Weird units like that are usually inertia from the pre SI days.


But electricity consumption was never measured in anything else. Here, it's because the aggregate unit, kWh, is present on the bill, so the derivation for kWh/year is easy to determine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: