"On CoreWeave, renting an Nvidia A100 40GB — one popular choice for model training and inferencing — costs $2.39 per hour, which works out to $1,200 per month. On Azure, the same GPU costs $3.40 per hour, or $2,482 per month; on Google Cloud, it’s $3.67 per hour, or $2,682 per month."
Am I missing something? I am sure I'm a bit rusty in math, but I can still handle a calculator. ~720 hours in a month (roughly), and that means CoreWeave would cost $1,720.80 per month, Azure is $2,448 per month, and Google Cloud is $2,642.40 per month.
Why are all of these numbers reported in the article off? Some slightly--Azure and Google Cloud are close, but CoreWeave is off by about 30%. I won't go further into the numbers as to why the author came up with these results, but I'm just wondering if this article was written by AI, which would explain why basic multiplication is incorrect.
The whole thing is nuts. They have the wrong costs multiplied by the wrong time period to get the wrong answers. https://coreweave.com/gpu-cloud-pricing says an A100 40GB NVLink is $2.06 whereas the article says $2.39.
That's $1483.20 a month, whereas the article says $1200 and should say $1720 if they'd got the maths right.
The CoreWeave number is completely wrong, but Azure and Google Cloud appear to be exactly correct at 730 hours per month, which happens to be the number of hours in 365 / 12 days.
My favorite quote about journalists goes something like "Never trust a journalist's math. If they could do math, they wouldn't have become journalists."
Wonderful that we have evolved large linear algebra models running on expensive computers to the point that they can no longer do basic arithmetic correctly.
"On CoreWeave, renting an Nvidia A100 40GB — one popular choice for model training and inferencing — costs $2.39 per hour, which works out to $1,200 per month. On Azure, the same GPU costs $3.40 per hour, or $2,482 per month; on Google Cloud, it’s $3.67 per hour, or $2,682 per month."
Am I missing something? I am sure I'm a bit rusty in math, but I can still handle a calculator. ~720 hours in a month (roughly), and that means CoreWeave would cost $1,720.80 per month, Azure is $2,448 per month, and Google Cloud is $2,642.40 per month.
Why are all of these numbers reported in the article off? Some slightly--Azure and Google Cloud are close, but CoreWeave is off by about 30%. I won't go further into the numbers as to why the author came up with these results, but I'm just wondering if this article was written by AI, which would explain why basic multiplication is incorrect.