Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> an integerless programming language

Technically true-ish, but deserves an important qualifier. The Javascript number format has a huge "safe space" of integers between

  Number.MIN_SAFE_INTEGER => -9007199254740991 (-(2^53 - 1))
  Number.MAX_SAFE_INTEGER => 9007199254740991 (2^53 – 1)
Also, the number format is a standard, not only used by JS, and given that it was supposed to be a minimal scripting language it is hard to argue against the initial design choice of choosing one all-encompassing big standard, and not burden the language with a complete set. Since he criticism was on the initial design:

> ultimately to be replaced by a guy who threw together an integerless programming language

I would like to refute it by pointing out that the criticism ignores the initial use case, as well as the actual existence of integers within that larger number format standard. Later, when enough people (and companies) demanded it, a big integer type was added, after all.

Internally runtimes use different paths depending on what kind of number it is.

For many use cases of integers, especially internal ones, like array indexing and counting, those integers are just that, and an extra integer type for extra purity is not much of a problem. For other uses of integers, e.g. finance (using cents instead of dollars), it sucks that you have to pay a lot of attention to what calculations you perform, so not having (had - until BIGINT) a real integer type as aid indeed made it less pleasant to do integer arithmetic.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: