JS implementations may use integers as an optimisation. The problem is when you need to sure something actually is an integer.
You can also have bitwise operations on doubles, you just take part of the number and toss away the rest, and pretend it's an integer. But occasionally, and unexpectedly, it may cause a rounding issue, depending on how it is done.
However, looking more into it, this library relies heavily on the Buffer [0] API from Node (which does have proper integers, and Buffer is a Uint8Array). Which handles 7bit data by treating it as latin1 (ISO 8859-1), and tossing away 1 bit of every byte. Which might not be a bad way of handling it.
I don't know enough about Node's particular implementation to judge whether or not there are problems here. I do know that 7bit encodings can break a whole lot of parsers in unexpected ways.
That's not exactly proof that JS has integers. I mean you can still do this:
> 1 === 1.00000000000000000000000000000000000000000000001
JS implementations may use integers as an optimisation. The problem is when you need to sure something actually is an integer.
You can also have bitwise operations on doubles, you just take part of the number and toss away the rest, and pretend it's an integer. But occasionally, and unexpectedly, it may cause a rounding issue, depending on how it is done.
However, looking more into it, this library relies heavily on the Buffer [0] API from Node (which does have proper integers, and Buffer is a Uint8Array). Which handles 7bit data by treating it as latin1 (ISO 8859-1), and tossing away 1 bit of every byte. Which might not be a bad way of handling it.
I don't know enough about Node's particular implementation to judge whether or not there are problems here. I do know that 7bit encodings can break a whole lot of parsers in unexpected ways.
[0] https://nodejs.org/api/buffer.html