09-14-2018 20:25 - edited 09-15-2018 10:41
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post

09-14-2018 20:25 - edited 09-15-2018 10:41
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post
I'm using TypedArray to append binary data into a file on the device. The data I'm collecting are from the accelerometer, and I would like to store those data with only 3 decimal values, which I achieve by doing so:
accel.x ? accel.x.toFixed(3) : 0;
I tried to use both Float32Array and Float64Array for my TypedArray and when I try to read from the file the bytes I just wrote (to make sure i got everything working properly, and I do) I see that whenever I'm using Float64Bytes I successfully have the 3 digit precision I was asking for in that previous piece of code, but when I'm using Float32Array, for some reason I can't get a 3 digit precision, but rather a 14 digit precision.
It's not a big deal, because my code is still working, but I'm using TypedArray to optimize storage use and I'm wondering if there might be a bug. I shouldn't be getting a 14 digit precision, it should be 7, like any other C-like float variable. I'm afraid that my program might end up using more memory than it should have.
A little update after a further investigation: by monitoring the memory usage through the MemoryUsage interface and the "used" property, I have found that using Float32Array still allows me to use less memory compared to Float64Array. At this point, my idea that there might be a bug either in JerryScript or in JavaScript becomes more realistic. There is not way a get a 14 digit precision from a 32 bit float.

09-17-2018 12:55
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post


09-17-2018 12:55
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post
Are you able to provide a small sample project? I can get some eyes on it.
Thanks

09-17-2018 14:04
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post

09-17-2018 14:04
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post
Here is two samples of the same code. I just changed the part related to the Typed Arrays:
- 32bit:
-64bit:
If I try to console.log those bytes, I would get 16 decimal digits in the first case (32 bit float, which is supposed to have, a maximum of 7 decimal digits, ignoring my .toFixed(3) function) and 3 decimal digits in the second case (64 bit float which can have up to 14 decimal digits, which I cut down to 3).

09-17-2018 14:17
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post


09-17-2018 14:17
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post
Is there a chance that the problem is with toFixed()? It's been reported as buggy elsewhere.
Gondwana Software

09-17-2018 14:19
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post

09-17-2018 14:19
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post
well if it were, i don't think it makes sense that it would act weird only when using Float32Array
but who knows, it could be it

09-17-2018 14:30
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post

09-17-2018 14:30
- Mark as New
- Bookmark
- Subscribe
- Permalink
- Report this post
Moreover, if I try to console.log my data before putting into the Float32Array, it correctly shows 3 decimal digits.

