0.1+0.2=0.30000000000000004 is a common problem in floating point arithmetic in JavaScript. In addition to this problem, there is also a large number crisis (loss of precision in large number processing), which is also a recent problem. Do some sorting and understand the causes and solutions behind it.

About the author: May Jun, Nodejs Developer, moOCnet certified author, love technology, love to share the post-90s youth, welcome to pay attention to Nodejs technology stack and Github open source project www.nodejs.red

JavaScript maximum safe integer

Before starting this section, I hope you know a little bit about JavaScript floating point numbers. In the previous article, the JavaScript Floating point Puzzle: Why isn’t 0.1 + 0.2 equal to 0.3? Floating-point is a very good introduction to the principle of floating point storage, why the accuracy loss (recommended reading in advance).

In IEEE 754 Double 64 Bits, the mantissa is the significant number of digits used to store an integer. The mantissa is 52 Bits, and the omitted 1 can hold the actual value.

Math.pow(2.53) / / 9007199254740992

Number.MAX_SAFE_INTEGER // The maximum safe integer is 9007199254740991
Number.MIN_SAFE_INTEGER // Minimum safe integer -9007199254740991
Copy the code

It is safe as long as it does not exceed the maximum and minimum safe integer ranges in JavaScript.

The problem of large number processing accuracy loss reappears

Patients with a

What happens when you execute the following code in the Chrome console or node.js runtime: What? Why did I escape 200000436035958034 to 200000436035958050? After learning how JavaScript floats are stored, you should know that JavaScript’s maximum safe integer range has been triggered at this point.

const num = 200000436035958034;
console.log(num); / / 200000436035958050
Copy the code

Example 2

The following example reads the data passed through the stream, stored in a string data, because the data passed is an application/ JSON protocol data, we need to deserialize data to an OBJ for business processing.

const http = require('http');

http.createServer((req, res) = > {
    if (req.method === 'POST') {
        let data = ' ';
        req.on('data', chunk => {
            data += chunk;
        });

        req.on('end', () = > {console.log('No JSON deserialization:', data);
            
            try {
                // Deserialize to obj object for processing business
                const obj = JSON.parse(data);
                console.log('JSON deserialization:', obj);

                res.setHeader("Content-Type"."application/json");
                res.end(data);
            } catch(e) {
                console.error(e);

                res.statusCode = 400;
                res.end("Invalid JSON"); }}); }else {
        res.end('OK');
    }
}).listen(3000)
Copy the code

So after I run this program I’m going to call POSTMAN, 200000436035958034 and this is a big number.

Parse () : Parse () : parse() : parse() : parse() : parse()) : parse() : parse() : parse()) : parse() : parse(); parse() : parse()); What’s wrong with JSON conversions and high numerical precision?

No JSON deserialization: {"id": 200000436035958034} After JSON deserialization: {id: 200000436035958050}Copy the code

This problem is actually encountered, the way is to call the third-party interface to get a large number of parameters, resulting in JSON after a similar problem, the following analysis.

What’s wrong with JSON serialization for large numeric parsing?

Internet Engineering Task Force 7159 (IETF 7159) is a lightweight, text-independent, language-independent data interaction format. It is derived from the ECMAScript programming language standard.

www.rfc-editor.org/rfc/rfc7159… Visit this address to see about the protocol.

What we need to focus on in this section is “What is the Value of a JSON?” According to the preceding protocol, the value must be object, array, number, or String, and can also be false, NULL, or true.

At this point, all other encodings are converted by default when parsing JSON. Large values in our example are encoded as number by default, which is the real cause of accuracy loss.

Solution of large number operation

1. The common method is to convert a string

This is a common scenario in front and back end interactions. For example, the order number is stored as a numeric type. The maximum value represented by long in Java is 2 to the power of 64. MAX_SAFE_INTEGER (math.pow (2, 53) -1). If the order Number exceeds the maximum safe value of math.pow (2, 53) -1, the order Number will be returned as a string. This is a pit in the process of docking with a supplier.

2. New Hope BigInt

Bigint is a new data type in JavaScript that can be used to manipulate integers outside the maximum safe range of Number.

Create BigInt

One way is to add the number N after the number

200000436035958034n; // 200000436035958034n
Copy the code

Create BigInt

The other way is to use the constructor BigInt(). It is also important to note that the best way to use BigInt is to use a string, otherwise there will be accuracy problems. github.com/tc39/propos… They are called intractable diseases

BigInt('200000436035958034') // 200000436035958034n

// Be careful to use a string or it will still be escaped
BigInt(200000436035958034) // 200000436035958048n This is not a correct result
Copy the code

Detection of type

BigInt is a new data type, so it is not exactly equal to Number; for example, 1n will not always be equal to 1.

typeof 200000436035958034n // bigint

1n === 1 // false
Copy the code

operation

BitInt supports common operators, but should never be mixed with Number; always be consistent.

/ / right
200000436035958034n + 1n // 200000436035958035n

/ / error
200000436035958034n + 1
                                ^

TypeError: Cannot mix BigInt and other types, use explicit conversions
Copy the code

BigInt is converted to a string

String(200000436035958034n) / / 200000436035958034

// Or the following
(200000436035958034n).toString() / / 200000436035958034
Copy the code

Conflict with JSON

Parse (‘{“id”: 200000436035958034}’) with json. parse(‘{“id”: 200000436035958034}’) causes error.

JSON.parse('{"id": 200000436035958034n}');
Copy the code

After running the above program, you get a SyntaxError: Unexpected token n in JSON at position 25 Error, Unexpected token n in JSON at position 25 Error, Unexpected token n in JSON at position 25

Github.com TC39 /proposal-bi… As of now, this proposal has not been added to JSON because it would break the format of JSON and probably make it unparsable.

BigInt support

BigInt is currently in Stage 4 and is available in Chrome, Node, Firefox, Babel, and Node.js as version 12+.

BigInt summary

We can use BigInt to do some operations without any problem, but when we interact with a third-party interface, if we serialize the JSON string, we still encounter some large number problems, which is obviously caused by the conflict with JSON. Here is a third solution.

3. Third-party libraries

Some third-party libraries are also available, but you may be wondering why this is so convoluted. We are not happy to convert to a string, but sometimes you need to connect to a third-party interface, the data to contain such a large number of cases, and encountered that refused to change, the business will always be completed! Here is the third implementation.

Let’s take the second example of the recurrence of the error problem of large number processing, which is solved by the jSON-Bigint library.

Parse () instead of using json.parse (), parse the data stream as a string using the jSON-Bigint library. BigInt is automatically converted to a BigInt for values over 2 ^ 53. Setting storeAsString: true will automatically convert BigInt to a string.

const http = require('http');
const JSONbig = require('json-bigint') ({'storeAsString': true});

http.createServer((req, res) = > {
    if (req.method === 'POST') {
        let data = ' ';
        req.on('data', chunk => {
            data += chunk;
        });

        req.on('end', () = > {try {
                // Use a third-party library for JSON serialization
                const obj = JSONbig.parse(data)
                console.log('JSON deserialization:', obj);

                res.setHeader("Content-Type"."application/json");
                res.end(data);
            } catch(e) {
                console.error(e);

                res.statusCode = 400;
                res.end("Invalid JSON"); }}); }else {
        res.end('OK');
    }
}).listen(3000)
Copy the code

Check again and you will see the following result. This time it is correct and the problem has been solved perfectly!

JSON deserialization id: {id:'200000436035958034' }
Copy the code

conclusion

In this paper, some reasons for the loss of accuracy of large numbers are put forward, and several solutions are given, for reference if similar problems are encountered. Is recommended in the system design to follow the specification double-precision floating-point number to do, in the process of finding problems, see some use regular to match, personal or not recommended, it is a regular itself is a time-consuming operation, also may find some match two operation rule, carelessly may return results of all the values are converted to strings, Nor is it feasible.

Reference

V8. Dev/features/bi… Github.com/tc39/propos… En.wikipedia.org/wiki/Double…