Consider the following code:
var min = Math.min();
var max = Math.max();
console.log(min < max);
As a general thought, this code should print true; after all, the minimum should be less than the maximum. But when we run this code, we magically print false.
Why is that?
I also need to check the documentation for MDN.
The Math.min() function returns the smallest of zero or more numbers.
Math.min has zero or more arguments. Easy to understand if multiple arguments, return the smallest of the arguments.
What if it has zero parameters? The document reads:
If no arguments are given, the result is Infinity.
If at least one of arguments cannot be converted to a number, the result is NaN.
Returns if there are no argumentsInfinity. What is Infinity?Infinity is a property of a global object in javascript, or a property of a window object in the browser environment, representing Infinity.
whileMath.max() returns – when no argument is passedInfinity.
So math.min () is larger than math.max ().
The full text.