I had a script where I was multiplying a few numbers, 25 x 2048 x 1.5 and I was putting the result into a shell script, and it was getting an error.
After some display dialog debugging, turns out my script was turning 76800 into 7.68+E4. What the hell? Thats not even a big number? I’m already dealing with sizes in MegaBytes because theres no chance this languages can handle numbers big enough to do disk size math in bytes.
But so I don’t have to walk in eggshells whenever I’m doing math, what is the rule?
The rule is explained in the AppleScript Language Guide (here) as follows:
Real numbers that are greater than or equal to 10,000.0 or less than or equal to 0.0001 are converted to exponential notation when scripts are compiled. The largest value that can be evaluated (positive or negative) is 1.797693e+308.
As mentioned, AppleScript converts integer numbers larger than ±536870911 or real numbers >= 10,000.0 or <= 0.0001 to exponential notation. In addition, a result will be real if any operand in the expression is real or the / operator is used.
If the result is real but otherwise an integer value (within the integer range), you can just coerce back to integer. There is also a quirk when coercing a number to text (handy for shell scripts) that can handle up to 16 digits, for example: