I have a script that asks you to type in a number. If you type in a big number it puts it in scientific notation or something. It looks like this:
1.8975126541345E+13
In my script I have to get how many digits are in the number and then go through a loop that adds the digits together like:
-- theNumber equals 1.8975126541345E+13
set theLength to length of (theNumber as string)
set tempNumber to 0
repeat with i from 1 to theLength
set tempNumber to (character i of (theNumber as string) as number) + tempNumber
end repeat
This doesn’t work because it has that decimal and E+13 thing going on. How do you get applescript to stop doing this and get my code to work?
According to the documentation, the largest value that can be expressed as an integer is 2^29-3 although on my computer it is 2^29-1 as you can see in the following script:
set n to (2 ^ 29 - 3) as integer
repeat
try
set n to (n + 1) as integer
on error
beep 2
exit repeat
end try
end repeat
{n as integer, n + 1}
So, if the integer is greater than 2^29-1 and you want to display it as integer, then you need to convert it to text. You can use text manipulation or math to convert it. I don’t know which is faster, but check this out:
set t to “18975126541345”
display dialog “Enter an integer:” default answer t
set n to (text returned of result) as number
return {n div 10, n mod 10}
Here’s what I got using the div and mod operators:
set t to “18975126541345”
display dialog “Enter an integer:” default answer t
set n to (text returned of result) as number
set temp_n to n
set int_text to {}
repeat until temp_n = 0
set beginning of int_text to (temp_n mod 10) as integer
set temp_n to (temp_n div 10)
end repeat
return int_text as string
Need to error check the text returned from the dialog.