I’m super confused by this one. Especially because I believe it used to work until recently.
use AppleScript version "2.4" -- Yosemite (10.10) or later
use scripting additions
set errorMessage to ""
repeat 1 times
try
on error errorMessage
end try
end repeat
if errorMessage is not "" then error errorMessage
Result: Error: The variable errorMessage is not defined.
Script Debugger shows the variable “errorMessage” as being defined with the value being the empty string “” in the sidebar when it errors on the last line, " if errorMessage isnot “” thenerror errorMessage"
It’s definitely the “on error errorMessage” that’s causing the script to think “errorMessage” is undefined. If I change that variable to “someOtherVariableName” it doesn’t error anymore.
Can anybody clarify what’s going on here? I find it very counterintuitive. The “on error” handler isn’t even being called, why is it making my previously defined variable become undefined?
It looks like something similar to what happens when a command returns no value. The errorMessage variable is being redeclared to be the error message for the try statement’s error handler, except the error is not called.
use AppleScript version "2.4" -- Yosemite (10.10) or later
use scripting additions
set errorMessage to ""
repeat 1 times
try
on error e number n
set errorMessage to e
end try
end repeat
if errorMessage is not "" then error errorMessage
The original script has stuff happening in the “try,” it tries an API call 5 times in a loop (sometimes the server is too busy and the calls fail) and exits the repeat loop on success. Then some other stuff happens and then it needs to know whether or not it has errored earlier. The below construction, where you just use a different variable name for the “on error (VARIABLE)” works fine.
I just found it surprising that the original construction doesn’t work. I’d have thought the original definition of “errorMessage” to be an empty string would be maintained unless there was an error inside the “try,” and then if there was, the value would be updated with the error message. I’m surprised the value of that variable is being touched at all when there is no error, and I’m surprised it’s turning into an undefined variable.
I previously replied with an example that showed how to avoid this error. That response did not address the underlying oddity.
The OPs original code, with errorMessage declared a local variable…
use AppleScript version "2.4" -- Yosemite (10.10) or later
use scripting additions
local errorMessage
set errorMessage to ""
repeat 1 times
try
on error errorMessage
end try
end repeat
if errorMessage is not "" then error errorMessage
…works as intended.
Why does this resolve an issue of the TRY obviously creating a conflict where it does not see the errorMessage as defined even though it clearly was defined and its value is correct in Script Debugger? It appears that the TRY is defining a new LOCAL variable.
Because the variable was not defined within the “context”, this is the power of defining local, global and properties.
IE
property errorMessage : “”
Try to put your code inside of a function.
Then define the errorMessage with the function and see what happens
use AppleScript version "2.4" -- Yosemite (10.10) or later
use scripting additions
on testFunction()
set errorMessage to ""
repeat 1 times
try
on error errorMessage
end try
end repeat
if errorMessage is not "" then error errorMessage
end testFunction
my testFunction()
Thanks for the reply, but what context are you suggesting errorMessage should it be defined in?
--script context
set errorMessage to ""
repeat 1 times
--repeat loop context
set errorMessage to ""
try
--try block context
set errorMessage to ""
on error errorMessage
end try
end repeat
if errorMessage is not "" then error errorMessage
-->Error: The variable errorMessage is not defined.
The errorMessage variable is defined, it just seems to be forced to become an explicitly local variable by its usurpation in the error handler.
set errorMessage to "no error"
repeat 1 times
try
1 / 0
on error errorMessage
end try
end repeat
return {my errorMessage, errorMessage}
-->{"no error", "Can’t divide 1.0 by zero."}
Note: The reserved word my in the statement ifmy HowManyTimes > 10 in this example is required to indicate that HowManyTimes is a property of the script object. Without the word my, AppleScript assumes that HowManyTimes is an undefined local variable.
I appreciate the explanations. I just can’t make it make sense given the explanations.
Can anyone explain the following?
set l to {}
{l, my l}
--> {{}, {}}
while in this script
set l to {}
{l, my l}
-->no result
try
on error l
end try
The existence of the try block and its utilization of the variable ‘l’ forces l to become a local variable even before encountering the try block. Clearly a compile-time coercion. I don’t have an issue with this, but this is not explained by order-of-opertions or references to script objects.
more spice…
set l to {"a"}
l -->{"a"}
my l -->{"a"}
(l & my l) as text -->"aa"
{l, my l} as text -->Error: Can’t make {{"a"}, l of «script»} into type text.
{l, get my l} as text -->no result
try
on error l
end try