Launching Concurrent Processes from Script

Hi, I’m having a devil of a time trying to get processes launched from my script in non-blocking mode. Here’s what I’m trying to do:

– Loop through a list of files (received from a folder action being triggered).

– Check file size of each periodically to determine when the file is complete (done copying).

– When a file is determined to be complete, upload it to an ftp site using curl in a shell process. This needs to happen at the same time the script is still independently checking other files.

– The files need to be uploaded concurrently as soon as they are ready, not in sequence (in other words, in separate processes).

When I use “do shell script” like so:

set myCommand to "cd " & quoted form of uploadDirectory & ";" & "curl --upload-file "
set myCommand to myCommand & quoted form of uploadFileName
set myCommand to myCommand & " " & remoteURL
set myCommand to myCommand & " --user " & userName & ":" & userPasswd
set myCommand to myCommand & " &"
do shell script myCommand

the problem is my script blocks until each upload is complete. I was hoping using “&” on the end would launch it in non-blocking mode. This works when entered manually from Terminal, but does not using “do shell script”. Seems like a bug to me, but in any case, it doesn’t do what I want.

The next thing I tried was to tell Terminal to “do script” using the same parameters. Here everything works as I want initially: separate curl processes are launched without blocking and I get multiple, simultaneous uploads. Nifty. However, it needs to activate Terminal, opens up a window for each file uploaded, and doesn’t clean up after itself when done (Terminal and all the windows remain open). There may be dozens, hundreds, or thousands of files uploaded. I also need to sometimes be using Terminal manually on this machine while this stuff is happening, so killing Terminal outright when done is not an option and no interaction at all with anything that might conflict with a human user is preferable. That’s why I’m hoping for a solution that does not involve opening applications and windows.

Any ideas of how I could do this? If I run an external AppleScript from another script, will it run in a separate thread? Would it be better to just have the folder action script pass the file names off to a python script to handle it or something? I’m using Leopard on a octocore Intel, though I’d like to be able to use the script on a quadcore G5 running Tiger as well. Thanks,

It sounds like you need to redirect the stdout and stderr of the process that you are trying to run in the background.

When do shell script starts a shell to interpret the shell command/“script”, it opens up pipes to collect the stdout and stderr outputs from the shell. The stdout pipe supplies do shell script with its string return value and the stderr pipe supplies do shell script with part of its error text if the shell signals an error. Both of these are important for the specified operation of do shell script (well at least stdout is).

When the shell spawns children (starts subprocesses), they inherit the shell’s stdout and stderr (unless the shell is instructed to provide other arrangements for stdout and or stderr; see below). This is true even if the shell launches process “in the background”.

Your problem comes from the fact that do shell script always waits for all processes to close the stdout and stderr pipes that it initially created for the shell (even after the shell itself has exited). In your case the background curl process still has open the stdout and stderr pipes to do shell script. So, do shell script is dutifully waiting until curl has closed those pipes (thus indicating that do shell script can collect no more output for its return value or error message), even though curl was started in the background (with respect to the shell).

If you do not care to save the output from curl, make your command look something like . &> /dev/null &. “&> /dev/null” tells the shell to send all the process’s output to /dev/null (which will ignore any such output). The last ampersand is the normal “run it in the background” shell instruction. If you want to keep the output of your command, replace /dev/null with the (POSIX) path to a suitable log file.

Wow, thanks for that great explanation, Chris, that does make sense. And it works perfectly! :cool: