Another problem with the Folder Actions...multiple instance?


I made a Folder Actoins Script that when some files are dropped in, if they are pictures, get some info, create a folder with the date and time (something like: 20040617143458), put the file in it, create a text file with the info and send the folder to a remote share and delete the local folder, When I test it manually, like dropping a bunch to file in the folder, the script work perfectly and process each file, one after the other in order.

The goal of this script was to proccess files automaticaly send from a PC software. The thing is that the software send more than a file a second and it’s seem that for each file, the script start and it’s like as if it’s skipping some file. I’ve gotta process the files in the order they come, but it isn’t woring like a ‘first in first out’ but more like a ‘last in first out’. And if after some time (90 minutes later) some folders are still there ( half processed??) and I delete those, I start getting error, and since I try to manage all error, I generaly put the name of the folder in the error message, witch give me a good idea when the folder was created, and a lot are more than 1 hour ago… strange.

And since I’m using the date and time for the folder’s name, sometime the folder already exist and I gotta wait.


1- Is there a way to make the Folder Actions Script to check each 5 minutes or so for new items? or to forget the Folder Action and just Schedule the script to check the folder each 5 minutes?

2- Is there a way to remember a value, like the first time the script start, the value is 001 the second time it’s 002 the tird time its’ 003 and so on…?

3- Is there a way to get the millisecond of the currentDate?

For now, the PC software is configured to send the file to another folder and when done, a human drag all files from that folder and drop it in the Folder with the Folder Actions script attached and it’s working fine, but I need it to be automatic.

Any suggestion will be appreciated.



I kind of ran into the same problem when I was making some scripts a while back. It seems that if a script that is attached to a folder is running, and then another file is added to that folder, the first script stops, and starts over for the most recent file. Something like that. I just know things get really messy when the script doesn’t finish before another file is added to folder.

What I ended up doing was creating a sort of buffer for the files. There is a folder that holds all the incoming files. The script attached to that folder checks the “process” folder to see if any files are being processed by the script I want to run on all the files. If there are no files in that “process” folder the script moves a file from the “buffer” folder to the “process” folder. This then sets of the real script I want to run. At the end of all the processing, I just deleted the file that was processed since I didn’t need it anymore. If you still need that file, have the script move it to a desired location. Then you can have a Folder Action Script that runs when a file is removed from the folder. This script does essentially the same thing as the script attached to the “buffer” folder. It checks the “process” folder (which should now be empty) and moves another file into the “process” folder. This effectively runs the script on all the files, one at a time.

Sorry if that was all a little confusing, but when I got it working right, it’s pretty slick. I hope this helps, or gives you some ideas. Ask any questions about what I did if you don’t understand

Well, it kinda confirm what I suspected and give me some ideas.

But is there a way to know if the ‘process folder’ script is running instead of checking if there is some files in the ‘process’ folder?



Well, the way my script finds out if the “process” folder script is running is by checking if there are files in the folder. The very last command the script executes is to delete the file that started the script. Once that file is gone, the script is done executing. If you don’t want to delete the file, you can achieve the same effect by moving the file for the very last command.

So, won’t the problem just change place?

At first, I thought I should do something like:

when receiving new item in current folder, check my other folder, if empty, move all files from current folder to my other folder, else, stop.

but then, the last couples of files received in the current folder, might not be moved to my other folder if he’s not finished processing…

and if instead of quiting if my other folder isn’t empty, I just wait, wouldn’t it cause the same problem?.. having a couple of script waiting

or am I missing something?



Ok, I missed something…

I should now be able to make something work…

Thanks again.