I am duplicating a running Helix collection file and it’s log file (not a Library file) to a local directory. It fails intermittently. About every 5 times, telling me that the log file is in use. (On the duplicate with replacing step.) In a handler before this one I save the collection file, resulting in a log file that is tiny and where the collection and its log are synced fully.
CollectionName and collectionFile are properties. It all works great, except when the Finder thinks the log file (always the log file) is “in use”. This is in deployed in 10.8.5…
Any help is appreciated. Thanks. Lenny
tell application “Finder”
set dupArray to {}
–move the collection into the array
set CollectionName to name of collectionFile
copy collectionFile to the end of dupArray
--move the logfile into the array
set theCollectionLogfile to ((mainLoc as text) & "Log_" & CollectionName & ".hlog")
if (exists file theCollectionLogfile) then
copy theCollectionLogfile to the end of dupArray
end if
--do the duplicate
duplicate dupArray to backupFolder with replacing
If you change your scheme from :
send a list of pathname to Duplicate
to
use a loop sending one path at a time to Duplicate
you will be able to check that a duplication is finished before starting an other one
or to encapsulate a dupliate instruction in a try . end try block
allowing you to retry until the process no longer error
Yvan KOENIG (VALLAURIS, France) vendredi 17 janvier 2014 18:05:07
Thanks everyone for replying.
McUsril, unfortunately, this isn’t that kind of log file. Those do exist, but this is external file that saves every keystroke so that it can be applied in the event of a system crash. It works extremely well. Helix is a great database platform.
Yvan, the reason I was doing it this way was because the program is running, and I figure if the database and the log file were copied at the same time there would NOT be an opportunity for them to be out of sync. Database collection file in this case is about a Gig and the log file is nothing. It takes about 5 secs or so to copy on an SSD drive. If I ran a repeat loop, then in the time it took to copy the file, there would likely be entries in the log file… They have about a dozen concurrent users.
I wonder if there is a Terminal command I can use to see what program is using this file. I think I am just going to rm the prior file, that seems to work, but I’d rather get it right. The file is still in use after hours of waiting so it makes me think that it might not be a valid backup anyway…
Thanks.
I just tried this. Nothing is using the file, and it launches just fine. One wrinkle, that might make things more clear is that I was wrong - it wasn’t the second duplicate command, it was the delete file just before the second duplicate that was failing. This little bit:
with timeout of 240 seconds
tell application "Finder"
delete every file of backupFolder
end tell
end timeout
seems simple enough… don’t know why it would fail on two files, which is all that’s ever in there. An obvious conclusion, of course is that it isn’t the deletion that’s the problem but duplicating the file where it is left in some indeterminate state.Probably by duplicating an array instead of the files themselves with a repeat.
I figure the Finder is going to lock the two files when they re being copied in some way and I would like it to lock them both. Might not be the right term, but I’m sure you get the idea. Perhaps I should resort to a do shell script with a copy, listing both files at once, if it will do it.
Maybe Finder are using several threads to do it, and it may still be copying your file in the background while it tries to delete it on another thread, then the FileObserver would possibly kick in and say “Hey you can’t do that.”.
I think you should resort to either using unix tools, or use NSFileManager (ASOC) for copying & deleting anyway.
But try using System Events as a first stepping stone. (Finder may have it’s own ideas about files with .log extension.)
Thanks, I came to the same conclusion. I have all kinds of adding and deleting in this script and they re all tested and working. Still, I decided to use rm -f on this one… I think I will add a unix copying thing as well…