After too many hours on figuring out a way to download backups from my sites. I have a page on my site that detects my ip and spits everything in the database but only when I surf there. As it is now I have to set the folder and site every time I run the script, it would be nice if it remembers it…
What I´d like now is adding so that it downloads from different sites. In asp I would save the urls and folders in a array (or database, but I´d rather not) and then loop trough that array downloading and saving in different folders. Maby I need to write to a textfile?
Anyone?
global pParentFolder, full_url
on idle theObject
doBackup()
-- return 24 * hours
return 10 -- run every 10 seconds just for now
end idle
on run
set pParentFolder to choose folder with prompt "Choose a folder where to save the backup files"
set full_url to text returned of (display dialog "URL: " default answer "http://www.5clubs.com/backup/")
doBackup()
end run
to doBackup()
set dateTime to do shell script "date +%Y%m%d%-%H%M%S"
set new_file to dateTime & ".sql"
tell application "Finder" to make new file at pParentFolder with properties {name:dateTime & ".sql"}
set sHtml to do shell script "curl " & (quoted form of full_url)
set this_file to pParentFolder & new_file
my write_to_file(sHtml, this_file, false)
end doBackup
on write_to_file(this_data, target_file, append_data)
try
set the target_file to the target_file as text
set the open_target_file to ¬
open for access file target_file with write permission
if append_data is false then ¬
set eof of the open_target_file to 0
write this_data to the open_target_file starting at eof
close access the open_target_file
return true
on error
try
close access file target_file
end try
return false
end try
end write_to_file