do shell script "curl http://www.google.com/ -o `date '+/Users/username/Desktop/backups/db%y%m%d/site_com%y%m%d.sql'`"
Type “man curl” in a Terminal window for all info.
And there are lotsa options to schedule it… You could even create a cron task with these shell statements, without the need of AS at all…
on get_url(this_url)
set browser_string to “‘Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en-us) AppleWebKit/125.4 (KHTML, like Gecko) Safari/125.9’”
return do shell script "curl " & (quoted form of this_url) & " -A " & browser_string
end get_url
The man curl was nice, but it gives a strange problem, it only prints so much text that fits in the window, and I get no scroll. When I make the window bigger I get more info.
As you can hear, this is like the first time I open the Terminal…
You “scroll” man pages using the space bar. And you exit them using the key “q”.
Can you post your not-working code based in my first example? (anyway, your code is near the same than mine, both are valid)
do shell script "curl http://www.5clubs.com/backup/ | bbedit"
which opens the source in bbedit and from there I could tell bbedit to save it where I like, but I don´t have bbedit on the server in mind (but I do have textEdit). So what I need now is how to create a file with this name (domain)050615.sql as a plain text-file
set fileName to do shell script "date +%y%m%d"
In asp I would use an array for the urls since this will become a loop
global pParentFolder, full_url
on idle theObject
doBackup()
-- return 24 * hours
return 10 -- run every 10 seconds just for now
end idle
on run
set pParentFolder to choose folder with prompt "Choose a folder where to save the backup files"
set full_url to text returned of (display dialog "URL: " default answer "http://www.5clubs.com/backup/")
doBackup()
end run
to doBackup()
set dateTime to do shell script "date +%Y%m%d%-%H%M%S"
set new_file to dateTime & ".sql"
tell application "Finder" to make new file at pParentFolder with properties {name:dateTime & ".sql"}
set sHtml to do shell script "curl " & (quoted form of full_url)
set this_file to pParentFolder & new_file
my write_to_file(sHtml, this_file, false)
end doBackup
on write_to_file(this_data, target_file, append_data)
try
set the target_file to the target_file as text
set the open_target_file to ¬
open for access file target_file with write permission
if append_data is false then ¬
set eof of the open_target_file to 0
write this_data to the open_target_file starting at eof
close access the open_target_file
return true
on error
try
close access file target_file
end try
return false
end try
end write_to_file
Next step is making it work for more than one site and also making it download only the tables of my choise. I was thinking of saving the urls in a textfile and then store all the table-names tabseparated after. If I build a page that spills only the tablenames and have a function where the script gets the info, stores it on a row and maby I can just edit the list in the textfile. Something like this: