PaulW
Member
Registered: 26th Jan 03
Location: Atherton, Greater Manchester
User status: Offline
|
code: wget http://blah/backup.php
wget ftp://blah/poo/ --ftp-user=weee --ftp-password=donk -m -nH
Basically... the wget is used to call a backup script which does a database backup, then second part does complete image of the poo folder from within the host...
I'm using -nH so it doesnt create the folder ftp.blah.com, BUT its still making a poo folder...
how can I make a cron job to do both these tasks as 1 job, in the order above (so FTP only starts AFTER wget has ran)???
cheers
|
PaulW
Member
Registered: 26th Jan 03
Location: Atherton, Greater Manchester
User status: Offline
|
also... would this overwrite all existing files with any newer ones & leave all unaltered files intact, or would it not overwrite any files?? as I would of assumed the -m command for wget litterally will 'mirror' the sites contents...
|
James_DT
Member
Registered: 9th Apr 04
Location: Cambridgeshire
User status: Offline
|
Write a script that executes those commands, and set the Cron to run the script?
|
PaulW
Member
Registered: 26th Jan 03
Location: Atherton, Greater Manchester
User status: Offline
|
have added commands into /etc/cron_dome.sh
tried adding this to cron...
00 03 * * 0 root sh /etc/cron_dome.sh
but doesnt work
tested by making the cron as
50 * * * root sh /etc/cron_dome.sh
so it ran at 10 to 6, but nothing... also restarted cron after updating crontab
|
willay
Moderator Organiser: South East, National Events Premium Member
Registered: 10th Nov 02
Location: Roydon, Essex
User status: Offline
|
try:
code:
#!/bin/sh
#script to do paulz shit
wget http://blah/backup.php
wget ftp://blah/poo/ --ftp-user=weee --ftp-password=donk -m -nH
then save it somewhere, chmod +x the file (executable) then add the cronentry as /path/to/script.sh (dont put sh at the start)
|
willay
Moderator Organiser: South East, National Events Premium Member
Registered: 10th Nov 02
Location: Roydon, Essex
User status: Offline
|
also do a whereis wget and get the full path name to wget, then put this into the script as crontab may not be using the same enviromentials as your shell.
|
willay
Moderator Organiser: South East, National Events Premium Member
Registered: 10th Nov 02
Location: Roydon, Essex
User status: Offline
|
btw - have you considered doing this on the server where the website is hosted? you could do a dirty script to call up tar and make the archive with the date in the filename, then use ftp/scp to transfer the file over to your backup location.
|
PaulW
Member
Registered: 26th Jan 03
Location: Atherton, Greater Manchester
User status: Offline
|
i cant do any cron jobs on the server, as its not enabled on the hosting plan
think it didnt work as I was forgetting to chmod +x the file...
testing now
|
PaulW
Member
Registered: 26th Jan 03
Location: Atherton, Greater Manchester
User status: Offline
|
cheers willay it really was a case of chmod +x the file!
|
willay
Moderator Organiser: South East, National Events Premium Member
Registered: 10th Nov 02
Location: Roydon, Essex
User status: Offline
|
elite, though your plan for backups sounds pretty smacky, good luck!
|
PaulW
Member
Registered: 26th Jan 03
Location: Atherton, Greater Manchester
User status: Offline
|
lol basically gona backup the entire thing every sunday at 3am... script creates a .SQL.GZ backup of the database on the host, then all contents are backed up...
need to do it so it actually compares file sizes of each file before downloading, otherwise will overwrite everything each time which tbh I only want it to get modified / updated files...
|