![]() |
|
![]() | ||
|
Subject:
Hosting backup software
Category: Computers > Internet Asked by: topbanana-ga List Price: $10.00 |
Posted:
30 Mar 2004 06:12 PST
Expires: 24 Apr 2004 03:16 PDT Question ID: 322311 |
I'm looking for some software - the requirements: Preferably freeware, although having no idea what's on the market I'd accept a freeware solution not being available. Something that would be run on a machine with only some initial user input that would trawl through a site at regular intervals and backup everything locally. (Kind of like an offline mirror - but which would only be made available to the web if the primary server went down). The aim is to be able to simply switch the IP of the domain across to the backup and continue as normal. I realise some companies incorporate this kind of redundancy as part of some hosting packages, but was wondering whether a solution was already available in order to do this alone. |
![]() | ||
|
There is no answer at this time. |
![]() | ||
|
Subject:
Re: Hosting backup software
From: s3com-ga on 30 Mar 2004 06:47 PST |
If You want something FREE, unix|Linux is a best choice. An example of linux backup software is rsync. But it is complicated OS in comparison with MS Windows. http://www.linuxforum.com/shell/rsync/92-8.php http://www.webmasterworld.com/forum40/907.htm regards, s3com |
Subject:
Re: Hosting backup software
From: topbanana-ga on 30 Mar 2004 06:55 PST |
I realise this might not be technically possible, but I was thinking of a solution that would be more along the lines of inputting a top level domain, and having something trawl the page for links/images etc.. and saving this in the appropriate directory. This obviously wouldn't work for php and such and would extremely limit the content that would be backed up, but it wouldn't require the user/pass of the server currently hosting the site which is what I'm initially after. |
Subject:
Re: Hosting backup software
From: 99of9-ga on 30 Mar 2004 07:21 PST |
If you're familiar with linux, I believe the recursive "wget" command can do such a trawl. But I'm no expert. |
Subject:
Re: Hosting backup software
From: purplepenguin-ga on 15 Apr 2004 16:36 PDT |
Wget sounds like a perfect canidate for your requirements. It has the ability to do recursive crawls of linked files (which can be restricted to all files on a given top-level domain, or other criteria you specify). Wget also has the ability to do http authentication, and ftp, and local hyperlink conversion. It has support for cookies; it has the ability to The main site for it is: http://www.gnu.org/software/wget/wget.html You can download it for windows, or several other OS's To do what you are saying you would need to add a scheduled task to check the site on an interval. The -N switch causes it to only download files if the site contains a newer one. A note on installation under windows: There is no setup program, or msi package, but all you need to do is download the wget-complete-stable.zip from the above site, and read the readme.txt contained within. I have been using it on several windows systems for years now without any problems. See the wget.hlp from the wget-complete-stable.zip file if you need any more details on this program. |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |