SLUG Mailing List Archives
[SLUG] Spider a website
- To: SLUG <slug@xxxxxxxxxxx>
- Subject: [SLUG] Spider a website
- From: Peter Rundle <slug@xxxxxxxxxxxxxxxxxx>
- Date: Tue, 03 Jun 2008 14:20:08 +1000
- User-agent: Thunderbird 22.214.171.124 (X11/20071115)
I'm looking for some recommendations for a *simple* Linux based tool to spider a web site and pull the content back into
plain html files, images, js, css etc.
I have a site written in PHP which needs to be hosted temporarily on a server which is incapable (read only does static
content). This is not a problem from a temp presentation point of view as the default values for each page will suffice.
So I'm just looking for a tool which will quickly pull the real site (on my home php capable server) into a directory
that I can zip and send to the internet addressable server.
I know there's a lot of code out there, I'm asking for recommendations.