Saturday, October 17, 2009

Selective Site Restoration

Recently I had to find, copy, and recreate specific files from a directory tree. What had happened was a web site that I work on had every file with the word index as part of its filename overwritten with some hacker's private agenda. It wasn't a security issue with the site, apparently they hacked their way into the host's server and did it to everyone. Fortunately, I had a copy of the site code locally, because I develop the site locally and then upload what I changed. Unfortunately, there are differences between the two, because the local copy is configured to run locally.

To get the site back up, I just needed to replace everything that had index as part of the filename. To achieve this, I did the following from a terminal.
  1. I went to the top directory that needed to be grabbed.
  2. Typed in find . -iname 'index*' > ~/Web/tmp_files_to_copy to create a file with all of the names and locations of the files that I need to upload (or find . -mtime -60 to find files that have been modified within the last 60 days).
  3. Next I typed in tar cf ~/Web/tmp_files_to_copy.tar -T ~/Web/tmp_files_to_copy which created a tar ball of the files that would eventually be uploaded.
  4. Then I went to a temporary directory and typed in tar xf ~/Web/tmp_files_to_copy.tar to extract the desired files.
  5. Finally I then uploaded the entire directory structure to the web site.
Fixing the site didn't take long. It took significantly longer to figure out what had been broken on the site.

Someday I may modify this into a script that automatically finds for the latest script changes and uploads them to the correct locations on a web site, but that would be a task for a different day.

No comments:

Post a Comment