Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Select an option

  • Save SuperSandro2000/4b7e047d2b98a7a14e2f6aea7f3ef5ce to your computer and use it in GitHub Desktop.

Select an option

Save SuperSandro2000/4b7e047d2b98a7a14e2f6aea7f3ef5ce to your computer and use it in GitHub Desktop.

Revisions

  1. @azizur azizur revised this gist Jul 9, 2016. 1 changed file with 10 additions and 10 deletions.
    20 changes: 10 additions & 10 deletions Creating a static copy of a dynamic website.md
    Original file line number Diff line number Diff line change
    @@ -3,15 +3,15 @@ The command line, in short…
    `wget -k -K -E -r -l 10 -p -N -F --restrict-file-names=windows -nH http://website.com/`

    …and the options explained
    -k : convert links to relative
    -K : keep an original versions of files without the conversions made by wget
    -E : rename html files to .html (if they don’t already have an htm(l) extension)
    -r : recursive… of course we want to make a recursive copy
    -l 10 : the maximum level of recursion. if you have a really big website you may need to put a higher number, but 10 levels should be enough.
    -p : download all necessary files for each page (css, js, images)
    -N : Turn on time-stamping.
    -F : When input is read from a file, force it to be treated as an HTML file.
    -nH : By default, wget put files in a directory named after the site’s hostname. This will disabled creating of those hostname directories and put everything in the current directory.
    –restrict-file-names=windows : may be useful if you want to copy the files to a Windows PC.
    * -k : convert links to relative
    * -K : keep an original versions of files without the conversions made by wget
    * -E : rename html files to .html (if they don’t already have an htm(l) extension)
    * -r : recursive… of course we want to make a recursive copy
    * -l 10 : the maximum level of recursion. if you have a really big website you may need to put a higher number, but 10 levels should be enough.
    * -p : download all necessary files for each page (css, js, images)
    * -N : Turn on time-stamping.
    * -F : When input is read from a file, force it to be treated as an HTML file.
    * -nH : By default, wget put files in a directory named after the site’s hostname. This will disabled creating of those hostname directories and put everything in the current directory.
    * –restrict-file-names=windows : may be useful if you want to copy the files to a Windows PC.

    source: http://blog.jphoude.qc.ca/2007/10/16/creating-static-copy-of-a-dynamic-website/
  2. @azizur azizur revised this gist Jul 9, 2016. 1 changed file with 1 addition and 0 deletions.
    1 change: 1 addition & 0 deletions Creating a static copy of a dynamic website.md
    Original file line number Diff line number Diff line change
    @@ -1,4 +1,5 @@
    The command line, in short…

    `wget -k -K -E -r -l 10 -p -N -F --restrict-file-names=windows -nH http://website.com/`

    …and the options explained
  3. @azizur azizur created this gist Jul 9, 2016.
    16 changes: 16 additions & 0 deletions Creating a static copy of a dynamic website.md
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,16 @@
    The command line, in short…
    `wget -k -K -E -r -l 10 -p -N -F --restrict-file-names=windows -nH http://website.com/`

    …and the options explained
    -k : convert links to relative
    -K : keep an original versions of files without the conversions made by wget
    -E : rename html files to .html (if they don’t already have an htm(l) extension)
    -r : recursive… of course we want to make a recursive copy
    -l 10 : the maximum level of recursion. if you have a really big website you may need to put a higher number, but 10 levels should be enough.
    -p : download all necessary files for each page (css, js, images)
    -N : Turn on time-stamping.
    -F : When input is read from a file, force it to be treated as an HTML file.
    -nH : By default, wget put files in a directory named after the site’s hostname. This will disabled creating of those hostname directories and put everything in the current directory.
    –restrict-file-names=windows : may be useful if you want to copy the files to a Windows PC.

    source: http://blog.jphoude.qc.ca/2007/10/16/creating-static-copy-of-a-dynamic-website/