Outline ·
[ Standard ] ·
Linear+
HTML managing static website with 100 or more pages, and no CMS
|
seanie
|
Dec 12 2010, 11:13 PM
|
|
If you want 'static HTML' because you can't do server-side scripting then you can use script elements to generate the page boilerplate on the client, but you'll have your work cut out to satisfy all user-agents (what if a browser has js turned off, or incomplete support?) and clients may not be able to save a meaningful document for viewing offline.
If it's only 100 pages or so, could you re-generate the content offline and then replace the previous content on the server? It sounds a bit suck, but it's basically similar to what a cache would do if you updated your server-side scripts on a dynamic web. How often are you likely to change the boilerplate? The great thing about doing it that way is that you just write your web using your normal server-side techniques, and then just use wget or similar to 'download' a static version of the web, which you then upload to the static web server (if you can't do it in one shot by using wget on the static version server).
|
|
|
|
|
|
seanie
|
Dec 16 2010, 09:44 AM
|
|
QUOTE wget is too command prompt base, hard to use. CODE wget --mirror http://staticsite1.localnet/ You won't need to do anything else with wget, that's all you have to do to create a static copy of the original web on your hard disk. Before you take that command for a test drive, make sure your target site is a very small one that won't mind you scraping it - wget doesn't check /robots.txt, so won't honour Disallows or Crawl-delays. Added on December 16, 2010, 9:46 amOh and make sure your site doesn't use sessions - there's no counterpart in a static site. And no URLs created in javascript - wget won't see them (but then, nothing will for some kinds of client-side generated URLs). This post has been edited by seanie: Dec 16 2010, 09:46 AM
|
|
|
|
|
|
seanie
|
Dec 16 2010, 09:29 PM
|
|
QUOTE(happy4ever @ Dec 16 2010, 09:12 PM) so it could be problematic. Well, that's your job. Try out a few and see how it goes for you. The one that allows you to export a static site most reliably is the one to use. Wget shouldn't give you any trouble with CSS - it reads it to hunt for background images etc. Javascript is only really a problem if it is generating URLs for server side scripts. If it's generating URLs for content that already exists on the server then wkkay's tip to copy the static content separately from the wget will handle that.
|
|
|
|
|