The first tool you need for developing web pages is a good editor. Many people have these glitsy WYSIWYG HTML editors, but I prefer really knowing what HTML codes are in my pages. Hey, I'm an engineer, I like knowing what's under the hood.
Standards are a Good Thing
One of the goals of the Web is enabling as many browser clients as possible to accurately render web pages. To accomplish this the World Wide Web Consortium (W3C) has written standards defining what is correct HTML and what is not.
Another useful standard from the Dublin Core is the Dublin Core Metadata for Resource Discovery which describes a common set of META tags. The Metadata Element Set is an update to the definitions in the RFC.
To get your web pages to render correctly with other character sets there are some pitfalls with current browsers. A tutorial on character code issues is a great article on getting things working.
A great demonstration for seeing multiple languages on one page is presented as an advert for the Tenth International Unicode Conference.
There are many tools available for helping in the creation of web pages. These are the ones I use to create my web pages.
I used to use a web page templating setup I rolled myself (see [m4 Macro Processor] below). Now am using ikiwiki to create all the pages on this site. ikiwiki produces static web pages like my home-brew solution but also has many desirable features that I would never had added.
Here is a list of ikiwiki features that I am using on my site:
I used to write all my web pages by hand in HTML so that I had the greatest amount of control over the content of my web pages (I just hate it when fancy HTML editors cram lots of junk in my web pages). I have softened a bit in my old age and now allow MultiMarkdown or, if not available, the original Markdown to do most of the heavy lifting.
What I like about the Markdown syntax is that you can maintain very readable e-mail-like text that gets converted into HTML. And for the tasks Markdown is not up to, Markdown will directly pass through inline HTML so you retain fine control of the output. The best of both worlds.
Link Checker is a simple tool for checking links on pages. This is a convenient tool for me because it comes packaged in Fedora already. Hence, this is the tool I use for validating links on my web pages.
The following command is what I use to validate my pages:
linkchecker --ignore-url=^mailto: --ignore-url=^ftp: \ --ignore-url=/tags/ --ignore-url=/blog/ \ --ignore-url=linkchecker-out.html -Fhtml/utf8/ \ http://moria.greycastle.net/
Webcheck is another tool for validating that all the internal and external links on a web page are still good (i.e., not dead links).
Webcheck is based on linbot which was bullied out of existence. The original author of linbot appears to have passed the torch on to a new maintainer who changed the name of the program to avoid the same treatment as linbot.
m4 Macro Processor
While it may take a bit to get a handle of the syntax for this package, it can be a very powerful tool. I used to use m4 as the template engine for all my web pages. Now I use [ikiwiki]
The old m4 setup defines a skeletal HTML file that will be the basis for a page, a template if you will, that m4 will parse and include the body part from another file. It took me a bit of time to get this working the way I like (actually most of the time was getting it into shape so that my wife could also use it) but it actually worked well. Well enough to generate all my wife's pages and mine.
- The m4 Macro Package
- Writing HTML with m4
- An Introduction to m4
- m4 Part 1: Macro Magic and Part 2: The Unknown Power Tool
One problem the web has is that its data is too transient. Your favorite site that that you read everyday can just one day drop of the face of the earth and you kick yourself for not mirroring all pages you use for reference. Fear not, the Internet Archive probably mirrored that site for you.