Current Topic: Web programming is and has always been an art. Lately it has become a careful balance between between media and content (copy). Because there are many Operating-Systems, Internet-Browsers and Search-Engines simple mistakes in programming (presenting content) can disenfranchise a large population of potential viewers.
Test Your HTML On A Compliant Browser...
There are several differences between one browser and all others. The one browser incorrectly implements some elements and additionally ignores certain attributes from others. None of which is 'Standards Compliant'. The others all render a web page close enough to themselves to ignore the differences. The one will render a huge mess for certain common elements because of interpretation differences of the standard. This presents a huge problem for programmers but over time the best have learned that it is much easier to write compliant code and then adjust it for the exception rather then the other way around.
Me... I prefer 'Firefox'. Not only is it well maintained and multi-platform but it has additional developer features which I depend on. My experience has also shown that if it displays properly on Firefox then it will on all the others. Except for the 'one'. Which I have to program exceptions for. Occasionally.
Presenting Media Correctly...
Modern media is not only a bandwidth concern it needs also to be sensitive to the user and whatever system they may be using. For every failure of a web page to load fast and behave responsively there is a direct relationship to sloppy (lazy) code. There is, almost, always an alternative method which promotes better Browser performance for the simple cost of better preparation. Sadly I guess since so much public media is transient nobody cares anymore.
Reduce thumbnail image quality and size. A 'presentation' page should not contain high definition media. These images should be teasers to encourage an audience to click-through (follow, explore). If they aren't. The presentation failed. I go out of my way to compress my 'teaser' thumbnails to minimum size vs clarity while following the thumbnail link leads to the final high quality presentation. The user gets what they expected and the overall 'user experience' is improved. This applies, of course, to banners. I am still amazed at how often I see a banner image that was originally of low color balance (few colors) that was compressed as a high quality JPEG trying to preserve the lack of depth and high contrast edges (between colors). JPEGs fail miserably at this. However PNG or GIF compression allows reduced color formats which can be reduced to very small files with 'no' loss of image quality. I've commonly seen this detail represent a 50 to 1 saving on image size and reduce load size by 95 percent.
Understand What You Are Embedding. The minimalist rule applies especially to embedded objects. The time it takes some systems to load and initialize numerous embedded instances of the same object can be horrible. Not everyone has the latest greatest computer and access that an author had. Tragic mistake to not recognize this. It is always a better practice to instead have an image or animated image as a tease where the click-through links to a page that loads the embedded object and presents the media. As an example many embedded adds use Adobe Flash to present the media. Many websites also display many adds without knowing or caring how they might spoil the 'user experience' by causing slow load times. Especially on Linux systems where the Flash Player is poorly optimized. Some viewers may be so turned off by this that they navigate away to another source for the information they seek. Don't make this mistake.
Program To The Least Common Denominator...
Blah blah!
Use Original Content...
Original content is very important to search engines. Don't think you can just create a web site, based on content from and or links to other sites, put a bunch of advertisements on it and retire. Search engines are very sensitive to this kind of trickery and will judge your site to be unoriginal and boring. In the worst case they might 'Blacklist' your site. Making sure it doesn't place in their search results and your site becomes invisible.
Make Sure External Links Keep In Context...
For a short time search results were partially ranked on how-many external websites linked to yours. Additionally this included social media suggestions. This led to a whole cottage industry of phony-link (likes) sites. For a small fee you can have, instantly, hundreds of links to your site. Artificially. And search engines recognized this as 'Unnatural' and 'Non Organic'. Search engines know 'everything'. All that's necessary are algorithms, that put the data together, and it's obvious who's been naughty. And the algorithms keep getting better and better.
That said... Links are still important for search engine ranking. But now an additional emphasis is placed on the context of the link. In other words do the source or destination 'content' have similarities to each other and might the user be additionally interested in the content provided from the link. In either direction. Yeah! Search engines are getting clever.
The 'Moral' to this story is that an Honest website is a Good website. After that all websites are equal in the algorithms of the 'search engine'.
Using CSS To Define Coherent Web Sites...
Understanding HTML means understanding it's elements and how HTML represents them. For each element there is a set of rules (methods, attributes, behaviors) that define how it will react to user input, how it will display itself and how much space (area) it will consume on the page. Additionally elements can be stacked (nested). When this occurs the descendant (stacked element) will initially 'inherit' these rules from the 'ancestor' (container element). HTML defines a method of overriding the rules of every element. And these override statements are what make HTML a blank canvas suitable for anything imaginable. Without this ability web pages would render as if they came from a typewriter. No Style.
Originally in order to modify the 'style' of an element it was required to code the style into the elements definition (tag). Sometimes many attributes needed to be changed. On each element. For hundreds of elements. And thus there was an inherent huge redundancy in coding. This led to an alternative method which improves efficiency and reduced HTML file size and thus parsing time (bandwidth) while preserving, actually enhancing, individual element presentation.
This is accomplished through CSS or 'Common Style Sheet' representing a body of code (CSS) that defines the style for elements and additionally their descendants presented (to the programmer) in an object oriented fashion. Using this approach requires defining only the style in an elements definition. The elegance of this method, requiring only inclusion of a 'style definition' in an elements definition, removes all the extraneous inline style code, returning HTML to a more pure form. At a deep level this ability lifts burden from the HTML parser on several levels making for better (faster) responsiveness on the Browsers part.
I Hate Social Media (my problem)...
Just saying!
Avoid External Tools Which Will Eventually Break...
It is usually ok for an amateur, when creating their first website, to use a web 'site' development tool just to get started. After that point it should become a priority to learn how to properly program HTML (and everything it brings with it) or hire a real website developer to do it for you. Most of these tools generate low quality HTML and reference external methods (gadgets) like page counters, various statistics and media presentation objects. And. The code that most of them generate is not only sub-standard it is also full of mistakes (programming errors). All of which will eventually break a website.
Validate Your HTML...
I program all my web pages by hand. And I make a lot of simple mistakes. Surprisingly though... I make less mistakes then some of the web authoring tools I used to use. It seems that most tools do not really produce valid HTML. They make lot's of mistakes. And unfortunately most browsers are very tolerant of them so they go unnoticed. At many levels. Sadly... Most of these mistakes are trivial and require very little to correct. But how does one find them. That's where a 'Validator' can be used. I use the one provided by the 'w3c.org' group. I use this for my HTML as well as my CSS kind of as a sanity check.
Using Script To Generate HTML...
My websites turn out to be either database driven or media heavy or both. In either case I find it useful to break a page down into sections. Typically...
Watermark. The first few lines. Defines the document type and begins the 'head' directive. Defines the head and any initial meta data that is common to all web pages.
Body. Closes the head and opens the body directive. Defines any last meta data that is common to all web pages.
Footer. Closes the body. Defines navigational and statistical information for and about the page.
Typically I define these as code fragments using script. Usually PHP.
Additionally I create a body fragment in pure HTML. Using the fragments
and the scripts ability to 'include' fragments I can build a set of web
pages that all have consistent look and operation. With the addition of
CSS to control behavior I create websites that take very little effort
to control (change) the overall presentation. As an example... The actual
web page (script) your browser requested most likely read...
include watermark
some additional meta information programming local to this page
include body
some additional programming local to this page
include menu
include content
include footer
With this simple recipe the only thing that stops me from creating endless pages is 'Good Original Content'. Exactly like it should be.
5921