Since deciding to follow a more programmatic route and automation for menial tasks, WordPress is still the favourite for site creation. With this in mind can useful microniches be created focused just on the browser and escape the tyranny of strict keyword matching?
Focused on just the browser experience the first plan is to create the tools. The tools must be able to gather all the data for all the products or information that belongs within a microniche and database it in a way that meaningful content can be created.
The following step is to automate the push of the resulting information up online. Now this would be really easy if the result were static pages, but I want to take advantage of the community power that goes behind WordPress and its development. The real advantage to WordPress is the theme; the look and feel of the site is generally abstracted from the information that is presented in posts and pages and new additions are made for it all the time in the form of plugins. I am focused on WordPress 3.0, which at this moment in time is in its last release candidate stages. This version offers flexibility in the form of custom taxonomies and pages. Now a lot of this desired functionality actually exists in version 2.9.2 of WordPress but it has been brought up to a new level in 3.0 that makes these functions able to be used immediately.
There is a choice of how to make automatic updates. The push of the information to the site can be done either by emulating the posting of information, which is a safer but slower, or by the dumping of data straight into the WordPress database, dangerous but very fast method.
There is also the need to take care of the updating of the site. The tools must only update and not replace the information that exists on the microniche. It will not do the site any good if the content gets completely refreshed on an update to the site. It will confuse the browser and the search engines who send out their little data collecting robots and could possibly think the site has been infected with a virus.
In a way I want to escape the tyranny of the search engines for a while and produce only sites that are useful for people. Obviously very basic SEO needs to be done but I am going to be very flexible as to all the keyword chasing that is going on. The down side to this approach is that the traffic opportunities of just focusing on busy keyphrases will be missed and maybe not do so well in the search engine results pages. If there is a serious absence of organic search traffic then the sites have to be fed from the social web, but I hope they contain enough material for search engine robots to chew on without the usual paranoia that I have been accustomed to in the search engine world.
Google is an entity that most web sites cannot afford to ignore and these steps are almost ignoring it – this may be dangerous. However, for me Google is just another entity that can only do harm, I know their original goal was to ‘do no evil’ but I am seeing its negative effects all over the Internet because it cannot be matched and I’ve seen people’s lives have been both made and broken on this machine. This will be a plan to really attempt to ignore its existence.