Featured Images

Protecting Your Search Engine Rankings

Protecting Your Search Engine Rankings

Your website’s ranking on search engines is a vital element of your overall marketing campaign, and there are ways to improve your link popularity through legitimate methods. Unfortunately, the Internet is populated by bands of dishonest web masters seeking to improve their link popularity by faking out search engines.

The good news is that search engines have figured this out, and are now on guard for “spam” pages and sites that have increased their rankings by artificial methods. When a search engines tracks down such a site, that site is demoted in ranking or completely removed from the search engine’s index.

The bad news is that some high quality, completely above-board sites are being mistaken for these web page criminals. Your page may be in danger of being caught up in the “spam” net and tossed from a search engine’s index, even though you have done nothing to deserve such harsh treatment. But there are things you can do – and things you should be sure NOT to do – which will prevent this kind of misperception.

Link popularity is mostly based on the quality of sites you are linked to. Google pioneered this criteria for assigning website ranking, and virtually all search engines on the Internet now use it. There are legitimate ways to go about increasing your link popularity, but at the same time, you must be scrupulously careful about which sites you choose to link to. Google frequently imposes penalties on sites that have linked to other sites solely for the purpose of artificially boosting their link popularity. They have actually labelled these links “bad neighbourhoods.”

You can raise a toast to the fact that you cannot be penalized when a bad neighbourhood links to your site; penalty happens only when you are the one sending out the link to a bad neighbourhood. But you must check, and double-check, all the links that are active on your links page to make sure you haven’t linked to a bad neighbourhood.

The first thing to check out is whether or not the pages you have linked to have been penalized. The most direct way to do this is to download the Google toolbar at http://toolbar.google.com. You will then see that most pages are given a “Page rank” which is represented by a sliding green scale on the Google toolbar.

Do not link to any site that shows no green at all on the scale. This is especially important when the scale is completely gray. It is more than likely that these pages have been penalized. If you are linked to these pages, you may catch their penalty, and like the flu, it may be difficult to recover from the infection.

There is no need to be afraid of linking to sites whose scale shows only a tiny sliver of green on their scale. These sites have not been penalized, and their links may grow in value and popularity. However, do make sure that you closely monitor these kind of links to ascertain that at some point they do not sustain a penalty once you have linked up to them from your links page.

Another evil trick that illicit web masters use to artificially boost their link popularity is the use of hidden text. Search engines usually use the words on web pages as a factor in forming their rankings, which means that if the text on your page contains your keywords, you have more of an opportunity to increase your search engine ranking than a page that does not contain text inclusive of keywords.

Some web masters have gotten around this formula by hiding their keywords in such a way so that they are invisible to any visitors to their site. For example, they have used the keywords but made them the same colour as the background colour of the page, such as a plethora of white keywords on a white background. You cannot see these words with the human eye – but the eye of search engine spider can spot them easily! A spider is the program search engines use to index web pages, and when it sees these invisible words, it goes back and boosts that page’s link ranking.

Web masters may be brilliant and sometimes devious, but search engines have figured these tricks out. As soon as a search engine perceive the use of hidden text – splat! the page is penalized.

The downside of this is that sometimes the spider is a bit overzealous and will penalize a page by mistake. For example, if the background colour of your page is gray, and you have placed gray text inside a black box, the spider will only take note of the gray text and assume you are employing hidden text. To avoid any risk of false penalty, simply direct your webmaster not to assign the same colour to text as the background colour of the page – ever!

Another potential problem that can result in a penalty is called “keyword stuffing.” It is important to have your keywords appear in the text on your page, but sometimes you can go a little overboard in your enthusiasm to please those spiders. A search engine uses what is called “Key phrase Density” to determine if a site is trying to artificially boost their ranking. This is the ratio of keywords to the rest of the words on the page. Search engines assign a limit to the number of times you can use a keyword before it decides you have overdone it and penalizes your site.

This ratio is quite high, so it is difficult to surpass without sounding as if you are stuttering – unless your keyword is part of your company name. If this is the case, it is easy for keyword density to soar. So, if your keyword is “renters insurance,” be sure you don’t use this phrase in every sentence. Carefully edit the text on your site so that the copy flows naturally and the keyword is not repeated incessantly. A good rule of thumb is your keyword should never appear in more than half the sentences on the page.

The final potential risk factor is known as “cloaking.” To those of you who are diligent Trekkies, this concept should be easy to understand. For the rest of you? – cloaking is when the server directs a visitor to one page and a search engine spider to a different page. The page the spider sees is “cloaked” because it is invisible to regular traffic, and deliberately set-up to raise the site’s search engine ranking. A cloaked page tries to feed the spider everything it needs to rocket that page’s ranking to the top of the list.

It is natural that search engines have responded to this act of deception with extreme enmity, imposing steep penalties on these sites. The problem on your end is that sometimes pages are cloaked for legitimate reasons, such as prevention against the theft of code, often referred to as “page jacking.” This kind of shielding is unnecessary these days due to the use of “off page” elements, such as link popularity, that cannot be stolen.

To be on the safe side, be sure that your webmaster is aware that absolutely no cloaking is acceptable. Make sure the webmaster understands that cloaking of any kind will put your website at great risk.

Just as you must be diligent in increasing your link popularity and your ranking, you must be equally diligent to avoid being unfairly penalized. So be sure to monitor your site closely and avoid any appearance of artificially boosting your rankings.

The Importance of Referrer Logs

The Importance of Referrer Logs

Referrer logging is used to allow web servers and websites to identify where people are visiting them either for promotional or security purposes. You can find out which search engine they used to find your site and whether your customer has come from a ‘linked site’. It is basically the URL of the previous webpage from which your link was followed.

By default, most hosting accounts don’t include referrer logs but may be subscribed for an extra monthly fee. If your web host does not provide a graphic report of your log files, you can still view the referrer logs for your website by logging into the host server using free or low-cost FTP software, like these:
(links have been checked and are OK)

FTP Explorer:   http://www.ftpx.com/(costs around $40 but free trial available)
LogMeIn: https://secure.logmein.com/dmcq/103/support.asp (free trial available)
SmartFTP: http://www.smartftp.com/
FTP Voyager: http://www.ftpvoyager.com/
Filezilla: http://filezilla-project.org/ (the free one I use which is very good)
Ipswitch: http://www.ipswitch.com/ (which is a professional standard FTP solution which I also can recommend)

The log file is available on your web server which can be download into your computer later. You can use a log analysis tool, like those mentioned below, to create a graphic report from your log files so that the files are easier to understand.
(links have been checked and are OK)
Abacre Advanced Log Analyzer http://www.abacre.com/ala/
Referrer Soft http://www.softplatz.com/software/referrer/
Log Analyzer http://www.loganalyzer.net/

You can view the files using Word, Word Perfect, txt or WordPad files even if you don’t have the right tool. This information is very crucial to your business and marketing plans and is not advisable to neglect it.

In addition to identifying the search engine or linked site from where your visitor arrived, referrer logs can also tell you what keywords or keyword phrases your client used for searching.

As referrer information can sometimes violate privacy, some browsers allow the user to disable the sending of referrer information. Proxy and Firewall software can also filter out referrer information, to avoid leaking the location of private websites. This can result in other problems, as some servers block parts of their site to browsers that don’t send the right referrer information, in an attempt to prevent deep linking or unauthorized use of bandwidth. Some proxy software gives the top-level address of the target site itself as the referrer, which prevents these problems and still not divulging the user’s last visited site.

Since the referrer can easily be spoofed or faked, however, it is of limited use in this regard except on a casual basis.

How Do Search Engines Work – Web Crawlers

How Do Search Engines Work – Web Crawlers

It is the search engines that finally bring your website to the notice of the prospective customers. Hence it is better to know how these search engines actually work and how they present information to the customer initiating a search.

There are basically two types of search engines. The first is by robots called crawlers or spiders.

Search Engines use spiders to index websites. When you submit your website pages to a search engine by completing their required submission page, the search engine spider will index your entire site. A ‘spider’ is an automated program that is run by the search engine system. Spider visits a web site, read the content on the actual site, the site’s Meta tags and also follow the links that the site connects. The spider then returns all that information back to a central depository, where the data is indexed. It will visit each link you have on your website and index those sites as well. Some spiders will only index a certain number of pages on your site, so don’t create a site with 500 pages!

The spider will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the moderators of the search engine.

A spider is almost like a book where it contains the table of contents, the actual content and the links and references for all the websites it finds during its search, and it may index up to a million pages a day.

Example: Excite, Lycos, AltaVista and Google.

When you ask a search engine to locate information, it is actually searching through the index which it has created and not actually searching the Web. Different search engines produce different rankings because not every search engine uses the same algorithm to search through the indices.

One of the things that a search engine algorithm scans for is the frequency and location of keywords on a web page, but it can also detect artificial keyword stuffing or spamdexing. Then the algorithms analyze the way that pages link to other pages in the Web. By checking how pages link to each other, an engine can both determine what a page is about, if the keywords of the linked pages are similar to the keywords on the original page.

Three Essential Keyword Research Tools

Keyword Research Tools

A keyword is a word or phrase that a web user types into the web browser in order to search for a topic that she is interested in finding information about. The web search engines take this word or phrase and searches its indexed catalogue and returns to the web browser those items (websites) that contain relevant content to the user. There are sophisticated algorithms behind the search that eliminate junk but that’s basically it – a search engine carries out a rapid search through an indexed database. The trick as website owners is to get our websites into the presented lists as high up as possible for the web user to browse. There is a lot of evidence in general search usage that most users will only skim through at most one to two pages (20 presented items) to find the information they need – they satisfice and use the first presented data that will do. The same argument applies to paid click adverts as users will only browse one or two pages at most they will only see the top ten ads on the right of the screen on the first page. As the ads a placed based on the original keywords selected by the user there is stiff competition to get your ad onto the first presented page (an average rank of at less than ten).

So one needs to choose those keywords that are frequently searched and are in high demand – but not so high as being already used by countless other websites and competitors as these will be very expensive to bid for and can use up you budget at a rate of knots! This is actually a difficult task and will take some effort but fortunately there are a number of keyword research tools that can help you find them.

Apart from the Wordtracker which is a tool I use we have some more important research tools like the Overture, Google AdWords and SEO Book which I briefly review below.

Link to Word-Tracker: http://freekeywords.wordtracker.com/

Overture’s http://inventory.overture.com/d/searchinventory/suggestion/ keyword suggestion tool is free and much quicker to use than Word tracker – it seems you need an account to use this one now.

Although you have to get an account first – which as usual with the Yahoo small business website is a nightmare to work out – It works something like Wordtracker but without telling you how many websites are targeting each keyword phrase. For example if you type ‘Computer’, the Overture search suggestion tool will tell you that during the last month the word ‘Computer’ was searched, say for example 400,000 times at Overture. Similarly ‘computer game’ was searched 309,850 times (I made the numbers up for illustration). Also, given one word it will generate all relevant combinations of that word based on actual searches done by people just as in all the other keyword tools. If the word you keyed is not a common search term then you will not get any results – it means that very few people have actually searched for that word during the last month into Overture.

The Google Keyword Tool generates potential keywords and reports their Google statistics, including search performance and seasonal trends. It is the major tool out there so well worth getting used to it.

Features of this tool include:

• Sorting the results of your desired keyword search by popularity, past performance history within the AdWords system, cost, and predicted ad position (based on your bid).
• Easy keyword manipulation where you can select a few keywords here and there or add them all at once.
• Searches for keywords present in any webpage URL specified by your search.
• More keyword results are generated based on regularly updated usage statistics database that helps you to get new keywords or phrases.

The external link to the keyword tool:

https://adwords.google.co.uk/select/KeywordToolExternal

The adding of the right keywords to your adword campaign is a snap your just have to have an account already in place – in the new user interface you can adapt and tune your campaign with a surprising degree of sophistication.

Another tool I occasionally use is ‘The SEO Book’ keyword research tool that is a very sophisticated resource that if you can see past the websites aggressive marketing is actually very good.

http://tools.seobook.com/keyword-tools/seobook/

Type in a phrase or keyword and it will suggest a large number of related searches across all the main search engines. You can also link off into (say) Google and get the full information on a particular word – overall a great resource.

These software tools are useful for researching how people search the web and then optimizing your own web pages so that more people find your web site – so they are essential parts of the webmasters tool kit and you need to learn them.

Protecting Your Search Engine Rankings

Protecting Your Search Engine Rankings

Your website’s ranking on search engines is a vital element of your overall marketing campaign, and there are ways to improve your link popularity through legitimate methods. Unfortunately, the Internet is populated by bands of dishonest web masters seeking to improve their link popularity by faking out search engines.

The good news is that search engines have figured this out, and are now on guard for “spam” pages and sites that have increased their rankings by artificial methods. When a search engines tracks down such a site, that site is demoted in ranking or completely removed from the search engine’s index.

The bad news is that some high quality, completely above-board sites are being mistaken for these web page criminals. Your page may be in danger of being caught up in the “spam” net and tossed from a search engine’s index, even though you have done nothing to deserve such harsh treatment. But there are things you can do – and things you should be sure NOT to do – which will prevent this kind of misperception.

Link popularity is mostly based on the quality of sites you are linked to. Google pioneered this criteria for assigning website ranking, and virtually all search engines on the Internet now use it. There are legitimate ways to go about increasing your link popularity, but at the same time, you must be scrupulously careful about which sites you choose to link to. Google frequently imposes penalties on sites that have linked to other sites solely for the purpose of artificially boosting their link popularity. They have actually labelled these links “bad neighbourhoods.”

You can raise a toast to the fact that you cannot be penalized when a bad neighbourhood links to your site; penalty happens only when you are the one sending out the link to a bad neighbourhood. But you must check, and double-check, all the links that are active on your links page to make sure you haven’t linked to a bad neighbourhood.

The first thing to check out is whether or not the pages you have linked to have been penalized. The most direct way to do this is to download the Google toolbar at http://toolbar.google.com. You will then see that most pages are given a “Page rank” which is represented by a sliding green scale on the Google toolbar.

Do not link to any site that shows no green at all on the scale. This is especially important when the scale is completely gray. It is more than likely that these pages have been penalized. If you are linked to these pages, you may catch their penalty, and like the flu, it may be difficult to recover from the infection.

There is no need to be afraid of linking to sites whose scale shows only a tiny sliver of green on their scale. These sites have not been penalized, and their links may grow in value and popularity. However, do make sure that you closely monitor these kind of links to ascertain that at some point they do not sustain a penalty once you have linked up to them from your links page.

Another evil trick that illicit web masters use to artificially boost their link popularity is the use of hidden text. Search engines usually use the words on web pages as a factor in forming their rankings, which means that if the text on your page contains your keywords, you have more of an opportunity to increase your search engine ranking than a page that does not contain text inclusive of keywords.

Some web masters have gotten around this formula by hiding their keywords in such a way so that they are invisible to any visitors to their site. For example, they have used the keywords but made them the same colour as the background colour of the page, such as a plethora of white keywords on a white background. You cannot see these words with the human eye – but the eye of search engine spider can spot them easily! A spider is the program search engines use to index web pages, and when it sees these invisible words, it goes back and boosts that page’s link ranking.

Web masters may be brilliant and sometimes devious, but search engines have figured these tricks out. As soon as a search engine perceive the use of hidden text – splat! the page is penalized.

The downside of this is that sometimes the spider is a bit overzealous and will penalize a page by mistake. For example, if the background colour of your page is gray, and you have placed gray text inside a black box, the spider will only take note of the gray text and assume you are employing hidden text. To avoid any risk of false penalty, simply direct your webmaster not to assign the same colour to text as the background colour of the page – ever!

Another potential problem that can result in a penalty is called “keyword stuffing.” It is important to have your keywords appear in the text on your page, but sometimes you can go a little overboard in your enthusiasm to please those spiders. A search engine uses what is called “Key phrase Density” to determine if a site is trying to artificially boost their ranking. This is the ratio of keywords to the rest of the words on the page. Search engines assign a limit to the number of times you can use a keyword before it decides you have overdone it and penalizes your site.

This ratio is quite high, so it is difficult to surpass without sounding as if you are stuttering – unless your keyword is part of your company name. If this is the case, it is easy for keyword density to soar. So, if your keyword is “renters insurance,” be sure you don’t use this phrase in every sentence. Carefully edit the text on your site so that the copy flows naturally and the keyword is not repeated incessantly. A good rule of thumb is your keyword should never appear in more than half the sentences on the page.

The final potential risk factor is known as “cloaking.” To those of you who are diligent Trekkies, this concept should be easy to understand. For the rest of you? – cloaking is when the server directs a visitor to one page and a search engine spider to a different page. The page the spider sees is “cloaked” because it is invisible to regular traffic, and deliberately set-up to raise the site’s search engine ranking. A cloaked page tries to feed the spider everything it needs to rocket that page’s ranking to the top of the list.

It is natural that search engines have responded to this act of deception with extreme enmity, imposing steep penalties on these sites. The problem on your end is that sometimes pages are cloaked for legitimate reasons, such as prevention against the theft of code, often referred to as “page jacking.” This kind of shielding is unnecessary these days due to the use of “off page” elements, such as link popularity, that cannot be stolen.

To be on the safe side, be sure that your webmaster is aware that absolutely no cloaking is acceptable. Make sure the webmaster understands that cloaking of any kind will put your website at great risk.

Just as you must be diligent in increasing your link popularity and your ranking, you must be equally diligent to avoid being unfairly penalized. So be sure to monitor your site closely and avoid any appearance of artificially boosting your rankings.