SEO Quirks You Must Know

They say that the fiend is in the details. When there is a question about search engine optimization, some important twists are included in the details that you should know.

  • Confirmation of the behavior of search engine
  • Keeping-up with the changes in the behavior of search engine
  • To defend your SEO occupy yourself fine with the other websites
  • Ignore the common things that causes the blockage of SEO

There are nine examples of SEO twists that are in my mind. Check out how many of the twists you already know.

  1. Don’t use underscores instead use dashes in the file names & subfolders.

Bad: newfolder/file_name.htm


Many developers prefer to use underscores (_) for separating the words in the names of files but they don’t use hyphens (-). They are trained to use this behavior because hyphen is reserved in some programming languages, e.g. the subtraction infix operator.

Google was written by the geeks for the geeks, the search engine is inclined to see the underscores as joiners or concatenation, and hence the technical terms like FTP_BINARY will occur on the pages of search results.

It is preferred to use hyphens if you are creating a new website.

If you have a small website that don’t have many inbound links then alternate the current URLs to hyphens and 301 redirected URLs to the new URLs.

Keep the CMS rules and old URLs if you are having an initiative site that uses the underscores, but shift to hyphens from the underscores as the resolution of naming for the names of all new file.

A word of attention can keep many hyphens and words to a sensible amount. Short and sweet, one or two hyphens are suggested by me for the topics level pages or the name of category. While you are creating the file names for the articles you will have much flexibility; you should not try to go to stuff keywords or town.

  1. Avoid Dashes In Domain Names

Select the domain which identifies your brand or business in a summarizing and proficient way without dashes.

You may read this article to know how to purchase expired domains

There is no need to worry about the keywords. Though I don’t know about any technical reason that why dashes shouldn’t be used in domain names, they look compromising and cheap from the applied viewpoint. That may rise an attention ensign when you are out of the credentials and links.

One of the main reason that why people choose hyphenated domains is to insert keywords. Google updated its algorithms last year to reduce the precise match domain benefit. Though, a long time before this the success of number of domains of brand names, many of them reaching on the comical, proved that there is no need for keywords rich domain to get succeeded.

  1. In Subfolder & File Names, Use Only Lower Case Letters, Numbers & Hyphens

Bing and google both of them are great at offering difficult URLs that are non-standard and with dashes or characters that are encoded. And the problem occurs when the other websites connect with your documents. If the special characters are not encoded, the other websites that are linked to your documents their content management systems will may encode the special characters.

E.g. spaces turn into %20. If those websites use different sets of character than the character sets used by your site then your special characters many not be translated correctly. Using hyphens and alphabets from a to z and characters for 0 to 9 is the safest way to keep it simple. URLs are case sensitive according to the technical standards. Many systems for the content management handles the addressed of mixed cases by writing them again into the lower case, but you should check yours and don’t make an assumption.

Also, some SEO tools and analytics are case sensitive and will report different versions of the same URLs individually. The safest path is to ensure that all the internal links of you are in lower case and make it the standard of style for all coders and copywriters.

  1. The grand Subfolders compared to the Subdomains dispute

It is used that the search engines treat subdomains rather like diverse websites. They are unevenly equal today. Actually, it is in the same manner for several times. It is also good because many of the third-party applications, e.g. like shopping carts that are hosted are necessarily supposed to be in subdomains.

Search engines are quite capable for telling though subdomains are interconnected or not. E.g., Tumblr and Blogspot are not interlinked subdomains, whereas and are interlinked.

If you are using subdomains then don’t detach them. Make sure that the navigation links between your subdomains and domains are joined well. I have seen application inside the subdomains that will link only to the homepage of main website or no follow links of employ.

  1. Be Careful With Parameters

Variables in the URLs are the parameters. Ending the address with a question mark is the standard way of creating the parameters, after that the list of names of parameter and values.


For example:


Your system for content management may rewrite this into a SEO-friendly and User-friendly format,


Both the URLs that are above are fine. Second example is my preference as it is easy for reading and also removes the characters and words that are unnecessary.

Surely you will desire to ignore non-standard or missing delimiters, (no limitations) (unusual restrictions)

I have also observed several similar silly delimiter schemes.

Be careful about the identification parameters by user like uid=1432567, and each visitor here will get a number or parameters for tracking like source=abc where abc is different for each document that is denoting. It will create the content issues of replacement.

Your choices include:

Free yourself from them; make use of cookies in its place.

use the rel=”canonical” tag for telling the search engine that which URL to index and credit with certifications and links.

State it to search engine to avoid the parameters that are making using of webmaster tools e.g. Bing and Google.

Another trick is putting the parameters by the content of a page are not effected after a #. Almost everything is ignored by the search engines after the # character in URLs, the exception is known as the AJAX hash bang.

  1. Use Flash Or Silverlight To Insert Multimedia Elements, Not For All Content

Though search engines advertise their capability to advertise to crawl Flash and other rich media, they still does a deprived job of it. Flash is a better choice to insert multimedia into an HTML page e.g. Presentation, video, sound file or animation.

Don’t make use of all Silverlight or all Flash website. Flash site are specifically common between the photographers, artists and musicians, which is a shame because these are the people who can take advantage from the organic search. Remember that, iDevices of Apple does not support Flash, so its fame is fading anyhow.

  1. Pick Only One Per Page, HTTP: Or HTTPS:

Google does not care whether you are making using of https: or http: or you are not making using of it!

Though both https: and http: are welcomed by Google: on the basis of URL-by-URL, just pick the one and switch to it. Give us a chance to tell you that if you are having a shopping cart and it is secured by https: checkout. You can be full of troubles if you are having a crawler-friendly set of pages that resolve to both versions https: and http:.

I have seen websites where all the offsite links go to http: addresses and Google indexes the http: URLS. Then, unexpectedly, the addresses in the Google index change to https: for no obvious reason, and the ranking of the website disappear.

The easy way of ignoring this is the use of conical tags that force https: or http: each is the version you want to get indexed.


  1. Make Sure The Markup & Visible Text Matches

It is known as the cloaking by search engines when the text in the HTML markup does not match what is seen by the users. Cloaking is involuntary occasionally. An example seen by me is occurred in a shopping cart where all the links to all the categories of products and subcategories of products all are included in the every page’s markup. Visitors only see the links to the subcategory of the category that they are viewing. The system of content management hide the other subcategory links through CSS.

I don’t want to argue here on the topic of white hat vs. black hat cloaking, mainly from the time Google engineers seems to dislike the discussion of particular cloaking techniques. Possibly, because they don’t want to give idea to people. The two exceptions, one is used for the demonstration purposes by them, are allocating the different contents that are based on the user agents and use of CSS for positioning text off the screen (-999).

There are always fast to say there are no good reasons for cloaking and that they have the special recognition of algorithms that ring the red alert phone. The end result for this twists is to avoid accidental cloaking.


  1. Using The Vertical Bar In Title Tags

Search for long winding road, long – winding road, long — winding road, and long | winding road. Note that how Google is ignoring the dashes but not the vertical bars? That bar separates long and curving, not only visually but in the Google algorithm too. If your website is using the vertical bar, Take an experiment by exchanging it with a dash and see what the result is.

If you know all the nine SEO twists of mine, then it is good for you. It is not easy to keep the complete knowledge of all the things Bing and Google SEO. Share the SEO twists in the comments if you know any twists.

Have a look at how to find out if the domain has fake backlinks