A glossary of SEO terms

search engine optimization

Within any professional industry there are some in house terms and phrases used that need definition and clarity to the uninformed. Search engine optimization is no different. I’ve written some post in the past that mention some of these, without elaborating on them - this post will hopefully serve as a reference post as time goes on, educating you guys who aren’t familiar with the terminology. It’s very easy to include some of these phrases within my writing, without realising that some people may not have a clue what I’m on about - so for completeness, here’s my SEO glossary.

Blackhats / Whitehats

Blackhat generally refers to a search engine optimiser who doesn’t play by the rules, and is constantly pushing the boundaries of what the major search engines will allow. If it is well known that for example links are important to rank well, a blackhat will perhaps spam or create automated programs to put links around the web. I think the term initially came from some connection with wizardry - search engine optimisers being the wizards of the web? Black being evil, white representing pure or good. Whitehats on the other hand are search engine optimisation professionals that plays by the rules and follows things like Google’s guidelines. A whitehat will err on the side of caution, and takes care not to trip any Google penalties, a Blackhat on the other hand is normally involved with naughty tactics on throw away domains, and uses this knowledge to bolster his Whitehat efforts.

Sitewide
A sitewide link is one which is available through your site, for example in the footer or on a menu which exists on every page within your website. A sitewide link will commonly not pass as much link juice as other links.
Nofollow

A link which carries the rel=”nofollow” attribute - this was a measure used to combat spam, and is implemented within common platforms such as Wordpress to avoid comment spam. A link with the rel=”nofollow” attribute, doesn’t carry any weight, and doesn’t count towards your overall pagerank.

Dofollow

A standard link which hasn’t been No followed. This type of link will pass page rank to another website or page on a website.
Link juice
Commonly used with the term Pagerank - Link juice can be defined as the amount of influence or Page rank being passed between webpages.

Pagerank
PageRank is Google’s algorithmic way of deciding a page’s importance. It is determined by the number of inbound links pointing at the page and used (as one of the many factors) in determining the results in the SERPS.

On Page techniques
Google uses both on page and off page techniques to determine the results in the SERPs - on page techniques refer to changes made to the web page code which help align the physical page code with relevance for a particular search phrase or term.

Off Page techniques
Off page techniques are used to enhance the relevance of a webpage by gaining links. Off page links may include traditional link building or link baiting

Traditional link building
Traditional link building refers to the practise of obtaining links from third party websites. This will commonly be performed by a human, asking for link exchanges from other webmasters, submitting websites to relevant directories or commenting on blogs which are dofollow.

Link Exchanges
Link exchanges or link swaps are used between two webmasters to share traffic and link juice between two websites. They are frowned upon generally as the two links cancel each other out, and look spammy. If in doubt don’t exchange links - you may end up getting into a bad neighborhoods.

Search phrase
A search phrase is the keywords someone uses to find your website in Google. Some search phrases are in more competitive niche’s than others, and are harder to rank for.

Link graph
When used in context a link graph represents the network of links that connect sites together. It is the overall picture of how a site is linked to and from.

Link bait
Link bait is content written solely for the purpose of gaining additional new links and a high influx of new traffic - simply from the content nature. I’ve written a post on link bait over here for further reading.

List bait
List bait is the same as link bait but takes the format of a list. e.g. 10 amazing widgets or 25 top tips for x y or z.

Competitive Niche
A hard to enter market online which caters for a particular niche, but is tough to rank for. For example pharaceuticals or mortgages. As the price paid per click is high for Adsense adverts,many people chase terms such as mortgages.

Keyword Research
Keyword research is the practise of working out how much potential traffic a keyword gets. See more over here. It may also cover Adsense keyword research (to figure out how much Adsense income a potential keyword may bring a website owner).

Keyword Density
Used within on page techniques keyword density refers to the distribution of a keyword within a web page. Each search engine favours its own keyword density. To figure out the keyword density get the word count of your webpage (WC), count the occurance of a particular keyword (OCC) in it, and divide into it. Want to know what the major search engines favour ? Have a peak at some of this keyword research.
(OCC / WC) * 100 = Keyword Density for a phrase

Internal Links
Internal links are any links which link (internally) to other pages on that website. A good internal linking structure is imperative for good SEO results. Linking back and forth with good link text can help Google determine what your website is about, and thus increase the likelyhood of you being found for particular terms. Internal links also help to improve the overall number of page views your website receives, as the pages are well linked together.

Outbound Links
Simply put - outbound links are links which link to other websites, and are sometimes known as external links.

Trust Rank
Trust Rank is a link analysis technique that many SEO’s believe is present somewhere within Google’s ranking algorithm, that uses the research conducted (PDF link) at Yahoo and Stanford University for identifying spam.

Link Text or Anchor Text
Link Text or Anchor Text are the words which are underlined when a link is created. For example Web Design Ireland - this will aid Google in identifying what a site is about. SEO’s commonly use the link text of a link to increase relevance for keywords or phrases.

Spider
A spider in the context of SEO, is an automated program which is used to collect and or mine data. In order for Google to find out about your website, it has to spider (or crawl) the web, clicking on link after link to find your website. The more links to your site, and the more frequently you update your content the more frequently you will get crawled. The information Google picks up from your webpage content is correlated in Google’s databases, and after your ranking has been decided, shown in the SERP’s. A search engine spider is also sometimes known as a robot.

Robots.txt
A robots.txt file contains instructions for spiders or robots on which pages they are allowed to index. If you wish to disallow access to certain parts of your website and not get listed for that page in Google, you need to have a robots.txt file in place. This is a good resource on the usage of a robots.txt file.

Backlink
A backlink is a link obtained from a third party website to a page on your website. The more of these that exist the more traffic you will receive. In addition, multiple backlinks has the added advantage of boosting your pagerank, and increasing your relevance in Google. Backlinks are sometimes referred to as inbounds.

301 Redirect
There are a variety of status headers that SEO optimisers need to be aware of, but one of the most important is the 301 redirect. This allows search engines to determine when a page has changed from one part of a site to another, or if a site has migrated from a subdomain to a main domain - or indeed completely changed. You can check the headers a particular webpage has, but if you are migrating, ask an expert.

Bad Neighbourhood
Bad neighbourhood’s describe areas on the web that have been penalised by Google in the past, and have engaged in dubious linking practises or cloaking. Gaining an inbound or more importantly adding outbound links to bad neighbourhoods can hamper your SEO efforts at best, or at worst; get you completely banned from Google.

Cloaking
Cloaking refers to the practise of sending human visitors one copy of a page, and the site engine spiders another. It is commonly performed by detected the User Agent of a browser or spider, and sending alternative content to Google. The perceived advantage to this is that keyword densities, link structures and other search engine ranking factors can be manipulated further without worrying about the readability of a page, or the navigation for humans. It is generally a serious faux pas to engage in cloaking of any kind, and it is well known to be a Blackhat technique.

Deep Linking
Deep linking refers to obtaining inbound links to content which is buried (deep) inside your site. Generally the majority of links to your website will hit the homepage. It is better to achieve links to content off the home page, which will improve the pagerank distribution across multiple pages. These are known as deep links.

Black Hole
A black hole site is created when a large tier 1 authoritarian site stops providing outbound links - or if it does provide outbound links they are made no follow. If another source is needed, another page is provided on the site for the citation, and as a result all inbound link juice is retained within the blackhole. A great example of this would be Wikipedia, which tends to dominate the SERPs for some keywords. A few newspaper sites have started to create blackholes to try and retain their linkjuice.

Rank
Rank is used to describe where abouts a particular site appears within the SERPs for particular keywords. Sometimes you will hear of SEO professionals talking about outranking someone else. This simply means that they have overtaken them in the SERPs.

SERPS
SERPS stands for search engine results page. It is the first page you see after you hit search on any major search engine, and lists the results for your particular search query.

Google Sandbox
The Google Sandbox is conceptually a place that new domains sit for a while once they are launched with a algorithmic lowered page rank. No one knows if the sandbox exists or not, and many dispute its existance. Matt Cutts has stated in an interview however that “there are some things in the algorithm that may be perceived as a sandbox that doesn’t apply to all industries”.

Domain Age
Domain age refers to how long a particular domain has been registered for. It is thought to be a ranking factor inside the Google algorithm, with older domains thought to be more likely to be relevant than new ones. Again this is something that is fraught with heresay.

User Agent
The User agent is the client application identifier that is used to access a web page. For example if you access a web page using Internet Explorer the user agent will contain a string that pertains to it - e.g. MSIE 9.0 beta or if the user agent is a search engine spider is browsing a web page it will likely identify itself as a bot of some sort. For example Google identifies its search engine spider as Googlebot. This is used by web analytics software such as Google Analytics.

Bait and Switch
The practise of attracting links for a term, think once ranking has been achieved for it, switching the content on the page. May be used for a landing page for a service or product.



Related Articles by Categories


0 comments:

Grab this Widget ~ Blogger Accessories