When you read this site or other sites on the web, you will come across many terms and acronyms and might wonder what is the meaning of a key term or “acronym”. If you are new to the world of Search engine optimization, this is one of the first hurdles you will need to pass – understanding the meaning of key terms and concepts before absorbing the message of an article.
For example, in one article you may see the word “Anchor text” being repeated over and over. If you don’t know what “Anchor text” is, you will have difficulties understanding the entire point of the article and what the content is all about. Hence, it is very important to understand and learn the meaning of common SEO terms.
To fulfill that need, we created this SEO glossary which contains a list of common SEO terms with their meaning and definitions. This is a comprehensive SEO glossary which contains a definition of key SEO terms, concepts, and their meaning explained in the most layman’s terms.
301 Moved Permanently
“301 Moved permanently” is an HTTP protocol which indicates that the resource being requested has moved to a new location, the location being given in the “Location” headers. This protocol is generally given by a web server to a “Client” which is often the web browser of a person’s computer requesting a page on a given website.
If the page on the website has moved to a new URL and the web server is configured in a way to pass the “301 moved permanently” header on specific pages, then the web server will throw this response when that specific resource is being requested. A typical example of “301 moved permanently” is given below
Client (Web browser)
GET /index.php HTTP/1.1
Server (Web server)
HTTP/1.1 301 Moved Permanently
A “301 redirect” is a permanent redirect from one webpage to another. The redirect can point to an existing resource on the same domain or a new resource on the same domain or may point to a resource outside the domain. A 301 redirect is simply assigning a new address for the existing resource.
Pages with 301 redirect are automatically pointed to the new resource by the web browser and also by search engines. The difference between 301 redirect and 301 moved permanently is that the former redirects the browser without giving any further information while the latter gives information in the form of headers and then re-directs the browser.
“302 Found” is an HTTP protocol which indicates that the requested resource has moved to a new URL but only temporarily. The visitor’s browser redirects to the temporary page for the time being, but search engines are instructed not to update the temporary URL because the original URL is supposed to return in due course.
A “302 found” status code is used when the web server is performing some maintenance and a certain part of the website is moved to another part for specific reasons but only temporarily. After the operation is resolved, the 302 status code is lifted from those pages and things are back to normal.
A “302 redirect” is a temporary redirect to a new resource and the directive is given both to web browsers and search engines. What this means is that the web server tells search engines that the redirect is only temporary, while the original URL is not available at this moment, it is most likely to return in near future. However, the web server gives no directives to search engines whether to crawl and index the new URL or whether to keep the old URL in its index, thereby giving search engines the freedom to choose which URL to keep in their index and show to users.
In general, when you are sure that the old resource will no longer be available and the content will only be available in the new resource, you should use a 301 redirect. However, if your redirect is only temporary for maintenance or other temporary issues, a 302 redirect makes more sense.
404 Not Found
“404 not found” is an HTTP protocol which indicates that the requested resource was not found on the server. It means that the requested resource has either been removed or the requested resource never existed on the server. This directive is given to both search engines and web browsers and in general, this results in a blank page with no content in it. A 404 not found means a dead end and the visitor has to go back and navigate through the website to find content. Please note that there is no redirect associated with a 404 not found, it is just an error message which communicates that the requested resource no longer exists.
“410 gone” is an HTTP protocol which indicates that the requested resource is no longer available on the server and the condition is most likely to be permanent. The difference between 410 gone and 404 not found is that the status code “404 not found” does not acknowledge whether this resource ever existed on the server or not. It simply says that the resource was not found. However, the status code “410 gone” acknowledges that the resource did existed on the server some time back but for some reason, it is no longer available and it will not be available in near future as well.
The directive is passed both to web browsers and to search engines.
500 Internal Server Error
“500 Internal server error” is an HTTP status code which indicates that something has gone wrong with the web server without giving any details on what the specific problem might be. The details of the specific problem is not shared with the search engines and the regular visitors and since the Error code is generally public in nature, the details of the error is not even shown to the owner of the website.
500 Internal server error status code can be shown in a variety of occasions when a misconfiguration has taken place or some setting has been malfunctioned. It is often restored by perming a thorough check of all the systems running in the web server.
The term “Alt tag” is an acronym for “Alternate tag” or “Alternate text” associated with images on a webpage. The Alt tag is used to communicate what an image is all about since Search engines cannot read images and understand what they are all about. The alternate tag or “Alternate text” is also used by screen readers and text-only browsers that do not support images, browsers for visually impaired people. An Alternate tag has good SEO value since it also conveys information about the content of the page where the image is used.
Above the fold
Above the fold refers to the portion of a webpage which is visible without having to scroll. Different devices will have different pixel size and hence there is no set definition of the number of pixels that defines the above the fold section on any given device.
It is sometimes considered a bad practice to place ads, banners and other “Non-content” elements above the fold since it is considered a major distraction for the user and does not lead to a good user experience. It is also worth noting that ads, banners or elements placed above the fold section lead to better conversions and more clicks, thereby converting to more sales, subscriptions, and clicks and other things which a website owner wants to achieve from his visitors.
Accelerated Mobile Pages (AMP)
Accelerated mobile pages is a project initiated by Google to have an open standard for website owners, webmasters, developers, designers, and publishers to ensure web pages load quickly on mobile devices. Basically, it is a standard to create a very fast responsive version of a webpage to ensure that the content of the page loads quickly when specifically viewed in a mobile device which has a slow internet connection.
Google AdWords is an online advertising service developed by Google, where advertisers pay to display brief advertising copy, product listings, and video content within the Google ad network to web users. Google AdWords’ system is based partly on cookies and partly on keywords determined by advertisers. Google uses these characteristics to place advertising copy on pages where they think it might be relevant. Advertisers pay when users divert their browsing to click on the advertising copy. Partner websites receive a portion of the generated income
Add -on domains
An Add-on domain is an “extra” domain which has been added to your web hosting account. Typically, when you create your web hosting account with a web hosting company, you have a main domain associated with your account. However, if you have enough disk space on your web hosting package, most web hosting companies allow you the flexibility to add multiple domains to the same web hosting account, without having to create a separate web hosting account for each domain.
When you add secondary domains to the same web hosting package, the secondary domains are called “Add-on Domains”.
An algorithm is a sequence of formulas or procedures which has been set to achieve a given objective. An algorithm considers all the use cases, scenarios, and other parameters before arriving at its objective. In general, an algorithm is built to solve a complex, recurring problem.
In SEO, algorithm generally refers to search engine algorithms which determine the relevance of a webpage to visitor’s query. Then there are algorithms which determine the quality of web pages, the authority of domains and lots of other things on a recurring basis. These algorithms are often updated to include newer parameters and often result in big shake-ups, resulting in traffic gains for websites and traffic loss for websites.
Most people are generally afraid of “Algorithmic updates” because it results in extreme conditions for business who depend on organic traffic from search engines (Especially Google)
Anchor text is the word which is used to hyperlink an internal or external page on a webpage. Anchor text carries good “SEO value” because it is a way search engines figure out what the target page is all about. Learn how to create saeo friendly anchor texts
Authority generally refers to the “Authority of a webpage” or “Authority of a website” or “Authority of a domain”. Though this is not an official term, it is a common one used within the webmaster’s community to refer to websites who have a high amount of “Trust”, “Popularity”, “Fan following” and considered one of the mainstream websites in its niche.
Search engines do have a segregation of websites according to “Authority” (it is also referred as “Pagerank”) and in general, websites with higher authority have higher rankings on Search results compared to websites with low or no authority.
Backlink refers to incoming links to a webpage, either from an internal page on the website or from a totally different website altogether. Although internal links are considered backlinks, a backlink is often referred to a link which is obtained from another website to an internal page on your website.
Backlinks have very high value in the search engine optimization of a website and in general, the higher quality backlinks you acquire for your website, the better your rankings will become over time.
Also referred as “Bad neighborhood”, a “Bad neighbor” means getting linked from a website which has been penalized by Google. Typically, “a bad neighbor” refers to a website that has content related to Gambling, Casinos, Adult pornography or any other content which violates Google webmaster quality guidelines.
If your website gets a backlink from one of these sites, then it is sometimes referred that your website just got a “Bad Neighbour”. There is no need to panic if you are getting links from Bad neighbors, but one should be aware if that number increases too much for no reason and take preventive measures to ensure that your website does not get classified as one of the bad neighbors.
Baidu is a Chinese internet search engine which is most popular in China and for Chinese language queries. It is the equivalent of Google in China, although Baidu is not owned by Google or it’s parent company Alphabet Inc.
Bing is a web search engine owned and operated by Microsoft, Inc. It’s chief rival being Google, Bing was Microsoft’s counter to offer some competition to Google in the search market. After Google, Bing is the most popular search engine but used only by a small fraction of users across the globe.
Blackhat generally means “Some which is not legitimate or legal to do”. Blackhat generally refers to “Blackhat SEO” which means violating the directives of Search engines to establish search rankings for a website by Brute force, often coming up with shallow content, low-quality pages and building links from tons of spam sites and trying to game the system.
The opposite of Blackhat SEO is Whitehat SEO which refers to Legitimate measures to gain search rankings on Google and other search engines. Learn the differences between Blackhat and Whitehat SEO
Blogger outreach typically means reaching out to Bloggers in a specific niche and asking them to write about your product or service on their websites. This is considered a “Promotional reaction” to gain more eyeballs and audience for your own website and a “Blogger outreach campaign” is often done to attract a critical mass of customers, followers or audiences in a small amount of time.
Blogger outreach has become really popular, especially with hi-tech companies who sell gadgets, computers, and smartphones. These companies reach out to popular blogs and often consider giving away some goodies for free to entice people and spread the brand deep into the consumer’s heart.
The boolean operator is a term from “Computer science” which refers to “Operators” which can be combined in a statement. The standard operators are “AND”, “NOT”, “OR”.
In SEO, the Boolean operator generally refers to Google search operators which let a user refine his search results to find more specific sites with specific attributes and parameters. It is not considered mainstream and is only an advanced mechanism to fine tune the result of a Google search. Learn more about Google’s Boolean operators.
A “Bot” is a short form of “Robot” which is an algorithm or a software application which is used to run automated tasks and achieve a given result. The task performed by a “Bot” is recurring and repetitive and is mostly beyond the scope of “Human labor”. A most common example of a Bot is a “Search bot” which routinely crawls and indexes content from all the websites available on the world wide web. Learn more about Google’s search bot – Googlebot
“Bounce rate” is a fancy name given to the percentage of people on a website who move away from the website after viewing only one page. Typically, bounce rate refers to the rate of people who “bounce” from a website as soon as arrive at the website.
For example, if 100 people come to your website and out of those 100, 97 people visit only one page and go away from your website, the bounce rate of your website is 97%.
Bounce rate is considered an important SEO signal and usually, the lower the bounce rate of web pages, the higher their rankings on Search results.
A “Branded link” is a website domain name which contains the name of the Brand in it. Branded links are often as short as possible and used as short links to main pages on the main “brand website”.
Both these domains reflect the “Brand” from the URL.
A “breadcrumb” is a type of navigation unit on a webpage which reveals the location of the user in the website and allows him to navigate higher to other sections quickly, depending on how the breadcrumb navigation is structured. A breadcrumb simply puts the hierarchy of a website in front of the user and lets the user decide how he wants to navigate on the website.
Breadcrumbs have a very high value from the SEO perspective, although not all sites are required to use Breadcrumbs. If it is a simple blog with only posts and pages with no definitive hierarchy needed for navigation, there is no use case of putting Breadcrumbs. However, if the website in question is an eCommerce Store or a product listing website with many possible products, categories, and other choices, it helps to add “Breadcrumbs” so that the user can easily navigate the website and arrive at the point which interests him.
“Broad match” refers to the common theme of search which a user performs on Search engines such as Google, Yahoo, Bing, and others. It is used by advertisers to target serving ads to people who perform searches around a theme of keywords.
For example, the keywords – “shoes for men”, “best summer footwear for men”, “casual shoes for men” all form a “Broad match” keyword set for “Men’s shoes”. Broad match keywords contain all possible keyword searches for a phrase, related search terms, related sentences or phrases, all possible keyword combinations, similar words, plurals, and misspellings.
CTR (Click Through Rate)
CTR is an acronym which stands for “Click-through rate”. It refers to the rate of clicks received on a hyperlink or an advertisement which contains a link. For example, if your advertisement is shown to 100 people and only 13 people click the advertisement, the click-through rate or CTR is 13%.
Click through rate is measured to observe the conversion rate of an advertisement, hyperlink or a marketing campaign. It is generally a measure of success or failure for a campaign and it’s objectives.
A canonical tag or a canonical meta tag is a directive to search engines to point them to the original content that a webpage contains. If two or more web pages on the same domain contain exactly the same content for some specific reason, the canonical tag is the only way to tell search engines which of the pages using the content is the main page and should be shown in search engine result pages.
A Canonical tag is very useful in resolving duplicate content issues and avoiding being penalized by search engines for duplicate content. Please note that canonical tags can be across multiple domains.
Cloaking refers to the practice of serving different content to search engines and different content to human readers, solely for the purpose of getting ranked in search engines for specific keywords. If your webpage is about “Shoes” and you are trying to rank in Search engines for the term “Best wallets for men”, you may be tempted to create a doorway page or some other mechanism that presents a particular page only to search engines which contains lots of keywords related to “Best wallets for men”. However, that specific pages do not even contain any keywords related to “Best wallets for men”, because your website is all about “Shoes”. However, when a search bot crawls your website, it finds those specific words because you have been practicing cloaking.
Cloaking is a serious offense and is a violation of Google’s search quality guidelines. Practicing Cloaking will get your website de-listed from Google search. Learn more about Cloaking.
Content Delivery Network (CDN)
A content delivery network is a system of distributed web servers or “computer systems” which helps accelerate the delivery of web pages, depending on the location of the visitor who is accessing the website. In general, a content delivery network is a geographically distributed data-center spread across the world in different locations and depending on the location of the visitor’s computer, his IP address and Internet service provider, the CDN decides from which data center the data is to be provided.
If your website uses a content delivery network which has data centers across all the major countries, it will greatly improve the page loading time of your website. A user in Brazil will get the page fetched from a data center in Brazil while a user in Japan will get the page fetched from a data center in Japan or Korea or Russia.
Large websites use content delivery networks to provide a better user experience to their users across the globe.
“Content marketing is a marketing approach in which the business creates high-quality content which is exclusive, useful, solves a problem or helps in the awareness of a product, service or brand. Content marketing usually means creating high-quality content and attracting “Organic traffic” from search engines, and converting that organic traffic into loyal users, sales, or customers.
If you are serious about the success of your website and your online brand, you just cannot afford to ignore content marketing. In fact, it is the most cost-effective way to attract customers naturally to your website. Content marketing is one of the fundamental pillars of SEO (Search Engine Optimization).
Conversion Rate Optimisation (CRO)
Conversion rate optimization is a procedure to improve the conversions around a business process. It could mean improving signups on a website, improving signups for an email newsletter, improving the number of clicks made on a specific button or improving page views of a particular page on the website.
Basically, CRO stands for optimizing whatever the website owner calls a “Conversion” and improving that procedure as far as possible to yield maximum business output. Conversion rate optimization is the core of any business process and is not limited to only “Internet assets” but applies to “Real world” assets as well.
Crawl errors refer to the errors which a Search bot encounters while trying to “Crawl” or discover the pages of your website or blog or domain. To see the crawl errors, one needs to verify his website in Google search console and then Google will tell the website owner what difficulties it is facing in finding the content and pages of the website. The business owner is supposed to fix those mistakes in order to ensure Google search bot has no difficulties in finding the content and the search bot can easily navigate from one page to another page and discover more and more content without any hindrances.
A “Crawler” is a “Computer program” or a software application or a script which finds websites on it’s own and discovers all the content on the website. Next, the crawler passes that information to the application and stores it in an “index” so that it can fetch results from that index when people search for terms in Search engines.
A Crawler, also known as a search spider is often a specialized bot whose job is to discover as much content as possible from a website on a recurring basis.
“Crawling” is the process followed by a search engine bot to discover content on a website or domain.
“De-indexing” is the process of removing specific web pages from the “Index” which a search engine maintains to serve search results to visitors who use a search engine to discover content. A Search engine uses a “Crawler” to find as much content as possible from different websites. Over time, the index of a search engine becomes really big and then it figures out a way to store only the good results and ignore the ones which is irrelevant and not useful.
De-indexing is the process of removing specific web pages from the index of Seach engines. Just like accumulating, search engines also love to clean up things which are not useful.
Deep links are the hyperlinks which link to internal pages of your website from the home page and other “main” pages. It is a way to create a natural flow of “Links” from the most popular section of your website to the least popular one, thereby allowing people and search engines to discover great content which may otherwise not get found.
It is a good practice to deep link the content of the pages as much as possible to provide relevant and useful information to people and search engines which they would otherwise miss out on. However, overdoing “deep linking” is bad and results in a very bad user experience. For example, linking each and every word in an article to internal pages of your website may be classified as “Spamming”.
A “Dofollow link” is a normal hyperlink which does not have the “Nofollow” attribute in it. You can use the rel=”nofollow” attribute in a link to tell search engines not to pass any value to the outgoing link or webpage. If you do not use the re=”nofollow” attribute in a link, then that link is a dofollow link.
Dofollow links can be internal links or external links or both. Note that it is a good practice to keep most of the links “Dofollow” and mark “Sponsored links” or “Advertisements” or “Affiliate links” as “Nofollow”.
The age of a domain name is referred to as domain age. If you registered your domain 10 years ago, the domain age would be 10. Note that domain age is not equal to website age. If you registered your domain 10 years ago and built the website only yesterday, the domain age would still be 10 years while website age will be only 1 day.
In general, domain age has no effect on SEO but website age has a considerable effect on the SEO of a website. Domain age used to be a factor in the early days but not anymore since Search engines have gotten really really smart to figure out patterns used by spammers to game search engines for rankings.
A “Doorway page” is a common black hat technique which is used by website owners to create pages with specific keywords in it to obtain search rankings. Once the rankings are obtained, the doorway page usually redirects to another page on the website. In short, the objective of a doorway page is to catch a specific type of traffic and channel that traffic to another page on the same website or to another website, if needed,
For example, if you have a page on your website – men-shoes.html, and you are not getting enough traffic to this page, you can create a similar doorway page on the same website with the name – boys-shoes.html. Next, you can build tons of links towards this page to ensure it ranks higher on search engines. Once that is done, you can do a 301 redirect to the men-shoes.html page and your objective of driving traffic to the original page through the doorway page has been achieved.
A doorway page is considered a violation of Google webmaster quality guidelines and is best avoided. Building too many doorway pages may cause your website to be de-listed from Google search altogether. Learn more about Doorway pages here.
Duplicate content is content which is accessible from more than 1 resource, either in the same domain or across domains owned by same or different individuals. If the exact same content or similar content is available on multiple web pages, then one of those content is considered “Original” or the “main source” while everything else is considered “Duplicate content” or “Plagiarized content”.
Duplicate content is considered a serious offense and having duplicate content on your website is considered a violation of Google Search quality guidelines.
A URL which contains query parameters or which results from a dynamic query from a database is often referred to as a “Dynamic URL”. A static page on a website is not supposed to change its URL but dynamic pages or dynamic URL’s do not really exist on the website but are generated on the Fly, depending upon what query was made by the user in the website’s database. Typically, search result pages, comparison pages, product catalog or e-commerce sites have dynamic URL.
As long as practicable, it is considered a good practice to use static URL’s and avoid dynamic ones since static URL’s are considered more friendly from the search engine perspective.
Exact match could refer to “Exact match domain” or “Exact match keyword”. An exact match domain is a domain name which contains an exact match of the keyword for which the website wants to rank for in search engines.
So if you have a website on shoes and you want to rank for the word “Men’s shoes”, you buy a domain name – www.menshoes.com and try to rank for it thinking that it is easier to rank for particular keywords if you have the exact keyword domain name. It used to be so in the early days of the Internet but it is no longer the case and search engines give no preference to exact match domains for keyword rankings (after the EMD update by Google).
An “Exact match keyword” is often a long phrase which exactly matches a word or phrase. So for example, – “Summer shoe collection for men” is an exact match keyword which is used in the copy of a webpage, just to ensure the webpage ranks for that exact match keyword.
An external link is a hyperlink on a website which links to a page outside the website. An external link can point to anything and not necessarily another webpage. For example, an external link can link to an image, a social profile, a PDF document, a Word Document, a Search query and even an email address.
One has to be careful with their external links and link to only legitimate sites who do not post spam content and are considered authority websites. If not careful, careless external linking can severely harm the SEO profile of a website.
Fetch as Google
“Fetch as Google” is a tool provided by Google search quality team through Google webmaster console which allows you to see how Google crawls different pages on your site and what is rendered once Googlebot starts crawling different pages of your site. This is a very useful way to figure out if a particular page is accessible by search engines or not and how they render it at their end.
You can do this to ensure that the content of the page is rendered exactly the same way as it is rendered at your end, thereby ending any confusion on how search engines see your content. Learn more about Fetch as Google in Google Webmaster help.
Adobe Flash is a software which is used to create animated content such as images, videos, presentations or other rich media content. Browsers who want to display Flash content must install a plugin by Adobe Flash. In the initial days of the internet, Youtube videos were encoded in Flash but later they moved to HTML5.
Search engines cannot read Flash content, like images and videos and there is no added advantage of using Flash over videos. As long as practicable, one should avoid Flash content because not every browser may have the Flash plugin installed. It is sometimes better to use other universal rich media content types over Flash. Flash content also increases the loading time of web pages which is often times, not the best user experience.
An iFrame (Inline Frame) is an HTML document embedded inside another HTML document on a website. The IFrame HTML element is often used to insert content from another source, such as an advertisement, a widget, an infographic or anything else into a Web page.
Using an iFrame will not cause duplicate content issues because search engines know that the content within the iFrame is from another source and you simply not copy pasting the content into your own website. However, iFrames are often discouraged for design reasons because the way they render in devices with smaller screens is not good. As long as possible, stay away from iFrames and use it if you have no other choice of embedding some content in your webpage.
A gateway page is also called a “Doorway page” which has been explained above. It is a common black hat technique which is used by website owners to create pages with specific keywords in it to obtain search rankings. Once the rankings are obtained, the gateway page usually redirects to another page on the website. In short, the objective of a gateway page is to catch a specific type of traffic and channel that traffic to another page on the same website or to another website, if needed,
For example, if you have a page on your website – men-shoes.html, and you are not getting enough traffic to this page, you can create a similar gateway page on the same website with the name – boys-shoes.html. Next, you can build tons of links towards this page to ensure it ranks higher on search engines. Once that is done, you can do a 301 redirect to the men-shoes.html page and your objective of driving traffic to the original page through the gateway page has been achieved.
A gateway page is considered a violation of Google webmaster quality guidelines and is best avoided.
Geo-targeting refers to the practice of delivering different content or advertisements to a website user based on his or her geographic location. Geo-targeting can be used to target local customers through paid search campaigns.
Geotargeting can be applied to various procedures. You can use Geotargeting to serve different versions of the same page to different users across different parts of the globe. Users from Spain see the Spanish version while users from Japan see the Japanese version of the same content. You can again use Geotargeting to ensure an advertisement is shown to particular users of a country. You can use Geotargeting to serve different ads to different users of different countries – it all depends on your business requirement and what you want to achieve.
Google is a search engine started by students of Stanford students Larry Page and Sergey Brin. As of now, Google is the most popular search engine used by millions of people across the globe. It is Google that drives most of the traffic to different websites through its search engine.
Google Analytics is a free website analytics solution which helps website owners track and measure key metrics of their websites or online property. As of now, Google Analytics is the most popular website analytics solution available and it is also the most comprehensive and detailed website analytics solution for tracking performance and metrics of a website with absolutely no cost whatsoever.
Google News Sitemap
Google News sitemap is a special sitemap file on your website which tells Google the latest content on your website and helps your website rank for search results in the “News” section of Google.com. It is very similar to the normal XML sitemap, the only difference is that it is seen by the Google News crawler and not the regular Google Bot crawler.
Google News sitemap is often used by large websites which post a lot of content on a daily basis and smaller sites which update once or twice a month do not actually need a Google News Sitemap.
Google Webmaster Guidelines
Google Webmaster guidelines is a set of directives and best practices shared by Google search quality and webspam team. It lays out the principles and best practices which website owners should follow to create a high-quality website and thus rank higher in search results for key phrases and terms. Read more about Google webmaster guidelines here.
Googlebot is Google’s web crawling bot (sometimes also called a “spider”). Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. It is nothing but a “software application” which goes to each website it finds through links on the web, crawls the content and then stores the content in Google’s index.
“Grey hat” refers to practices which are neither legitimate not illegitimate. It is the use of those practices which have not been defined yet and nobody knows if they are right or wrong. Some people think it is good to do so while something it is not good to do so. While a lot is known about Blackhat SEO and White Hat SEO, very little is known about Grey hat SEO and the things which were Grey hat at some point in time are no longer grey hat because, with each algorithmic update, some grey hat concepts are either absorbed in white hat SEO or black hat SEO.
Growth hacking is a process of rapid experimentation across marketing channels and product development to identify the most efficient ways to grow a business. It involves finding the right market at the right stage of business, figuring out what kind of consumers the business should go after, figure out the costs and do whatever it takes to run and accelerate the business maintaining a lean setup
Simply put, Growth hacking is a concept of conducting marketing experiments to help accelerate the business in the right direction.
An HTML sitemap is a simple HTML page placed at the root of the website which contains links to all the pages of the websites, neatly organized by sections, hierarchies, and other differentiators. It is like an open index which human visitors can refer to navigate all the content available on a website.
Heading tags are HTML tags which are used to create headings in a webpage. Most common heading tags are <h1>, <h2>, <h3> and <h4>. The most important heading tag is the <h1> tag which contains the title of the page or the “Main heading”.
Heading tags do carry good SEO value and one should learn how to create SEO optimized heading tags that are attractive and compelling enough to human visitors as well.
Homepage often refers to the “Front page” or the “Main page” of a website. It is usually placed at the root of a website but some websites have the homepage placed in a subfolder. For example, this website’s homepage is at www.cultofweb.com/blog/ and not www.cultofweb.com
The Hreflang Tag is a directive to the Google search engine to specify what languages are being used in a given webpage and to which specific set of the user who speaks a language this webpage is for. For example, you may have a page on your website which has three version. One for people who speak German, one for people who speak French and one for people who speak English.
You can use the hreflang tag to tell search engines which version of the page is to be served to people who speak French, German and English respectively. You can also tell search engines the location of these users and you can combine both language and location to fine tune the version of the page which is needed to be shown to different language speaking users across the globe. Learn more on hreflang for language and regional URL.
Hummingbird is the codename given to a significant algorithm change in Google Search in 2013. Unlike previous search algorithms, which would focus on each individual word in the search query, “Hummingbird” considers the context of the different words together, with the goal that pages matching the meaning do better, rather than pages matching just a few words
A hyperlink is a text, graphic, image or document which links to another webpage on an object. The Link can lead to a page on the same website or a page on a different website. A Link or a “Hyperlink” is generally a clickable text on a webpage, clicing which takes the user to the target page.
Similar to an XML sitemap for content pages, an image sitemap helps Googlebot discover images more quickly and in a better way. It is simply an XML file containing all the URL of images that are used in your website. You can use Google image extensions for Sitemaps to give Google more information about the images available on your pages. Image sitemap information helps Google discover images that they might not otherwise find.
An Inbound link is a hyperlink from another website to your own website. It is also sometimes referred to as a “Backlink” or an “Incoming link”.
An index could refer to different things. The most common usage of “index” in the SEO world is referred to as “Google’s search index” which is a huge database of URL’s that Googlebot has crawled and “indexed”. It contains billions and trillions of URL’s which Google has found after crawling the web.
Index could also refer to a “list” i.e an index of pages on the website could simply mean a list of pages on a website. Index could also refer to the homepage or the “index” page of a website.
Indexing is the procedure used by search engines to find content on different web pages and save that reference to its own database so that when users type for words in a search engine, the System can tally those words in its index, find the best possible match and show it to the user in search result pages.
Internal links are hyperlinks to other internal pages of a website. If you link to an existing page on your website from a page on your website, that link is called an external link. On the contrary, if you link to a page outside your website, that link will be an external link.
In the context of Search engine optimization, a keyword is defined as a single word or a group of words which describe “The intent of search”. It clearly explains what the person might be looking for.
For example, if someone searches for “Gifts or Christmas”, this is a keyword phrase which clearly explains his intent of the search. He could be looking to buy gifts for Christmas, see gift choices and then decide, maybe he is doing some research or looking for ideas.
A keyword is a central block of Search engine optimization and everything derives out of “Keywords and their usage”.
Keyword density is the percentage of times a keyword or phrase appears on a web page compared to the total number of words on the page. In the context of search engine optimization, keyword density can be used to determine whether a web page is relevant to a specified keyword or keyword phrase.
Keyword density plays a major role in the SEO of a website or webpage but it is a double-edged sword. If you overdo it, it may do more harm than good.
Keyword planner (earlier known as Google Adwords keyword tool) is a free tool provided by Google Adwords team to advertisers who want to show advertisements on Google search and Google’s display network. It is like a palette where advertisers can search for keywords, see the amount that is required to purchase an advertisement, figure out how much competition is there in the space and do lots of other interesting things related to search engine marketing. Learn more about Google Adwords keyword planner
Keyword research is a practice which a website owner or a professional uses to find out popular words and phrases which people type in search engines around a particular topic. Keyword research is used to figure out what people are searching for around a given topic. Once keyword research is done, the website owner or the blogger figures out what type of content needs to be created to get more traffic, customers, leads or sales of products and services.
“Keyword stuffing” refers to the practice of loading a webpage with keywords or numbers in an attempt to manipulate a site’s ranking in Google search results. Often these keywords appear in a list or group, or out of context (not as natural prose). Filling pages with keywords or numbers result in a negative user experience and can harm your site’s ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.
Also known as Google Knowledge Graph, The Knowledge Graph is a knowledge base used by Google to enhance its search engine’s results with information gathered from a variety of sources. This information is presented to users in a box to the right of search results. Knowledge Graph boxes were added to Google’s search engine in May 2012.
A Landing page is a page on a website which acts as an entry point for a new visitor or “potential customer” or “sales lead”. By definition, any page on a website can be a landing page because every page is a potential entry point. However, some pages are built with specific focus and objectives to acquire a specific type of user and that is what separate landing pages from other generic web pages. A Landing page has a focus and purpose – to acquire users of a specific segment or type.
A “Linkbait” is a specific article or page on a website which attracts links and citations from other sites naturally, without you having to tell them to link to. Most people will not link to your website unless the resource in question is very exclusive in nature. Hence, a Linkbait article is carefully crafted and deployed to attract links from other sites and improve the search engine optimization of a website.
Link building is the process of building backlinks to your website (both the home page and internal pages). Link building is considered a good practice for search engine optimization as it helps in improving the authority of your website and increases the rankings in search results, which results in more traffic and hence more business.
Link building has to be done carefully and objectively, by adhering to the Google webmaster quality guidelines. Failure to abide by the quality guidelines may lead to adverse results.
A Link exchange is a “deal” between two website owners wherein one website links to the other and vice versa. It is naturally very hard to attract links naturally towards your website. Hence, some website owners participate in link exchange programs in order to gain links by linking to each other’s website. There are portals and forums where website owners discuss and share links to each other’s websites. Please note that Link exchange is purely a blackhat technique and should be avoided if you want to stay compliant with Google webmaster quality guidelines.
A Link farm is a website which creates a set of pages on one section of the website, solely for the purpose of linking to a key page or section of the same website or another website. This is done in order to improve search rankings for a specific landing page on the same website or on some other website. A typical link farm website generates hundreds of thousands of pages with autogenerated content and links to a specific page or section over and over again with a variety of anchor texts to achieve search rankings. This is a blackhat technique and is best avoided if you want to adhere to Google’s webmaster quality guidelines.
Link spam is the most common way website owners try to create backlinks towards their websites. This includes posting irrelevant links in portals, forums, blog comments, discussion boards and every possible place to ensure the backlink helps in search engine optimization of the website in question. Link spam is not a good way to improve the SEO of a website and in some extreme cases, the website may get penalized in Google Search.
Local SEO is the procedure to optimize your website to show up for local searches. If you have a business which has some local presence and you want your business to get discovered in local searches such as Google Maps, Google Local search and other places with an emphasis on local listings, then you should optimize your website for Local SEO.
Local search refers to the concept of searching for content, businesses, and listings confined to a particular location. For example, the search term “restaurants” is not a local search keyword but someone who is searching for “best restaurants in New York” is probably looking for a restaurant to dine out in New York. He is only interested in content and businesses that are in New York, hence this search term is perfectly a “Local search term”.
Google understands which keyword is a local search term and which one is not and depending on the type of query, local search results are sometimes shown in Google Search, Google Maps and other places where the search is performed.
Long tail keywords
A Long tail keyword is a long search term which contains at least 3 words. For example, the keyword – “barbershop” is not a long tail keyword but the phrase “Barbershops in New Jersey with Spa” is a long tail keyword.
Long tail keywords are a good indicator of what people are looking for in your website. Your website analytics logs will show you a lot of data with respect to long tail keywords and it is considered a goldmine as far as optimizing your website for the long tail of search is considered.
Long tail keywords also help you find new content ideas and create more useful and engaging content on your website.
Meta description is a website meta tag which is included in the <head> section of a webpage. It is basically a text string of 165 characters which explains, in brief, the content of a webpage. Sometimes, Google may choose to show a different meta description than specified by the owner of a webpage. This happens because Google thinks another snippet of content is better relevant to the query of the user.
Recently, for selected search results the length of meta description has been increased to include up to 300 characters. Meta descriptions are important elements of SEO and good meta descriptions make it a little easier for websites to rank in search result pages and at the same time, it attracts more click-throughs from the users. Learn how to write a good meta description.
Similar to meta description, meta keyword is a meta tag which is included in the <head> section of a webpage. It is basically a list of comma separated words which signals search engines what are the most important words on a given page. Meta keywords have relatively less value compared to meta title and meta description and Google gives little to no importance to meta keywords when evaluating the relevance of a webpage to the user’s query.
Meta tags are snippets of text that describe a page’s content and other attributes about the page but do not contain the content of the page itself. Also, the meta tags are only meant for the search engines, bots, and spiders and not meant for the user browsing the page. The meta tags don’t appear in the body of the page and is placed in the source code within the <head> section.
Mobile first indexing
Mobile first indexing is a concept introduced by Google which puts greater emphasis on pages which are mobile friendly. Until 2016, Google’s index of web pages consisted of pages that were rendered from the desktop point of view. Post-2016, the Google bot has become more intelligent and will now prefer to render pages from a mobile device point of view. Henceforth, preference is to be given to those websites who are mobile friendly and whose content can be read in even a mobile device with small screen size.
Mobile usability report
Mobile Usability report is a report generated by Google search console which shows you the issues and errors Google has found while trying to render your website’s pages on a mobile device. It focusses on the problems you need to fix to ensure your website loads correctly in all possible mobile devices used for accessing your content.
Mod_rewrite is a way to rewrite the URL at the server level and showing output to the user without any intervention of the browser. It is a technical concept which comes with your website’s web hosting package. One has to use Mod_rewrite carefully in his web server, and not mess up the URL structure of the website and its internal pages. Mod_rewrite is not to be practiced if you don’t know what you are doing.
Navigation refers to that design element or design unit of a website which lets the visitor navigate to other parts of a website. In general, the navigation consists of a menu with links to important pages and sections of the website which allows the user to hop from one page to another page without having to look for individual pages all the time.
In general, the navigation menu is placed at the t0p of a website in the Above the fold section.
Negative SEO refers to the practice of using a black hat and unethical techniques to sabotage a competitor’s rankings in search engines. It is typically used by spammers and lazy people who do not want to put the effort in building a good and useful website themselves but want to harm other people’s website by doing all the wrong things on behalf of them.
For example, if I am not able to build good links towards my website and improve the rankings, I will build bad links towards my competitor’s website and ensure they get de-listed from Google search. Negative SEO rarely works and there is no need to worry about it since there is nothing you can do to stop it. All you should be doing and focussing is working on your own website and improving it day by day with useful content people love to read.
The noarchive meta tag is a directive to search engines not to store the cache of the webpage in their index or server. When a webpage is crawled by a search bot or a search crawler, it stores a cached copy of the page in it’s index for faster retrieval. If you are not happy about this and want to stop it, you can use the noarchive meta tag in the source code of your website’s pages.
The noindex meta tag is a directive to search engines not to crawl and store the content of the webpage in it’s index. A webpage with the noindex meta tag is most likely not to be seen in search result pages because the website owner has explicitly told search engines not to index the content of this page at all.
Rel=noreferrer tells the browser not to send referer information to the server when requesting a Web page. No browser supports this protocol at the time of writing this and this has no effect on SEO or link value. More information about noreferrer meta tag.
Open Graph Protocol
The Open Graph protocol enables any web page to become a rich object in a social graph. While many different technologies and schemas exist and could be combined together, there isn’t a single technology which provides enough information to richly represent any web page within the social graph. The Open Graph protocol builds on these existing technologies and gives developers one thing to implement. Developer simplicity is a key goal of the Open Graph protocol
Organic traffic is that traffic which a website receives from search engines through natural searches made by users who are using the search engine across the globe. Organic traffic is “Free” which means website owners do not have to pay for this traffic and can get visitors and customers for free. The other form of traffic is direct traffic, which is that section of traffic which directly comes to a website and not from a search engine.
Referral traffic is the traffic which a website receives from other referring sites and Paid traffic is the traffic that a website receives from sponsored advertisements.
Organic search results are listings on search engine results pages that appear because of their relevance to the search terms, as opposed to their being advertisements. In contrast, non-organic search results may include pay per click advertising.
Outbound links are external links from a website to a different website. Outbound links are sometimes also referred as “External links”.
PageRank (PR) is an algorithm used by Google Search to rank websites in their search engine results. PageRank is a way of measuring the importance of website pages.
PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.
Please note that this does not necessarily mean that pages with higher PageRank will always be on the first page of Google and pages with lower PageRank will never show up in Google search results. That said, PageRank is Google’s measuring scale for weighing the importance of web pages.
Paid Links are “Sponsored links” which an advertiser buys on a website, blog, forum, portal or any other domain. Since links are a differentiating factor in determining search rankings, a lot of advertisers won’t mind paying some websites money in order to create a backlink from that website to theirs. Sometimes, it also means that an advertiser wants to insert a “Text link advertisement” in one of your posts not for the SEO benefit but only to get referring traffic from your page.
Paid links are not bad but should be dealt with carefully.
Google Panda is a change to Google’s search results ranking algorithm that was first released in February 2011. The change aimed to lower the rank of “low-quality sites” or “thin sites”, in particular, “content farms”, and return higher-quality sites near the top of the search results. Google published an official guideline on creating high-quality sites.
Google launched the Penguin Update in April 2012 to better catch sites deemed to be spamming its search results, in particular, those doing so by buying links or obtaining them through link networks designed primarily to boost Google rankings. The changes in Google Penguin algorithm were later incorporated in Google’s main algorithm.
Poison words, or forbidden words, is the name given to words or phrases that trigger suspicion, mistrust and loss of respect, or are of inappropriate character for a given web site in its consideration for a search engine.
The preferred domain is the one that you would like used to index your site’s pages (sometimes this is referred to as the canonical domain). Links may point to your site using both the www and non-www versions of the URL (for instance, http://www.example.com and http://example.com). The preferred domain is the version that you want to be used for your site in the search results. Learn more about using Preferred domains.
The keyword proximity refers to the distance between the search term’s individual keywords. It is not necessary that keywords in the target webpage have to appear in the same order of the search query. For example, if the query is – “best summer collection for men”, then a webpage may appear in search results which contains the phrase – “Here is a list of summer clothes for men which we hope you will enjoy”.
Keyword proximity is a very complex concept and one need not worry too much about Keyword proximity and let the content flow naturally in the webpage to achieve desired results.
A Query is the total search phrase used by a user to search for something in a Search engine. A Query may contain keywords and it may not contain keywords but in general, 99% queries contain keywords because, without keywords, it does not make much sense to search for something. For example, “Buy Classic footwear offer discount code” is a query while “Classic footwear” is the keyword inside the query.
Note that all keywords are queries but not all queries are keywords.
Rankings refer to “Search rankings” of a webpage in Google search results for specific keywords, phrases or queries. When someone says he has a search ranking 1 for the keyword “ABC”, he means that if anyone searches for the keyword “ABC” in Google Search, his website will show up as the first link (not counting links from ads).
Rankings mean traffic and traffic mean business. A loss in rankings means a loss in business value and hence website owners are very careful with maintaining their website rankings at all costs.
When two websites exchange links with each other, the links are called “Reciprocal links”. It is a typical link exchange which has been in practice since the beginning of internet and search engines. It is not considered a good practice to have too many reciprocal links and that too from too many low-quality websites. That said, if the two websites are genuine and legitimate sites, it won’t hurt to have some reciprocal links in some pages but one must be careful and not overdo it across all pages.
A redirect is a URL forwarding technique which is used to forward a webpage to a new address. That means, the existing old address of the webpage automatically takes the visitor to its new address. URL redirection is achieved on the server side through various mechanisms depending on the technology used to deploy the website and the type of web server used.
When you see the term “Redirect” or “301 redirects”, it usually means that the page is forwarded to another URL and is accessible through both the URL’s, the older URL being forwarded to the newer one automatically.
“Re-inclusion” or a “Re-consideration request” is the process of asking search engines to re-include a website or specific pages of a website in search results after the website has been penalized by Search engines for not adhering to Google Webmaster quality guidelines. Google and other search engines continuosly filter websites with low quality content and those who practice link spam and violate content quality guidelines. Websites what are de-listed from Google get no traffic from Google and are pretty much considered “Dead”.
However, if the websites correct all the wrong things at their end, cleans up all the technical things on their website, removes bad links and low-quality content from their sites, they can file a “Re-consideration request” from Google search console.
Rich Cards are a display option for specific content in the mobile Google search. A webmaster can choose this type of presentation by marking up content such as recipes and movies with the JSON-LD format. The content will then be displayed in a visual format which improves the mobile user experience and interaction. Rich cards can be viewed as an evolution of rich snippets and build on the vocabulary of Schema.org. Learn more about Rich Cards
A “snippet” is a search result displayed on Google search result page. A “Rich snippet” is a search result which contains more than just the title and the meta description of the page displayed in the search result. A rich snippet can contain a small thumbnail image from the page, a rating information, author information and much more content in it. A rich snippet is shown in the webpage in question is using structured data in its source code. Learn more on rich Snippets and structured data.
SEO stands for “Search engine optimization”. It is the process of optimizing the content and structure of a website for ranking in search engines for specific keywords and phrases so that the website can attract organic traffic, the traffic that is free and large in number.
“SERP” stands for “Search engine result pages”. When a user performs a search on Google.com, he is shown results on a page and this page is called “SERP” or (Search engine result pages). Each SERP contains 10 links and the goal of a website is to rank on the 1st SERP, preferably on the 1st position to attract maximum traffic organically.
A Sandbox is a typical testing environment which is used to debug or test something. When people refer to a “Sandbox”, it generally means that there is a specific “testing tool” that is being used to test or debug something.
Search console refers to the free tool Google provides to webmasters and website owners to fix issues with their website and improve the structure of their website that makes it easy for search engines to crawl and index the content of the website on a regular basis. If you have a website and you want to make it friendlier to Google, it is strongly recommended to create a Google search console account and verify your website.
Search operators are typical words used in Google search engine to create an advanced search query on Google. For example, you can put $ in front of a number to search for the price. A typical example is “soccer price $50” – this will refine search results which contain prices of soccer balls. Search operators are used to refining search results and restrict search results of a particular type that precisely match your search query and the refinements you have provided with it. Learn more about Google search operators.
A Search engine is a website that runs a “Crawler” program which “Crawls” the internet to find content and websites through links. Once a website is found, the crawler tries to crawl all the available content inside it and then stores that content in it’s”index” so that if a relevant search occurs, it can show that page to a potential user who would benefit from that content.
A Sitemap is a list of all the pages that a website has. A sitemap can be in XML or HTML format. XML Sitemaps are meant for only search engines while HTML sitemaps are meant for both users and search engines. It is recommended to have both the XML and HTML sitemaps in a website since it showcases all the content of the website at one place and allows the user to discover content and pages which they may not otherwise find.
A Spider is another name of a “Search bot” or a “Web crawler” which “Crawls” or discovers content from different websites and stores them in its index so that it can fetch the same page for a relevant search query made by someone else at a later point in time.
The page of a Web site that the user sees first before being given the option to continue to the main content of the site. Splash pages are used to promote a company, service or product or are used to inform the user of what kind of software or browser is necessary in order to view the rest of the site’s pages.
A Splash page is often used as a marketing weapon and deployed to entice the user into buying something or subscribe them to a newsletter campaign or achieve other marketing objectives.
Title tag stands for the Content placed inside the <title> </title> tags used in the HTML source code of a webpage. Title tags are extremely important for Search engine optimization since the content of the title tags is shown in Search result pages and Search engines pay extra importance in the content of the title tag. Learn more about creating SEO friendly Title Tags.
URL rewrite is the process of “rewriting” or “changing” the URL of web pages by the web server where the website is hosted. If you have a website with dynamic URL’s that are genarated by the content management system used in the website, you must consider using URL re-writing to create better, prettier URL’s which are friendly to search engines.
Unique content refers to content that is unique to a website and available nowhere else. Unique content is carefully written content by an editorial team and often takes lots of time to develop and brainstorm through. Google and other search engines love unique content and the more unique the content is, the better it is for the long-term success of your website.
Unique content does not mean that the topic or the subject of a page has to be unique. What it means is that the idea presented must be original in nature and not taken from somewhere else. 100 Bloggers can present the same idea in their own words and all of the blog posts will be unique content since each blog post contains the facts and theories about the idea from an individual author’s perspective.
A user agent is a software that is acting on behalf of a user. One common use of the term refers to a web browser telling a website information about the browser and operating system. This allows the website to customize content for the capabilities of a particular device, but also raises privacy issues. Learn more.
A vertical search engine as distinct from a general web search engine focuses on a specific segment of online content. They are also called specialty or topical search engines. The vertical content area may be based on topicality, media type, or genre of content. Common verticals include shopping, the automotive industry, legal information, medical information, scholarly literature, job search, and travel. Learn more.
Web hosting stands for using a service to host your website so that the content of the website is accessible by anyone. In general, web hosting implies to web hosting companies who sell web hosting plans to customers in exchange of giving them all the necessary tools and services to host and develop their websites. Most common web hosting providers are Hostgator, Bluehost, Dreamhost, Wp Engine and so on.
There are free web hosting providers who will let you create your basic website for free, at no cost at all. Premium web hosting is used by people who want to go the extra mile. If you are really serious about building a quality website, you can go premium at some point or other and choose a reliable and trustworthy web hosting company to host your website with.
White hat refers to the practice of applying legitimate SEO principles and strategies to naturally grow the traffic and presence of a website in Google search results. White hat generally means that the website owner has not broken any guideline mentioned by Google webmaster search console and has not tried to game the system by going out of his way to do whatever it takes to improve the traffic of his website.
A Widget is a small element on a website which performs a specific action or displays a specific content, either from the same website or from another third party website. Widgets are used for a variety of purposes. Sometimes, widgets are used to display ads, banner ads, banners, normal text, a list of posts from another website, Tweets from Twitter.com, Facebook fan page box – all these are some good examples of widgets.
An XML Sitemap is an XML file placed at the root of a website which contains all the URL’s of the website listed so that search engines can use this file as a reference to find all the content available on the website. A search bot can discover and “Find” content on its own but an XML sitemap provides an easy roadmap to the search engine to discover content quickly and more efficiently, one URL at a time without having to discover those URL’s the hard way.
It is strongly recommended to have an XML sitemap in your website and add that sitemap in your Google search console account to see statistics and reports about the “Crawl status” of all the pages on your website.
Yahoo is a search engine which was one of the primitive ways to discover content on the internet. Before Google became popular, it was Yahoo which was used by almost everyone for everything – search, email, web chat, IRC, and other internet activities.
Nofollow is an attribute used in the source code of a hyperlink which is used to tell search engines that this hyperlink is not necessarily an endorsement to the target website and it should not influence the ranking of the target website in search engine result pages. Nofollow attribute is used to control the flow of link juice from your site to other sites and to even other internal pages of your website. It is also used to link to sites you necessarily do not endorse but must have to link to for necessary reasons. Learn more on using Rel Nofollow for Search engine optimization of your website.
A Robots.txt file is a simple text file placed at the root of a website which is used to instruct search engine bots on which pages to crawl and index and which pages they are not allowed to crawl and index. It is a way to restrict search engines from crawling specific pages or sections of your website. The purpose of robots.txt is not to have unimportant pages being indexed by search engines. See an example of a Robots.txt file. Learn more on using a Robots.txt file for search engine optimization of your website.
We hope you have liked this detailed SEO glossary. If you feel we have missed something, feel free to write to us at firstname.lastname@example.org and we will incorporate your suggestions in the Glossary. You can download this SEO glossary as a PDF document here -[ Download SEO Glossary (PDF) ]Be sure to check out our Blogging glossary and read our SEO Guide which contains useful information about SEO and we have discussed in detail key SEO Concepts with examples.