Showing posts with label become SEO freelancer. Show all posts
Showing posts with label become SEO freelancer. Show all posts

Sunday, December 25, 2016

More About On-Page SEO Optimization


In my last post on One Stop SEO Guide, I discussed with you about few major on-page factors. Now we will discuss few more on page factors that will help you in becoming a SEO expert. Earlier we discussed about Keyword Research, Meta Tags, Title Tag, Image Alt Tags, URL restructuring & redirection, internal and external linking (Inbound and outbound linking). Read our previous post Learning On-Page SEO via One Stop SEO Guide to learn about these all.

Now we will discuss about Keyword density, HTML header tags, sitemap and robot.txt files.Then lets start now.

  • Keyword Density: It is the % of no. of times a keyword or phrase appears on a web page or we can say in the content of a webpage as compared to the total number of words on the webpage/ content. When we talk about search engine optimization, keyword density is the factor used in determining whether a web page is relevant to a specified keyword or keyword phrase. Keyword density is an important part of On and Off page SEO. For example, if a keyword appears 4 times in a 100 word text the keyword density would be 4%. From the point of view of search engines, a high keyword density is a good indicator of search engine spam. If a keyword appears too often in a website, search engines will downgrade the website and it will then appear lower down in search results. The density for every keyword you use on a webpage should be between 2 to 4.5%. This % is considered as good practice in terms of SEO.
  • HTML Header Tags: HTML Header tags are used to differentiate the headings and sub-headings of a webpage from the rest of the content. These tags are also known heading tags or simply header tags. The most important heading tag is the h1 tag and least important is the h6 tag. In SEO use a mix of header tags is a good practice and increases the readability of the content of a webpage. There a six types of HTML header tags. These tags are H1, H2, H3, H4, H5, H6.
  • Sitemap (.xml and .html file): A sitemap is a list of pages of a web site accessible to crawlers or users. It can be either is an XML file that lists URLs for a site along with additional metadata about each URL so that search engines can more intelligently crawl the site or a HTML file for the end users. In SEO both are important but we usually focus on sitemap.xml as it is the file that we send to various search engines with the help of their webmasters to help them crawl our website or blog. If you want to generate .xml or .html file for your website, simply visit www.xml-sitemaps.com and enter your website's home page like "http://www.example.com", set the frequency to according to your choice. (Recommended monthly or weekly depending on the rate at which you update your website). Let the last modification to "Use Server's Response" and priority to "Automatically Set Priority". Click on 'Start' button and in few second, your sitemap.xml and sitemap.html files get generated for downloading.
  • Robots.txt File: The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize web sites. Not all robots cooperate with the standard; email harvesters, spambots, malware, and robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites. Include all webpages or URLs that you don't want web spiders, bots or crawlers to crawl with Disallow tag. To learn how to create Robots.txt file, visit here. 
In my next post, I will explain about Webmasters & Analytics integration and use. I will also explain how you can apply URL redirection for your website or webpages. Keep on reading and do share the blog with your friends.

Sunday, December 11, 2016

First Step Towards Becoming A SEO Freelancer

This is in continuation to my earlier post "How To Get Started With Search Engine Optimization". In that post I have explained about what are various Online Marketing techniques. Now here in this post, I will make you learn more about White Hat SEO (Search Engine Optimization). We will discuss other techniques like SMO, SEM, SMM in future post.

Search Engine Optimization (White Hat SEO) is a powerful technique in the hands of a IT professionals which help them to manipulate the search result of Major search engines like Google, Yahoo and Bing as per their requirement.

With White Hat SEO, you can help your website rank well on SERPs (Search Engine Result Pages) against many keywords. The first and the foremost step to get started with SEO is Bidding for a SEO project on websites like Upwork and Freelancer. After bidding, you have to present to client the detailed website analysis report. We will explain you how to prepare a Website Analysis Report and what things you should include in website analysis report.

WAR aka Website Analysis Report is a detailed report of the website on which SEO work is required to be done. To analyze a website you can either use various website analysis tools or can do it manually. You can analyze your website www.site-analyzer.com , www.seoptimer.com or www.woorank.com You should include details of following in the report:

  1. Website Score on Woorank (Visit https://www.woorank.com/en/www/google.com): Simply replace the google.com with your website name in url and enter that url in address bar for results. With woorank, you can check website score and other related issues with the site easily.
  2. Google Analytics: On home page of website, press key combination as "ctrl+U" and a new window with source code will open up. Now find "UA-" with find box. If it is present, the Google Analytics must be installed else not.
  3. Google Webmasters
  4. Meta Title and Description: Check home page and subpages source code by pressing "ctrl+U" for Tile, Meta Description and Keywords tags.
  5. HTML Header Tags
  6. Mobile Friendliness
  7. Website Speed for Desktop and Mobile (To check visit: https://developers.google.com/speed/pagespeed/insights/ )
  8. Check for W3 errors (Check it here: https://validator.w3.org/
  9. Check for robots.txt file. Type website_name/robots.txt (e.g. example.com/robots.txt). You can also create robots.txt file or can simply make one by yourself.
  10. Check for sitemap.xml file. Type webiste_name/sitemap.xml (e.g example.com/sitemap.xml). You can generate sitemap.xml or sitemap.html files here https://www.xml-sitemaps.com/
  11. Check for social media page plugins. Check if website has Facebook page, Twitter, Google+, Linked-in and Pinterest account or not. 
  12. Check if the website has its blog.
  13. WWW Resolve: Check if the website has 301 or 302 redirect. That is if the website get redirected to www.example.com for both instances of typing example.com and www.example.com
  14. Check in woorank, if the website has broken links or not. If yes, mention those links in report. 
  15. Check for website backlinks.
  16. Check Alexa Rank for the website at Alexa.com
  17. Check all images for Image Alt tags.
  18. Check Text to HTML ratio.
  19. Mention Moz PA and DA for the website. 
These are some of the most important things that you should include in Website Analysis Report of a client's website. In my upcoming posts, I will tell you about On-Page SEO, Url-Structuring, Meta Titles, Meta Descriptions, Meta Keywords Analysis and many other On-Page & Off-Page SEO techniques to make your website search engine friendly and to make it perform well on search engine results. stay tunes and share this blog with your friends to support my efforts. Thank you.

Also Read: Learning On-Page SEO via One Stop SEO Guide