Showing posts with label SEO Tools. Show all posts
Showing posts with label SEO Tools. Show all posts

Saturday, 16 April 2016

// // 2 comments

How To Optimize Blog for Search Engine Using Search console Tool

Google Webmaster is an amazing tool, and if you are a serious blogger you should never avoid Google Webmaster tool. From SEO perspective, it’s a life saver as it gives you the complete detail about your site in Google search engine.

Webmaster tool offers many tools that will let you analyze your blog thoroughly, and you can work on that to make your blog ranking better in the search engine. Remember search engine optimization is a gradual process and doesn’t happen overnight. Sometime it take weeks, months to get proper search engine ranking. I will not be taking about back-links and other strategy for your ranking, instead in this post I will be sharing about how you can use best free SEO tool call Google webmaster tool to optimize your blog for search engines.

How to Make your Blog Search friendly using Search Console tool:

Digital Hardy

This is the first thing that you need to do. Generate a sitemap of your blog and submit it to Google webmaster tool. This will eventually help Google to crawl your site effectively, and you can keep a track how many links are being indexed by Google.  If you have not submitted your sitemap till now, go ahead and submit your sitemap to Google webmaster tool.
Digital Hardy Web master tool

Webmaster settings enable you to configure some important things like Geographical target. If your blog or service is limited to one particular region, configure it for that. For example one of my client WordPress blog is for UK Web Development services, so I configured her blog Geographical target as UK. Same goes with your blog or service. If your blog targets globally, leave it as it is. WordPress take care of WWW and non-WWW though I suggest use Google webmaster tools to configure this setting as well. Crawl rate depends on how often you update your blog. I always leave it on Google, though if you update your blog many times a day or once in a week time, you might like to play with this setting as well.

Digital Hardy Web master tool
Site link are the pages from your site and website which Google find useful and from my experience pages you link the most define your site links. Sometime Google add less useful posts or pages like disclaimer and privacy policy as site link. In this case you can use this feature of webmaster tool to configure your site links. You can’t add links to be included into your site links from here but you can always block unwanted links.

Digital Hardy Web master tool

When it comes to maintaining your search engine ranking and improving it, finding the top ranking keywords is one of the crucial parts. Search queries feature of Google webmaster tool, will show you which keyword you ranking for, your Google position and link to the post. They recently revamped this and added post link and search engine placement feature. You can use WordPress plugin like SEO Smart link to auto link such keywords to maintain your ranking and also try to push posts on the second page to first page in search engine. This is so far one of the top features that you should be using now.

Ranking for the right keyword is more important than ranking for the wrong keyword. If your blog is about shoes, and you rank for keywords like shirts and trousers and you are on the first page, such traffic is useless. Google webmaster keyword tool determine which keyword they found while crawling your site. If you are ranking for a wrong keyword, your strategy should find the cause and try to show the related and useful keyword to search engine.

There are many other tools like website speed, internal links, crawl errors that are equally important. I suggest you should spend some time inside your Google webmaster tool today and come back with your queries. Let’s make a good discussion thread about SEO using Google Webmaster tool.


Also, let us know which feature of Google search console tool you find most useful?

  Digital Hardy   Digital Hardy   Digital Hardy   Digital Hardy
Read More

Monday, 29 February 2016

// // Leave a Comment

Robots.txt for SEO Ranking

Robots.txt for SEO Ranking

What is "robots.txt"?

For those who don’t know what is robots.txt, it is a text file which instructs search engine robots to crawl a page or not. Any CMS (content management system) based sites are having admin module online and that should not be crawled by Search Engines, using robots.txt you can block that part to not get it crawl.

About /robots.txt

Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.It works like this: a robot wants to visits a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt, and finds:

User-agent: *
Disallow: /

The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
There are two important considerations when using /robots.txt:
  • robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
  • The /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use.
So don't try to use /robots.txt to hide information.

How to create a /robots.txt file

  • Where to put it - In the top-level directory of your web server.
  • When a robot looks for the "/robots.txt" file for URL, it strips the path component from the URL (everything from the first single slash), and puts "/robots.txt" in its place.
  • For example, for "http://www.example.com/shop/index.html, it will remove the "/shop/index.html", and replace it with "/robots.txt", and will end up with "http://www.example.com/robots.txt".
  • So, as a web site owner you need to put it in the right place on your web server for that resulting URL to work. Usually that is the same place where you put your web site's main "index.html" welcome page. Where exactly that is, and how to put the file there, depends on your web server software.
  • Remember to use all lower case for the filename: "robots.txt", not "Robots.TXT.
What to put in it
  • The "/robots.txt" file is a text file, with one or more records. Usually contains a single record looking like this:
User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /~joe/
In this example, three directories are excluded.
Note that you need a separate "Disallow" line for every URL prefix you want to exclude -- you cannot say "Disallow: /cgi-bin/ /tmp/" on a single line. 

Also, you may not have blank lines in a record, as they are used to delimit multiple records.

Note also that globbing and regular expression are not supported in either the User-agent or Disallow lines. The '*' in the User-agent field is a special value meaning "any robot". Specifically, you cannot have lines like "User-agent: *bot*", "Disallow: /tmp/*" or "Disallow: *.gif".
What you want to exclude depends on your server. Everything not explicitly disallowed is considered fair game to retrieve. Here follow some examples:

To exclude all robots from the entire server

User-agent: *
Disallow: /

To allow all robots complete access

User-agent: *
Disallow:
(Or just create an empty "/robots.txt" file, or don't use one at all)

To exclude all robots from part of the server

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

To exclude a single robot

User-agent: BadBot
Disallow: /

To allow a single robot

User-agent: Google
Disallow:
 
User-agent: *
Disallow: /

To exclude all files except one

This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:
User-agent: *
Disallow: /~joe/stuff/
Alternatively you can explicitly disallow all disallowed pages:
User-agent: *
Disallow: /~joe/junk.html
Disallow: /~joe/foo.html
Disallow: /~joe/bar.html

/robots.txt checker

There are third party tool which allows us to check the /robots.txt :

Read More

Monday, 25 January 2016

// // Leave a Comment

TOP SEO CHECK LISTS FOR NEW WEBSITE

TOP SEO CHECK LISTS FOR NEW WEBSITE
Digital Marketing

Search Engine Optimization

After creating a website, you may want to optimize it to search engines so that your website is visible on the Internet and get massive traffic. The Internet is already flooding with millions of websites and there is a stiff competition for people attention. In this situation, SEO is a must to do activity to establish your website on the Internet and earn a good reputation. This blog lists down the important SEO checklist and a guide on how to do them.
Submission to major search Engines and directories
Nobody will notice your website until you submit it to Google as it is the most popular search engine on the Internet. Apart from Google, Bing, AOL, and Ask. Com is few other important search engines. You should also submit your website to major internet directories for more exposure and traffic. Dmoz, Wikiweb, Linkbook, Wikidweb, etc are some of the good web directories.
Google Webmaster is a set of tools which lets you manage the online presence of your website and provides valuable data. It shows search queries, crawl status, links, sitemap, and much other important data through which you can improve the look and performance of your website on Google search. Webmaster also alerts you through message when something goes wrong in your website. Similarly, Bing Webmaster is also an important tool.
Obviously you would want to capture every single detail of your customers to optimize your website and services accordingly. Google Analytics is an important tool for visitor analysis. It provides you the detailed report on the performance of your website including in-depth analysis of your visitors. It is a free service of Google so create your account here.
Sitemap is an essential SEO technique which optimizes a website for search engines. A sitemap helps both search engine and visitors to understand the navigation and content of your website. It increases the visibility of a website leading to more exposure and traffic. Keep a sitemap on your website for visitors and submit one to webmasters for crawling and indexing. You can create a sitemap using online tools and resources. Click here to generate your sitemap.
Robot. Txt file communicates with the search engine robots and instructs them how to use a website. For example, there may be some sections of your website you may not want to be crawled by Google so you create a robot. Txt file. This is a very critical function and if used incorrectly may cause the restriction on crawling leading to loss of traffic and ranking.

Everyone hates slow loading sites. If your website takes more than 3 seconds to load you will lose a significant amount of traffic and bounce rate will be always high.  Most of the web users are impatient and have endless options so they will not wait for your website to load, they will simply go to your competitors. Heavy image and videos, bulky coding JavaScript errors, and massive use of flash often slow down your website.
Search engine is the main source of traffic. People search information through various keywords and visit websites which come on the top, so there is always huge competition on primary keywords. One of the most important SEO tasks is to choose a set of relevant keywords and apply wisely on your website. You can use Google Keyword Planner to analyze the keywords of your industry. Once you have chosen keywords apply them in Title Tag, Meta Description and h1 tag of your content for better visibility.
Since a large number of people user smartphone and tablets to surf the internet, it is wise to create a mobile responsive website.  Mobile is easy to carry and convenient to use so people prefer mobile internet. Modern smartphones have empowered users to access most of the desktop feature on small screens, for instance, one can pay bills, book tickets, transfer money and many other things through smartphone.
Social media is a big source of traffic for your website after search engine. Millions of people use social media sites like Facebook, Twitter, and LinkedIn in their daily lives. You can convince them to visit your website and convert into customers through right social media strategy. Social media has massive potential to multiply the revenue of your website. Create an account on major social media websites, post useful content and engage with other people to increase the profitability of your website.
Read More