Showing posts with label SEO techniques. Show all posts
Showing posts with label SEO techniques. Show all posts

Monday, 29 February 2016

// // Leave a Comment

Robots.txt for SEO Ranking

Robots.txt for SEO Ranking

What is "robots.txt"?

For those who don’t know what is robots.txt, it is a text file which instructs search engine robots to crawl a page or not. Any CMS (content management system) based sites are having admin module online and that should not be crawled by Search Engines, using robots.txt you can block that part to not get it crawl.

About /robots.txt

Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.It works like this: a robot wants to visits a Web site URL, say Before it does so, it firsts checks for, and finds:

User-agent: *
Disallow: /

The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
There are two important considerations when using /robots.txt:
  • robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
  • The /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use.
So don't try to use /robots.txt to hide information.

How to create a /robots.txt file

  • Where to put it - In the top-level directory of your web server.
  • When a robot looks for the "/robots.txt" file for URL, it strips the path component from the URL (everything from the first single slash), and puts "/robots.txt" in its place.
  • For example, for ", it will remove the "/shop/index.html", and replace it with "/robots.txt", and will end up with "".
  • So, as a web site owner you need to put it in the right place on your web server for that resulting URL to work. Usually that is the same place where you put your web site's main "index.html" welcome page. Where exactly that is, and how to put the file there, depends on your web server software.
  • Remember to use all lower case for the filename: "robots.txt", not "Robots.TXT.
What to put in it
  • The "/robots.txt" file is a text file, with one or more records. Usually contains a single record looking like this:
User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /~joe/
In this example, three directories are excluded.
Note that you need a separate "Disallow" line for every URL prefix you want to exclude -- you cannot say "Disallow: /cgi-bin/ /tmp/" on a single line. 

Also, you may not have blank lines in a record, as they are used to delimit multiple records.

Note also that globbing and regular expression are not supported in either the User-agent or Disallow lines. The '*' in the User-agent field is a special value meaning "any robot". Specifically, you cannot have lines like "User-agent: *bot*", "Disallow: /tmp/*" or "Disallow: *.gif".
What you want to exclude depends on your server. Everything not explicitly disallowed is considered fair game to retrieve. Here follow some examples:

To exclude all robots from the entire server

User-agent: *
Disallow: /

To allow all robots complete access

User-agent: *
(Or just create an empty "/robots.txt" file, or don't use one at all)

To exclude all robots from part of the server

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

To exclude a single robot

User-agent: BadBot
Disallow: /

To allow a single robot

User-agent: Google
User-agent: *
Disallow: /

To exclude all files except one

This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:
User-agent: *
Disallow: /~joe/stuff/
Alternatively you can explicitly disallow all disallowed pages:
User-agent: *
Disallow: /~joe/junk.html
Disallow: /~joe/foo.html
Disallow: /~joe/bar.html

/robots.txt checker

There are third party tool which allows us to check the /robots.txt :

Read More

Wednesday, 10 February 2016

// // Leave a Comment

IFTTT - Powerful Social Media Connection Platform

IFTTT - Powerful Social Media Connection Platform
IFTTT – What it is & how to use it.

It can be difficult to regularly update each social media channel you have an account with. A new and popular way to make this more manageable is a web service platform named ‘IFTTT’ (like lift with no ‘l’). For many users this might seem like just another complicated website to get to grips with. However, using IFTTT for both your business and personal life can make updating social media channels much simpler. Take a look at our straightforward explanation of IFTTT as well as our list of 7 ways people are using it. 

IFTTT is a platform which ‘lets you create powerful connections with one simple statement: If this then that’. IFTTT explains that this is a trigger and that is the resulting action. The website calls these combinations ‘recipes’. An example of a recipe is: if ‘someone retweets your tweet’ automatically ‘tweet to thank them for retweeting’.

7 great ways people are using IFTTT

#1 – Retweets or mentions

If a user retweets, mentions or includes you in a #FF then you should ideally reply to thank them. Doing this for each tweet can take up a lot of time, so why not set up an automatic tweet using IFTTT? Conversations with users don’t have to end here, but at least their activity has been acknowledged.

#2 – Follow a user

If you are working to grow your Twitter readers you might want to follow back each user that follows you. Make this easier by creating a recipe which automatically adds new users to a list. This means that you can view all of your new followers together, and can then follow those that you want to without having to look through your full followers list.

#3 – Sending items in Google reader

If you have set up a Google reader for a client, you might notice that not every post is useful to them. To avoid your client having to trawl through lots of useless articles you could instead set up a recipe to forward items to their inbox as you star them in Google reader.

#4 – Schedule posts or tweets

As an alternative to HootSuite and other Twitter managers, the ‘Date & Time’ channel on IFTTT can be used to plan posts for your social media accounts. Unlike HootSuite this allows you to set up a single tweet to be posted each day/week/month/year rather than planning each separately. This could be especially useful for promoting an event or special offer.

#5 - Receive updates when a specific person tweets

IFTTT can be used to notify you each time a specific user of posts a tweet. Notifications can be sent to your phone, email, Evernote or Google calendar. So, if you’re following a key industry influencer, IFTTT gives you the power to react quickly to their tweets.

#6 – Weather updates

This recipe is particularly useful with the changeable weather we have been having recently in England. Planning weather updates means that you can be better prepared. Simply set up ‘this’ to be the weather and ‘that’ to be the way in which you want to receive the information such as email or text.

#7 – BBC news in pictures

Another fun idea is to have the BBC ‘news in pictures’ emailed to you each day. Simply set up a recipe with ‘this’ as feed- new feed item and then set ‘that’ to be an email straight to your inbox each morning!
There are so many other useful (and fun!) ways of using IFTTT for both your business and personal social media accounts. The best way to explore the platform is to create an account and try out different ways of mixing and matching each channel. Plus, a lot of useful ideas can be found in the browsing section, displaying all the great combinations put together by other users.

Be connected with this blogs & get a Knowledge for the Digital Marketing.
Subscribe to Digital Marketing by Email
Read More

Monday, 25 January 2016

// // Leave a Comment


Digital Marketing

Search Engine Optimization

After creating a website, you may want to optimize it to search engines so that your website is visible on the Internet and get massive traffic. The Internet is already flooding with millions of websites and there is a stiff competition for people attention. In this situation, SEO is a must to do activity to establish your website on the Internet and earn a good reputation. This blog lists down the important SEO checklist and a guide on how to do them.
Submission to major search Engines and directories
Nobody will notice your website until you submit it to Google as it is the most popular search engine on the Internet. Apart from Google, Bing, AOL, and Ask. Com is few other important search engines. You should also submit your website to major internet directories for more exposure and traffic. Dmoz, Wikiweb, Linkbook, Wikidweb, etc are some of the good web directories.
Google Webmaster is a set of tools which lets you manage the online presence of your website and provides valuable data. It shows search queries, crawl status, links, sitemap, and much other important data through which you can improve the look and performance of your website on Google search. Webmaster also alerts you through message when something goes wrong in your website. Similarly, Bing Webmaster is also an important tool.
Obviously you would want to capture every single detail of your customers to optimize your website and services accordingly. Google Analytics is an important tool for visitor analysis. It provides you the detailed report on the performance of your website including in-depth analysis of your visitors. It is a free service of Google so create your account here.
Sitemap is an essential SEO technique which optimizes a website for search engines. A sitemap helps both search engine and visitors to understand the navigation and content of your website. It increases the visibility of a website leading to more exposure and traffic. Keep a sitemap on your website for visitors and submit one to webmasters for crawling and indexing. You can create a sitemap using online tools and resources. Click here to generate your sitemap.
Robot. Txt file communicates with the search engine robots and instructs them how to use a website. For example, there may be some sections of your website you may not want to be crawled by Google so you create a robot. Txt file. This is a very critical function and if used incorrectly may cause the restriction on crawling leading to loss of traffic and ranking.

Everyone hates slow loading sites. If your website takes more than 3 seconds to load you will lose a significant amount of traffic and bounce rate will be always high.  Most of the web users are impatient and have endless options so they will not wait for your website to load, they will simply go to your competitors. Heavy image and videos, bulky coding JavaScript errors, and massive use of flash often slow down your website.
Search engine is the main source of traffic. People search information through various keywords and visit websites which come on the top, so there is always huge competition on primary keywords. One of the most important SEO tasks is to choose a set of relevant keywords and apply wisely on your website. You can use Google Keyword Planner to analyze the keywords of your industry. Once you have chosen keywords apply them in Title Tag, Meta Description and h1 tag of your content for better visibility.
Since a large number of people user smartphone and tablets to surf the internet, it is wise to create a mobile responsive website.  Mobile is easy to carry and convenient to use so people prefer mobile internet. Modern smartphones have empowered users to access most of the desktop feature on small screens, for instance, one can pay bills, book tickets, transfer money and many other things through smartphone.
Social media is a big source of traffic for your website after search engine. Millions of people use social media sites like Facebook, Twitter, and LinkedIn in their daily lives. You can convince them to visit your website and convert into customers through right social media strategy. Social media has massive potential to multiply the revenue of your website. Create an account on major social media websites, post useful content and engage with other people to increase the profitability of your website.
Read More