Friday, February 20, 2015

How Google treats "allow" field in robots.txt

What is mean by Robots.txt: robots.txt is set of instructions for blocking the particular pages, folder, images, etc... Also, we can block the particular search engines by using robots.txt. like, Google, Bing, Yahoo, etc.

Allowing all the search engines and accessing all the files.

User-agent: *

Disallowing all the search engines and not accessible all the files.
User-agent: *
Disallow: /

if we need to exclude all the files except few files, what we could do.

we have to use the disallow command for all the files, which need to be excluded. either files can be accessible by the search engines. Check the advice here.

To exclude all files except one

This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:
User-agent: *
Disallow: /~joe/stuff/

As per Google: To exclude all files except one:

User-agent: *
Disallow: /
Allow: /2009/03/aarya-sarvam.html

Google disallowing all the files except 2009/03/aarya-sarvam.html because of allow command. Google following all the disallow and allow in Robot.txt, Those allow method does not valid by But Google accepting the allow method and working according to that.

Check the visual screenshot here: Except file showing as Allowed in robots.txt Tester tool:

Checked another url in robots.txt Tester tool:  It's showing as disallow

Robots.txt Changes: Before going to making a changes in the robots.txt. We need to test the changes via Robots.txt Tester tools. 

Robots.txt Tester Tool: its online editor, so we can edit the text in robots.txt tester tool and we can check the changes before going to upload the robots.txt file in live server.

Test your robots.txt with the robots.txt Tester:

Tuesday, February 3, 2015

PageRank is still considerable?

What is mean by Pagrank: Pagerank equation is based on site backlinks(authority). If we are going to link someone, like

website A -> links -> website B

website A rank juice(authority) will flow to website B. So many people used this methodology to gain rank in shorter period.

PageRank Negatives: Yes, it's lead to link farms, link exchanges, and paid links. People still believe about pagrank and doing some bad techniques to earn pagerank such as called link schemes. But now Google alerted about Pagerank and they have changed their algorithm to clean the spam websites in search results. For avoiding this kind of pagerank manipulation, Google invented "Penguin" Algorithm.

Which helpful to prevent,

1. Low quality links
2. Over optimized anchor texts
3. Keywords stuffing

Pagerank is dead: Yes finally Google(John Muller) announced  on 06 Oct 2014.


But still so many people are believe about Pagerank and keep posting about PR in SEO Forums. Anyhow PR toolbar is dead and Google wont update in future. So what we need to look hereafter instead of Pagerank.

Search Queries feature in Google Webmaster Tools: Yes you have to examine this section for improving site and rankings. Which provides lot of information for the Webmasters.

View Search Queries:

  1. On the Webmaster Tools home page, click the site you want.
  2. On the left-hand menu, click Search Traffic, and then click Search Queries.

More information about Pagerank:

PageRank: one of just 200 signals

Beyond PageRank: Graduating to actionable metrics

Pagerank officially dead

My site's PageRank has gone up / gone down / not changed in months!

Also take a look at the above section. Pagerank is one of the 200 ranking signals from Google. So don't worry too much about 1 signal. Better to build the other 199 signals for the website.

Alternatives to Google Site Search

Google plans to discontinue Site Search : On April 1, 2017, Google will discontinue sales of the Google Site Search. All new purchase...