Tuesday, July 11, 2017

Alternatives to Google Site Search

Google plans to discontinue Site Search:

On April 1, 2017, Google will discontinue sales of the Google Site Search. All new purchases and renewals must take place before this date. Google Site Search will be completely shut down by April 1, 2018.

So what next,

No more Google site search. Only Google custom search, free version with ads and lot of restrictions such as no image search, no e-mail support etc. but one possible solution provided by some experts in Google Forums.

Try Google custom search with XML or JSON API:

I didn't tried this method. please comment it, if anyone tried and works.

Check the reply from Google about GSS using XML or JSON API:

If you have implemented GSS using XML or JSON API provided in GSS control panel->Business->XML & JSON page then note these features are specific to GSS only and not available in CSE. Once your GSS gets downgraded to free CSE these XML and JSON API implementation will stop working. However if you want to implement CSE using JSON API, you can consider taking the API key from developer console. By default this API key provides 100 queries per day for free and beyond that you need to enable separate billing on this API key to increase the daily limit. Additional requests cost $5 per 1000 queries, up to 10k queries per day.
Using this implementation you can see ads free search results.

You can read more about this API at https://developers.google.com/custom-search/json-api/v1/overview

You can read more about billing and payments at https://support.google.com/cloud#topic=3340599

Other Alternative Solutions:

  1. Server side coding: We can have code on our server to do a simple search. It would not be as fast as Google but still serve the purpose for free.
  1. Cloud based:
  • Amazon CloudSearch - Amazon CloudSearch is a managed service in the AWS Cloud that makes it simple and cost-effective to set up, manage, and scale a search solution for your website or application. https://aws.amazon.com/cloudsearch/pricing/
  1. Others:
  • Swiftype - Create and manage a search experience tailored to your specific needs in no time, thanks to seamless indexing, best-in-class relevance and intuitive customization features. https://swiftype.com/pricing
  • Algolia - Algolia's Search API makes it easy to deliver a great search experience in your apps & websites. Algolia Search provides hosted full-text, numerical, faceted and geolocalized search. https://www.algolia.com/pricing
  • searchIQ - Supercharge your website’s search functionality with searchIQ. With blazing speed get your users to what they are looking for faster than ever and get real–time analytics to help manage your content more effectively. https://searchiq.xyz/pricing.html
  • SearchBlox - SearchBlox is an enterprise search, sentiment analysis and text analytics platform for websites, intranets, file folders, databases and social content. https://www.searchblox.com/pricing-2/

Friday, February 20, 2015

How Google treats "allow" field in robots.txt

What is mean by Robots.txt: robots.txt is set of instructions for blocking the particular pages, folder, images, etc... Also, we can block the particular search engines by using robots.txt. like, Google, Bing, Yahoo, etc.

Allowing all the search engines and accessing all the files.

User-agent: *

Disallowing all the search engines and not accessible all the files.
User-agent: *
Disallow: /

if we need to exclude all the files except few files, what we could do.

we have to use the disallow command for all the files, which need to be excluded. either files can be accessible by the search engines. Check the robotstxt.org advice here.

To exclude all files except one

This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:
User-agent: *
Disallow: /~joe/stuff/

As per Google: To exclude all files except one:

User-agent: *
Disallow: /
Allow: /2009/03/aarya-sarvam.html

Google disallowing all the files except 2009/03/aarya-sarvam.html because of allow command. Google following all the disallow and allow in Robot.txt, Those allow method does not valid by robotstxt.org. But Google accepting the allow method and working according to that.

Check the visual screenshot here: Except file showing as Allowed in robots.txt Tester tool:

Checked another url in robots.txt Tester tool:  It's showing as disallow

Robots.txt Changes: Before going to making a changes in the robots.txt. We need to test the changes via Robots.txt Tester tools. 

Robots.txt Tester Tool: its online editor, so we can edit the text in robots.txt tester tool and we can check the changes before going to upload the robots.txt file in live server.

Test your robots.txt with the robots.txt Tester: https://support.google.com/webmasters/answer/6062598?hl=en

Tuesday, February 3, 2015

PageRank is still considerable?

What is mean by Pagrank: Pagerank equation is based on site backlinks(authority). If we are going to link someone, like

website A -> links -> website B

website A rank juice(authority) will flow to website B. So many people used this methodology to gain rank in shorter period.

PageRank Negatives: Yes, it's lead to link farms, link exchanges, and paid links. People still believe about pagrank and doing some bad techniques to earn pagerank such as called link schemes. But now Google alerted about Pagerank and they have changed their algorithm to clean the spam websites in search results. For avoiding this kind of pagerank manipulation, Google invented "Penguin" Algorithm.

Which helpful to prevent,

1. Low quality links
2. Over optimized anchor texts
3. Keywords stuffing

Pagerank is dead: Yes finally Google(John Muller) announced  on 06 Oct 2014.

Ref: https://www.youtube.com/watch?v=-GlxLlpm3Ew#t=1230

But still so many people are believe about Pagerank and keep posting about PR in SEO Forums. Anyhow PR toolbar is dead and Google wont update in future. So what we need to look hereafter instead of Pagerank.

Search Queries feature in Google Webmaster Tools: Yes you have to examine this section for improving site and rankings. Which provides lot of information for the Webmasters.

View Search Queries:

  1. On the Webmaster Tools home page, click the site you want.
  2. On the left-hand menu, click Search Traffic, and then click Search Queries.
Ref: https://support.google.com/webmasters/answer/35252?hl=en

More information about Pagerank:

PageRank: one of just 200 signals


Beyond PageRank: Graduating to actionable metrics


Pagerank officially dead


My site's PageRank has gone up / gone down / not changed in months!


Also take a look at the above section. Pagerank is one of the 200 ranking signals from Google. So don't worry too much about 1 signal. Better to build the other 199 signals for the website.

Thursday, April 11, 2013

Google Plus Displays Recent Posts in Google Search

The next social media update from Google for Google Plus. Today(11 Apr 2013) Google plus showing the recent posts in Google search engine when we search for the business name.

Before Google plus showing the business name, contact information(addres, phone no), map directions and review. After the Google plus updates, "Recent Posts" were added at bottom of the google plus page.

Nice update came from Google, The posts showing immediately and frequently in Google search engine after the publication.

Thursday, January 17, 2013

Facebook Graph Search

Facebook announced by yesterday, the major and new feature called as "Graph Search".

It will be very useful for the visitors who try to discover the fun between the connections.

Sunday, December 4, 2011

Google New Looks

Hi Guys,

Google Getting Changes Day by Day. Check the latest Google New Looks.

Alternatives to Google Site Search

Google plans to discontinue Site Search : On April 1, 2017, Google will discontinue sales of the Google Site Search. All new purchase...