Back to the blog

Integrating Algolia Search for AEM Sites

author image

Shankar Krishnan

Technology | June 18, 2021

hero image

Algolia with over 10,000 customers as of January 2021 and 30 billion search queries a week, is currently second only to Google. The numbers show clearly the need in the market for a good Site Search solution which was at some point dominated by GSA at the enterprise level followed by many Solr and Elastic-based implementations.

At Hashout, we are involved in multiple Algolia implementations and we truly love and recommend Algolia to our customers. As Algolia is rapidly launching capabilities, we being an implementation partner of Algolia are constantly learning and taking them to our enterprise customers. Since we work extensively on the Adobe Experience Manager platform for our customers, there are few things we would like to share from our experience on integrating AEM with Algolia to deliver a site search experience.

Algolia at the moment does not provide an official connector between AEM and Algolia. We have been on client calls and seen AEM customers continuously asking this question to Algolia along the lines of “Does Algolia work well with AEM? ”.

Here is a summary of few considerations if you are an AEM Sites Customer / Architect looking at Algolia:

  • The simple OOTB way to get your AEM content over to Algolia is their Crawler. Algolia at the moment does not have an official Algolia AEM Connector that can be installed on your AEM Instances.
  • Crawler is an add-on subscription and it works quite well for public pages (outside your enterprise firewall). So, if you are an enterprise customer who has the staging environment locked, then you cannot use the Crawler and test your search relevancy, rules, etc.
  • Crawler is only going to get to the HTML, hence you have to make sure that all your in the page has everything that you need to drive the search. Also note, there is a limitation of once a day crawl schedule.
  • If you have DAM Assets that are not linked to the pages or in the sitemap, Crawler will not be able to get to it. For one such need, we added all the resources to the sitemap and used a Servlet for exposing the Asset Mesta-data to be pulled by externalDataSource, which is a feature of Algolia Crawler that picks addlInfo from a CSV.
  • The other option is to use the APIs to push the data to Algolia. We are going this route for some customers due to the above limitations of the crawler or even the additional cost incurred for the crawler. As you may know, it's quite a development intensive to cover all use-cases using the APIs.

Please feel free to reach out to us if you are evaluating a solution especially Algolia and would like to learn from our experience on the product.

Browse all categories