The most important sentence: "Probably not if your core business needs revolve around search."
Postgres full text search is very good, but once you get into the realms were Elasticsearch and SOLR really shine (complex scoring based on combinations of fields, temporal conditions or in multiple passes, all that with additional faceting etc.), trying to rebuild all that on top of Postgres will be a pain.
While that doesn't break the article, it runs into a nasty problem: `unaccent` doesn't handle denormalized accents.
Thanks for you comment, I was not aware of the unaccent limitation.
This blog post only present a solution for small/medium search needs without adding extra dependency ... the postgres full text search is far of being a silver bullet
Yep, that's how I understood it and I like the rest a lot, too. I just wanted to make make the sentence stand out, as it is so far below. BTW, the problem seems to be that postgres doesn't actually handle decomposed UTF-8 and is non-compliant in that regard:
Postgres full text search is very good, but once you get into the realms were Elasticsearch and SOLR really shine (complex scoring based on combinations of fields, temporal conditions or in multiple passes, all that with additional faceting etc.), trying to rebuild all that on top of Postgres will be a pain.
While that doesn't break the article, it runs into a nasty problem: `unaccent` doesn't handle denormalized accents.
(That problem is also present in Elasticsearch if you forget to configure the analyzer to normalize properly before unaccenting)