Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The most important sentence: "Probably not if your core business needs revolve around search."

Postgres full text search is very good, but once you get into the realms were Elasticsearch and SOLR really shine (complex scoring based on combinations of fields, temporal conditions or in multiple passes, all that with additional faceting etc.), trying to rebuild all that on top of Postgres will be a pain.

While that doesn't break the article, it runs into a nasty problem: `unaccent` doesn't handle denormalized accents.

    # SELECT unaccent(U&'\0065\0301');
     unaccent 
    ----------
     é
    (1 row)
(That problem is also present in Elasticsearch if you forget to configure the analyzer to normalize properly before unaccenting)


Thanks for you comment, I was not aware of the unaccent limitation. This blog post only present a solution for small/medium search needs without adding extra dependency ... the postgres full text search is far of being a silver bullet


Yep, that's how I understood it and I like the rest a lot, too. I just wanted to make make the sentence stand out, as it is so far below. BTW, the problem seems to be that postgres doesn't actually handle decomposed UTF-8 and is non-compliant in that regard:

http://www.postgresql.org/message-id/53E1AB15.8050702@2ndqua...

So, probably making sure everything is in composed form before writing to the DB seems to be the best way to go.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: