
Switching from MySQL to PostgreSQL unlocked powerful features for a recent project, including robust JSON support with `jsonb` for indexing and querying, and built-in full-text search using `tsvector` and `tsquery`. Additionally, PostgreSQL’s pgvector enables seamless storage and similarity searches for AI embeddings, streamlining workflows without the need for extra infrastructure. For complex data applications, PostgreSQL emerges as the superior choice.
Cary Li
For years I defaulted to MySQL — it was what I learned first, and it worked fine. But after switching to PostgreSQL for a recent project, I don't think I'm going back.
The project needed full-text search and JSON storage. MySQL can handle both, but the experience felt like a workaround. PostgreSQL treats them as first-class citizens.
JSON support is genuinely good. The jsonb type isn't just storage — you can index it, query inside it, and use operators that make complex queries readable. MySQL's JSON support works but feels bolted on.
Full-text search is built in and flexible. With tsvector and tsquery, I got decent search without pulling in a separate service. Not Elasticsearch-level, but good enough for most apps.
pgvector changed everything for AI projects. Being able to store embeddings and run similarity search in the same database where my other data lives is a big deal. No extra infrastructure, no sync issues.
If you're running a high-traffic read-heavy workload and have existing MySQL expertise on the team, there's no urgent reason to switch. MySQL is fast and well-understood.
For new projects — especially anything touching AI, search, or complex data shapes — PostgreSQL is the better default. The ecosystem has caught up, the tooling is great, and features like pgvector make it future-proof in a way MySQL isn't.