That's a wrap: Meilisearch Launch Week, October 13-17th 2025
New in Meilisearch: Conversational AI, search personalization, multimodal, and more

Another Launch Week in the books. We shipped five major updates this week, designed to solve real problems our users told us they were struggling with. Each one of them makes your search experience more powerful, more flexible, and easier to work with.
Here's what we built:
Meilisearch Chat*: we brought conversational AI to Meilisearch
Search is becoming a conversation. Your users no longer want to scroll through results - they want to ask a question and get an answer. But building conversational AI has meant juggling LLM calls, vector databases, reranking logic, and weeks of infrastructure work.
Our new /chat
endpoint brings complete RAG capabilities directly to your Meilisearch index. One API call handles everything: understanding the query, retrieving relevant data, and generating accurate responses.
Key features:
- Conversational AI for search: your users can ask natural-language questions and get precise, contextual answers directly from your Meilisearch index.
- Built-in RAG capabilities: retrieves relevant information, contextualizes it, and generates accurate responses - no separate vector database, reranker, or AI service required.
- Native integration: runs directly on your Meilisearch instance as a single API endpoint.
*This feature is in open beta.
Read the full announcement and start building with /chat
→
Advanced GeoSearch
Location-based search is powerful, but until now, Meilisearch only supported basic shapes and points on the map. If your use case involved complex shapes, delivery zones, city boundaries, or irregular service areas, you had to work around it.
We've added full GeoJSON support with polygon and custom shape filtering. It allows to define a complex geographic boundary and search within it with precision. It's built on the same fast, filterable foundation you're used to, just way more flexible.
Key features:
- Advanced shapes search: search within polygons, multi-points, and complex shapes
- Fine-grained filtering: mix geo queries with text relevance and filters
- Faster performance: optimized for real-time map and delivery use cases
Explore geosearch with polygons in the docs →
Enhanced multi-modal: integrated image search
Image search has been possible in Meilisearch, but it required you to generate embeddings externally and inject them manually. For a lot of teams, that friction was enough to shelve the idea entirely.
Now it's built in. Provide images in base64 format, select an embedder (like CLIP), and Meilisearch handles the rest-embedding, indexing, and search. Support text-to-image queries, image-to-image queries, or both. Whether you're building product catalogs, media libraries, or visual discovery tools, you can ship image search without the setup headache.
Key features:
- Integrated support: image-to-image and text-to-image search now built directly into Meilisearch
- Multi-modal queries: mix image, text, and embedding inputs seamlessly
- Simple setup: everything runs through a single API and index with flexible configuration
Explore how multi-modal search works in our docs →
Search personalization: results that adapt to each user
Not everyone searches the same way. A user shopping for running shoes shouldn't see the same results as someone looking for dress boots - but most search engines treat every query identically.
Search personalization lets you rerank results based on user context: preferences, behavior, past interactions. Powered by Cohere Rerank 3.5, it works with a simple personalize
parameter. You provide the user context, Meilisearch adjusts the ranking, and your users get results that actually match their intent.
Key features:
- Personalization API: individualize search results based on behavior
- Cohere 3.5 embeddings: accurate, privacy-safe personalization
- Lightweight integration: add personalization without reindexing
Talk to our team to get access to personalized search →
Resource-based pricing: predictable costs that scale with you
Usage-based pricing works when your workload is unpredictable. But as you scale, you want to know exactly what you're paying for: compute, memory, and storage, ideally, without surprise bills tied to search volume.
We've introduced resource-based pricing alongside our existing usage-based model. Choose your dedicated resources (CPU and RAM) from predefined tiers, and your costs stay predictable as you grow. You can see real-time resource usage, upgrade when you need to, and only pay for what you allocate.
Key features:
- Resource-based Pricing: transparent, usage-based costs
- Predictable scaling: pay as you grow with your data
- Simple migration: no changes needed to your setup
Learn about resource-based pricing →
What's next
These releases aren't just features, they're the foundation for where we're taking Meilisearch. Conversational AI, multi-modal search, and personalization: it's all pointing toward search that understands context, not just keywords.
If you're building with Meilisearch or thinking about it, now's a good time to dig in. We're here if you need help.