Traditional search is built on keywords. You type "project budget", you get results containing "project" and "budget". That works fine for a search engine. It fails completely for personal memory.
The reason: people don't remember what words they used. They remember what they meant.
Vector embeddings: the technology behind semantic search
A vector embedding is a mathematical representation of meaning. When Maatix stores a piece of text, it converts it into a list of numbers — a vector in a high-dimensional space — where similar meanings sit close together.
When you ask "what did I talk about regarding costs for the blue project?", Maatix converts your question into a vector and finds the memory entries whose vectors are nearest to it. This means it can surface a conversation where you discussed "Q3 budget overruns" even if neither "costs" nor "blue project" appeared verbatim.
Why this makes a real difference
We talk naturally and inconsistently. "I need to cut spend", "our burn rate is too high", "the budget is out of control" — these mean the same thing but share almost no keywords.
Keyword search would miss all but the exact phrase you search for. Semantic search finds all of them.
The tradeoffs
Semantic search is more expensive computationally. Embeddings need to be generated when content is stored and compared during search. We've optimized heavily for this — search results in Maatix return in under 200ms even across years of conversation history.
It also requires careful calibration. Too broad and irrelevant results surface. Too narrow and relevant ones get missed. Getting this right is a large part of what we work on.
The result is a search experience that feels, for the first time, like your AI actually understands what you're looking for.