Back to Dictionary
ConceptsBackend

Embeddings

Numerical representations of text that capture meaning, used to find similar content for AI retrieval.

The Full Picture

Embeddings convert text into arrays of numbers (vectors) where semantically similar content ends up close together in vector space. When you search your codebase in Cursor, it uses embeddings to find files relevant to your query — even if you didn't use the exact same words. Embeddings power RAG systems, semantic search, recommendation engines, and deduplication.

Common providers: OpenAI's text-embedding-3, Cohere, and open-source models.

Was this helpful?

Want to go deeper? I write about the real gaps vibe coding leaves behind.