Home/Case studies/Contextual AI

Semantic search product · Solo

Contextual AI: Qdrant + Neo4j ready, seed round closed

Contextual AI is a solo ML startup building semantic search infrastructure. Jerome needed production-ready Qdrant and Neo4j integrations before a seed demo. He found them pre-wired in ShipAI and shipped a working demo with live vector search and graph traversal in 6 days.

Jerome Okafor

Jerome Okafor

Founder & ML Engineer, Contextual AI

Background

Jerome had spent three months building his own AI infrastructure stack. He had strong ML intuition but was spending disproportionate time on integration plumbing — connecting vector stores to the AI SDK, designing graph query patterns, wiring streaming responses. When a seed investor gave him a demo date six weeks out, he audited his timeline and realized he couldn't ship a compelling demo and build infrastructure simultaneously. He needed to start from somewhere that had the hard parts done.

The challenge

The technical requirement was specific: a semantic search demo that used Qdrant for embedding-based retrieval and Neo4j for relationship traversal — both connected to a streaming AI interface, both using real production APIs, not mocked data. Most boilerplates didn't touch vector stores or graph databases. He needed something pre-integrated with the Vercel AI SDK that he could build on, not around.

How they built it

Qdrant integration operational on day one

ShipAI ships with Qdrant pre-configured for vector storage and retrieval, connected to the AI SDK's tool use system. Jerome pointed the integration at his target collection, defined the embedding schema for his search domain, and had live vector search running in the first day. The embedding pipeline, similarity search, and result formatting were already handled — he wrote the domain logic for his specific retrieval use case.

Neo4j graph traversal for relationship context

The semantic search demo needed to surface not just similar documents but related concepts and their connections. The Neo4j integration provided graph traversal queries that Jerome adapted to his knowledge graph schema. This was the most specialized component — he spent two days on the graph query logic, but started from working connection patterns rather than from configuration.

Streaming AI interface for demo polish

The demo needed to feel responsive and professional. Jerome used the streaming AI handler to build the search interface — results streamed as they arrived, with real-time updates as the AI synthesized retrieved context. The streaming infrastructure was pre-built; he focused on the presentation layer.

Demo environment with real data

On day five, Jerome loaded his knowledge graph and document corpus into the live Qdrant and Neo4j instances. The Docker Compose environment meant he could run everything locally for testing, then deploy to Vercel for the investor demo with no infrastructure changes. The demo ran on real data from a public dataset relevant to the investor's domain.

Outcomes

Demo shipped in 6 days

A working semantic search product with live vector retrieval, graph traversal, and streaming AI interface — six days from starting with ShipAI.

Seed round closed on the demo

Investors specifically noted the product quality and technical credibility of the demo. The lead investor cited the live vector search as evidence of production readiness.

Zero integration time for Qdrant and Neo4j

Both database integrations were pre-configured and connected to the AI SDK. Jerome wrote domain logic on top of working infrastructure rather than building the infrastructure itself.

Three months of infrastructure work compressed to one week

The infrastructure Jerome had been building for three months — vector storage, graph queries, AI streaming — was replaced by ShipAI's pre-wired foundation in days.

In their own words

I'd spent three months building infrastructure that turned out to be available off the shelf. That's a painful realization, but also a useful one. The Qdrant and Neo4j integrations are exactly what I would have built, just already built. I focused six days entirely on the parts that only I could do — the knowledge graph design and the search UX — and closed the round.

Jerome Okafor

Jerome Okafor

Founder & ML Engineer, Contextual AI

We needed Qdrant for semantic search and Neo4j for relationship graphs before our seed demo. I was dreading the integration work. ShipAI had both pre-wired with the Vercel AI SDK. We shipped a working demo with live vector search in 6 days. Our investors specifically called out the product quality.

Jerome Okafor

Frequently asked questions

How is Qdrant integrated with the Vercel AI SDK?

ShipAI wires Qdrant to the AI SDK's tool use system — semantic search is exposed as a tool that the AI can call during a conversation. The embedding, search, and result formatting are encapsulated in the tool implementation. Developers configure the collection and query parameters without touching the underlying SDK plumbing.

What does the Neo4j integration include?

The integration includes connection configuration, query helper utilities, and example traversal patterns. Jerome used the query helpers as templates and adapted them to his knowledge graph schema. The connection management and error handling were already handled.

Is this setup realistic for production beyond a demo?

Yes. Jerome used the same codebase from demo to production. After closing the seed round, he extended the same Qdrant and Neo4j foundations to his full product without a rewrite. The production scaling considerations — connection pooling, query optimization — were addressed as he grew.

Keywords

contextual ai case studysemantic search product case studyshipai.today customer storynext.js saas case studyai saas launch story

https://shipai.today/cases/jerome-okafor

Ready to write your own case study?

Start from the same foundation.

Every outcome in these case studies started from ShipAI.today. Production auth, billing, AI infrastructure, admin panel — all included.

  • 12 builders and counting
  • All features from these case studies included
  • Full landing source + SEO infrastructure