Forward Deployed AI Engineer
Forward Deployed AI Engineer
Forward Deployed AI Engineer
Forward Deployed AI Engineer
Omnilex
Informationsdienste
Zürich
- Art der Beschäftigung: Vollzeit
- 8.000 CHF – 12.000 CHF (Unternehmensangabe)
- Vor Ort
- Zu den Ersten gehören
Forward Deployed AI Engineer
Über diesen Job
🧭 Who You Are
You like the last mile—the part where an AI product stops being a demo and starts surviving real life: inconsistent documents, weird naming conventions, strict access rules, stakeholders who notice every edge case, and workflows that were never designed for “AI assistants.”
You’re the person who can sit with a legal team, understand what they actually need, translate that into system behavior, and then implement it cleanly. You enjoy being the connective tissue between customers, domain experts, and the core engineering team—shipping practical improvements and leaving behind crisp documentation so the next rollout is smoother.
🏢 About Omnilex
Omnilex is an AI legal tech startup originating from ETH Zurich. Our interdisciplinary team (14+ people) helps legal professionals do better research by combining external legal sources, customer-internal knowledge, and our own AI-first legal commentaries—delivered through search + LLM workflows built for professional-grade reliability.
Tasks
🔧 What You’ll Do
As a Forward Deployed AI Engineer, your mission is to bring Omnilex into customer environments and make it work exceptionally well—then turn what you learn into reusable product capabilities.
Customer rollouts & customization (the heart of the job)
- Lead technical onboarding for new customers: ingest documents, build indexes, map metadata (jurisdiction, authority, recency), and run validation checks.
- Tune retrieval and reranking behavior to match customer expectations (practice area focus, internal taxonomies, document patterns, relevance definitions).
- Deliver customer-specific UX and workflow adaptations: templates, default filters, jurisdiction presets, citation formatting, permission-aware retrieval, and customized result views.
Production-grade LLM workflows
- Adjust prompting and context strategies to meet strict requirements (grounding, traceability, citation style, explanation depth, fallback behavior).
- Build and enforce guardrails: provenance tracking, source-grounded generation, “no source → no statement” rules, and risk-aware uncertainty patterns suitable for legal contexts.
Field iteration & quality loops
- Create small but high-signal evaluation sets per customer (gold questions, acceptance criteria, “cannot fail” scenarios).
- Perform fast failure analysis and ship improvements: chunking changes, deduping, reranker adjustments, query interpretation tweaks, caching, and routing strategies.
Latency, cost, and operational reliability
- Keep response times and usage costs sane through batching, caching, early exits, and practical fallback paths.
- Track quality signals and usage patterns; convert feedback into measurable fixes and clear acceptance tests.
Cross-team execution & knowledge capture
- Work closely with Customer Success and legal experts to convert pain into engineering work.
- Write deployment playbooks and integration “recipes” so customer solutions become repeatable patterns over time.
Requirements
✅ Must-haves
- Strong practical experience building or adapting search/retrieval systems in production (hybrid retrieval, reranking, indexing, query understanding).
- Experience taking LLM features from prototype to stable, real-world usage.
- Solid TypeScript/Node.js skills (our core stack).
- Hands-on experience with at least one of: Azure AI Search, pgvector/PostgreSQL, OpenSearch/Elasticsearch (or comparable systems).
- Strong engineering judgment: debugging skills, performance tuning, careful edge-case handling, and operational thinking.
- Comfortable working directly with customers: deep technical sessions, trade-off explanations, and clear written documentation.
- Fluent English; available full-time.
- Hybrid setup: at least two days per week on-site in Zurich.
➕ Nice-to-haves
- German proficiency (many sources and stakeholder conversations are German-speaking).
- Experience integrating customer document sources and pipelines (connectors, ETL, access controls).
- Experience with lightweight evaluation processes (human labeling loops, basic agreement checks, simple dashboards).
- Familiarity with sparse + dense retrieval approaches (BM25 variants included).
- Experience running and operating services (Docker a plus).
- Familiarity with Azure / NestJS / Next.js.
- Exposure to Swiss / German / US legal systems.
Benefits
🤝 Benefits
- Tangible customer impact: your work directly affects daily trust and adoption inside legal teams.
- High ownership: you run deployments end-to-end and help define reusable solution patterns.
- Fast feedback loops: you’ll see real failure modes early and influence product direction with evidence.
- Compensation: CHF 8’000–12’000 per month + ESOP, depending on experience and skills.
We’re excited to meet builders who enjoy getting close to customers and shipping improvements that make legal research faster, more accurate, and more trustworthy. If you like turning real-world pain points into robust, repeatable solutions, we’d love to hear from you. Apply by pressing the Apply button.