AI Developer Advocate

    Posted Date: Aug 18, 2025
    Closing Date: Sep 9, 2025
  • Full Time
  • San Francisco, CA
  • $150,000 – $200,000 USD / Year

Eventual

Powering the future of AI with multimodal data at scale.

Job Description

Every breakthrough AI application, from foundation models to autonomous vehicles, relies on processing massive volumes of images, video, and complex data. But today’s data platforms (like Databricks and Snowflake) are built on top of tools made for spreadsheet-like analytics, not the petabytes of multimodal data that power AI. As a result, teams waste months on brittle infrastructure instead of conducting research and building their core product.

Eventual was founded in 2022 to solve this. Our mission is to make querying any kind of data—images, video, audio, text—as intuitive as working with tables, and powerful enough to scale to production workloads. Our open-source engine, Daft, is purpose-built for real-world AI systems: coordinating with external APIs, managing GPU clusters, and handling failures that traditional engines can’t. Daft already powers critical workloads at companies like Amazon, Mobileye, Together AI, and CloudKitchens.

We’ve assembled a world-class team from Databricks, AWS, Nvidia, Pinecone, GitHub Copilot, Tesla, and more, quadrupling our size within a year. With Series A and seed funding from Felicis, CRV, Microsoft M12, Citi, Essence, Y Combinator, Caffeinated Capital, Array.vc, and top angels from the co-founders of Databricks and Perplexity, we’re looking to double the team now. Join us—Eventual is just getting started.

Please note: We’re looking for individuals who are excited to be a part of a tight-knit team working together 4 days/week in our SF Mission District office.

Your Role
As our AI Developer Advocate, you will champion AI workload use-cases and drive adoption of Daft, our distributed query engine for multimodal data. You’ll work directly with users to build compelling end-to-end demonstrations that showcase Daft as the definitive engine for any modality of data at any scale, turning complex multimodal scenarios into accessible, reproducible examples. When you’re not producing developer-facing content, your work will also directly influence Daft’s roadmap as you learn from our users.

Key Responsibilities

  • Work closely with users to understand multimodal data challenges and translate them into compelling use-cases

  • Develop tutorials, videos, and technical demos showcasing things like:

    • Data ingest for Multi-modal RAG systems (documents, images, videos, and code repositories for AI assistants)

    • Agent data orchestration enabling AI agents to reason over massive unstructured data repositories

    • Foundation model training pipelines preparing multimodal datasets for LLM and vision models

    • Multi-modal generation workflows combining text, image, and video generation with complex dependencies

    • Real-time AI assistant ingestion of user documents, conversations, and media

  • Present at conferences, meetups, and events to grow the developer community

  • Drive feature requirements and improvements for the open-source project based on user feedback

What We Look For

  • Strong background in AI/ML and multimodal data processing

  • Experience with distributed systems and data engineering

  • Proven track record creating technical content and speaking at events

  • Ability to translate complex technical concepts into clear, actionable guidance

  • Open-source contribution experience preferred

Perks & Benefits

  • In-person team (4x a week in office)

  • Competitive salary and equity

  • Catered lunches and dinners (SF employees)

  • Commuter benefit

  • Team building events & poker nights

  • Health, vision, and dental coverage

  • Flexible PTO

  • Latest Apple equipment

  • 401k plan with match

! You must be logged in to apply for this job.

Login to Apply

To apply for this job please visit smartapply.indeed.com.

Scroll to Top