Agiloft - AI Data Platform Lead
Upload My Resume
Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT
Requirements
• Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related technical field required. • 7–10 years of experience in data engineering, data architecture, or a related technical function, with at least 3 years focused on AI or ML data infrastructure. • Deep expertise in modern data stack technologies — Snowflake required; experience with dbt, Airflow or equivalent orchestration, and ELT/ETL pipeline design. • Demonstrated experience designing data architecture for AI consumption — including vector databases, embedding pipelines, RAG systems, or feature stores — not only for BI and reporting. • Strong data modeling skills across multiple paradigms: dimensional modeling, normalized models, and AI-optimized schemas for agent and model consumption. • Experience building and operating real-time or near-real-time data pipelines for operational AI use cases. • Proficiency in Python and SQL; experience with cloud data infrastructure on AWS required. • Experience designing data access patterns and governance controls for AI systems, including least-privilege access, audit logging, and AI-specific data security considerations. • Demonstrated ability to own cross-functional technical programs — translating requirements from multiple business domains into coherent, prioritized data architecture decisions. • Strong communication skills with the ability to make complex data architecture decisions legible to non-technical executives and cross-functional stakeholders. • Experience in private equity-backed SaaS organizations. • Experience with agentic AI frameworks — LangGraph, Mastra, or equivalent — and the data infrastructure requirements they create. • Experience building or operating RAG architectures at production scale, including vector store selection, chunking strategy, retrieval optimization, and evaluation. • Experience with agent memory architectures and state persistence design for multi-step AI workflows. • Familiarity with AI governance and compliance requirements for data used in automated decision-making. • Experience supporting investment board or executive-level AI progress reporting from a technical infrastructure perspective. • Experience with Tines or equivalent no-code/low-code orchestration platforms for simple agent pipelines. • Exposure to contract lifecycle management, legal tech, or professional services data domains. • Agiloft offers a comprehensive benefits package for US employees including but not limited to the following: • Medical, dental, and vision insurance • Short term and long-term disability • Life insurance and AD&D • Supplemental life insurance (Employee/Spouse/Child) • Health care and dependent care Flexible Spending Accounts • 401(k) with company match • Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non- overtime eligible) position. • Paid parental leave • Voluntary benefits including pet insurance
Responsibilities
• Own the end-to-end data architecture for the Data Warehouse Foundation, designing for AI-first consumption across GPT assistants, AI agents, predictive models, and operational intelligence — in addition to BI and reporting. • Lead data modeling across all 11 departments, designing canonical enterprise data models that serve cross-functional AI and analytics use cases without duplication or fragmentation. • Design and implement the contextual intelligence layer — including RAG architecture, vector store strategy, knowledge base ingestion pipelines, and document and unstructured data processing — that powers Agiloft's enterprise knowledge system. • Build and maintain the agentic data integration layer: real-time and near-real-time data access patterns, agent memory and state persistence design, orchestration data requirements, and agent output integration back into the warehouse. • Own the AI/ML feature layer — feature engineering strategy and standards, training data pipeline design, feature store architecture, and model output integration — enabling predictive analytics across churn, pipeline health, and operational forecasting. • Design and govern the operational data and GPT context layer, including structured context feed design for GPT assistants, data freshness and access SLAs for AI use cases, and cross-departmental data reuse standards. • Lead the Data Warehouse Foundation build in partnership with the external consulting team — setting architecture standards, reviewing implementation against AI-first principles, and ensuring the five-wave build plan delivers a foundation that serves the full intelligence architecture. • Design and manage data ingestion, ELT/ETL, and orchestration pipelines across all source systems, ensuring reliability, performance, and cost efficiency. • Establish and enforce AI data engineering standards across the organization — prompt-adjacent data design, agent data access patterns, reusable pipeline components, and quality assurance processes. • Own data access policy design and least-privilege access controls in partnership with Security, ensuring data made available to AI systems is governed, auditable, and compliant. • Define data quality standards and monitoring processes for AI-consumed data, where quality failures have direct impact on model and agent performance. • Partner with the Principal Data and Integrations Architect on infrastructure design, ensuring data modeling and AI consumption requirements are incorporated into pipeline and architecture decisions from the start — not retrofitted after build. • Partner with the VP FP&A and Manager of BI & Data to ensure the semantic and metrics layers are technically sound and serve both AI use cases and reporting requirements. • Manage the AI Ops data architecture roadmap, translating business and AI use case requirements from all 11 departments into sequenced, prioritized technical work. • Maintain documentation and knowledge transfer standards for all data architecture, pipelines, and integration patterns — ensuring AI Ops-built infrastructure is reusable, auditable, and not dependent on any single individual. • Collaborate with the AI Agent Engineer and GPT & AI Systems Lead to ensure data infrastructure supports agent orchestration, retrieval-augmented generation, and multi-step reasoning workflows. • Define the roadmap for data science and AI data work in partnership with the VP of AI Operations — this role does not take direction from IT on resource allocation or prioritization. All roadmapping is managed within AI Operations. • Evaluate and recommend data tooling, frameworks, and platform components in alignment with AI Ops' technology-agnostic, build-for-leverage approach.
No credit card. Takes 10 seconds.