Senior Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• You are a curious engineer experienced in backed, platform and data engineering who cares about building reliable, simple and easy-to-use systems. You don't need to know every tool on our list perfectly; we value your ability to learn and your foundational engineering skills. • A solid grasp of system design and data structures, coupled with foundational software engineering background. You write code that is tested, modular, and readable (we use Python, TypeScript, and Go). • Proficiency with cloud-native development (AWS or Azure) and containerisation (Kubernetes/Docker). • Experience with Cloud Data Warehouses (Snowflake/Postgres), Data Warehouse API (GraphQL). • Experience in data engineering, including building and maintaining data pipelines. You treat data pipelines like software. • Experience implementing CI/CD (CircleCI/GitHub Actions/GitLab), automated testing, and Data Observability (Datadog). • You can articulate ideas clearly to both engineers and product partners. • You have experience with platform engineering principles, you can demonstrate empathy for users, prioritising usability, configurability and long-term sustainability. • You care deeply about code quality, testing, and documentation, and you aim to build systems that are easy to understand and operate. • You are comfortable refactoring monolithic data structures into modular services that prioritise ease of use for the end consumer. • Interest or experience in building infrastructure for GenAI such as Vector Databases or MCPs (Model Context Protocols). • Familiarity with event-driven architectures is a bonus.
Responsibilities
• Data Modelling & Refactoring • Take ownership of our data architecture. You will define schemas for new entities and refactor existing models to improve performance and clarity as the product evolves. • Build Interfaces & APIs • Create APIs and connectors that allow users to access data easily without needing deep infrastructure knowledge. • Enable the "Universal Data Layer" that powers our AI agents and internal services. • Enable AI/ML (via IDP) • Build features in our Internal Developer Platform that make it easy to deploy and manage AI models. • Remove the friction between "training a model" and "running it in production." • Ensure Security and Privacy by Design • Automate security and compliance checks so that data is classified and safe by default. • Replace manual approval gates with automated guardrails, ensuring speed without compromising safety. • Build Domain-Driven Services & APIs • Encapsulate Business Logic: Design and develop services that wrap complex business logic into clean, reusable APIs. • Simplify Data Consumption: Create a "Productised Data" layer, making it easy for non-technical stakeholders to pull high-fidelity reports and build dashboards without needing to understand the underlying raw tables. • Refactor for Scale: Transition legacy data scripts and coupled domains into robust, version-controlled services that the entire company can rely on.
Benefits
• Time off - 27 days holiday, plus 5 additional days off: 1 life event day, 2 volunteer days, 2 company-wide wellbeing days (M-Powered Weekend) and 8 bank holidays per year • Health & Wellness- private medical Insurance with Bupa, a medical cashback scheme, life insurance, gym membership & wellness resources through Wellhub and access to Spill - all in one mental health support • Hybrid work offering - for most roles we collaborate in the office three days per week with the exception of Coaches and Instructors who collaborate in the office once a month