wagey.ggwagey.ggv1.0-e93b95d-4-May
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Data Engineer Role/synthesia - Senior Data Engineer
synthesia

synthesia - Senior Data Engineer

London, Greater London, United Kingdom2mo ago
In OfficeSeniorEMEACloud ComputingData AnalyticsData EngineerSenior Data EngineerSQLPythonAirflowTerraformAWS

Upload My Resume

Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT

Apply in One Click
Apply in One Click

Requirements

• 5+ years of experience as a Data Engineer or in a closely related role, with a proven track record of building and operating production data systems. • Experience working in an early-stage or scaling data function. You’re comfortable taking ownership and wearing multiple hats when needed. • Strong foundations in software engineering and data modelling best practices, with an ability to design systems that are maintainable, scalable, and easy for others to build on. • Deep expertise in SQL, and solid experience using Python or similar languages to build data pipelines, tooling, and orchestration (Airflow). • Hands on experience managing cloud infrastructure using infrastructure-as-code (e.g. Terraform) on AWS, GCP, or similar platforms. • A pragmatic approach to data platform design, with an eye for performance, cost efficiency, and operational reliability. • Excellent communication skills: you can work effectively with technical and non-technical stakeholders to gather requirements, explain trade-offs and communicate data team needs. • A product-oriented mindset, with an understanding of how data can shape decision making and accelerate company growth.

Responsibilities

• Architect and scale robust, end-to-end data pipelines that ingest and transform complex semi-structured and structured data into our Snowflake data warehouse. • Own the evolution of our dbt project - implementing modular modelling patterns and other best practices to ensure a "single source of truth" for the entire organisation. • Manage platform infrastructure in snowflake, AWS and other tools. • Continuously optimise warehouse performance and cost by diagnosing bottlenecks, tuning inefficient queries, and improving how compute resources are used as we scale. • Bridge the gap between experimental data science workflows and production, building the infrastructure and orchestration needed to deploy and monitor batch ML jobs. • Drive best practices in data security, governance, and compliance, particularly with regards to AI. • Partner with cross-functional stakeholders to understand data requirements and translate them into technical solutions.

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X