3+ years of experience in Data Infrastructure Engineer / Data Engineer / MLOps Engineer roles;
Have work experience or troubleshooting experience in the following areas:- Analytical Databases: configuration, troubleshooting (Snowflake, Redshift, BigQuery)- Data Pipelines: deployment, configuration, monitoring (Kafka, Airflow or similar)- Data Modeling: DRY and structured approach, applying performance tuning techniques- Containerizing applications and code: Docker, k8s
Fluent with SQL and Python;
At least Intermediate level of English;
Have experience in researching and integrating open-source technologies (data ingestion, data modelling, BI reporting, LLM applications, etc.);
Ability to identify performance bottlenecks;
Team work: GitOps, Continuous Integration, Code reviews;
Technical university graduate.
Responsibilities
Enhance Data team with architectural best practices and low-level optimizations
Support on evolving data integration pipelines (Debezium, Kafka, dlt), data modelling (dbt), database engines (Snowflake), ML Ops (Airflow, MLflow), BI reporting (Metabase, Observable, Text-2-SQL), reverse ETL syncs (Census)
Cover up business units with feature requests / bugfixes / data quality issues
Enforce code quality, automated testing and code style
Benefits
Wheely expects the very best from our people, both on the road and in the office. In return, employees enjoy flexible working hours, stock options and an exceptional range of perks and benefits.
Office-based role located in West London
Competitive salary & equity package
Medical insurance, including dental
Life and critical illness insurance
Monthly credit for Wheely journeys
Lunch allowance
Professional development subsidies
Cycle to work scheme
Top-notch equipment
Relocation allowance (dependent on role level)
Wheely has an in-person culture but allows flexible working hours and work from home when needed.
All of your personal information will be collected stored and processed in accordance with Wheely’s Candidate Privacy Notice