wagey.ggwagey.ggv1.0-4558734-20-Apr
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Analytics Engineer Role/perk - Analytics Engineer
perk

perk - Analytics Engineer

Remote - London, United Kingdom2w ago
RemoteMidEMEAOil & GasLogisticsAnalytics EngineerData AnalystSQLdbtReportingSnowflakeRedshift

Upload My Resume

Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT

Apply in One Click

Requirements

• Experience: 3+ years of experience in a data-focused role (e.g., Analytics Engineer, Data Analyst, BI Developer). • SQL Mastery: Expert-level proficiency in writing and optimising complex SQL queries. • Data Transformation Tooling: Hands-on experience with dbt (Data Build Tool) or similar data transformation frameworks is essential. • Data Warehousing: Experience working with cloud-based data warehouses such as Snowflake, Google BigQuery, or Amazon Redshift. • Data modelling: Solid understanding of data warehousing concepts, ETL/ELT principles, and dimensional modelling techniques. • Version Control: Proficiency with Git for collaborative development and version control. • Familiarity with reporting/ BI tools like Looker. • Python expertise for automation, integration, and orchestration • Experience with orchestration tools like Airflow. • Knowledge of modern software engineering practices applied to data (e.g., modularity, code review, testing). • Our Vision is for a world where TravelPerk serves as the platform for human connection in-real-life (IRL). We take an IRL-first approach to work, where our team works together in-person 3 days a week. As such, this role requires you to be based within commuting distance of our London or Barcelona hub https://www.travelperk.com/contact/. We fundamentally believe in the value of meeting in-real-life to improve connectivity, productivity, creativity and ultimately making us a great place to work. • For certain roles, we can help with relocation from anywhere in the world, English is the official language at the office. Please submit your resume in English if you choose to apply. Do not forget to submit an updated portfolio and/or resume.

Responsibilities

• Develop and Maintain Transformation best practices in ELT Pipelines: Design, develop, and maintain efficient and scalable workflows using tools like dbt to transform raw data loaded into our data warehouse (Snowflake) into clean, ready-to-use data models. • Data Modelling: Implement best-practice data modelling and software engineering techniques to ensure data structures are optimised for performance, accuracy, and ease of use in reporting and analytical applications (e.g. CI/CD, testing, lineage). • Data Quality and Testing: Write comprehensive data quality checks, tests, and monitoring scripts to ensure the accuracy, completeness, and reliability of all transformed data assets. Establish and maintain documentation for all data transformations and models. • Collaboration: Work closely with Data Analysts and business users to understand their reporting needs and optimise data models to support their analytical use cases. Collaborate with Data Engineers on data ingestion strategies and platform optimisation. • Performance Optimisation: Tune and optimise SQL queries and data models to reduce latency and improve the performance of our data warehouse and downstream applications. • Tool Adoption: Champion the adoption of modern data stack tools and practices (e.g. Git, CI/CD). • Build scalable, compelling data visualisations in Looker that can be used by large teams.

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X
Loading...