wagey.ggwagey.ggv1.0-55c2ce9-10-Apr
Browse Tech JobsCompaniesFeaturesPricing
Log InGet Started Free
Jobs/Data Engineer Role/Taxfix - Staff Data Engineer
Taxfix

Taxfix - Staff Data Engineer

Berlin, Germany+ Equity1mo ago
In OfficeStaffEMEACryptocurrencyFintechData EngineerBackend EngineerDocumentationPythonSQLData QualitySnowflake

Upload My Resume

Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT

Apply in One Click

Requirements

• Must-have • 6+ years of experience in Data Engineering or a similar role (backend engineer working on data-intensive systems counts) • 6+ years • Strong Python skills for data pipeline development — you write production code, not just scripts • Strong SQL skills — window functions, CTEs, query optimization are second nature • Experience with event-driven data pipelines — CQRS, event ordering, idempotency, and the difference between initial load and incremental processing • Experience with event-driven data pipelines • Expert with Airflow — you’ve built DAGs with proper task dependencies, retries, and monitoring • Expert with Airflow • Experience with Snowflake or/and BigQuery — you understand their architecture, performance characteristics, and how they differ from each other and from other analytical or operational tools • Experience with Snowflake or/and BigQuery • Cloud platform experience — you’ve worked with GCP (GCS, Dataflow, Dataproc etc) or equivalent AWS/Azure services and understand how to manage cloud resources at scale • Infrastructure-as-code — experience with Terraform, Helm, or similar tools for provisioning and managing cloud environments • Infrastructure-as-code • K8S and Docker containerization — you package and deploy your own work • K8S and Docker containerization • Data quality mindset — you profile data, validate assumptions, build checks, and don’t trust that “the data looked clean” • Data quality mindset • Data for AI readiness — you understand what it takes to prepare data for ML and AI: governance, lineage, privacy controls, and reproducibility • Data for AI readiness • Awareness of data privacy requirements — you can identify PII, understand GDPR, and know how to implement anonymization and deletion across multiple data layers • AI-enabled engineering practices — you actively use AI assistants and code generation tools to accelerate development and deliver and you can establish standards for their effective use across the team • AI-enabled engineering practices • Nice-to-have • Track record of coaching and growing engineers — you’ve helped teammates level up through pairing, code reviews, or structured mentorship • Track record of coaching and growing engineers • Exposure to ML platforms - (Vertex AI, SageMaker) or feature stores — you’ve helped ML teams get from raw data to production models • Exposure to ML platforms • Hands-on ML experience — training models, running experiments, and understanding the full lifecycle from data preparation to deployment • Experience with Segment.io or similar event collection and customer data platforms • Segment.io • Advanced data modeling — SCD Type 2, data vault, anchor or other patterns • Advanced data modeling • Advanced privacy engineering — crypto-shredding, differential privacy, consent management systems • Advanced privacy engineering • Domain knowledge in fintech, tax, or regulated industries — understanding of compliance-driven data requirements, audit trails, and data retention policies. • Domain knowledge in fintech, tax, or regulated industries

Responsibilities

• Build and maintain ingestion pipelines that capture changes from application databases, APIs, SaaS and deliver clean, analytics-ready tables to our cloud data warehouse • Build and maintain ingestion pipelines • Design data models with proper layering that handle real-world data complexity: out-of-order events, schema evolution, late arrivals, and backfills • Design data models • Own and evolve cloud platform infrastructure — manage GCP resources (GCS, Dataflow, Dataproc), provision and maintain environments with Terraform, and ensure the platform is cost-efficient and scalable • Own and evolve cloud platform infrastructure • Own data quality monitoring — build validation, monitoring, and alerting that catches problems before downstream consumers do • Own data quality monitoring • Implement privacy and compliance controls — anonymization, pseudonymization, access policies, and deletion propagation (GDPR right-to-be-forgotten) across raw and derived layers • Implement privacy and compliance controls • Prepare data for ML and AI use cases — build governed, privacy-safe datasets and feature pipelines that ML engineers and data scientists can use for model training, evaluation, and production inference • Prepare data for ML and AI use cases • Operate and improve our orchestration layer — scheduling, retries, SLA tracking, and observability for data pipelines • Operate and improve our orchestration layer • Define and raise the bar on engineering standards — code quality, testing, CI/CD, documentation, and infrastructure-as-code • Define and raise the bar on engineering standards • Evaluate and adopt new technologies that help the team achieve its goals across data management, analytics, and machine learning • Evaluate and adopt new technologies • Incorporate AI into platform services — enable AI-assisted development workflows and build internal AI backend services as part of the data platform offering • Incorporate AI into platform services • Communicate across domains — work closely with analytics, product, compliance, and engineering teams; translate between technical and business language • Communicate across domains • Mentor and grow with the team — share what you learn, support others, and contribute to a culture of honest technical discussion • Mentor and grow with the team

Benefits

• A chance to do meaningful, people-centric work with an international team of passionate professionals. • Holistic well-being with free mental health coaching sessions and yoga. • A monthly allowance to spend on an extensive range of services that you can use and roll over as flexibly as you like. • Employee stock options for all employees—because everyone deserves to benefit from the success they help to create. • 30 annual vacation days and flexible working hours. • Work from abroad for up to six weeks every year. Just align with your team, and then enjoy your trip. • Plenty of opportunities to socialise as a team. In addition to internal tech meetups, our international team hosts regular get-togethers—virtually and in person when possible. • Free tax declaration filing, of course, through the Taxfix app—and internal support for all personal tax-related questions. • Have a four-legged friend in your life? We’re happy to have dogs join us in the office. • Excited? So are we. Learn more about Team Taxfix on our blog and get a glimpse of our culture.

Similar Jobs

ClearcoClearco - Senior Data Scientist2h ago
·Canada·$150k - $200k/year
In OfficeNASeniorBankingArtificial IntelligenceSenior Data ScientisthypothesisPythonSQLSnowflakedbtMLOpsRevenue ForecastingData QualityDocumentation
Resilient CoResilient Co - Senior Data Scientist2h ago
·Argentina·Equity
In OfficeLATAMSeniorBankingArtificial IntelligenceSenior Data ScientisthypothesisPythonSQLSnowflakeMLOpsdbtRevenue ForecastingData QualityData AnalysisDocumentation
LingaroLingaro - ML/AI Engineer2h ago
·Remote - India
RemoteAPACJuniorCloud ComputingArtificial IntelligenceAI EngineerML EngineerPythonGeminiMLOpsAzureGCPData GovernanceDatabricksKubernetesDockerDocumentation
Talentus GlobalTalentus Global - DevOps Engineer2h ago
·Argentina
In OfficeLATAMMidCloud ComputingDevOps EngineerBashPythonGitJenkinsAzureAWSTerraformDockerAnsibleKubernetesDocumentation
FloVision SolutionsFloVision Solutions - People Operations & Recruiting Generalist2h ago
·U.S. - Remote - Hybrid
In OfficeNAMidInsuranceSoftwareHR GeneralistEvent PlanningGreenhouseGustoBenefits AdministrationDocumentation
Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact
Loading...