Avra - Senior Data Engineer
Upload My Resume
Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT
Requirements
• Experience - At least 5 years of experience building and operating data systems in production environments • Technical Proficiency - Strong Python and SQL skills, with experience in data pipelines, orchestration, and warehouse modeling • Technical Proficiency - • Operational Mindset - Experience managing data quality, observability, failure modes, and long-running production workflows • Operational Mindset - • Collaboration - Strong communication skills and ability to work across engineering, product, and business teams • Collaboration -
Responsibilities
• Pipeline & Workflow Orchestration. Design and operate orchestrated workflows, ensuring clear execution semantics, observability, and resilience. • Pipeline & Workflow Orchestration. • Data Modeling & Transformation. Design and maintain scalable transformation layers using dbt, BigQuery, Spark, Ray or equivalent tools, enabling reliable downstream consumption. • Data Modeling & Transformation. • Data Quality & Platform Standards. Define and enforce standards around data quality, contracts, schema validation, lineage, and metadata. • Data Quality & Platform Standards. • Handling Real-World Data Complexity. Work with incomplete, inconsistent, and evolving data sources. Handle schema drift, edge cases, and operational failures with strong engineering judgment. • Handling Real-World Data Complexity. • Cloud & Infrastructure Integration. Operate within cloud-native environments (GCP/AWS), working with storage, messaging (e.g., Pub/Sub), and infrastructure as code. • Cloud & Infrastructure Integration. • Platform Improvement & Simplification. Continuously improve systems by increasing observability, reducing operational risk, and simplifying architecture. • Platform Improvement & Simplification. • You Stand Out If • You Stand Out If • You have strong experience operating production data systems, not just building them once • data systems • You have worked with orchestration tools like Dagster, Airflow, or Prefect • Dagster • You have experience designing robust data models and transformation layers • You have strong hands-on experience with Python for data engineering (pipelines, tooling, and production systems) • Python • You think in terms of systems, reliability, and long-term maintainability • You simplify complexity instead of adding unnecessary abstraction • simplify complexity • Experience with GCP, dbt, BigQuery, or event-driven architectures is a strong plus • event-driven architectures
Benefits
• Competitive Compensation - Attractive salary and equity participation • Ownership & Impact - Direct influence on the foundation of Avra’s data platform and its evolution • Ownership & Impact - • Technical Environment - Work on real-world data problems involving complex ingestion, reliability, and large-scale systems • Technical Environment - • Lean, High-Quality Team - Work closely with experienced engineers and founders in a high-context environment • Lean, High-Quality Team - • Flexible Culture - Remote-first (Brazil), with flexible time off and a strong focus on autonomy • Flexible Culture - • If you are motivated by building and operating real data systems — not just designing them — and want to work on hard problems involving messy data and production reliability, we would like to hear from you.
No credit card. Takes 10 seconds.