Senior Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• 3+ years of experience in data engineering, including building and maintaining large-scale data pipelines • Extensive experience in SQL RDBMS (SQLServer or similar) with dimensional modeling using star schema, and foundational data warehousing concepts • Hands-on experience with AWS services such as Redshift, Athena, S3, Kinesis, Lambda, Glue • Experience with DBT, Databricks or similar data platform tooling • Experience working with structured and unstructured data and implementing data quality frameworks • Excellent communication and collaboration skills • Demonstrated experience using AI coding tools (GitHub Copilot, Cursor, or similar), with understanding of prompt engineering, to enhance development productivity • Understanding of AI/ML concepts and data requirements, including feature stores, model versioning, and real-time inference pipelines • Bachelor's or Master's degree in Computer Science, Engineering, or a related field • Experience in a SaaS or e-commerce environment with AI/ML products • Knowledge of stream processing frameworks like Kafka, Flink, or Spark Structured Streaming • Familiarity with LLMOps and AI model deployment patterns in data infrastructure • Experience with AI-powered data tools such as automated data catalogs, intelligent monitoring systems, or AI-assisted query optimization • Proven ability to thrive in a fast-paced, agile environment with shifting priorities and emerging technologies • Experience with containerization and orchestration tools like Docker and Kubernetes
Responsibilities
• Core Data Infrastructure • Design and implement scalable ETL/ELT workflows that support both batch and streaming data using AWS primitives (e.g., S3, Kinesis, Glue, Redshift, Athena) • Design and implement scalable ETL/ELT workflows • Architect and maintain cloud-native data platforms with automated ingestion, transformation, and governance pipelines using modern tools like DBT, Apache Spark, Delta Lake, Airflow, and Databricks • Architect and maintain cloud-native data platforms • Work with stakeholders, including the Product, BI, and Support teams, to assist with data-related technical challenges and support their data and infrastructure needs. • Work with stakeholders, including the Product, BI, and Support teams • Collaborate with cross-functional Engineering teams using analytics to predict data needs and proactively deliver solutions. • Collaborate with cross-functional Engineering teams • Assist in the optimization of data lake/lakehouse infrastructure, to support AI workloads and large-scale analytics. • Assist in the optimization of data lake/lakehouse infrastructure • Governance & Collaboration • Ensure data quality, lineage, and observability. • Ensure data quality, lineage, and observability • Develop and enforce data governance policies, including compliance monitoring and privacy protection. • Develop and enforce data governance policies • Partner with Data Scientists to optimize data pipelines for model training, inference, and continuous learning workflows. • Partner with Data Scientists • Advanced Data Operations • Build self-healing data pipelines with AI-driven error detection, root cause analysis, and automated remediation capabilities. • Build self-healing data pipelines • Implement intelligent data lineage tracking to automatically discover relationships between datasets and predict downstream impact of changes. • Implement intelligent data lineage tracking • Create AI-assisted data discovery systems that help stakeholders find relevant datasets and understand data semantics through natural language interfaces. • Create AI-assisted data discovery systems • Participate in on-call rotation, as needed. • Participate in on-call rotation • AI-Enhanced Development • Leverage AI coding assistants (GitHub Copilot, Cursor, etc.) to • Leverage AI coding assistants • Accelerate development cycles, generate complex SQL queries, and automatically optimize data pipeline code. • Accelerate development • Develop data quality monitoring, using anomaly detection and data profiling tools to identify issues, before they impact downstream systems. • Develop data quality monitoring • Optimize pipeline orchestration, with ML to predict optimal scheduling, resource allocation, and failure recovery patterns. • Optimize pipeline orchestration • Generate and maintain living documentation that evolves with code changes. • Generate and maintain living documentation • Leadership & Innovation • Participate in the full software development lifecycle, including both manual and AI-assisted requirements gathering, automated testing, and intelligent deployment strategies. • Participate in the full software development lifecycle • Mentor junior engineers on both traditional data engineering practices and effective use of AI development tools. • Mentor junior engineers • Lead tool evaluation and adoption for the data engineering team, establishing best practices for human-AI collaboration. • Lead tool evaluation and adoption • Drive innovation in data architecture by experimenting with emerging technologies. • Drive innovation in data architecture • Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities required of the employee for this job. Duties, responsibilities, and activities may change at any time with or without notice. • What it’s like to work at Rithum • When you join Rithum, you can expect to work with smart risk-takers, courageous collaborators, and curious minds. • As part of the Rithum team, you are valued, supported, and included. Guided by a transparent culture and accessible, approachable leadership, we offer career opportunities aligned to your ambitions and talents. To ensure work and life balance works for you, we also offer an array of resources to support you and your families, including comprehensive benefits and wellness plans. • Partner with the leading brands and retailers. • Participate in an inclusive, welcoming work atmosphere. • Achieve work-life balance through remote-first working conditions, generous time off, and wellness days. • Receive industry-competitive compensation and total rewards benefits.
Benefits
• Medical, Dental and Psychology benefits • Life insurance and disability benefits • Competitive time off package with 25 Days of PTO, 8 Company-paid Holidays, 5 paid floating holidays (new in 2026!), 2 Wellness days and 1 Paid Volunteer Day • Voucher program for Transportation, Meals & Childcare • Remote Working Stipend: €40/month automatically applied in payroll • Access to tools to support your wellbeing such as the Calm App and an Employee Assistance Program • Professional development stipend and learning and development offerings to help you build the skills and connections you need to move forward in your career. • Charitable contribution match per team member