Mutt Data - DataOps Engineer Specialist
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Leveraging our expertise, we build modern Machine Learning systems for demand planning and budget forecasting. • Developing scalable data infrastructures, we enhance high-level decision-making, tailored to each client. • Offering comprehensive Data Engineering and custom AI solutions, we optimize cloud-based systems. • Using Generative AI, we help e-commerce platforms and retailers create higher-quality ads, faster. • Building deep learning models, we enhance visual recognition and automation for various industries, improving product categorization, quality control, and information retrieval. • Developing recommendation models, we personalize user experiences in e-commerce, streaming, and digital platforms, driving engagement and conversions. • Amazon Web Services • 📊 We are Data Nerds • 🤗 We are Open Team Players • 🚀 We Take Ownership • 🌟 We Have a Positive Mindset • Orchestration & Integration: Build and manage scalable data pipelines using Apache Airflow, dbt, and Airbyte, ensuring seamless data ingestion and movement. • Product Development: Work hand in hand with Mixilo’s product team to address real client and internal users' issues, design technical solutions, and help prioritize the roadmap. • Infrastructure as Code (IaC): Use Terraform to provision and manage cloud resources on AWS, maintaining a secure and cost-effective infrastructure. • Kubernetes & GitOps: Manage containerized applications and services on Kubernetes (EKS), implementing continuous delivery practices to keep Mixilo running smoothly. • Enhance DX (Developer Experience): Abstract away complex DAG and dbt logic to reduce manual work for the team and optimize our time to market. • Data Reliability: Implement rigorous testing frameworks—specifically leveraging dbt tests—to ensure data quality and catch errors before they impact client recommendations. • CI/CD for Data: Maintain and improve our CI/CD pipelines to automate testing, deployment, and infrastructure changes. • Senior Data Engineering Foundations: Strong experience in Python and a deep mastery of SQL and Postgres. • Modern Data Stack: Hands-on experience with Airflow, dbt, and Airbyte. • Analytical Data Systems: Experience in constructing analytical data systems over Data Lakes (e.g.,AWS S3, Athena, EMR, Glue, Iceberg/Delta, etc.). • Cloud Mastery: Solid understanding of AWS services (S3, EC2, RDS, IAM, etc.) • Operations Mindset: A strong understanding of GitOps, CI/CD principles, and a passion for automation. • Teamwork: Great capacity for collaborative, async, and written communication. You take ownership and follow through on your commitments. • Observability: Experience setting up monitoring and alerting systems to ensure the health of data pipelines and infrastructure. • Code Hygiene: A sharp sense of code hygiene, including code review, documentation, testing, and CI/CD (Continuous Integration/Continuous Delivery). • Stream Processing Knowledge: Experience with stream processing tools like Kafka Streams, Kinesis, or Spark. • Python's Scientific Stack: Proficiency in Python's Scientific Stack, including numpy, pandas, jupyter, matplotlib, scikit-learn, and related tools. • Infrastructure & Containers: Proficiency with Terraform, Docker, and Kubernetes. • English Proficiency: Solid command of the English language for writing technical documents, such as Design Documents.oli
Responsibilities
• Orchestration & Integration: Build and manage scalable data pipelines using Apache Airflow, dbt, and Airbyte, ensuring seamless data ingestion and movement. • Product Development: Work hand in hand with Mixilo’s product team to address real client and internal users' issues, design technical solutions, and help prioritize the roadmap. • Infrastructure as Code (IaC): Use Terraform to provision and manage cloud resources on AWS, maintaining a secure and cost-effective infrastructure. • Kubernetes & GitOps: Manage containerized applications and services on Kubernetes (EKS), implementing continuous delivery practices to keep Mixilo running smoothly. • Enhance DX (Developer Experience): Abstract away complex DAG and dbt logic to reduce manual work for the team and optimize our time to market. • Data Reliability: Implement rigorous testing frameworks—specifically leveraging dbt tests—to ensure data quality and catch errors before they impact client recommendations. • CI/CD for Data: Maintain and improve our CI/CD pipelines to automate testing, deployment, and infrastructure changes.
Benefits
• Remote-first culture – work from anywhere! 🌍 • AWS, DBT, Google Cloud, Azure & Databricks certifications fully covered • In-Company English Lessons. • Birthday off + an extra vacation week (Mutt Week! 🏖️) • Referral bonuses – help us grow the team & get rewarded! • Maslow: Monthly credits to spend in our benefits marketplace. • ✈️🏝️ Annual Mutters' Trip – an unforgettable getaway with the team!