Data Engineer II
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Build and optimise scalable, maintainable, and high-performance data pipelines and workflows. • Ingest, transform, and deliver high-volume data from a range of structured and unstructured sources, including MySQL databases and real-time Kafka streams. • Design and maintain performant data models in Redshift, and optimise SQL queries to support analytics and reporting workloads. • Contribute to our cloud migration journey and help evolve the data architecture with new ideas and improvements. • Collaborate with cross-functional, globally distributed teams to translate business requirements into robust technical solutions. • Embed data quality, reliability, observability, and security as first-class principles across the data platform. • Support and enhance data solutions operating in a mission-critical payments environment. • Continuously learn, share knowledge, and help raise standards across data governance, quality, and security practices. • Minimum 4 years of professional experience in Data Engineering or a related role. • Strong SQL expertise, including query optimisation, complex transformations, and datamodelling. • Solid Python skills for data engineering and pipeline development. • Hands-on experience with AWS services such as Redshift, Lambda, Glue, and S3, as well as Airflow for orchestration. • Familiarity with modern transformation and modelling practices using tools such as dbt. • A collaborative mindset, strong problem-solving skills, critical thinking, and a willingness to take ownership and initiative. • Experience contributing to large-scale cloud migration initiatives. • Knowledge of real-time data streaming using Kafka. • Exposure to CI/CD pipelines and infrastructure-as-code for data platforms. • Interest in exploring AI/ML use cases within modern data ecosystems. • Understanding of the payments domain or fintech data challenges.
Responsibilities
• Build and optimise scalable, maintainable, and high-performance data pipelines and workflows. • Ingest, transform, and deliver high-volume data from a range of structured and unstructured sources, including MySQL databases and real-time Kafka streams. • Design and maintain performant data models in Redshift, and optimise SQL queries to support analytics and reporting workloads. • Contribute to our cloud migration journey and help evolve the data architecture with new ideas and improvements. • Collaborate with cross-functional, globally distributed teams to translate business requirements into robust technical solutions. • Embed data quality, reliability, observability, and security as first-class principles across the data platform. • Support and enhance data solutions operating in a mission-critical payments environment. • Continuously learn, share knowledge, and help raise standards across data governance, quality, and security practices.
Benefits
• We Value Performance: Through competitive salaries, performance bonuses, sales commissions, equity for specific roles and recognition programs, we ensure that all our employees are well rewarded and incentivized for their hard work. • We Care for Our Employees: The wellness of Nium’ers is our #1 priority. We offer medical coverage along with 24/7 employee assistance program, generous vacation programs including our year-end shut down. We also provide a flexible working hybrid working environment (3 days per week in the office). • We Upskill Ourselves: We are curious, and always want to learn more with a focus on upskilling ourselves. We provide role-specific training, internal workshops, and a learning stipend. • We Celebrate Together: We recognize that work is also about creating great relationships with each other. We celebrate together with company-wide social events, team bonding activities, happy hours, team offsites, and much more!
Similar Jobs
No credit card. Takes 10 seconds.