Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Bachelor's degree in Computer Science, Engineering or related Information Technologies field • 2+ years of experience as a Data Engineer or in a similar data-focused role • Experience with Scala or Java programming language • Experience with Go programming language is a plus • Experience with Data Integration concepts, open source tools such as Kafka Connect and Debezium • Experience with big data processing frameworks such as Apache Spark • Familiarity with real-time data processing and streaming architectures for low-latency analytics • Familiarity with modern data architectures including data lakes, data warehouses (Vertica), and lakehouse table formats (Apache Iceberg, Delta Lake) • Experience with workflow orchestration tools such as Apache Airflow, Dagster or Prefect • Strong proficiency in SQL for data manipulation, querying, and optimization • Experience with both RDBMS and NoSQL databases • Experience with testing frameworks, including unit testing and data quality validation for data pipelines • Experience with containerization technologies such as Docker and Kubernetes • Familiarity with CI/CD pipelines for data engineering and infrastructure-as-code practices • Strong problem-solving skills and the ability to work independently and collaboratively • Strong English communication skills, written and verbal • Languages: Scala/Java (primary), Go (preferred), SQL (advanced), Python (nice to have) • Data Integration: Kafka Connect, Debezium • Data Processing: Apache Spark • Streaming & Messaging: Apache Kafka • Workflow Orchestration: Apache Airflow • DevOps & Infrastructure: Docker, Kubernetes, GitLab CI, Terraform • Version Control: Git
Responsibilities
• Build and maintain scalable data pipelines for batch and real-time data processing. • Work with data platforms and data warehouses to ensure reliable, cost-effective data processing solutions • Build and maintain REST APIs to serve processed and aggregated data to downstream applications and teams • Collaborate with cross-functional teams to define data architecture and infrastructure requirements • Monitor and improve pipeline performance, scalability, and resilience
Benefits
• Hybrid working model with flexibility: a schedule that helps you find the right balance between flexibility and team bonding, including work-from-abroad opportunities and a summer working model. • Customisable FlexBenefits budget: Adjust your daily meal allowance, choose your health insurance package (and extend it to your spouse or children), and pick from additional benefits like fuel support or Trendyol shopping credits. • Well-being support: Access to location-based in-house doctors, as well as psychologist and dietitian support, and HPV vaccination provision. • Personalised training allowance and learning opportunities: Use your annual budget for any training or conference of your choice, explore our Learning Management System (LMS) anytime, and join in-person learning sessions offered throughout the year. • Responsibility from day one: Take full ownership from the start in a culture where every voice is heard and valued. • A diverse, international team: Collaborate with global peers across our offices in Berlin, Amsterdam, Dubai, and beyond, in a startup-spirited and collaborative environment. • Opportunities to grow with the best: Tackle meaningful challenges, develop through hands-on experience, and grow with the support of expert guidance and global mentoring. • Meaningful connections beyond tasks: Be part of team rituals, events, and social activities that help us stay connected and inspired. • Take the Next Step • If this role excites you, apply today, we look forward to taking the next step with you. • Want to get to know the team better first? Explore our Career Website, LinkedIn, or YouTube to learn more about #LifeatTrendyol and how we work.