Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Experience with version control systems, such as Git • Experience in data modeling, database design, and data schema optimization • Experience with other cloud platforms such as AWS or Azure • Experience with machine learning workflows and pipelines • What you’ll get? • compensated days without service delivery obligation (from 21 up to 31!) • UNUM group insurance • private medical care (Luxmed) and sport card (Medicover Sport) • option of fully remote cooperation or from one of their Polish office in Rynek in Wrocław (Św. Mikołaja), Przeskok in Warsaw and Wyzwolenia in Szczecin • company retreats abroad or in Poland once a year (bonding time, yeah!) • company equipment provided • budget for your training and development • access to Google Cloud Skills Boost platform. • Sounds interesting? Let’s talk! :) • PS Recruitment stages: • 1. 30-minutes initial call with Recruiter Marta KK from Bee Talents. • 2. 60-minutes technical call with Mariusz Zyśk, Data & AI Evangelist plus Emil Najczuk, PM Lead at FOTC . • BEE TALENTS INFORMATION CLAUSE ⬇️ • The Administrator of your personal data is Bee Talents PSA, ul. Garbary 35/12, 61-868 Poznan, NIP: 7792463296. The purposes of processing your personal data are: conducting the initial stage of recruitment, based on your consents, participation in future recruitment processes conducted for our clients, based on the consent, defense against potential claims of our Clients for whom we conduct recruitment, in particular regarding the non-compliance of the Candidate’s profile with the requirements specified by the Client – which is a legitimate interest of the Data Administrator pursuant to art. 6 section 1 pt. f) GDPR. Due to data processing you have the right to access your personal data, request a copy of it, retrificate it, delete or limit it and withdraw of consents at any time, make an objection to the processing of personal data and lodge a complaint to the President of the Personal Data Protection Office if it violate the GDPR rights. You can read the full content of the information clause here: https://beetalents.com/eng-gdpr.
Responsibilities
• What (client) can offer you: • 🎯 Role: Data Engineer • ⏱ Working hours: flexible between 8:00-18:00 • 👩💻 Working mode: B2B and fully remote • 📍 Office location: Wroclaw, Warsaw, Szczecin, Bucharest, Budapest • 🇵🇱 🇬🇧 Speaking language: both Polish and English on communicative level as they work in Polish and an international environment. • The biggest challenge in this role: you will learn a professional approach to data management and you will learn about the data life cycle from the source to Business Intelligence systems. • Building and maintaining data pipelines and workflows using Cloud Composer • Designing and implementing data models, database schemas, and ETL processes using core GCP data services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Functions • Collaborating with cross-functional teams to understand business requirements and develop solutions • Ensuring data quality and reliability by monitoring and debugging data pipelines • Maintaining and improving our existing data infrastructure and processes • Staying up-to-date with industry trends and best practices in data engineering. • What are they looking for? • Min. 3 years of experience in Data Engineering, with at least 2 years focused on GCP • Proficiency in SQL and Python • Knowledge of Apache Airflow • Hands-on experience with core GCP data services: BigQuery, Dataflow, Pub/Sub • Experience in building data warehouses on GCP (Google Cloud Platform) • English and Polish at a communicative level.