Min. 3 years of experience in Data Engineering, with at least 2 years focused on GCP
Proficiency in SQL and Python
Knowledge of Apache Airflow
Hands-on experience with Cloud Composer (implied by the job responsibilities)
Responsibilities
Develop and maintain data pipelines for big data processing.
Implement ETL processes to extract, transform, load large datasets into our systems.
Design scalable database architectures that can handle high volumes of queries efficiently.
Monitor system performance and optimize query execution times where necessary.
Ensure the security and integrity of sensitive information throughout data handling procedures.
Collaborate with cross-functional teams to align big data strategies with business objectives.
Stay updated on emerging technologies in Big Data, Machine Learning, and Cloud Computing relevant to our projects.
Benefits
Equity options mentioned as part of the offer.
Paid time off (PTO) benefits included in the job posting.
Insurance coverage provided to employees is specified.
Perks such as flexible working hours and location are highlighted, though they may be considered a benefit rather than compensation directly tied to salary or equity.
Remote work options available within Poland mentioned for full-time positions; however, this could also fall under perks related to flexibility in the job role itself.