wagey.ggwagey.ggv1.0-e93b95d-4-May
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Data Engineer Role/Satellite Innovations - Satellite Innovations
Satellite Innovations

Satellite Innovations - Satellite Innovations

Remote - Warsaw2mo ago
RemoteMidEMEACloud ComputingData EngineerSolutions ArchitectPrincipalPythonSQLDatabricksAirflowAWS

Upload My Resume

Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT

Apply in One Click
Apply in One Click

Requirements

• Skills needed: Data engineering experience and knowledge of big data technologies. • Years of experience: At least five years in a similar role with strong analytical skills. • Education: Bachelor's degree in Computer Science, Information Technology, or related field preferred; Master’degrees are considered an advantage. • Certifications: Professional certification such as AWS Certified Solutions Architect is highly desirable but not mandatory. ✅ Must-haves include proficiency with SQL and experience using data visualization tools like Tableau.

Responsibilities

• Design, build, and maintain data pipelines and ETL workflows using Python. • Write and optimize SQL queries for data transformation and validation. • Develop data processing jobs in a Databricks-first environment using Spark and Delta Lake. • Orchestrate data workflows using Airflow (DAGs, scheduling, dependencies, retries). • Implement data quality checks to ensure reliability and consistency of data pipelines. • Support the Principal Data Engineer and collaborate closely with the Data Engineering team. • Work with cloud-based data platforms, primarily within AWS. • Use Git for version control and contribute to basic CI/CD workflows. • Participate in code reviews and continuous improvement of data solutions.nt. • We’re Looking For: • 2-4 years of experience as a Software Engineer or Data Engineer. • Strong hands-on experience with Python for building and automating data pipelines. • Strong SQL skills for querying, transforming, and validating data. • Experience working with Databricks (Spark, Delta Lake). • Hands-on experience with Airflow for workflow orchestration. • Familiarity with AWS fundamentals in a cloud-based data environment. • Experience building and maintaining ETL pipelines. • Experience with Git and basic CI/CD processes. • Nice to have: experience with streaming data pipelines. • Ability to work effectively in a collaborative, fast-paced environment. • Good English communication skills (verbal and written).

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X