wagey.ggwagey.ggv1.0-68eec7a-3-May
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Data Engineer Role/Andela - Data Engineer
Andela

Andela - Data Engineer

Remote - United States3w ago
RemoteMidNALogisticsData EngineerPythonSQLAirflowTerraformFivetran

Upload My Resume

Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT

Apply in One Click
Apply in One Click

Requirements

• 3+ years experience in data engineering roles. Expert-level SQL skills and experience with data warehousing (BigQuery preferred) • Proficiency with data pipeline tools (Airflow, Terraform, Fivetran, dbt, or similar) • Strong proficiency in Python, with a focus on writing clean, maintainable code for data pipeline development, API integrations, and ETL/ELT processes. • Proven ability to leverage AI-driven development tools to accelerate pipeline construction, automate documentation, and perform rapid root-cause analysis of data quality issues. • Strong understanding of data modeling principles and dimensional design • Experience with data governance, security, and quality frameworks • Ability to translate business requirements into technical data solutions • Experience with the Looker semantic layer (LookerML) to define business logic and create self-service reporting environments is highly desirable. • Bachelor's degree in Computer Science, Engineering, or related technical field preferred

Responsibilities

• Data Infrastructure • Design, build, and maintain scalable data pipelines and ETL processes that ensure data quality and reliability • Manage data warehouse architecture and optimization for performance and cost-efficiency • Implement data governance standards, security protocols, and monitoring systems • Own data ingestion from multiple sources and ensure seamless integration into analytics platforms • Enabling Self-Service Analytics • Build data models and semantic layers that enable business users to access insights independently • Create and maintain data documentation, lineage, and data dictionaries • Partner with BI Analysts to understand data needs and proactively improve data availability • Implement automated data quality checks and alerting systems • Technical Excellence • Utilize tools like Composer, Terraform, Airflow, Fivetran, dbt to build robust data infrastructure • Optimize SQL queries and warehouse performance for analytical workloads • Stay current with modern data engineering best practices and tools

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X