Recharge - Senior Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Typically, 5+ years experience in a data engineering related role (Data Engineer, Data Platform Engineer, Analytics Engineer etc) with a track record of building scalable data pipeline, transformation, and platform solutions. • 3+ years of hands-on experience designing and building data pipelines and models to ingesting, transforming and delivery of large amounts of data, from multiple sources into a Dimensional (Star Schema) Data Warehouse, Data Lake. • Experience with a variety of data warehouse, data lake and enterprise data management platforms (Snowflake {preferred}, Redshift, databricks, MySQL, Postgres, Oracle, RDS, AWS, GCP) • Experience building data pipelines, models and infrastructure powering external, customer-facing (in addition to internal business facing) analytics applications. • Solid grasp to data warehousing methodologies like Kimball and Inmon • Experience working with a variety of ETL tools. (FiveTran, dbt, Python etc) • Experience with workflow orchestration management engines such as Airflow & Cloud Composer • Hands on experience with Data Infra tools like Kubernetes, Docker • Expert proficiency in SQL • Strong Python proficiency • Interview recording & AI notetakersTo protect privacy, legal compliance, and interviewer/candidate experience, recording, transcribing, or using AI notetaker tools during interviews is not permitted without our prior written consent. Handwritten notes are welcome. If you need an accommodation (e.g., captions), email [email protected] before your interview—we’ll arrange an approved solution. • Interview recording & AI notetakers • ompensation • Recharge’s compensation offerings are grounded in a pay-for-performance philosophy that recognizes exceptional individual and team performance. Salary ranges are designed to be competitive and aligned with country specific practices, while individual compensation is determined by skills, qualifications, and experience. The compensation listed is not inclusive of any equity and benefits that might exist in your total compensation package. • Hiring range in the US$148,000 USD - $185,000 USD • Hiring range in Canada$108,000 CAD - $136,000 CAD • Application window anticipated to close: 6/1/2025. If you’re interested in this opportunity, please submit an application as soon as possible.
Responsibilities
• Build data pipeline, ELT and infrastructure solutions to power internal data analytics/science and external, customer-facing data products. • Create automated monitoring, auditing and alerting processes that ensure data quality and consistency. • Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models • Design, develop, implement, and optimize existing ETL processes that merge data from disparate sources for consumption by data analysts, business owners, and customers • Seek ways to continually improve the operations, monitoring and performance of the data warehouse • Influence and communicate with all levels of stakeholders including analysts, developers, business users, and executives. • Redesign and optimize existing data processes using AI (e.g., LLM-assisted development, automated documentation, issue resolution) to improve team productivity and enhance data quality across the data warehouse • Live by and champion our values: #day-one, #ownership, #empathy, #humility.
Benefits
• Medical, dental and vision plans • Retirement plan with employer contribution • Flexible Time Off • Paid Parental Leave • Monthly Remote Life and Merchant stipends • Recharge | Instagram | Twitter | Facebook
No credit card. Takes 10 seconds.