Senior Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Bachelor of Science degree in Computer Science or equivalent practical experience. • 4+ years of dedicated experience building and maintaining complex ETL/ELT pipelines. • 3+ years of Python development experience, specifically building production-grade data APIs using FastAPI or similar frameworks. • Strong expertise in SQL, including advanced query optimization and performance tuning. • Expert-level proficiency in dbt implementation, including managing models, macros, and incremental and dynamic tables. • Extensive hands-on experience with Snowflake, including performance optimization, data recovery using Time Travel, and advanced data modeling techniques. • Practical experience with key AWS data services such as Kinesis, Firehose, Redshift, Spectrum, Elastic MapReduce, and Lambda; and container orchestration services like ECS and EKS for deploying and managing data applications. • Proven experience designing and building near real-time data processing systems. • Experience with data observability tools for proactive monitoring of data quality, lineage, and pipeline health. • Familiarity with modern data orchestration platforms such as Argo or Airflow. • Hands-on experience with the full software development lifecycle (SDLC), including strong CI/CD practices for data pipelines and proficiency with data testing frameworks. • Demonstrated ability to identify and adopt new technologies that improve data quality and reliability. • Deep understanding of the data lifecycle, emphasizing the importance of high-quality data in applications, machine learning, business analytics, and reporting. • Proven track record of mentoring junior team members. • Experience with data ingestion tools like Singer is a plus. • Knowledge of analytics tools such as Tableau, Plotly, and Pandas. • Experience in the financial services industry. • Prolonged periods of sitting at a desk and working on a computer • Must be able to communicate via phone calls and/or video conferences (mainly for concierge and sales roles)
Responsibilities
• Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology. • Develop and manage robust data integrations with external vendors and organizations (including complex API integrations). • Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions. • Lead and take ownership of assigned technical projects in a fast-paced environment. • Drive continuous improvement in the quality, security, efficiency, and scalability of our data pipelines and infrastructure.
Benefits
• All roles at SmartAsset are currently and will remain remote - flexibility to work from anywhere in the Contiguous US. • Medical, Dental, Vision - multiple packages available based on your individualized needs • Life/AD&D Insurance - basic coverage at 100% company paid, additional supplemental available • Supplemental Short-term and Long-term Disability • FSA: Medical and Dependant Care • Equity packages for each role • Time Off: Vacation, Sick and Parental Leave • EAP (Employee Assistance Program) • Financial Literacy Mentoring Program