wagey.ggwagey.ggv1.0-38ee235-5-May
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Data Engineer Role/Element Solutions - Element Solutions
Pro members applied to this job 36 hours before you saw itGet Pro ›
Element Solutions

Element Solutions - Element Solutions

Remote - United States$135k - $155k+ Equity2d ago
RemoteMidNALife InsuranceInsuranceData EngineerSolutions EngineerCloud EngineerSnowflakeKafkaApache SparkSQLAWS

Upload My Resume

Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT

Apply in One Click
Apply in One Click

Requirements

• Bachelor’s degree in Computer Science, Information Systems, Engineering, Data Science, or related field (or equivalent experience). • 3+ years of experience in data engineering, data integration, or related technical roles. • Strong hands-on experience with Apache Kafka for streaming data pipelines. • Strong experience with Apache Spark for large-scale data processing (batch and/or streaming). • Advanced SQL development experience, including complex queries, performance tuning, and data transformation logic. • Experience integrating and managing data across multiple heterogeneous data sources. • Experience working in the federal government or other highly regulated environments with security and compliance requirements. • Strong understanding of data quality management, data validation, and data governance practices. • Strong problem-solving and analytical thinking abilities. • Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders. • Strong attention to detail, especially in ensuring data accuracy and consistency. • Ability to work independently in a fast-paced, mission-driven environment. • Strong collaboration skills across cross-functional technical and business teams. • US Citizenship or Permanent Residency required. • Must reside in the Continental US. • Depending on the government agency, specific requirements may include public trust background check or security clearance. • AWS Data Engineer certification. • Experience with cloud platforms such as AWS, Azure, or GCP (especially data services like S3, Glue, Databricks, or BigQuery). • Familiarity with data orchestration tools (e.g., Airflow, NiFi, or similar). • Experience supporting healthcare, insurance, or CMS-related data environments. • Knowledge of data modeling techniques (dimensional modeling, star/snowflake schemas). • Experience with DevOps practices, CI/CD pipelines, and infrastructure-as-code for data systems. • Familiarity with real-time analytics and event-driven architectures. • $135,000 - $155,000 a year • The likely salary range for this position is $135,000-$155,000. The salary offered will depend on several factors including, but not limited to, relevant experience, skills, education, geographic location, internal equity, business needs, and other factors permitted by law. Posted pay ranges are a general guideline only and are not a guarantee of compensation or salary. • We invest in the lives of our employees, both in and out of the workplace, by providing competitive pay and benefits packages. Benefits offered may include health care, dental, vision, life insurance; 401(k); paid time off including PTO, holidays, and any other paid leave required by law. • Be in your Element. We are a remote first company based out of Washington, DC.

Responsibilities

• Build and maintain high-volume, scalable data pipelines using Apache Kafka and Apache Spark, supporting both real-time and batch data processing needs. • Design, develop, and optimize data ingestion, transformation, and integration workflows across enterprise systems. • Ensure data quality, consistency, and integrity across four (4) disparate data sources, implementing validation, cleansing, and reconciliation processes. • Develop and maintain SQL-based data solutions, including complex queries, stored procedures, performance tuning, and data modeling. • Collaborate with data analysts, product owners, and application teams to define data requirements and ensure alignment with business needs. • Implement monitoring, logging, and alerting mechanisms to ensure reliability and observability of data pipelines. • Support data architecture design and contribute to best practices for scalable and secure data engineering solutions. • Ensure compliance with federal data governance, security, and privacy requirements. • Participate in Agile ceremonies and support iterative development and delivery of data capabilities. • Troubleshoot and resolve data pipeline issues, ensuring minimal disruption to downstream systems and reporting.

Benefits

• Make an impact that resonates-join our vibrant team and discover how you can improve lives through digital transformation. Our talented professionals bring unparalleled energy engagement, setting a higher standard for impactful work. Come be a part of our team and shape a better future.

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X