Zencore - Data Engineer (Epic Clarity Migration)
Upload My Resume
Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT
Requirements
• Deep understanding of the Clarity data model, ETL Console, and underlying Oracle structures. • Advanced PL/SQL scripting, performance tuning, and understanding of Oracle legacy hardware constraints. • Proven track record of moving on-premise data warehouses to the cloud (ETL/ELT). • Proficiency in SQL is mandatory. Experience with Python or Shell scripting for automation is highly valued. • Experience with BigQuery architecture, partition/clustering strategies, and Google Standard SQL. • Experience with HL7 and FHIR data standards. • Familiarity with data validation frameworks (e.g., Great Expectations, dbt tests). • Experience troubleshooting replication tools (e.g., GoldenGate, Dataflow, or similar CDC tools).
Responsibilities
• Design and execute technical processes to migrate data from Oracle (Clarity) to Google BigQuery. • Handle complex schema mappings, specifically resolving incompatibilities between Oracle data types and BigQuery (e.g., handling timestamps, numerics, and character sets). • Troubleshoot pipeline failures related to hardware latency and legacy infrastructure constraints. • Develop automated SQL scripts and testing frameworks to validate data replication accuracy at scale. • Perform deep-dive root cause analysis on data inconsistencies, distinguishing between genuine data errors and artifacts of the conversion process. • Create technical documentation for data lineage and conversion logic. • Optimize SQL queries for performance within the BigQuery environment to manage costs and reduce latency. • Collaborate with Data Analysts to verify business logic and ensure the migrated data meets clinical reporting standards.
No credit card. Takes 10 seconds.