EnableComp - Principal Engineer, Data Insights
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. • Must have 6-8+ years deploying production ML systems, preferably in financial services or healthcare domains • Deep hands-on expertise with Databricks, MLflow, Spark optimization, and distributed computing architectures • Modern ML frameworks: Python, scikit-learn, XGBoost, LightGBM, deep learning libraries. • Statistical modeling: Time series forecasting, ensemble methods, feature engineering at scale, survival analysis, causal inference • Production deployment: Real-time scoring APIs, model monitoring, A/B testing frameworks • Data engineering at scale: Unity Catalog governance, Delta Lake ACID transactions, streaming ingestion, schema evolution management • Complex claims domain knowledge (VA, Workers’ Comp, MVA, Out-of-State Medicaid) • Understanding of revenue cycle workflows and predictive opportunities • Healthcare data experience: claims processing, payer behavior, denial patterns, medical coding • Azure ML ecosystem and cloud infrastructure • Data governance and HIPAA compliance requirements • API design for cross-functional consumption • Medallion architecture implementation (Bronze/Silver/Gold layers) • Timely and regular attendance. • Equivalent combination of education and experience will be considered • To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. Reasonable accommodations may be made to enable qualified individuals with disabilities to perform the essential functions. • SPECIAL CONSIDERATIONS & PREREQUISITES • Practices and adheres to EnableComp’s Core Values, Vision and Mission • Exceptional technical leadership driving innovation in emerging technologies • Ability to translate abstract AI capabilities into concrete RCM solutions • Experience managing mixed onshore/offshore development teams • Track record of delivering complex platforms that scale across business units
Responsibilities
• Develop and implement strategies for collecting, analyzing, and interpreting large datasets to derive actionable insights that drive business decisions. • Collaborate with cross-functional teams within the organization to understand data needs and integrate solutions into existing systems or develop new ones as required. • Design and execute experiments using statistical methods to validate hypotheses about customer behavior, market trends, and other relevant factors impacting company performance. • Monitor key performance indicators (KPIs) related to business objectives and provide regular reports on data findings with actionable recommendations for improvement. • Stay up-to-date with the latest developments in big data technologies, analytics tools, and methodologies relevant to your industry or organization's needs. Attend workshops, webinars, conferences, and other professional development opportunities as needed. • Maintain a high level of technical proficiency across various programming languages (e.g., Python, R), data manipulation tools (e.g., SQL, Excel), statistical software packages, and big data platforms such as Hadoop or Spark. • Communicate complex analytical results to non-technical stakeholders in a clear and concise manner using visualizations like charts, graphs, dashboards, etc. Ensure that the insights are easily understandable by all relevant parties involved. • Participate actively in team meetings, brainstorming sessions, project reviews, and other collaborative activities to share knowledge, exchange ideas, provide feedback on work progress, identify potential issues or roadblocks early on, and contribute constructive suggestions for improvement. • Maintain confidentiality of sensitive data at all times while working with proprietary information from clients/customers within the organization's compliance policies regarding privacy laws such as GDPR (General Data Protection Regulation) or CCPA (California Consumer Privacy Act). • Manage and organize large datasets effectively using data storage solutions like databases, cloud services, etc., ensuring that access controls are in place to prevent unauthorized use of confidential information. Implement appropriate security measures such as encryption or anonymization techniques when necessary based on the nature of client/customer data being handled within your organization's policies and regulations regarding privacy laws like GDPR (General Data Protection Regulation) or CCPA (California Consumer Privacy Act). • Develop, test, deploy, monitor, troubleshoot, optimize, and maintain big data infrastructure such as Hadoop clusters, Spark engines, etc., ensuring that they are running efficiently with minimal downtime. Implement best practices for performance tuning to maximize the speed of processing large datasets while minimizing resource consumption (e.g., CPU usage). • Coll
Similar Jobs
No credit card. Takes 10 seconds.