Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Bachelor’s degree in Computer Science, Engineering, or a related field. • Proven experience as a Data Engineer / BI Engineer, with 3+ years of experience. • Data modeling (semantic layer) • Power Query (M) • Dashboard/report optimization • Strong hands-on experience with SQL (MySQL / PostgreSQL preferred) and query optimization. • Proficiency in Python for data transformation, automation, and integrations. • Experience with data warehousing / lakehouse principles and performance tuning concepts. • Databricks, Spark, PySpark • APIs / data integrations (Flask or similar frameworks) • Utilities & tools: logging, requests, subprocess, regex, pytest • Familiarity with Git and modern collaborative development workflows. • Comfortable working in Linux and writing basic shell scripts. • Strong problem-solving skills and attention to detail. • Excellent communication skills and ability to work with cross-functional stakeholders. • Adaptability and willingness to learn new tools/technologies as needed. • Empowering, Leadership, Innovation, Teamwork, • Excellence • E - Empowering: Enabling individuals to reach their full potential • L - Leadership: Taking initiative and guiding each other toward success • I - Innovation: Embracing creativity and new ideas to stay ahead • T - Teamwork: Collaborating with empathy to achieve common goals • E - Excellence: Striving for the highest quality in everything we do
Responsibilities
• Data Engineering & Pipeline Development • Design, develop, and maintain scalable ETL/ELT pipelines to process large volumes of data from multiple sources. • ETL/ELT pipelines • Build and optimize data lakes / data warehouses / lakehouse structures for efficient reporting and analytics. • data lakes / data warehouses / lakehouse • Integrate structured and unstructured data from internal and external systems to create a unified and analysis-ready layer. • Ensure data accuracy, consistency, and completeness through validation, cleansing, transformation, and reconciliation checks. • Maintain strong documentation for data pipelines, datasets, Power BI models, and reporting logic. • Power BI Reporting & Semantic Modeling (Core Focus) • Develop and maintain Power BI dashboards and reports for business and leadership stakeholders. • Build optimized semantic models in Power BI (star/snowflake schema) for performance and scalability. • Write and optimize DAX measures, calculated columns, and KPIs aligned to business metrics. • Use Power Query (M) for transformations and ensure reporting datasets are refresh-ready and reliable. • Manage Power BI deployments via Power BI Service, including scheduled refresh, workspace access, and report sharing. • Improve dashboard performance by optimizing data model design, relationships, and aggregations. • Requirements Gathering & Business Alignment • Collaborate with product managers and business stakeholders to gather reporting requirements and translate them into technical deliverables. • Participate in analysis sessions to understand business needs and propose best-fit BI/data solutions. • Provide technical recommendations to improve reporting accuracy, usability, and speed of decision-making. • Agile Delivery • Agile Delivery • Work in Agile development cycles including sprint planning, daily stand-ups, and sprint reviews. • Deliver features on time by managing priorities, dependencies, and scope in a fast-paced environment. • Testing, Debugging & Reliability • Perform thorough testing of pipelines and reports to ensure reliability, performance, and data correctness. • Write unit tests and validations for transformations and pipeline outputs. • Conduct integration testing to ensure end-to-end reporting flows work as intended. • Identify and resolve defects, performance bottlenecks, and model/report issues proactively. • Continuous Improvement & Innovation • Stay updated on trends in data engineering, BI analytics, and Power BI capabilities. • Recommend improvements to enhance scalability, maintainability, and performance of the data/reporting ecosystem. • Continuously optimise existing codebases, ETL processes, and Power BI dashboards for better efficiency. • Stay familiar with cloud platforms like Azure / AWS / GCP and modern data engineering practices.