wagey.ggwagey.ggv1.0-e2c599d-4-May
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Platform Engineer Role/New Era Technology - Data Platform Engineer III / MS Fabric
New Era Technology

New Era Technology - Data Platform Engineer III / MS Fabric

Remote - India-Remote1mo ago
RemoteMidAPACPaymentsCloud ComputingPlatform EngineerData EngineerPythonSQLAzureSynapsePower BI

Upload My Resume

Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT

Apply in One Click
Apply in One Click

Requirements

• Location: • Employment Type: • Strong hands-on experience with Microsoft Fabric (Lakehouse, Data Factory, Data Warehouse, Notebooks). • Microsoft Fabric (Lakehouse, Data Factory, Data Warehouse, Notebooks) • Expertise in PySpark and Spark-based data processing. • PySpark and Spark-based data processing • Strong programming skills in Python. • Python • Advanced knowledge of SQL and database optimization. • SQL and database optimization • Experience with ETL/ELT pipeline development. • ETL/ELT pipeline development • Understanding of data modeling concepts and data warehousing principles. • data modeling concepts and data warehousing principles • Experience working with large-scale structured and unstructured data. • large-scale structured and unstructured data • Experience with Azure Data Factory / Azure Synapse / Databricks. • Azure Data Factory / Azure Synapse / Databricks • Knowledge of Power BI integration with Microsoft Fabric. • Power BI integration with Microsoft Fabric • Familiarity with CI/CD pipelines and DevOps practices. • CI/CD pipelines and DevOps practices • Experience with data governance and security frameworks. • data governance and security frameworks • Exposure to real-time or streaming data processing. • real-time or streaming data processing • Educational Qualification • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. • Computer Science, Information Technology, or a related field • Certification in Microsoft Fabric or Azure Data Engineering. • Microsoft Fabric or Azure Data Engineering • Experience working in cloud-based data platforms. • cloud-based data platforms • Shift Timings: 2PM to 11PM IST • 📩 Interested candidates may share their updated resumes to:        [email protected] • nterested candidates may share their updated resumes to: • [email protected] • New Era Technology, LLC., and its subsidiaries (“New Era” “we”, “us”, or “our”) in its operating regions worldwide are committed to respecting your privacy and recognize the need for appropriate protection and management of any Personal Data that you may provide us. In this, we are also committed to providing you with a positive experience on our websites and while using our products, services and solutions (“Solutions”). • View our Privacy Policy here https://www.neweratech.com/us/privacy-policy/ • We never ask candidates to pay any fees at any point in our hiring process. If you are ever asked to provide payment for training, certification, equipment, or any other purpose, it is not from our company. Only communications from our official company channels should be trusted. Please note our official email domain is @neweratech.com. If you suspect fraudulent activity, please contact us immediately at [email protected] . • @neweratech.com

Responsibilities

• Design, develop, and maintain data pipelines and ETL/ELT processes using Microsoft Fabric and PySpark. • data pipelines and ETL/ELT processes • Microsoft Fabric and PySpark • Build and manage Lakehouse and Data Warehouse solutions within the Microsoft Fabric ecosystem. • Lakehouse and Data Warehouse solutions • Develop scalable data processing workflows using Python and PySpark. • Python and PySpark • Write optimized SQL queries for data transformation, analysis, and performance tuning. • SQL queries • Integrate data from various sources such as APIs, databases, cloud storage, and streaming platforms. • APIs, databases, cloud storage, and streaming platforms • Implement data modeling techniques to support analytics and reporting requirements. • data modeling techniques • Ensure data quality, governance, and security across the data platform. • data quality, governance, and security • Monitor and optimize data pipeline performance and reliability. • data pipeline performance and reliability • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. • data analysts, data scientists, and business stakeholders • Document architecture, workflows, and technical processes.

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X