Dijital Team - Senior/ Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• 4+ Years of experience in Data Engineering • Strong hands-on experience implementing data migration and processing solutions using Azure services (the more areas experience the better) such as: Azure Storage, Azure SQL DB / Data Warehouse, Azure Data Factory, Azure Stream Analytics, Azure Analysis Services, Azure Databricks, Azure Data Catalogue, Event Hub, Cosmos DB, Azure Functions, ARM Templates • Hands-on experience with Microsoft Fabrics • Experience working with serverless architectures and cloud-based data platforms • Knowledge of big data, analytics, database management, and information processing • Experience with IoT, event-driven architectures, or microservices-based cloud solutions • Exposure to ML Studio, AI/ML workflows, or advanced analytics platforms • Excellent communication skills, able to clearly interact with clients and peers. • Experience presenting to groups and facilitating workshops. • Experience leading and mentoring technical staff. • Background in consulting and experience working on projects. • Consulting or client service delivery experience on Azure • Ability to work remotely via Microsoft Teams • Aptitude for critical and analytical thinking, as well as the capacity to leverage out-of the box solutions • Keen for a fast-paced environment with opportunities to learn, grow, and be exposed to a variety of problems, clients, and technologies • We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
Responsibilities
• Design and implement data models and data architecture solutions to support scalable analytics platforms • Develop and maintain ELT pipelines, data integration workflows, and data migration processes, particularly using Azure Data Factory • Design and implement data quality systems and governance processes • Support DevOps practices, including CI/CD deployments for data solutions • Develop and maintain solutions within Microsoft Fabric environments • Analyze business requirements and translate them into technical specifications • Document solutions including data models, configurations, and deployment setups • Work with large datasets to support analytics, reporting, and information analysis initiatives
Similar Jobs
No credit card. Takes 10 seconds.