Timing: 10:00 AM - 07:00 PM (Pakistan Standard Time)
Employment Type: Full Time (Onsite)
Location: Lahore
No. of positions: 1
Key Responsibilities:
- Analyse, Design and build modern data solutions using Microsoft Fabric.
- Explore exiting implementation and determine the impact of new change on existing applications.
- Extract Transform and Load data from different sources to Microsoft Fabric OneLake/Lakehouse/Warehouse.
- Create Notebooks, Dataflow Gen2 & pipelines in Microsoft Fabric.
- Create semantic models and write relational database queries
- Develop and deploy comprehensive data analytics solutions in Microsoft Fabric, ensuring scalability and performance.
- Utilize deep knowledge of Lakehouse architecture to manage and store vast amounts of structured and unstructured data efficiently.
- Write clean, efficient, and optimized Python code in Notebooks to perform complex data transformations and manipulation tasks.
- Work with PySpark to handle and process large volumes of data, ensuring optimal performance in Spark environments.
- Design, develop, and implement end-to-end Power BI solutions on Microsoft Fabric, including dashboards, reports, and data visualizations.
- Leverage knowledge of CI/CD pipelines in Azure DevOps to automate deployment processes and ensure smooth integration of solutions into production environments.
- Create and manage ETL pipelines in Microsoft Fabric to streamline data processing, extraction, transformation, and loading across various data sources.
Minimum Skills Required:
- Azure Date Lake, Data Factory, Azure Storage
- Azure SQL Database
- Azure DevOps, Power BI
- SQL Server 2019, SQL Server 2017, SQL Server 2016
- Analytic Problem Solving
- Having experience in Microsoft Fabric.
- Excellent communication skills.
Qualification:
- BSCS in Computer Science
- 3 – 4 years of experience
- Certification in Azure is a plus
- Certification in Microsoft Fabric is plus
- Having experience in Insurance domain is plus.