Open for Applications

Data Engineer

KENYA
PunditSpace
remote
Job ID:0091025

Job Description

About the Role 

We are hiring skilled Data Engineers to join a high-impact team delivering innovative, scalable, and cloud-first data solutions across industries. 

Key Responsibilities 

1. Tools & Experience 

  • Work with Power BI, Azure Data Factory, SSIS, and Databricks. 

  • Evaluate and improve tool usage efficiency, ensuring seamless integration and performance. 

2. Data Wrangling & Transformation 

  • Write efficient DAX, SQL/T-SQL, and Python code for complex data transformations. 

  • Implement Row-Level Security (RLS) and region-based data restrictions in SQL. 

3. ETL & Architecture 

  • Design and implement ETL pipelines following best practices and layered architectures (Bronze/Silver/Gold). 

  • Demonstrate experience in end-to-end ETL design, testing, and deployment using Azure Data Factory and Databricks. 

4. Semantic Models & Security 

  • Develop semantic data models, define cardinality, and configure access privileges. 

  • Apply views, stored procedures, and security logic to enforce data governance and compliance. 

5. Big Data Practices 

  • Handle large-scale fact and dimension tables using effective indexing and partitioning strategies. 

  • Apply CTEs, CTAS, and materialized views to optimize data queries and performance. 


Must-Have Technical Skills 

Primary Technical Proficiencies (All Required) 

  • SQL Server Product Suite: 
    Strong expertise in SSMS, SSIS, SSAS, and SSRS. 

  • Microsoft Azure Cloud: 
    Hands-on experience with Azure Data Factory (ADF) and Logic Apps. 
    Experience implementing CI/CD practices in cloud environments. 

  • Programming for Data: 
    Advanced proficiency in Python and PySpark for data manipulation, transformation, and orchestration. 

Main Technologies (Mandatory) 

  • Azure Databricks & Spark Ecosystem: 
    Strong knowledge of Spark, Spark SQL, Delta Tables, PySpark, and Lakehouse Architecture. 

  • Data Integration: 
    Proven experience with Azure Data Factory (ADF) for orchestration and transformation pipelines. 

Nice to Have (Preferred but Not Mandatory) 

  • Experience with Azure Data Lake and understanding of Lakehouse design principles. 

  • Hands-on skills in DAX and Power BI for data modeling and visualization. 

  • Familiarity with Azure Fabric and ADF Dataflows for advanced data integration scenarios. 

Job Details

Application Deadline

October 27, 2025

13 days remaining