Data Engineer job at Kivu Choice Ltd
New
3 Days Ago
Linkedid Twitter Share on facebook
Data Engineer
2025-09-11T11:19:48+00:00
Kivu Choice Ltd
https://cdn.greatrwandajobs.com/jsjobsdata/data/employer/comp_4078/logo/download%20(5).jpg
FULL_TIME
 
kigali
Kigali
00000
Rwanda
Agriculture, Food, and Natural Resources
Science & Engineering
RWF
 
MONTH
2025-10-10T17:00:00+00:00
 
Rwanda
8

Kivu Choice is the fastest growing vertically integrated aquaculture company with the largest hatchery in Rwanda. A fish production operation as well as a growing number of branches to sell the fish throughout the country. Over the next 5 years our plan is to scale to become the largest and most sustainable protein producer in the country, producing and distributing over 50 million fish meals per year across Rwanda, DRC, and Burundi. 

About the Role

We are looking for a skilled and motivated Data Engineer to lead the integration of multiple data sources into our centralized Snowflake data warehouse. This role will design and maintain robust ETL pipelines, support predictive modeling efforts, and develop intuitive dashboards in Power BI to drive insights across the organization.

Key Responsibilities:

  • Build and manage end-to-end ETL pipelines to integrate diverse data sources (APIs, databases, flat files, etc.) into our Snowflake data warehouse.
  • Own and optimize our Snowflake architecture, including data modeling, performance tuning, and access control.
  • Partner with Data Analysts and Business Stakeholders to define and deliver clean, consistent, and reliable data.
  • Design and implement predictive models to support forecasting, optimization, and data-driven decision-making.
  • Develop and maintain Power BI dashboards for monitoring KPIs, operational reporting, and executive insights.
  • Ensure high data quality through validation frameworks, monitoring, and documentation.
  • Automate repetitive tasks and improve efficiency of data workflows using Python and/or orchestration tools.

Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or related discipline.
  • 3+ years of professional experience in data engineering or data platform development.
  • Proven experience working with Snowflake in a production environment is a plus.
  • Expertise in SQL and data modeling for analytics and reporting.
  • Proficiency in Python (or similar language) for scripting, automation, and model development.
  • Strong experience with Power BI including DAX, data transformations, and visual storytelling.
  • Hands-on experience developing predictive models (e.g., regression, classification, time-series, etc).
  • Familiarity with Git and workflow orchestration tools like Airflow, dbt, or similar.

Nice to Have:

  • Experience in integrating third-party APIs or using tools like Airbyte, Fivetran, or Azure Data Factory.
  • Knowledge of MLOps practices and deploying models into production environments.
  • Understanding of data governance, data privacy, and security in cloud environments.
  • Experience in industries such as aquaculture or agriculture is a plus.

Submitting your application

  • If you are interested in this position, prepare the following: 
  1. Job application letter 
  2. Curriculum Vitae (CV)
  3. Copy of your academic documents
  4. Copy of your ID
Key Responsibilities: Build and manage end-to-end ETL pipelines to integrate diverse data sources (APIs, databases, flat files, etc.) into our Snowflake data warehouse. Own and optimize our Snowflake architecture, including data modeling, performance tuning, and access control. Partner with Data Analysts and Business Stakeholders to define and deliver clean, consistent, and reliable data. Design and implement predictive models to support forecasting, optimization, and data-driven decision-making. Develop and maintain Power BI dashboards for monitoring KPIs, operational reporting, and executive insights. Ensure high data quality through validation frameworks, monitoring, and documentation. Automate repetitive tasks and improve efficiency of data workflows using Python and/or orchestration tools.
Nice to Have: Experience in integrating third-party APIs or using tools like Airbyte, Fivetran, or Azure Data Factory. Knowledge of MLOps practices and deploying models into production environments. Understanding of data governance, data privacy, and security in cloud environments. Experience in industries such as aquaculture or agriculture is a plus.
Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or related discipline. 3+ years of professional experience in data engineering or data platform development. Proven experience working with Snowflake in a production environment is a plus. Expertise in SQL and data modeling for analytics and reporting. Proficiency in Python (or similar language) for scripting, automation, and model development. Strong experience with Power BI including DAX, data transformations, and visual storytelling. Hands-on experience developing predictive models (e.g., regression, classification, time-series, etc). Familiarity with Git and workflow orchestration tools like Airflow, dbt, or similar.
bachelor degree
36
JOB-68c2b054981c7

Vacancy title:
Data Engineer

[Type: FULL_TIME, Industry: Agriculture, Food, and Natural Resources, Category: Science & Engineering]

Jobs at:
Kivu Choice Ltd

Deadline of this Job:
Friday, October 10 2025

Duty Station:
Kigali | Rwanda

Summary
Date Posted: Thursday, September 11 2025, Base Salary: Not Disclosed

Similar Jobs in Rwanda
Learn more about Kivu Choice Ltd
Kivu Choice Ltd jobs in Rwanda

JOB DETAILS:

Kivu Choice is the fastest growing vertically integrated aquaculture company with the largest hatchery in Rwanda. A fish production operation as well as a growing number of branches to sell the fish throughout the country. Over the next 5 years our plan is to scale to become the largest and most sustainable protein producer in the country, producing and distributing over 50 million fish meals per year across Rwanda, DRC, and Burundi. 

About the Role

We are looking for a skilled and motivated Data Engineer to lead the integration of multiple data sources into our centralized Snowflake data warehouse. This role will design and maintain robust ETL pipelines, support predictive modeling efforts, and develop intuitive dashboards in Power BI to drive insights across the organization.

Key Responsibilities:

  • Build and manage end-to-end ETL pipelines to integrate diverse data sources (APIs, databases, flat files, etc.) into our Snowflake data warehouse.
  • Own and optimize our Snowflake architecture, including data modeling, performance tuning, and access control.
  • Partner with Data Analysts and Business Stakeholders to define and deliver clean, consistent, and reliable data.
  • Design and implement predictive models to support forecasting, optimization, and data-driven decision-making.
  • Develop and maintain Power BI dashboards for monitoring KPIs, operational reporting, and executive insights.
  • Ensure high data quality through validation frameworks, monitoring, and documentation.
  • Automate repetitive tasks and improve efficiency of data workflows using Python and/or orchestration tools.

Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or related discipline.
  • 3+ years of professional experience in data engineering or data platform development.
  • Proven experience working with Snowflake in a production environment is a plus.
  • Expertise in SQL and data modeling for analytics and reporting.
  • Proficiency in Python (or similar language) for scripting, automation, and model development.
  • Strong experience with Power BI including DAX, data transformations, and visual storytelling.
  • Hands-on experience developing predictive models (e.g., regression, classification, time-series, etc).
  • Familiarity with Git and workflow orchestration tools like Airflow, dbt, or similar.

Nice to Have:

  • Experience in integrating third-party APIs or using tools like Airbyte, Fivetran, or Azure Data Factory.
  • Knowledge of MLOps practices and deploying models into production environments.
  • Understanding of data governance, data privacy, and security in cloud environments.
  • Experience in industries such as aquaculture or agriculture is a plus.

Submitting your application

  • If you are interested in this position, prepare the following: 
  1. Job application letter 
  2. Curriculum Vitae (CV)
  3. Copy of your academic documents
  4. Copy of your ID

 

Work Hours: 8

Experience in Months: 36

Level of Education: bachelor degree

Job application procedure
Interested in applying for this job? Click here to submit your application now.

All Jobs | QUICK ALERT SUBSCRIPTION

Job Info
Job Category: Data, Monitoring, and Research jobs in Rwanda
Job Type: Full-time
Deadline of this Job: Friday, October 10 2025
Duty Station: Kigali
Posted: 11-09-2025
No of Jobs: 1
Start Publishing: 11-09-2025
Stop Publishing (Put date of 2030): 11-09-2071
Apply Now
Notification Board

Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.

Caution: Never Pay Money in a Recruitment Process.

Some smart scams can trick you into paying for Psychometric Tests.