Data Engineer job at Kivu Choice Ltd
131 Days Ago
Linkedid Twitter Share on facebook
Data Engineer
2025-09-11T11:19:48+00:00
Kivu Choice Ltd
https://cdn.greatrwandajobs.com/jsjobsdata/data/employer/comp_4078/logo/download%20(5).jpg
FULL_TIME
 
kigali
Kigali
00000
Rwanda
Agriculture, Food, and Natural Resources
Science & Engineering
RWF
 
MONTH
2025-10-10T17:00:00+00:00
 
Rwanda
8

Kivu Choice is the fastest growing vertically integrated aquaculture company with the largest hatchery in Rwanda. A fish production operation as well as a growing number of branches to sell the fish throughout the country. Over the next 5 years our plan is to scale to become the largest and most sustainable protein producer in the country, producing and distributing over 50 million fish meals per year across Rwanda, DRC, and Burundi. 

About the Role

We are looking for a skilled and motivated Data Engineer to lead the integration of multiple data sources into our centralized Snowflake data warehouse. This role will design and maintain robust ETL pipelines, support predictive modeling efforts, and develop intuitive dashboards in Power BI to drive insights across the organization.

Key Responsibilities:

  • Build and manage end-to-end ETL pipelines to integrate diverse data sources (APIs, databases, flat files, etc.) into our Snowflake data warehouse.
  • Own and optimize our Snowflake architecture, including data modeling, performance tuning, and access control.
  • Partner with Data Analysts and Business Stakeholders to define and deliver clean, consistent, and reliable data.
  • Design and implement predictive models to support forecasting, optimization, and data-driven decision-making.
  • Develop and maintain Power BI dashboards for monitoring KPIs, operational reporting, and executive insights.
  • Ensure high data quality through validation frameworks, monitoring, and documentation.
  • Automate repetitive tasks and improve efficiency of data workflows using Python and/or orchestration tools.

Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or related discipline.
  • 3+ years of professional experience in data engineering or data platform development.
  • Proven experience working with Snowflake in a production environment is a plus.
  • Expertise in SQL and data modeling for analytics and reporting.
  • Proficiency in Python (or similar language) for scripting, automation, and model development.
  • Strong experience with Power BI including DAX, data transformations, and visual storytelling.
  • Hands-on experience developing predictive models (e.g., regression, classification, time-series, etc).
  • Familiarity with Git and workflow orchestration tools like Airflow, dbt, or similar.

Nice to Have:

  • Experience in integrating third-party APIs or using tools like Airbyte, Fivetran, or Azure Data Factory.
  • Knowledge of MLOps practices and deploying models into production environments.
  • Understanding of data governance, data privacy, and security in cloud environments.
  • Experience in industries such as aquaculture or agriculture is a plus.

Submitting your application

  • If you are interested in this position, prepare the following: 
  1. Job application letter 
  2. Curriculum Vitae (CV)
  3. Copy of your academic documents
  4. Copy of your ID
Key Responsibilities: Build and manage end-to-end ETL pipelines to integrate diverse data sources (APIs, databases, flat files, etc.) into our Snowflake data warehouse. Own and optimize our Snowflake architecture, including data modeling, performance tuning, and access control. Partner with Data Analysts and Business Stakeholders to define and deliver clean, consistent, and reliable data. Design and implement predictive models to support forecasting, optimization, and data-driven decision-making. Develop and maintain Power BI dashboards for monitoring KPIs, operational reporting, and executive insights. Ensure high data quality through validation frameworks, monitoring, and documentation. Automate repetitive tasks and improve efficiency of data workflows using Python and/or orchestration tools.
Nice to Have: Experience in integrating third-party APIs or using tools like Airbyte, Fivetran, or Azure Data Factory. Knowledge of MLOps practices and deploying models into production environments. Understanding of data governance, data privacy, and security in cloud environments. Experience in industries such as aquaculture or agriculture is a plus.
Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or related discipline. 3+ years of professional experience in data engineering or data platform development. Proven experience working with Snowflake in a production environment is a plus. Expertise in SQL and data modeling for analytics and reporting. Proficiency in Python (or similar language) for scripting, automation, and model development. Strong experience with Power BI including DAX, data transformations, and visual storytelling. Hands-on experience developing predictive models (e.g., regression, classification, time-series, etc). Familiarity with Git and workflow orchestration tools like Airflow, dbt, or similar.
bachelor degree
36
JOB-68c2b054981c7

Vacancy title:
Data Engineer

[Type: FULL_TIME, Industry: Agriculture, Food, and Natural Resources, Category: Science & Engineering]

Jobs at:
Kivu Choice Ltd

Deadline of this Job:
Friday, October 10 2025

Duty Station:
Kigali | Rwanda

Summary
Date Posted: Thursday, September 11 2025, Base Salary: Not Disclosed

Similar Jobs in Rwanda
Learn more about Kivu Choice Ltd
Kivu Choice Ltd jobs in Rwanda

JOB DETAILS:

Kivu Choice is the fastest growing vertically integrated aquaculture company with the largest hatchery in Rwanda. A fish production operation as well as a growing number of branches to sell the fish throughout the country. Over the next 5 years our plan is to scale to become the largest and most sustainable protein producer in the country, producing and distributing over 50 million fish meals per year across Rwanda, DRC, and Burundi. 

About the Role

We are looking for a skilled and motivated Data Engineer to lead the integration of multiple data sources into our centralized Snowflake data warehouse. This role will design and maintain robust ETL pipelines, support predictive modeling efforts, and develop intuitive dashboards in Power BI to drive insights across the organization.

Key Responsibilities:

  • Build and manage end-to-end ETL pipelines to integrate diverse data sources (APIs, databases, flat files, etc.) into our Snowflake data warehouse.
  • Own and optimize our Snowflake architecture, including data modeling, performance tuning, and access control.
  • Partner with Data Analysts and Business Stakeholders to define and deliver clean, consistent, and reliable data.
  • Design and implement predictive models to support forecasting, optimization, and data-driven decision-making.
  • Develop and maintain Power BI dashboards for monitoring KPIs, operational reporting, and executive insights.
  • Ensure high data quality through validation frameworks, monitoring, and documentation.
  • Automate repetitive tasks and improve efficiency of data workflows using Python and/or orchestration tools.

Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or related discipline.
  • 3+ years of professional experience in data engineering or data platform development.
  • Proven experience working with Snowflake in a production environment is a plus.
  • Expertise in SQL and data modeling for analytics and reporting.
  • Proficiency in Python (or similar language) for scripting, automation, and model development.
  • Strong experience with Power BI including DAX, data transformations, and visual storytelling.
  • Hands-on experience developing predictive models (e.g., regression, classification, time-series, etc).
  • Familiarity with Git and workflow orchestration tools like Airflow, dbt, or similar.

Nice to Have:

  • Experience in integrating third-party APIs or using tools like Airbyte, Fivetran, or Azure Data Factory.
  • Knowledge of MLOps practices and deploying models into production environments.
  • Understanding of data governance, data privacy, and security in cloud environments.
  • Experience in industries such as aquaculture or agriculture is a plus.

Submitting your application

  • If you are interested in this position, prepare the following: 
  1. Job application letter 
  2. Curriculum Vitae (CV)
  3. Copy of your academic documents
  4. Copy of your ID

 

Work Hours: 8

Experience in Months: 36

Level of Education: bachelor degree

Job application procedure
Interested in applying for this job? Click here to submit your application now.

All Jobs | QUICK ALERT SUBSCRIPTION

Job Info
Job Category: Data, Monitoring, and Research jobs in Rwanda
Job Type: Full-time
Deadline of this Job: Friday, October 10 2025
Duty Station: Kigali
Posted: 11-09-2025
No of Jobs: 1
Start Publishing: 11-09-2025
Stop Publishing (Put date of 2030): 11-09-2071
Apply Now

For Employers

Head Hunting Support

Some jobs are so strategically important but not easy to find the perfect match for the right expertise. Leverage our tools, networks and experience to find the best fit.

Shortlisting

Let us sieve out the top qualified candidates for you and save time looking at hundreds or thousands of applications.

Boost Your Jobs

Depending on the urgency of your jobs, we boost your jobs using digital distribution channels to get the best out of our platform.

Great Rwanda Jobs has the best email alert has a great whatsapp community has shortlisting services has a CV database has head hunting services posts jobs within minutes makes job searching easy simplifies recruitment provides career guidance team is reachable 24/7

Browse Thousands of Jobs on Auto Scroll



Candidate Assessment Tests

For only $15 per test per individual, you gain the power to assess candidates for any job, spanning all levels of expertise and qualifications.
  • Blue collar tests
    Evaluate the skills and abilities of candidates for blue-collar jobs
  • Coding Tests
    Evaluate the coding skills of candidates for software development jobs
  • Cognitive ability tests
    Verbal reasoning, Numerical reasoning, Abstract reasoning, and Problem-solving
  • Engineering skills test
    Engineering abilities focusing on Mathematics, Physics, Engineering principles, Problem-solving, and Communication.
  • Language
    Tests for job that require a certain level of language proficiency.
  • Programming skills test
    Fundamental programming concepts, Problem-solving, Data structures and algorithms, and Coding style
  • Psychometric Tests
    Intelligence tests, Aptitude tests, Personality tests, and Interests tests
  • Situational judgement
    Measure a person's ability to make decisions in realistic work-related situations

CONTACTS

Office Hours:
8:00am - 5:00pm

Telephone:
+256753298980, +256773966860

Email:
rw@lafabsolution.com

JOB ALERTS

JOB CATEGORIES