Junior Data Engineering & Analytics

Position:

Organization: HABTech Solutions PLC

Not Specified

HABTech Solutions PLC is looking for a motivated Junior Data Engineer & Analytics professional to join our team in Addis Ababa. This hybrid role combines data engineering and analytics to support organizational operations and data-driven decision-making.

The ideal candidate is someone who can build robust data pipelines, manage databases, analyze datasets, and create insightful reports and dashboards. The role also involves documentation, collaboration with cross-functional teams, and contributing to machine learning initiatives under guidance.

Key Responsibilities

Data Engineering

  • Design, build, and maintain data pipelines and ETL processes (ingestion, linking, transformation).

  • Work with open-source tools such as Apache Airflow, Airbyte, and dbt, and Python libraries such as pandas.

  • Manage and optimize relational and columnar databases (e.g., PostgreSQL, ClickHouse).

  • Ensure database structure supports efficient storage and data processing.

Data Analytics

  • Clean, prepare, and transform raw datasets for analysis.

  • Perform statistical analysis to identify patterns, trends, and actionable insights.

  • Create interactive dashboards and reports using tools such as Power BI or Apache Superset.

  • Support early-stage machine learning models and experimentation.

Quality, Compliance & Documentation

  • Implement data quality checks and validation procedures.

  • Ensure data privacy and security compliance.

  • Prepare documentation and support internal/external training sessions.

Collaboration

  • Work closely with internal teams and external stakeholders (e.g., public health offices and digital systems teams).

  • Translate technical insights into clear, understandable outputs for non-technical audiences.

Job Requirements

Required Skills & Qualifications

Education

  • Bachelor’s degree in Computer Science, Software Engineering, Information Systems, or a related field.

Technical Skills

  • Strong proficiency in Python and SQL for data manipulation.

  • Hands-on experience with ETL/workflow orchestration tools:

    • Apache Airflow, Airbyte, dbt

  • Familiarity with data visualization tools:

    • Power BI, Apache Superset

  • Understanding of relational and columnar databases:

    • PostgreSQL, ClickHouse

Interpersonal & Soft Skills

  • Strong analytical and problem-solving ability.

  • Excellent communication and teamwork skills.

  • Ability to explain technical concepts to non-technical audiences.

  • Eagerness to learn and stay updated with new data technologies.

How To Apply

Qualified candidates should submit their applications to HABTech’s official job application email:

jobs@habtechsolution.com

Important Application Instructions:

Compile all application files (CV, cover letter, certificates) into one PDF document before submitting.

Only shortlisted candidates will be contacted.

Application Deadline: December 10, 2025

We also invite you to learn more about us by visiting our website:

 https://www.habtechsolution.com

Job Requirements Bachelor’s Degree in Computer Science, Software Engineering, Information Systems or in a related field of study Duties and Responsibilities: - Design, build, and maintain data pipelines and ETL processes (ingestion, linking, transformation). - Work with open-source tools such as Apache Airflow, Airbyte, and dbt, and Python libraries such as pandas. - Manage and optimize relational and columnar databases (e.g., PostgreSQL, ClickHouse). How to Apply Submit your application, updated CV and supporting credentials via email: jobs@habtechsolution.com

Deadline: Dec 11, 2025, 12:00 AM

Location: , Addis Ababa

Amount: 1