Categories
S4D CONSULTING LLC
Senior Data Architect / Engineer
AvailableSenior Data Architect / Engineer
Role Overview
S4D Consulting LLC is seeking a Senior Data Architect / Engineer to lead the design and delivery of the Predictive Stock Intelligence Engine under the Zambia National Digital Health Intelligence Hub. The role is focused on data architecture, data modelling, and engineering of robust data pipelines and storage infrastructure to support commodity stock-out prediction and supply chain decision-making across MoH. The incumbent will work within S4D's delivery team and operate within the MoH technical environment.
Key Responsibilities
Data Architecture & Platform Design
• Design and document the end-to-end data platform architecture, covering ingestion, storage, transformation, and data serving layers.
• Define and enforce a medallion data layering strategy (Bronze, Silver, Gold) within a SQL-based data warehouse or lakehouse environment.
• Select and configure core platform technologies (e.g. PostgreSQL, Delta Lake, Apache Airflow) in line with MoH ICT standards and project requirements.
• Develop and maintain data models, entity-relationship diagrams, and data dictionaries for all platform datasets.
• Establish and govern data contracts between source systems and the central warehouse to ensure consistency and reliability.
Data Engineering & Pipeline Development
• Architect and build ETL/ELT pipelines to ingest MoH supply chain datasets (stock levels, consumption records, and delivery data) into the central warehouse.
• Develop and schedule pipeline workflows using Apache Airflow, Prefect, or equivalent orchestration tooling with Python 3.10+.
• Implement data quality checks, lineage tracking, and error-handling across all ingestion and transformation processes.
• Optimise query performance and storage efficiency across the data warehouse, with attention to scalability as data volumes grow.
• Version-control all pipeline and transformation code using Git, with documented deployment and rollback procedures.
Data Governance & Documentation
• Produce and maintain comprehensive technical documentation for the platform architecture, data models, pipeline specifications, and data lineage.
• Apply data governance practices including data classification, metadata management, and compliance with MoH data policies.
• Conduct architecture reviews and contribute to technical governance forums within the project.
• Provide technical input to project reporting and stakeholder briefings as required.
Qualifications & Experience
Essential
• Bachelor of Science in Computer Science, Software Engineering, or a closely related field.
• Minimum 6 years of experience in cloud data engineering and data architecture at production scale.
• Demonstrated experience designing and implementing data warehouse or lakehouse solutions, including medallion-layer data modelling. • Strong proficiency in Python (3.10+) and SQL for data engineering and transformation work.
• Solid experience with cloud data platforms (e.g. AWS, Azure, or GCP) including managed database, storage, and compute services. •
Hands-on experience with pipeline orchestration tools such as Apache Airflow or Prefect. • Strong knowledge of relational databases (PostgreSQL, MySQL, or SQL Server) in production environments.
• Proficiency with Git for version control of data platform code and configuration.
Desirable
• Experience working within public health, government, or NGO data environments in sub-Saharan Africa.
• with health commodity management or supply chain systems (e.g. eLMIS, DHIS2).
• Knowledge of Delta Lake or open-table formats.
• Experience with PySpark for large-scale data transformation.
• Exposure to data catalogue or metadata management tools.
Technical Skills Summary
Competencies & Personal Attributes
• Strong systems-thinking ability, with a track record of translating business requirements into scalable data architecture.
• Detail-oriented approach to data modelling, documentation, and code quality.
• Ability to work independently and manage delivery in resource-constrained environments.
• Clear written and verbal communication skills, with the ability to convey technical concepts to non-technical stakeholders.
• Collaborative and constructive in cross-functional teams alongside analysts, engineers, and programme staff.
• Commitment to data quality, platform reliability, and sound engineering practice.
Performance Indicators
• Data platform architecture documented and approved by technical leads within the agreed onboarding period.
• ETL/ELT pipelines operational and reliably loading MoH supply chain data into the central warehouse with documented quality checks.
• Medallion-layer data models (Bronze, Silver, Gold) implemented and maintained with current data dictionaries.
• All pipeline and architecture code version-controlled and documented in accordance with project standards.
• Technical documentation and data governance artefacts produced and kept up to date throughout the engagement.
Application Information
Candidates should submit a CV and cover letter to hrmanagement@s4dconsulting.com, Closing Date April 8,2026 . Applications should reference specific projects or platforms the candidate has designed or built. Submissions that do not evidence the required technical background will not be progressed.
Shortlisted candidates will complete a technical assessment before the interview stage.
3/31/2026
