Project Summary

Mbarara University of Science & Technology (MUST) relied on scattered portals, PDFs and word-of-mouth to answer routine student questions—slowing onboarding and overloading staff. MISA (MUST Intelligent Student Advisor) introduces a 24/7 virtual assistant backed by a curated single source of truth.

Although still in development, the centralisation effort has already streamlined internal processes and heightened data-quality standards across departments.

Client Profile

Industry & Sector

Public higher-education university

Size & Geography

≈5,000 students, 300 staff, 2 campuses in south-western Uganda

Digital Maturity

Modernising—core LMS in place, limited AI adoption

Existing Platforms

Moodle LMS, legacy intranet portals, assorted PDF handbooks

Problem Statement

Business Challenges

Students spent too much time hunting for course policies, timetables and fees across disparate systems; staff fielded repetitive queries.

Technical Blockers

Legacy portals lacked APIs, data lived in silos and MUST had little applied-AI expertise.

Strategic Driver

As an aspiring regional tech hub, MUST sought a GenAI showcase that would modernise support while signalling innovation.

Solution & Approach

Knowledge Centralisation & Curation

A cross-functional panel of key users and process owners now harvests information from syllabi, handbooks, calendars and websites. We delivered a bespoke knowledge-management platform—built to grow with new content and taxonomy rules.

Virtual Assistant Setup

  • Channel: Web widget embedded in the student portal
  • Language: English (with future local-language expansion roadmap)
  • Conversational engine: RAG pipeline powered by OpenAI LLMs, grounded in the curated vector store

Service & Technology Stack

Service category: AI Solutions (underpinned by our GISE™ methodology)

Next.jsPostgreSQLpgvectorOpenAIPythonDocker

Delivery Process

Incremental, phased roll-out: discovery → PoC → knowledge-base build → assistant pilot → campus-wide launch.

Outcome & Impact (to date)

While the assistant itself is still in pilot, early wins include:

Unified Knowledge Base Live

300+ documents consolidated, de-duplicated and tagged.

Cultural Shift

Departments now review content freshness quarterly, improving data quality.

Process Streamlining

Fewer inter-department email chains as staff reference the shared knowledge hub.

*Quantitative KPIs will be captured post-go-live (Q4 2025).

Technical Highlights

Architecture

Stateless Next.js front-end ↔ API layer ↔ Postgres/pgvector store; OpenAI for generation, with small custom-trained models handling intent classification to cut latency.

Content Ingestion

Automated schedulers scrape PDFs, calendars and public web pages nightly; diffs trigger re-embedding only on change.

Latency Optimisation

Edge caching for unchanged answers; lighter distilled models for classification before escalating to the full LLM.

Reflection & Lessons Learned

Phased delivery proved ideal—yet we underestimated stakeholder enthusiasm. Rapid feedback loops at MUST let us shorten approval cycles and iterate faster.

Ready to transform your institution with AI?

Book a discovery call