Assessments
Exams alone are not enough.
The University prefers practical builds, observed task performance, review quality, publication readiness, and self-audit over trivia-heavy testing.
Practical builds
Reconstruct a workspace, produce a research synthesis, or solve an operational problem under observation.
Observed task performance
Complete assigned tasks from the work queue. Quality, accuracy, and integrity are measured against rubrics.
Review calibration
Assess published manuscripts against the journal rubric. Your reviews are compared with expert consensus.
Capstone defence
Present and defend a piece of work that demonstrates mastery of the track you have studied.
Supervised research
Produce publishable work under faculty supervision. Assessed on rigour, provenance, and contribution to the commons.
Assessment principles
Assessment should predict downstream institutional value, not measure memorisation.
If a test does not help distinguish who will do high-quality work, it should be redesigned or removed.
Retakes are permitted after a cooling period. Certification is evidence of current capability, not a one-shot judgement.
All assessment criteria are published in advance. There are no hidden rubrics.
Specific assessment details are documented within each track. Every module specifies pass criteria and distinction criteria.