Fung and colleagues observed improvements in Prescribing Safety Assessment (PSA) pass rates following targeted teaching interventions in their medical school cohorts. They showed that the PSA captured the effect of the intervention providing further evidence of its validity as a test of prescribing competence, and its sensitivity as an instrument to measure this. It also demonstrates that a dedicated prescribing assessment can influence the design of student learning pathways and encourage the development of authentic prescribing tasks in undergraduate medical training. In contrast, the Applied Knowledge Test (AKT), which necessarily tests multiple domains and areas of practice and knowledge, exclusively with single-best-answer (SBA) questions, did not detect the educational impact of this intervention. The PSA focuses on various aspects of prescribing skills including drug selection, dosing, timing and frequency of administration; as well as providing information to patients, recognising adverse drug reactions, dose calculations, monitoring, prescription review and planning management. It uses a testing approach that more closely aligned with prescribing practice including simulation of electronic prescribing in ‘white space’ fields, avoiding the potential cuing effects of SBA questions, and using a semi-open book design, in which the candidates use the key resource that they will use in practice – the British National Formulary. Fung and colleagues’ findings underscore the importance of passing the PSA as a measure of competence to enter practice and the complementarity of the PSA and the AKT in determining this.