190 likes | 270 Views
Accounting and Finance Verification Group 266. Central Verification Demystified. 2010-2011 Session. June 2011 was the first Central Verification session for the 2010 framework. The only Unit from the 2010 framework which was under scrutiny here was Graded Unit 1. Why June?. The purpose.
E N D
Accounting and FinanceVerification Group 266 Central Verification Demystified
2010-2011 Session • June 2011 was the first Central Verification session for the 2010 framework. • The only Unit from the 2010 framework which was under scrutiny here was Graded Unit 1. • Why June?
The purpose • To ensure that standards are met and maintained • Standardisation cross ALL centres
Sample selection • How is the selection made? • Centres offering for the first time • ‘Hold’ from previous submissions • Random sample
Behind closed doors! • Based on the size of the selection the EV team decide how long the process will need • EV team discuss any issues arising in our own centres or those that centres have notified us of in comments forms or queries • EV team look to see which assessment instruments (AIs) have been used • Check to see that Assessment Exemplars have been used or that AI has been Prior Verified.
First steps • Identify the areas of expertise in the room! • Sort submissions into Units • Ensure that there is a fair spread of the workload
Initial Review • Look for the following inside the packs: • Sample sheet – VS00 • Complete list of all candidates and grades • IV records • Exemplar/AI used with solutions
Sample Sheet and Class Lists – why? • To ensure centres have sent a sample which includes a range of marks • To ensure that centres have sent a representative sample of the whole cohort • To allow EV Team to identify how many candidates continue on to complete the GU.
The process • EV Team select a sample from the submission and review each paper in detail to ensure that the standards have been met • EV Team look for centres to identify where they have awarded marks and what they are for if needed • EV Team check that at least a sample have been subject to IV
What happens if EVs don't agree with centre’s marking? • Disagree with marks? • Continue to look at the sample chosen to see if this is an oversight or continues • Select a further sample from the submission • If EV Team disagree with the marking on 4 out of the 12 scripts submitted or 25% of the total submission we have to ‘not accept’
Why might EVs disagree with centres’ marking? • Inconsistency in marking • Including inconsistency across sites • Marking not in line with the solution
IV records – why? • To allow EVs to see what issues each centre has identified – if any • How the centre has approached the GU delivery and assessment • Identify areas of good practice which EVs can share at events like this.
Discussions • Any issues arising, during central verification session, are discussed by the whole team • Any submissions which may highlight problems • All problematic issues are reviewed by the SEV
Reports • The reports are written upon completion of the review by the EV who carried out the review • The reports should give centres a clear indication of how EVs have found their submission • Any reports which have identified that centres have not met the standards contain more detail and are reviewed by the SEV
Guidance tools • SEV report • Understanding Standards • Your EV • Online resources • Network Events
SEV report • This is compiled at the end of each academic session and summarises the findings of the EV team for the whole session • Identifies areas of good practice • Identifies any areas which may need to be clarified