Skip to content

Blurred Lines – roles in assessment and verification

June 23, 2013

In the public and voluntary sectors in Scotland (and by this I mean secondary schools, further education colleges and community-based contexts), the majority of accredited ESOL qualifications are provided by the Scottish Qualification Authority (SQA). Of course, there are plenty of non-accredited courses, and you do get the odd course that prepares students for IELTS, or a Cambridge ESOL/Trinity/City & Guilds exam, but the most widely recognized awarding body in Scotland (for all subjects, not just ESOL) is SQA.

SQA’s ESOL qualifications are being revamped to reflect the new Curriculum for Excellence, which I have previously blogged about, but as far as I’m aware the key principles of assessment will remain the same, and it is the way that these assessments are conducted and marked that I would like to explore in this post.

Although SQA ESOL qualifications are awarded by a national body, most of the assessment is administered and marked internally by the centre that delivers the courses. At Intermediate 2 and Higher levels there is an external examination, but these courses also include internally-assessed units, and at lower levels the entire qualifications are delivered and assessed within the centre.

Obviously, relying on the centre that provides the course to assess candidate evidence throws up some major concerns with regard to reliability and standardization. Even when all centres are using the same assessment materials (and this is not always the case) there is no guarantee that they are all following the same standards or assessing their candidates under identical conditions. Furthermore, as course providers are coming under increasing pressure to use learner attainment as a key performance indicator, which can impact on the funding they receive, there is potential (I’m being diplomatic here) for a conflict of interest to exist. If future funding relies on good results, and if you have the power to declare what your own results are, it is conceivable that some centres may be tempted to adopt a rather over-positive approach when assessing their own candidates.

SQA makes a big thing about relying on the “professional integrity” of its centres; no centre that values its own professionalism would deliberately compromise its own credibility by deliberately messing with assessment results.

Let’s assume though that all centres have enough professional integrity not to do anything underhanded. There is still the possibility that centres will make mistakes, or do things wrongly or differently by accident. This is particularly true when you consider the level of subjectivity that exists when assessing SQA ESOL units. Guidelines for assessors to follow when assessing various outcomes include phrases such as:

  • “Range of structures and vocabulary is appropriate to purpose and audience.” (How wide a range is that? How narrow must the range be to be inappropriate?)
  • “Use your discretion and accept synonyms or near-synonyms” (How near is a near-synonym? Just exactly how much discretion does the assessor have here?)
  • “Award the point if the answer is correct, even though it may be wrong in terms of spelling or grammar” (What if the answer is “money” and they write “many” – is that a spelling mistake? What if the grammar is so bad that the answer is not entirely clear – is it still correct if it is barely recognisable?)
  • “Sufficiently accurate to convey meaning on first reading” (so does this mean it can be full of grammar and spelling mistakes as long as it can be understood?)
  • “Sufficiently accurate to convey meaning to a sympathetic listener” (How sympathetic is sympathetic? As sympathetic as the person who has been teaching this candidate for the past 3 months and is totally tuned into their way of using English (however bad it might be) or as sympathetic as a doctor, say, who is meeting the candidate for the first time?)

When it comes to conditions for assessment, there are also a few comments that are open to interpretation. For example, candidates can sometimes have access to ”notes” – what constitutes notes? This could be the notes they write down in a 10-minute preparation period, but it could equally be a pile of ring binders containing all the English they have ever learnt in their lives.

Questions raised by these examples may suggest that I have put quite a lot of thought into how to deliver SQA ESOL assessments. I’ve had to – not just to ensure my department is adhering to the appropriate standards and procedures, but also just to ensure that staff within my own department are following the same standards and procedures. The thing is though, even when you follow SQA’s procedures to the letter, there remain some aspects of the assessment process that are open to interpretation. Inevitably then, some centres will/can/may follow slightly different procedures or employ slightly different standards when delivering SQA ESOL units.

SQA addresses this problem by adding two more layers to the whole process – Internal Verification and External Verification. Internal Verification (IV) is conducted within the centre, and involves a subject specialist who is NOT involved in the assessment process to “verify” the units as they are delivered. The IV process involves taking samples of candidate evidence for each of the outcomes (Speaking, Writing, Listening and Reading) and ensuring that the assessment procedures are being conducted correctly.

External Verification involves an External Verifier (EV) visiting the centre as a representative of SQA and, like the IV, looking at samples of candidate evidence to ensure assessment procedures and standards are as they should be. If everything is OK, the centre maintains its status as an approved provider of the qualifications. If not, the EV can put a “hold” on the qualification, meaning the centre cannot deliver it until it has made changes or put conditions in place to ensure that any problems are resolved.

So, to summarise, there are four different “roles” played during the delivery of an SQA ESOL qualification.

  1. The Teacher, (or tutor or lecturer, depending on where they work), who actually teaches the class and gives candidates the knowledge and skills they need to pass the assessments.
  2. The Assessor, who actually administers and marks the assessments.
  3. The Internal Verifier, whose job is to make sure that the assessments are delivered in the right conditions and marked according to the criteria provided by SQA. Every unit must be internally verified, meaning IVs need to look at samples of every assessment that the assessor delivers.
  4. Finally, the external verifier, who comes into the centre to look at samples of candidate evidence and ensure the centre is complying with SQA criteria. EV only happens occasionally; in 8 years at my current college we have had two EV vists for ESOL units, though I’m expecting another one soon.

It may be very light touch, but in principle SQA’s assessment and verification procedures are really quite effective. The different layers contribute positively to ensuring professional integrity is maintained. Although the assessments are delivered and marked internally, the knowledge that candidate evidence will be verified by a colleague/manager, along with the constant possibility of being “EV’d”, keep assessors on their toes. As far as I’m aware, all SQA centres do their very best to ensure they are delivering units as well as they can.

However, there does appear to be a bit of a problem with regard to the above layers, and the importance of keeping them separate. The lines that are drawn between teacher and assessor, assessor and verifier, have the potential to become a bit blurred:

Am I teaching or am I assessing?

In most contexts, the teacher and the assessor are the same person. As assessments are usually delivered during lesson time, there is potential for students to expect the assessor to continue to provide support and feedback (as they would if they were teaching) during the assessment. This is exacerbated by somewhat ambiguous guidelines from SQA in terms of how much feedback or support can/should be given by assessors, which could lead some assessors to give more support than others.

How involved in the assessment process is the internal verifier?

According to SQA guidelines:

“The internal verifier’s role is critical in ensuring that assessments are appropriately conducted and that any possibility of malpractice is minimised.”

So the IV’s role is to look at what the assessor has done and check that they haven’t done it badly. If they have, the IV makes recommendations that the assessor takes specific action to address this. Necessarily, internal verification can only take place after an assessment has been delivered.

However, because of the ambiguities of the performance criteria, and perhaps also because the internal verifier is often a senior member of staff, there is a temptation for assessors to want to check their marks with the IV at the assessment stage. SQA seems to allow this to happen, stating:

“The internal verifier can have a developmental role for less experienced assessors by offering advice and guidance.”

New assessors need to get support from somewhere, and the expectation is that the IV will offer support throughout the delivery of the unit – so in designing/selecting an assessment, or identifying what knowledge and skills are being tested in a certain question, or deciding what possible answers could be accepted. But IV support is not about helping an assessor decide if a candidate is a pass or fail. As soon as an IV gets involved in the actual assessment process they immediately compromise their position as verifiers. How can you verify that the assessments were conducted properly if you were actually involved in the assessment process? SQA also seems to acknowledge this, as it says in the same document:

“It must be stressed that no individual may act as assessor and internal verifier for the same group of candidates.”

Of course, sometimes it is useful to get a second opinion, and we get round the problem by encouraging assessors to discuss results with other colleagues who are delivering the same unit or who have delivered it in the past. However, they must not consult the IV for advice on whether or not a candidate meets the criteria. I’m not sure how widespread this practice is though, and I think there are some centres where the IVs get very hands-on in the assessment process.

Who does the External Verifier represent?

EVs visit centres as representatives of SQA but are also experienced practitioners who manage/deliver/have delivered SQA ESOL units as part of their own daily practice. A problem with our subject area is that a large number of external verifiers come from a very small number of centres. What this means is that there is the possibility that external verifiers, who naturally view practice in their own centres as good practice (after all, they usually implemented them), will visit other centres where things are done differently and assume that any deviation from their own model should be viewed negatively. I’m not saying that EVs don’t do a good job, or that they are biased in any way. I’m simply saying that it’s difficult for them to represent SQA and promote good practice without relating a centre’s practice to what they do themselves (the same could be said of British Council inspectors, or CELTA assessors, or others who perform similar roles, but that doesn’t diminish the potential for this problem to exist.)

Those of you who are used to ELT exams like IELTS or the Cambridge main suite may find it odd that centralized and nationally-accredited qualifications can be internally assessed. However, the verification process goes a long way towards ensuring the whole thing remains reasonably robust. The problem is that the roles can get blurred – between teacher and assessor, between assessor and internal verifier, and between the external verifier of one centre and the internal verifier of another. As the whole system is in the process of change anyway, SQA might want to consider removing some of the ambiguities that exist with regard to conditions of assessment and verification procedures.

Advertisements
2 Comments
  1. Gordon Wells permalink

    You’ve steered a carefully worded course through the multiple pressure points in a system which demands high levels of frequently multi-hatted professionalism of those tasked with implementing it. Its internal complexity is one thing. Add in a fifth role/tier – the learners themselves – and the pressure on the teacher/assessor is compounded further, is it not? Particularly if success or failure can mean the difference between naturalisation/right of continued residence and perhaps even expulsion… In a world where the regulations seem to change from year to year I’m not completely au fait with where SQA ESOL qualifications currently stand in relationship to official “citizenship”. But, to the extent that there’s any linkage at all, is it not asking a lot of the English teaching/testing practitioner, often a very important intermediary in terms of helping their learners to interact constructively with their surrounding community, to also require them to play a de facto part in a gate-keeping role too?

  2. “a system which demands high levels of frequently multi-hatted professionalism of those tasked with implementing it.” – Yes, that sums it up very well, Gordon.
    You’re right that the attainment of SQA ESOL qualifications are very important for students applying for UK citizenship. This and other reasons can really raise the stakes, and I agree it is asking a lot for a teacher to make the shift from nice guy helper to objectively-minded assessor. The contrast between these roles is potentially very stark, and unless these roles are made very clear to the students it can be hard for them to understand how their teacher, who always seemed so nice, could turn round and fail them.
    I suppose that’s why I feel it’s so important to keep the roles clearly defined in everyone’s eyes, including the students’.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: