Creating a vocationally relevant English assessment tool for learners with grade 4+

Lakes College

This project focused on learners who have already attained Level 2 in English (Functional Skills or GCSE 4/C+) and would benefit from further development of their English skills in a contextualised manner.

You can download a PDF of this report on the Excellence Gateway [LINK TBC].

Summary

Our college is a General Further Education College in West Cumbria, offering a broad range of subjects from Level 1 to Foundation Degrees.
The project aimed to address the challenge faced by establishing an accurate starting point and skills gap information for learners. It also aimed to go beyond the general diagnostic information provided by popular systems currently available, as often these are generic and not subject-specific.

This current system was also resulting in a lack of engagement in English, post-GCSE or Functional Skills exams.

Rationale

The project was designed to have a much broader target audience. However, for the purposes of our research, we decided to target a cohort of learners on Level 3 Information Technology. This meant that we could create resources specifically for this career pathway, while being able to apply the underpinning English skills to other vocations in the future.

Although we were testing out resources on the whole class, we selected three learners as case studies, after consultation with the department. The whole cohort was a mixture of learners, with or working towards, a Level 2 English qualification, the majority of whom had achieved their English GCSE.

Approach

At the start of the project, we identified which department and students to work with. The IT department was chosen, specifically Level 3 students, as this provided a mixed cohort to trial the assessment resources. From this cohort, three learners were identified to be our case studies.
We then designed a self-assessment tool (Figure 8a-1) to establish a clear starting point and delivered our first 30-minute briefing session, during which we explained the project, met the group and established our next steps.

Over the next few months, we worked together to create resources targeting three identified areas from the initial assessment: planning, writing objectively rather than subjectively, and proof reading. We then tested these over three sessions, gathering feedback as we were going.

During the planning session, we did an introductory activity exploring the steps taken to make a cup of tea. This highlighted to the learners that plans can be beneficial to help avoid missing vital pieces of information. We then did a rotation activity of planning spider diagrams, which the learners used to plan exam responses in groups (Figure 8a-2).

In the second session, we handed out an exam-style response, worded subjectively, and had the learners change it to a more objective form, before discussing why it is important to be objective in IT exams.

In the third session, we used a different exam-style response with various spelling, punctuation and grammar mistakes. We asked the learners to find them, and then discussed proof reading strategies, focusing on where people make common errors, such as using the incorrect homophone.

In the final session, we repeated the self-assessment tool from the first session to compare learners’ progress from their starting point. This, along with the resources trialled, forms our evidence base.

Professional Learning: Evidence of changes in teaching, learning and assessment practices

As an outcome of the project, we have created an initial assessment booklet, which the IT department can use, going forward, to assess the starting points of their future learners. This also has tasks attached that they can use throughout the year to help them develop their English skills.

We can say that our research has helped us to improve our personal teaching practice as English teachers, especially in terms of developing our confidence in linking specific English skills to vocational courses (I.T. or otherwise). In turn, we will be able to make our lessons more vocationally relevant, and engage our learners through highlighting when they will be using the skills they are learning in their course or in their future workplace. It has also enabled us to develop a deeper understanding of initial assessments, reflect on how best to analyse the starting points of our learners, and think about possible new strategies to develop these.

A strength of this project was involving the learners in every step of the research, and creating an environment that allows them to feel comfortable sharing what is working and what is not. This has taught us that in future initial and formative assessment we could allow the learners to discuss how best to structure the activities for their own benefit, thus making assessments a more collaborative exercise. This makes the learners feel they are valued, and may motivate them to complete it, seeing that they have been part of its creation.

Evidence of improved collaboration and changes in organisation practices

Through undertaking this project, the researchers from both the English department and the IT department have had a chance to work together to develop a resource that is useful for both parties.

There has also been an opportunity, through both the project sessions and drop-ins, for the English department staff to observe IT lessons. This has enabled them to make the initial assessment vocationally relevant, as well as establishing relationships with the participants, and strengthening our relationships with those learners who are in our GCSE resit classes.
Further, by taking part in the Zoom meetings, we have been encouraged to focus our research and maintain its intended direction by our project leaders. A by-product of this was also learning how to use Zoom as a piece of collaborative software, which we could use in the future.

We were also able to access external training sessions, which provided useful guidance and next steps strategies, as well as the chance to network with other researchers and research leaders.

Further, we were given the opportunity to have a project lead visit. Sue provided useful advice which we implemented in the project, including getting feedback from each session, rather than having one large feedback session at the end of the project. This meant that the sessions were fresh in participants’ minds, and they were able to provide more focused and useful feedback.

Evidence of improvement in learners’ achievements, retention and progression

One aspect where the project has had impact on the participants is that it is clearer to the learners involved who have a grade 4 (or above) in English that their learning journey in English does not stop after passing their GCSE. One learner said they “didn’t realise how much English there was in IT” until they took part in the project, while another told us they were using strategies from the planning session in their coursework.

The IT department reported an increase in mock results compared to last year. While this research project is one of several strategies they have implemented this year, it has had a positive impact on how the learners are approaching the higher mark questions, with some using the planning strategies, or checking they are reading the questions properly.

Most of the learners who are still in GCSE classes also had improved mock exam results, with 100% achieving a grade 4 on at least 1 paper, and 40% on both. They have also given some positive feedback about the impact of the project, with one learner saying they felt “it was helping [them] see the link, and useful for [them] to get as much English practice as possible.” Learner 3, in our case study, also said that they intend to progress to university, so need to pass their English GCSE, as well as develop their English skills in order to cope with the academic demands of completing a degree such as essay writing and proof reading.

After the research project is complete, the researchers are going to compare the results of the participants’ summative June exams, and reflect on the possible impact of the project.

Learning from this project

From this project, our knowledge claim is that a vocationally relevant initial assessment will provide a sound starting point for assessing a learner’s English skills.
We have also learned the benefit of asking critical questions to guide our thinking. This has led to specific findings including the following:

What went well?

Learner engagement: our biggest success from the project was how engaged the learners were in it. All identified participants took part in the sessions without complaint, and gave useful and detailed feedback on what they liked, and what they would change, about the project.

Collaboration across departments: through working with the IT department, we now have a much stronger link in our college between English and IT. This has enabled us to monitor and motivate current GCSE English students on Level 2 and Level 3 IT much more effectively, as those students know our departments work closely together to ensure they are attending and working to the best of their ability.

Development of the initial assessment document: although it has been challenging to strike a balance between being IT-relevant and still testing English skills, we have developed an initial assessment tool which we can use in future IT and/or English lessons (with IT learners) to establish starting points, while avoiding discouraging the learners because of its heavily English content.

In one researcher’s personal teaching practice, they are going to try to make formative assessment a more collaborative exercise with the learners. This project has shown them that learners want to be in control of their assessment, and respond well to assessment tasks when they have had some input into their creation. This could be put into practice through allowing learners to choose when the assessment will be done, if it will be done in one go or split into segments, or the topic of the assessment.

Even better if…

Starting earlier: on reflection, the project would have run more smoothly had we completed the initial assessment document before it started, so that it could have been trialled with the participants at the beginning and at the end of the project.
More sessions: there would be a richer amount of data if we had more than four sessions with the learners, especially if we had been able to trial all the activities in the initial assessment tool. This way, we would have feedback on all of them, and be able to use or adapt them accordingly. Two of the case study learners also provided feedback that they would have liked more sessions, and to be able to cover more topics from the initial self-assessment, rather than just the most frequently occurring three ‘weak points’ discovered in the first session.

In summary, going forward, we would ideally like to publish the initial assessment in an online format, so it could be available both paper-based and digitally, making it accessible to a larger group of people. A digital format could allow for different activities, such as embedding videos and submitting quizzes.

We also feel we can say that we have succeeded in achieving our aims of helping learners to improve their English skills in a contextualised manner.