14a. BCOT

Improving feedback for assessments

Basingstoke College of Technology (BCoT)

This project aimed to explore whether feedback could be improved for GCSE and ESOL written tasks using a software extension called Mote. We predominantly chose a cohort of 16-19 year old GCSE resit learners for the GCSE research group. The ESOL group was a cohort of adults completing an ESOL Entry Level 2 Skills for Life qualification.

You can download a PDF of this report on the Excellence Gateway (link pending).


Our intention was to identify a digital approach for GCSE and ESOL learners that would work for learner feedback, development and target setting. Learners do not often read the feedback provided in their books, or after assessments, and written feedback is very time consuming (we have over 100 learners each). We intended to create a digital learning feedback journal using Mote software. This tool allows teachers to add voice comments to Google documents. We were intending for learners to listen to the teacher feedback and then reflect and record what their next steps were.

Other Contextual Information

Our action research was part of the Education and Training Foundation’s OTLA 8 Programme. It took place in the English department of our FE college, where we worked with two groups of 16-19 full time GCSE learners and one part time class of adult ESOL Entry Level 2 learners. GCSE learners were using the feedback given to set targets and understand any gaps in their learning. ESOL learners used the same feedback tool but were also able to read the transcript and then translate into their chosen language. BCoT has embraced technology during the pandemic and we used Mote previously on tasks submitted digitally. Our intention was to attempt digital feedback on handwritten assessments.


We knew that we wanted to improve and streamline marking and feedback processes but knew that some learners would be more receptive than others. All existing learners from BCoT had been used to online delivery (some had used the Mote tool before as a method of feedback). The majority of learners who were in their first year from leaving school had not heard of, or used, Mote before.

Due to the success of using online tools in the pandemic and trying to steer away from a school approach, we decided upon this new approach for written task feedback.

GCSE learners:

Two different groups of learners for GCSE were chosen. Both groups consisted of learners who had achieved grade 3. One class was working at a higher level than the other. In total 20 GCSE learners were chosen to be provided with online feedback. Not all learners engaged in the feedback given. Following an initial and diagnostic assessment, all learners had to complete three additional progress tests and a set of mocks throughout the academic year. We chose to:

  • Provide a Mote audio recording for up to 3 minutes – the feedback followed the form of What went well (WWW) and Even Better If (EBI). It included how to answer certain questions, use different vocabulary and how to improve their responses.
  • This was for all learners.
  • This feedback was available as a transcript.
  • This was listened to by learners and targets set.
  • We followed the same process for all three progress tests.

Feedback from one of the GCSE learner A who gained a Grade 4 in the November exams:

I listened to my progress and targets from my verbal feedback. I was able to then share this feedback with both Emily and Jane during my extra English sessions… I think feedback from teachers will help me with getting the skills needed to find a part time job and improve my job at the radio station.

ESOL learners:
An Entry Level 2 class of 15 part-time ESOL learners were chosen. They completed a writing initial assessment in class. The teacher marked the spelling, punctuation and grammar (SPaG) errors on their writing but did not write the customary feedback on their work. Instead, they recorded the feedback for each learner and produced an individualised QR code which was inserted into a presentation (see Appendix 3a). The presentation was shown in the next class and the learners were able to come to the board and scan their code (each code was labelled) with their phones so they could listen to the feedback on their phones. The feedback consisted of what went well and how they could improve on their next piece of writing. The focus was on constructive feedback. The learners then completed another writing activity and the teacher analysed this to ascertain whether they had taken into account the feedback given previously.

Functional skills English:
We also used Mote for a small number of learners resitting their Functional Skills English writing exams to pinpoint areas for improvement to assist them in the resit. The Mote feedback was sent as an MP3 recording to their learner email.

Feedback from Functional Skills learner C:

As a learner at BCOT, I was very impressed when I used Mote, it was incredibly easy to use and the instructions were easy to follow. One of the things I like about Mote is that I can quickly clearly hear feedback. In my opinion, voice comments are more clearly understood because you can hear the teacher’s tone of voice and the nuances of what they are saying. I would 100% recommend this product to teacher’s and other learners.

Outcomes and Impact

Teaching, Learning and Assessment

The methodology of the research changed during the project. We were hoping for an ongoing journal for learners, but we were unable to find a platform that provided this. Instead, we used Mote for the three progress tests for English and in preparation for the ESOL exams. As Mote was developed, we used the tool for additional things such as voice-based questionnaires, the use of QR codes and voice instructions. We found that most learners engaged with the tools offered, but we realised that some just could not show the ability, or the enthusiasm, to work on feedback. We have a number of learners that have sat the exam more than twice. Their confidence has been reduced as a result of them being expected to resit the GCSE year on year. A minority just found the Mote process too difficult to understand, or were not able to understand the correct tool to listen to the feedback offered.

Throughout the process we gained feedback from learners to assess the impact on their learning. The activities chosen worked with the two types of learners identified, but we now need to identify how we can implement this across the entire cohort. We need to ensure that the teaching and support staff are given appropriate training and support to enable them to deliver and assess in the future.

We have had a number of successes with the Mote feedback. The learners have enjoyed scanning the QR codes and listening to the feedback. We have some case studies where learners have stated that the feedback has directly impacted their learning and future skills. We have also managed to use Mote to embed in Google Slides as verbal instructions and for all class feedback.

There have also been barriers. Not all learners have the access to a QR reader on their mobile device. We do also embed feedback by using ‘hypermotes’ but then the learners have to log on to a laptop and find the document. This can be lengthy and confusing for some. It can take 10 minutes to listen to the document where it would have been instant for written feedback, or teacher 1:1 verbal feedback. Additionally, many learners do not have access to headphones and it can be the case that they would rather listen to it when they return home and they may forget to play the feedback. If we play it in the lesson, they can then hear 20 versions of the teacher giving individualised feedback.

Moving forward, we will continue to use Mote but will use it alongside other forms of feedback such as peer and self-marking. It is still a ‘work in progress’ as we have yet to find the right approach to using this for paper-based tasks. It works effortlessly when learners create a typed response using Google Docs as we highlight the text and then record the relevant feedback.

Lastly, we still need to work with how to store and track the progress made as a result of the recordings provided. We can see who opens the recordings – but need to understand how and why it may improve their English skills. We also need to understand the next steps in supporting learner progress. We have attempted this during the year but have not created an accurate tracking system.

Organisational Development

We went into the project with an ambition to change and streamline our marking process for all GCSE and ESOL learners; however, the numbers were too great. By choosing smaller groups of learners across different abilities we were able to identify who benefited from the project. It was great to see the ESOL learners embrace the feedback given and we feel this is only the beginning for them. Working with the ESOL department and understanding how the learners developed their skills will be ongoing after this project concludes. We were able to work closely with the Mote team to evaluate the correct tools for our learners and suggest improvements for future releases of the app. Elsewhere in the organisation, colleagues are using Mote effectively for digitally produced assignments and we will have shared our experiences using the same tool, but on paper-based assessments.

Learning from this project

We have enjoyed the project and have realised that Mote is a very useful tool for feedback. As the project developed, we soon discovered that we could use the tool for many other purposes.

The main challenge we found was the quantity of individualised feedback we had to record and share with the learners. Every GCSE lesson is three hours long and during that time a task is completed by the learners. At first, we found that we could not record the feedback on a weekly basis for these 20 learners for each lesson. When the lesson had finished, we then had to record the feedback. It was more time efficient to continue with our usual methods of in class feedback such as peer marking, all class feedback and face to face feedback as the teacher checked learners’ work during class.

Instead, we chose to use the Mote feedback method on the three progress tests for GCSE. This was much more effective and straightforward as we had to provide clear feedback to enable them to improve their practice. Similarly, the same approach was used with the ESOL team as they delivered the feedback following the assessments that took place during the year. We were using this for the paper-based assessment and when learners received the marked paper they had the audio Mote feedback to listen to whilst looking through WWW/EBI.

We attempted to use a Google form for group feedback following one of the progress tests. Each question from the test had a Mote recording explaining what worked well overall as a class and what needed to be worked on. Learners were then asked to set a target for each question where they achieved less than 50%. Due to the length of the feedback the recordings lasted for about 15 minutes and many learners struggled to retain the information.
We loved using the QR codes and these became easier to use and embed for feedback following a number of meetings with the developers. In the ESOL classes, QR codes were displayed on the class whiteboard under each individual learner and they were able to scan and listen within the lesson.

We went on to use the QR codes for other tasks as well as providing feedback. Most recently, we developed Top Tips for English GCSE revision and these were added as QR codes and posted around the College and on the Google classroom.

Professional Development

Using the ETF’s Professional Standards for teachers and trainers. Please note, this report refers to the 2014-2022 standards.

  • 4. Be creative and innovative in selecting and adapting strategies to help learners to learn.

    Our project gave learners different opportunities to engage with the feedback provided and understand how they could make changes to improve their writing. Strategies were put in place as a form of target setting. As a result of giving feedback for three progress assessments, learners were able to identify their target areas and undertake differentiated revision activities to enable them to succeed.

  • 9. Apply theoretical understanding of effective practice in teaching, learning and assessment drawing on research and other evidence.

    We are always trying to improve our process for teaching, assessment and feedback.
    As Kay (2021) states:

    “Less is more…if teachers want learners to take notice of feedback, it needs to be short, specific and clear.”

    “Keep it focused…on the task and let learners know specifically what they can do to develop their work.”

    We wanted to ensure that we were providing this using the audio Mote feedback. As part of the feedback process we gave specific actions to enable the learners to improve.

  • 18. Apply appropriate and fair methods of assessment and provide constructive and timely feedback to support progression and achievement.

    We have learned that all learners react to feedback dependent on their individual learning preferences. For those learners who have difficulty reading, it was a huge advantage for them to listen to audio recording. For the ESOL learners it was a fantastic tool where they were able to hear the audio to improve their English skills, transcribe into their own language to improve their vocabulary and then listen over and over to support their progression.


Appendix 2: Learner Case Studies

Appendix 3: Project Resources and Reflections


Kay, J., (2021). Improving English and maths in further Education: A Practical guide. 1st ed. London: Open International Publishing LTD.

13a. Blackburn College

Using digital readers to engage and build confidence in reading

Blackburn College

This project wanted to investigate how Microsoft Immersive Reader (IR) could be used to build reading confidence and help learners access more difficult texts. We began by exploring possibilities for classroom use and then moved on to explore its use with the help of Additional Learning Support (ALS) staff.

You can download a PDF of this report on the Excellence Gateway (link pending).


At Blackburn College many learners who begin study programmes have not yet achieved

Using MS Immersive Reader to support students with reading

the required grade 9 – 4 in English language and must resit their GCSE English. In 2021-22 learners retaking English numbered 620 and of those learners 22% were identified as having additional learning support needs. Key to helping our learners obtain this qualification and move on to successful further study is the building of confidence in reading and improved comprehension skills.
Many of our learners are reluctant readers, easily put off by the length of texts and the sheer amount of new vocabulary that some GCSE as well as vocational reading requires. To aid with this, we spend a lot of time helping learners to break texts up, explore context to help understanding, and demonstrate how it isn’t necessary to understand every word. However, we felt that by exploring the use of IR with its built-in dictionary, translator and chunking function we might also make the task of reading more interesting.

Following the written text as it is read aloud can aid comprehension, as well as helping with the pronunciation of unknown words, the spelling of words which they recognise or use in speech and in doing so build fluency. Alongside these functions the tool also allows learners to customise their reading experience by speeding up or slowing it down, limiting the amount of text seen at one time, changing letter size and font, as well as background colour.
We felt these features not only stimulate engagement with the text but encourage learners to reflect on the strategies that work best for them and to take responsibility for these when reading.

Immerse Reader in use

Ultimately, our aim was to get learners reading, to encourage them to read more extensively to build up their confidence, and to support them to manage the more challenging 19th century texts in their GCSE as well as to prepare them for the different text types on their vocational programmes. Several empirical studies have shown that extensive reading, i.e. reading large quantities of varied text types purely for reading fluency rather than to complete a task, has positive effects on language acquisition and understanding (Mart, 2015) and is an effective way to enhance language proficiency (Maley, 2005). Although there has not been a great deal of research into the use of IR, one American study reported that teachers had found that the tool did facilitate access to a wider range of materials which in turn, ‘helped teachers find content aligned with their learners’ interests, at comprehension levels that were challenging and previously inaccessible.’ (McKnight, 2018, p.6).

Other Contextual Information

Blackburn College is a large General Further Education College (FE) and Higher Education (HE) provider based in the Northwest of England. The two biggest departments that meet with the most learners across college are Additional Learning Support (ALS) and English and Maths. Both departments we felt were uniquely positioned to explore the use of the tool and would be in the ideal position to share what was learned across the college.

For the purposes of this project, we worked with four English GCSE resit classes; two classes of 14 learners, with grade 3 teacher assessed grades (TAG) and two classes of 12 learners who had achieved a grade 2 TAG. Across these classes, 14 learners had been identified as having additional learning support needs. All classes were working on the Pearson Edexcel 2.0 lift curriculum with the target of moving up by a minimum of one grade and were from a variety of vocational backgrounds including Hairdressing, Motor Vehicle, Construction, Business, Art, Childcare and Health and Social Care. We then worked with 11 Additional Learning Support Assistants (ALSAs) who supported learners across the college.


The research was a mixed method, learner and teacher focused plan that investigated how training, awareness and use of IR in the classrooms could impact on the learner learning experience both in the English classroom and, as the research progressed, across the wider college as learners transferred their usage of IR to vocational lessons. The intention was to evaluate how easily IR could be introduced in classrooms, how user friendly and portable it was and if it encouraged learners to read with more confidence.

  • Setting up the project
  • Initial strategy
  • Revised strategy
  • Evaluating impact
  • Sharing and next steps
  • • Initial assessment of what the tool could do, what learners would need to access it (Appendix 3).
    •Created a project description to explain to staff and learners what we were aiming to do.
    •Identified how the tool could be used in different ways, both in and out of lessons.
    •Set up a Padlet to collate materials at the end of the project.
  • •Principal researchers implemented the integration of IR activities into English classes.
    •Verbal feedback from staff and learners on how easy the tool was to use as a classroom
    learning tool/ learners’ reactions/any impact on reading confidence and comprehension.
    •Analysis of findings led to a new approach which then focused on individual learners and
    widening participation into other departments supported by ALSAs.
  • •Training in the use of IR for English teachers and Additional Learning Support Assistants
    (ALSAs) to facilitate the roll out of the trial (Appendix 3e).
    •MS Teams page set up to support roll out and provide technical support (Appendix 3f).
    •English teachers and ALSAs asked to identify which learners might be interested in or benefit
    from using this technology.
    •Referrals identified and researchers attended learners’ English classes to help them adapt and
    include IR technology through use of their mobile phones during regular classroom time.
  • •We collected feedback from group tasks on flipchart paper (reading task and evaluation of the
    IR too)
    •We spoke with the individual learners we worked with and collected verbal feedback.
    •2 case studies were identified ( Appendix 2).
    •We collected feedback from ALSA sthrough Microsoft Forms and a Padlet (Appendix 3g and
  • •Continue to work with the ALSAs to reflect on IRs usefulness in different learning situations and
    how the tool responds to their learners’ specific needs.
    •Share findings with quality leads and amplify English reading skills through cross college
    •Expand and reinforce the use of the tool by training up personal tutors and appointing IR
    champions to support the sustainablity of the approach.
    •Review impact of IR on individual learner’s confidence and reading comprehension.

Outcomes and Impact

Teaching, Learning and Assessment

We began to explore the IR tool as part of whole class activities encouraging learners to experiment with the tool and tell us whether they thought that it helped them to understand the texts more easily. They were shown a short video explaining how to use the tool and we highlighted functions which we thought might interest them and be of use in practising for their GCSE English exam, e.g. identifying word class which is now a requirement on the language question on both GCSE English papers.

Feedback from learners on first being introduced to the tool was mixed. In some sessions learners said that they found listening to the software through headphones difficult and it limited their participation in the wider classroom. Similarly, some found the voice “really irritating”, and asked if it could be changed, while another noted that the reading aloud of text line numbers and punctuation was also annoying and interrupted the flow of the text.

“It is quite good but the line numbers are really irritating, can you take them out before the next lesson?”

We were pleased to find that learners were interested and quite happy to tell us whether they found the tool useful. In one class learners were asked to use IR to read a 19th century non-fiction text, a text type which had proved extremely challenging in a previous class. Learners were introduced to IR and shown how to access it through Microsoft Teams but they also had paper copies in their GCSE booklets. They were asked to work in groups to identify the main themes and ideas from the text and record their answers on flipchart paper (Appendix 3d). The tutor noted that the learners approached the reading with more enthusiasm and were far more animated in the group task than they had been in the previous session. They completed the task more swiftly and were keen to discuss the advantages and disadvantages of using the tool:

“It was really useful to know about this. It would have helped during lockdown when we working from home”.

Reflecting on how the session went and on the feedback from learners, tutors said that they were surprised to find out that learners who struggled more with reading found the tool distracting and “too simple” while stronger readers recognised that reading and listening aided their understanding as it was “helpfull (sic) to understand the situation”. Tutors thought that having a paper copy might have been more of a distraction which resulted in some less confident readers not really following the electronic version or engaging with the different functions. We now feel a more scaffolded approach which allows these learners to explore the functions in stages might make the process less confusing. We did find out however, that 3 learners from the class have gone on to use it in their vocational classes.

Other tutors have reflected on the difficulty of preparing resources for using the IR, e.g. having to extract line numbers, uploading texts to Teams, preparing learners to use the resource.

To address some of these barriers, we adapted our research strategy to implement the use of IR for use with individuals in lessons. Training in the use of the tool for both English teachers and ALSAs was then provided.

The feedback we have received from ALSAs who have been using IR with their learners has been very useful. The vast majority have found the tool easy to use, having had the training, and said that learners have been engaged. The different functions of the tool have been used in far more targeted ways by the ALSAs. Here are some of the comments fed back so far:

Working with one second language learner:

“I showed him how to translate task instructions using it to aid understanding”.

With another learner who needs to be more independent in his learning the ALSA said:

“I used it to help increase font size and also to block out text helping to chunk the reading”.

Another ALSA working with a learner with Autism reported:

“Helping a learner with their IT work, they were using Word and struggled with recalling information. So I typed the information within Immersive Reader and they used it that way. We would talk about what it was that they wanted to write about and then they could put it into their own words on the computer”.

We will be continuing to monitor how useful the IR is with our case study learners and are planning to continue this research until the end of the next term, when we are likely to have more specific data.

Professional Development

The project has provided us with a wonderful opportunity to build positive and collaborative relationships with colleagues who support our learners both in English classes and across the whole provision. We approached the manager of the ALS team and they were keen to accept training and explore the use of the Immersive Reader with us and have since suggested collaboration on other pilot projects. Reciprocal relationships are being developed on this to work more closely for the benefit of learners.

The training was well received. The 2nd group of ALSAs told us that they were really looking forward to their session as following the first session they said that there had been a real ‘buzz around the office’ with colleagues saying that the training was ‘really good’ and ‘CPD worth doing’. One of the ALSAs said:

“The immersive reader training was very insightful. It proves to be a useful tool for everyday use because it is simple to use. The additional tools such as translating, pace of reader and adjustable font size makes it even more helpful.”

All in all, the ALSAs were keen to explore the use of the tool as there were so many functions that could be of use to learners with specific learning difficulties and second language learners:

“I used immersive reader to translate a document for a student as English wasn’t their first language. A very useful tool.”

And another staff member said:
“It works well with Visually Impaired students as it allows them to highlight only relevant text.”

This project also provided the opportunity and impetus to explore research into the latest digital reading technology and build on the practices that had been forced through due to online learning in the COVID-19 lockdowns. This project also provided the opportunity and impetus to explore research into the latest digital reading technology and build on the practices that had been forced through due to online learning in the COVID-19 lockdowns.

Although many teachers could appreciate the possible uses of the tool within their classes, especially to inspire and motivate learners, who tend to switch off when tackling archaic text types, we took their feedback regarding time constraints on board. In the summer we will be preparing off the shelf whole class sessions to help engage learners with 19th century texts as well as more scaffolded introductory sessions.

Organisational Development

The project has allowed us to work in collaboration with colleagues who support learners across all of our provision as well as vocational staff. The further involvement of the ALSAs has the potential to carry the use of digital reading technologies across all areas of the college. Its integration into classes could not only be a very useful aid for those with learning difficulties but also help reluctant readers access high level and varied content, ‘creating equity through access to learning materials.’ (McKnight, 2018, p.17). We believe the tool would be useful in theory lessons across the curriculum to support learners in Hairdressing, Plumbing, Motor Vehicle, Catering, to name but a few, to facilitate their understanding and interpretation of subject specific terminology to match their practical skills.

Learning from this project

The project has afforded the opportunity for English and Learning Support staff to work together more closely and provided us both with more time to reflect on how we can best support our learners and ensure that they get the most out of their classes. We will be collating further feedback on the impact of the tool from ALSAs later in the year and look forward to working together on other projects, inspired by this work, which are now in the pipeline.

The project has taught us that technology in classrooms can only be used productively once fully researched and with full support and training of those both using and facilitating access to the tools. At the beginning of the project, the use of IR proved problematic as there are several conditions that needed to be met for the software to be used effectively. Additional research and training were undertaken to prepare smart boards and computers to avoid problems when rolled out for use with other staff and learners.

Reactions from learners have also highlighted the significance of training at the right time of year. For example, Learner MP struggled to engage with a new tool midway through his programme and Learner FS seemed reluctant to engage in something that not everyone was using. Scaffolded sessions in which all learners are encouraged to explore the usefulness of the tool and share their experiences with each other should not only encourage confident use of the tool but reduce any sense of embarrassment in class.

We have taken feedback on board from teachers regarding the time implications of using the tool and we believe that by developing ready-to-go materials for English teachers to use in the summer we can encourage them to explore the use of the tool more thoroughly in whole class contexts. We also believe that a more scaffolded approach in which teachers and ALSAs gradually introduce the functions of the tool would encourage less confident readers to reassess its usefulness.

We have also learned that whilst it can enhance both access to learning and the learner experience, even free technology has cost implications. Not only the necessary hardware requirements and other software packages that are licensed and chargeable, but it needs to be run online to be most effective. Although this is covered in college, asking learners to use it outside of lessons will have an impact on more economically disadvantaged learners who do not have unrestricted access to the internet or have limited data allowances.

We have learned that no matter how exciting and shiny some digital tools may appear or how high your expectations are, both learner and facilitator have to find them engaging and worthwhile and the only way to really do this is to keep asking what’s working and responding to their feedback. Rather than be daunted by initial criticisms, we took comments on board, adjusted training, and adapted our approach to make sure the full use of the tool will be evaluated for its usefulness.


Appendix 2: Learner Case Studies

Appendix 3: Project Resources


Maley, A. (2005). Review of extensive reading activities for the second language classroom. ELT Journal, 59(4), pp.354-355.

Mart, C.T. (2015). Combining extensive and intensive reading to reinforce language learning. Journal of Educational and Instructional Studies in the World, 5(4), pp.85-90.

McKnight, K. (2018). Levelling the Playing Field with Microsoft Learning Tools. [online]. Available at: (Accessed: 23/03/2022).

10b. City of Bristol College

Can language learning apps enhance the classroom experience for ESOL learners?

City of Bristol College

This project aimed to explore a digital language learning package to support ESOL learners in the city of Bristol. The digital tool decided on was FlashAcademy. The project team sought to gain honest, accurate feedback from their learners as to their experiences using the digital learning package, in addition to feedback from teachers on their impact. The project explored how to use the tools in and outside of the classroom in a blended learning format and through asynchronous activities. The project culminated in an event bringing all the project participants together: the managers, the teachers and the learners.

You can download a PDF of this report on the Excellence Gateway (link pending).


Our project took place with five groups of learners over different ages, genders and levels. It took place within the ESOL department of City of Bristol College, both ESOL 16–18-year-olds and ESOL adults. The range of groups was Entry Level 1 to Level 1. The majority of the research took place within groups of more than 10 learners and one lecturer worked individually with learners. There were three lecturers in total and two project leads.

The project team wanted to find out how effective (if at all) language learning apps are to support learning both in and out of the classroom. The pandemic and subsequent forced use of online delivery served to bring the issue of digital language learning to the forefront of teacher discussions. Teachers of learners at all levels were taken by surprise at how well many learners coped with using their mobile phones to access their language learning. Towards the end of the last academic year, some teachers trialled a standalone language app with a small group to supplement their online lessons and wanted to extend this further with a different software package.

Other Contextual Information

City of Bristol College is the principal provider of ESOL courses in the city. The ESOL provision is large (approx. 1500 learners per year), extremely broad and aims to support all learners to gain language skills, qualifications and confidence to progress in their education, work and independent lives in the city.

Screenshot of FlashAcademy topics.

Figure 1: Some of the topics on FlashAcademy


We chose to use the FlashAcademy platform for this project as it had a number of different features that were attractive to the teachers, and we felt learners would enjoy using it. One learner log-in gave access to multiple devices which meant that they could use college laptops or their own devices. It was accessible in 30+ home languages and had content that fit the required levels including vocabulary, pronunciation and grammar. Behind the scenes, teachers could set specific lessons for their groups or the learners could work through the content. Teachers could track progress via the app’s reporting settings and the learners could play games, allowing them to score points on a leader board.

Screenshot of lessons set by teachers.

Figure 2: Lessons set by teachers

After spending time becoming familiar with the app and showing it to learners, the teachers decided to use the app in different ways. They used it to set tasks as homework or asynchronous lessons to supplement the learning in the classroom. Two teachers also used it as an extension activity for when learners finish tasks sooner in the lesson, or as an independent learning activity while they hold tutorials with individual learners.

Towards the end of the research period, each teacher used a tutorial session to capture learners’ thoughts using a semi-structured interview format. This enabled the teachers to capture the views of the whole class as not all were able to attend the wrap up event.

Screenshot of leader board.

Figure 3: Leader board

At the end of the research, the group decided to bring all of the learners involved in the research together for a final capture of evidence (see Appendix 3) and as a social activity to thank them for their participation. The teachers posed closed questions to the learners and got them to move around the room to the number that best reflected their answer. Following that, the learners were put into smaller focus groups and asked open ended questions. Prizes were awarded to the learner in each class that had scored the highest number of points and they were treated to a buffet lunch.

Outcomes and Impact

Teaching, Learning and Assessment

From analysing the evidence, we found that learners mostly enjoyed using the app to supplement their learning and, in most cases, the content of the app supported what was being taught in the classroom. This enabled the learners to continue their learning at home. We asked learners questions about the level of challenge and most found content very easy. For the most part, learners found the app very easy to use and were able to navigate through its different functions. There was no difference in response between the adults or the 16-18s. When we asked how much they felt they learned from the app, the responses were very mixed and evenly spread between the markers. They felt it supplemented what they were doing in the classroom but they didn’t learn much in the way of new content.

Within the appendices below, responses are shown for all questions, with some descriptive comments to give a feel for the numbers and statements. One thing that we were very surprised about was the fact that the majority of learners decided to use the app in English rather than their home language. One of the key selling points for the app was that the learners can access it in more than thirty home languages, but some outlined that there were mistakes in the translation and that if they are there to learn English – they wanted it all in English!

The learners particularly liked the gamification of the app, especially the 16-18 age group who are predominantly male. They explained that they liked the competition and moving up the leader board. This was less of a highlight for the adult groups.

Organisational Development

This academic year, the 16-18 and Adult ESOL teams were merged. This project provided a great opportunity for staff to work together who had previously never met as they worked on different campuses, within different departments and different age groups. Apart from the final event, we conducted the whole project remotely. The team worked collaboratively using a Microsoft Teams page, Teams meetings and shared documents to work effectively without having ever met.

Following on from this project, the team are currently exploring other apps and platforms to support language learning in the next academic year. We think that by involving staff in the decision-making process and the trial, there has been a greater buy-in and commitment to the platform. The developers were very keen to support us in this project and offered several training and troubleshooting sessions for the staff to help them get up and running with it.

One of the teachers stated:

Normally, I don’t use apps in my teaching/classroom as I have regarded them as a distraction from traditional teaching and potentially creating more work for me. However, since starting this research I have been pleasantly surprised that in FlashAcademy I can facilitate learning through technology by setting tasks/lessons based on classroom topics for learners. For some learners their natural curiosity has led them to do different levels and lessons independently. My adult learners have many commitments and use this app to fit around their busy lives.

This teachers’ full account can be found as Appendix 2.

Learning from this project

Reflecting on the use of online platforms and apps and what led us to make choices for ourselves and the learners has been a useful exercise. Some of the learners appeared to enjoy the attention of being part of a research project and having their opinions being valued too. This is something that we are keen to take forward as a college; having regular learner engagement events to discuss different topics will add a lot of value.

Within our organisation, like most, funding is always a struggle. As much as we would like to invest in digital platforms, often teachers source their own or search out free equivalents. The teachers found that many of the features of this app were useful e.g. being able to track learner progress via a dashboard, being able to use one log-in on multiple devices and having content that broadly followed the ESOL curriculum. However, they did find that it was occasionally glitchy. Some learners lost all of their ‘points’ and so were back at the bottom of the leader board despite their best efforts. They also found that the app had a facelift halfway through the project which confused both staff and learners when they logged back on.

Getting the balances between giving learners something to do versus something that is relevant and useful to current topics/skills and between ease of use and usefulness is difficult. If an app is difficult to use or unreliable, it is no good to the busy teacher.

FlashAcademy falls down in some areas at the moment although it does have its merits too which come out in the research feedback and there were more positives from the more motivated adult learners.

Following on and inspired by the work on this project, we are considering which apps or platforms we would like to offer for our staff and students for the next academic year. This project has given us the tools to critique the different features they offer. We quickly challenged our own assumptions around digital learning and technology and will be spending time with the rest of the team so that they can see its benefits and be prepared for the year ahead.

Screenshot of the topics with teacher and learner feedback on top inc: multimodal format, ability to repeat as and when needed, and fulfilling a natural curiosity

Professional Development

Using the ETF’s Professional Standards for teachers and trainers. Please note, this report refers to the 2014-2022 standards.

  • 1. Reflect on what works best in your teaching and learning to meet the diverse needs of learners.

    We utilised the electronic resource with a wide variety of learners, gathered feedback in various contexts and reflected on that feedback to inform how we could best meet future needs of similar groups of learners. For example, noting that an option to allow some learners to receive instruction in their mother tongue aided some learners (but not the majority who preferred the simplicity of having both instructions and learning in English as the language being learned.) This may inform our future use of similar electronic resources.

  • 5. Value and promote social and cultural diversity, equality of opportunity and inclusion.

    Our project involved learners from a range of backgrounds including age, gender, ethnicity, disability etc. All were supported to participate and those who struggled with the technology were provided with additional support. When we brought the learners together at the end of the project, they were able to socialise and meet people from other classes usually based on other campuses. We managed to connect three learners who had come from a minority ethnic group within Afghanistan who swapped numbers and have become friends.

  • 15. Promote the benefits of technology and support learners in its use.

    Not all of the teachers involved were keen users of technology in the classroom. One in particular used it very little. This project has given her the confidence to reflect on her practice and to work with more ‘techy’ colleagues to trial new things in her classroom. While the teachers work in the same department, it is very large and they didn’t know each other so it has provided the opportunity to share practice and resources.

    One of the other teachers sits in the middle and uses some tech but, during the project, she applied for an internal position of ‘digital champion’ to support college staff with developing their digital skills.