Using technology to motivate and engage GCSE maths learners
Basingstoke College of Technology
This project’s premise was to use an action research approach to investigate digital learning and the effectiveness of learner-led digital activities. This includes not just what programs and software work best, but also which methods and approaches engage learners most successfully. We have used technology with our learners for a number of years but we aimed to refine it, with promising results.
You can download a PDF of this report on the Excellence Gatewa.
Summary
I am Joe Wydrzynski, the project lead and a maths lecturer at Basingstoke College of Technology (BCoT). We aimed to detail what methods of delivery will be best suited to fully engage Further Education (FE) GCSE maths re-sit learners. Looking at how learners engage in lessons and how this affects their progression is a passion of the maths team and the College. Developing digital learning has been a significant part of the College’s development for a number of years.
As re-sitting students, in a vocational FE college with minimal entry requirements, we get a diverse cohort. We wanted to establish the best strategies for digital participation for our learners. We feel that properly planned flipped learning helps define the learner’s maths experience at the College and provides a different process than school. Due to the pandemic, we also wanted to ensure effective learner engagement for a potential year of remote learning. Lastly, we wanted to improve digital confidence and competence in the teaching team.
Our research built on the College’s decision to introduce an extra hour of flipped maths learning each week and we set out to determine how best to digitalise our workbooks and which method of online assessment would be most suitable.
We focused the action research project on enabling better general digital fluency, wanting to ascertain what works best from a variety of software, increasing our motivation to experiment. We also examined what works in a digitally focussed, learner-led flipped learning methodology in comparison to a conventional teacher-led approach and how each of these approaches influence engagement and achievement.
Rationale
As a leading college in technology, we’ve been successfully embedding all forms of digital learning. A few years ago, the College decided that in addition to learners having three hours of maths lessons a week, they would have an extra hour of flipped maths learning. This included supervision from non-maths specialists and almost exclusive use of the artificial intelligence (AI) focused maths website, Century. This, combined with WiFi difficulties, meant that these sessions did not go very well.
As a department, it was difficult to regain learners’ trust in using technology in maths, especially if they also had a negative experience using maths-based technology at school, with programs such as MyMaths (Dowker, Sarkar, and Looi, 2016). Adding into the equation the challenge of being in the middle of a global pandemic, I thought it wise to focus on how to adapt and refine our use of technology to improve both staff and learner capability and willingness for using all kinds of digital tools/ applications. Our research focus included, but was not exclusive to, how best to digitalise our workbooks and which method of online assessment is most suitable.
The project’s focus was always on how to improve success with our demographic of learners, considering factors such as their particular socio-economic backgrounds and previous grades. We decided to make the project a general betterment in overall digital fluency, rather than putting the focus on a particular tool/ app. The reasons for this are varied but, essentially, this is due us wanting to ascertain what works best from a variety of different software and not wanting to be limited in scope, thus increasing colleagues’ motivation to experiment.
Almost all learners who previously did not achieve a Grade 4 at GCSE maths, will sit GCSE again at the College. A very small percentage take Functional Skills (FS) qualifications. This impacts engagement and strategies for resource management, as learners who may have just scraped a Grade 1 will be doing GCSE again. We wanted to ensure that their learner experience is the best it could possibly be.
Approach
Activities
As is common with the nature of action research, our activities have evolved and changed throughout the year. There is further detail in Appendix 7 showing how the learning resources and strategies we have used have developed:
At the beginning of the project, we had an ideal of what ‘teacher-led learning’ entailed and what ‘learner-led flipped digital learning’ would involve.
Teacher-led learning (A3) is essentially what happens in an average GCSE maths class. A teacher lectures from the front on the class, presenting a variety of topics at their discretion. Learners have access to a paper workbook. The teacher might wander around checking learners’ work and possibly go through some model answers on the board. The lesson may then include a plenary exercise or assessment at the end of the lesson.
Learner-led flipped digital learning (A4) is the opposite in many regards; it is somewhat asynchronous but always occurs with a teacher present. The teacher informs learners where to locate resources and then can spend the rest of the lesson supporting and working preventively. Learners take the lead on their learning, dictating the speed (timeframe suggestions are provided) and order and are also given some freedoms on work presentation. Tutoring can be identified via videos and slides. Answers are often located digitally, sometimes timed, so the teacher can spend time really focusing on either stretching some learners’ knowledge or spending that bit more time those who are struggling. Whilst often heavily structured, the learner-led element comes from learners not having everything directly presented in a lecture to them and instead independence is encouraged. Towards the end of a session, the teacher would then have learners take an assessment (Nouri, 2016).
At first, we used learner sets (that are split by previous GCSE maths grade and vocational area) and embedded differing teaching and learning techniques to see what was most effective in our lessons. For example, one group had a standard teacher-led maths lesson whilst another group completed the exact same work but with resources digitalised. The group that had the resources digitalised worked via a laptop in a more learner-led digital approach (A1). Both groups then completed the same assessment upon completion of learning (A2).
Later in the year, especially when moving to remote learning, we switched to almost all learners having a teacher-led digital lesson. When appropriate, these sessions became learner-led for particular topics. This developing pedagogy did alter our feedback focus too, and all the allocated weeks are detailed further in the appendices (D1-3).
One word of warning I would offer for those who are intending to take part in a similar project, do be careful about the initial week when you introduce new technology. You will read further ahead about an issue I had at the start of the project. Start simpler than you might first intend and slowly build fluency.
Assessment
We focused on both summative and formative assessment (A5 and A6), to help evidence the research outcomes. This focus would enable us to see if the interventions we had made helped the learners to make progress in their maths learning. The plan has always been flexible, so if a particular group simply cannot perform with a learner-led approach, we switched instead to a teacher-led lesson and left the learner-led approach for another time or used it with another group. The learner-led groups comprised 6 sample sets throughout the year to ensure that the learning was not disrupted nor were learners put at any form of disadvantage.
At the end of the class, both groups (teacher-led and learner-led) used the same online Google Form assessment to assess their progress in that lesson’s topic (D1/ D2/ D3/ A7). The data we required was recorded via online self-marking assessments in order to compare who scored better out of the teacher-led group and the learner-led group. I populated the data onto formula-prepared spreadsheets (Appendix 7, A8).
Through Google Forms analytics (A9), we could instantly produce the data on these assessment results (A10/ A11/ D4). Whilst this data is subjective and many variables were at play, it gave an initial indication of differing levels of progress using the teacher-led or learner-led approaches. However, it is the learner and teacher feedback, which was the most important aspect throughout the action research project (A12/ A13).
Feedback
Gathering feedback from small focus groups of learners and teachers was imperative to decide how we could best adapt the assessment process for the following session. (A14/ A15). All learners involved completed a survey (A16/ A17/A 18) asking what they personally felt worked or did not work for them. The feedback from learners and staff was of vital importance to this research and is how we will truly have gauged what parts of the teacher-led and learner-led approach worked for our learners. I have also conducted class discussions where appropriate.
Example
An important feedback topic that came up with most of the class was how I had instructed the learners to answer questions. When planning new digital approaches, we discovered that we had to be careful that the pressure placed on learners’ digital skills did not detract from the maths that we were ultimately trying to teach them.
To keep the integrity of the work document in place, I asked learners to answer questions via the comment feature instead of writing all over the document. This is often used in workplaces and industries. It keeps the document tidy and is easy for the teacher to see what has been answered.
This process proved to be rather advanced for most learners and required higher levels of computer skills than I envisaged learners needing. Whilst some learners were able to understand this practice and therefore improve their digital skills, the focus of the lesson had moved away from maths and heavily onto IT. Therefore, we adapted the original resources to suit the needs of the learners for the following sessions and an example of this refinement is shown below.
First attempt with learners:
By using the comments feature, the integrity of the page stays the same in the live document. If a learner uses 20 lines 200 words to answer a question, all of the questions will be kept on the same page and place as they were previously.
The group struggled with this. Therefore, we listened to their advice and used the format below going forward:
Learners now had a dedicated place to demonstrate working out and answer the question, in a similar vein to most exam questions. Teachers still continue to use the comment feature for marking work, assessing and providing feedback.
Another benefit of this approach is that it seems that learners who have had a digital action research lesson, seem to be better at answering the questions in the correct format, for example by using specialist maths symbols such as ² and ÷. Overall scores for topics that might be easier on paper than digital, such as angles and linear graphs, have been lower in comparison to topics that typically work well digitally, like ratio and percentages. This is only an 8% difference than the average across the whole cohort, so not a vast difference. However, our scores for learners taking part in the project have generally been higher than those but this may be due to the groups we had chosen to take part. However, there is no denying that it is interesting and yet another reason for why we will push for even more digitally-focussed provision next year.
Professional Learning: Evidence of changes in teaching, learning and assessment practices
Firstly, the feedback we received from learners has been extremely detailed, well balanced and incredibly useful. I will admit that I was not particularly convinced that we would get much honest feedback from most of our learners based on my previous experiences working with them.
Our engaged and motivated learners were likely to support the project. Many of our learners dislike maths lessons however, due to being forced to be retake the course, and so I thought these learners would offer little or no feedback. We have completed surveys before, from asking about how learners would prefer the lesson structure, to asking if learners want to have revision classes, but as a department we had never tried asking for opinions on resources and delivery.
What we have received has both humbled and inspired us. Here are a few of the quotes we received:
“It is great that the college want to get better, to help us to get better”.
“The stuff we did today (taking part in the project) made me feel like I’m involved in something that will make me actually get a grade 4”.
“I’ve never felt so good at using computers”.
“It has made me realise that I can use my phone to revise as I don’t have a computer at home”.
We have placed additional quotes and feedback from the team in Appendix 4.
Some of the most useful and, subjectively, most important feedback, has come from learners who do not participate much in lessons, Due to the project, we now know why. From this feedback, we have so far implemented the following changes:
1. We have become more mindful that the pressure placed on learners’ digital skills must not detract from the maths that we were ultimately trying to teach them. At the same time, we have capitalised on the finding that the learners answering questions online seem better at answering questions in the correct format, using specialist maths symbols.
2. The paper and digital workbooks have been designed differently, so learners can edit their digital work in a more natural and simple way, including providing a dedicated space to demonstrate working out. This is a template we now use for all physical and online workbooks. This refinement became vital during lockdown’s remote learning. 89% of learners asked agreed that the new layout was better. On questioning one of the learners involved in the case study on which type of format of workbook that he preferred, he gave the following feedback:
“It is better that I know where to write now as before it was messy.
I like when work looks tidy, it makes me want to try harder” – IB. (D5/ D6/ D7)
3. Our weekly exit ticket assessments that used to be taken at the end of lessons were designed to test learners’ knowledge on all of the topics taught in that lesson (D8). We now do mini assessments (topic tickets), at the end of each topic being taught, then move on to the next topic before a further topic ticket and repeat (D9). This break-up in assessment has helped engagement and learners generally prefer this method (69%). Staff have also agreed that this is more suitable for our demographic.
4. When completing a project session, we started to teach the independent part of the lesson in small segments (30 minutes), rather than a large portion (1 hour 20 minutes). This was to keep up motivation, as most classes didn’t like being left to their own devices for long periods (57%). Teachers also noticed motivation drop after the 40-minute mark. Commonly, learners enjoyed working independently across the whole cohort (73%). However, learners felt that a flipped learning session of longer than an hour was undesirable. We found that many adult learners (21+ years) did not appreciate independent parts of the lessons, whether half an hour or more (54%).
5. Learners like using technology and tell us that they see the advantages of being confident with its use. They found challenging topics difficult whether they were using digital mediums or not but felt they performed better using a computer. However, the adult group, more than any other, appreciated the independent aspect the least. This may well be because of their level of maths comprehension (a lower grade than previous groups). In teaching, we know that, generally, those learners who are of a lower ability will struggle to work independently. I now have some quantifiable evidence. These learners scored, on average, 27% lower than usual in their assessment when learning independently. This could be because of the topics that came up or because the digital aspects of their learning, but many in the group (50%) made clear that they do not wish to have an independent session again.
6. We have had a large amount of positive survey results in regard to both digital learning and independent studying. The following percentages are for surveys given to all areas of the cohort, throughout the year.
- Learners generally feel their technical skills have improved because of the project (71%).
- 67% felt that their ability to use technology within maths increased over the year due to taking part.
- 65% felt like they are more likely to independently try to revise at home from now on and 69% are likely to do so via digital mediums.
- 74% agreed that they are now more confident in using software related to maths to revise.
- 71% said that they have to be skilled in digital areas in order to flourish in their futures (up from 48% in the first term).
Secondly, the other aspect of the research that we are proudest of is the increasing ability in technical skills, not just with respect to the learners but also regarding staff engaging in digital resource making. This upskilling could not have come at a better time, due to a series of national lockdowns. A member of the team said:
“While I have generally felt a level of comfortability with technology, I feel that this year has opened my eyes to how digital resources can be so beneficial in streamlining our usual processes and opening doors for more collaborative working.
In recent circumstances, where we all have had to adapt our approaches, experimentation with technology has also taught me how online digital means can provide students with access to learning outside of the standard classroom environment and helped me gain a more flexible mindset to delivering the subject.”
The last academic year, our teaching team were teaching remotely for about 9 months. Both the main curriculum (for remote leaning) and the project has involved the learners using digital resources, which in turn means the teaching team has had to use and adapt resources. While our teaching team are open to new ideas, some of the staff have not always been confident in their own digital skills.
We have now seen colleagues use Google Classroom for the first time, to great effect. We have also witnessed new staff develop and create digital workbooks for the learners to use, which are improving all the time in terms of presentation and adaptability. This has all accumulated confidence with digital teaching and learning. Further quotes can be seen in the Appendix 7 (B1/ B2).
Due to having to use the same programs that the learners are using in order to teach the specification, the team have embraced this and advanced their own knowledge on how these systems work. The feedback we received from the learners has not only helped us with the project but has also helped us to prepare our remote learning package. Our team now know how best to make resources to suit online delivery.
Staff have, for the first time:
- Made live question Google Doc workbooks (B3)
- Created assessments on Google Forms and Jamboards (via iPads) (B4)
- Opened their teaching practice, to try new things on Mathwhiteboard, Dr. Frost and Mathsbot.
- Learned how to utilise a new marking tool, the Googledocs ‘Rubric’ (Google, 2021)
These developments will ensure that we improve our best practice and will lead to the team becoming stronger, with the learners rightly benefiting.
A statement from the deputy project lead:
“You could say I entered into the digital revolution that we now find ourselves in by dragging my heels. I was not the most enthusiastic advocate for the use of technology, when let’s face it the exam is on paper!
Through participating in the OTLA 7 action research project my confidence has grown tremendously. I have completed some truly wonderful CPD with the ETF and I cannot thank the OTLA enough for all the support they have given me and the team.
Being a part of the project has allowed me to focus and develop digital learning in a way that I would not have done without the project. The action research has enabled me to be a part of the digital journey and I feel that I have; with the learners, participated in this adventure together. The opportunities to reflect and hear feedback from learners has opened up a different approach and is something which has been incredibly valuable not just to myself but the team as a whole.
I now feel confident to try new approaches and embed digital learning to enhance maths lessons rather than feel that it is something which I would try and fit in at the end of the lesson.I am truly humbled and amazed by not only the learner’s openness and adaptability to new approaches, but also the team as a whole. Joe has led the project in such a way that we have all felt part of it and has brought us a team closer together.
I am now looking forward to our next steps and continuing the great work that the project has enabled us to achieve.”
If we did not have the rich feedback from the learners, this development would not have happened and the year definitely would not have been as seamless and successful as it has been. We feel the project has helped the team really listen to our learners.
We truly feel that engaging learners and staff on the project has raised expectations all round. Having the learners appreciate and understand that we are trying to improve our best practice in order to help them and knowing we need their assistance, gives the course a specialist and nurturing aspect. It has also helped us realise exactly what the learners can actually achieve.
Another important factor was a realisation of learners’ appreciation of how we want to improve, in order to help them develop. This process is now becoming a mastery, as we look to become excellent at this practice. We are always looking to provide the department with a prestigious and high-status appeal.
We have now been asked if we could let other colleges come and see how we organise our online and digital practice, including our research project. We have presented our methods and findings to networking events for a number of years now and been asked to present again. This work is boosting our teams’ confidence, tremendously.
Evidence of improved collaboration and changes in organisation practices
Due to our successes, confidence has risen fantastically in the team, especially in regard to digital ability and perseverance to try new things. One great example of this was the team adopting a new digital semi self-marking tool, the Googledocs Rubric.
Due to remote learning, we needed to find a way of assessing the learners’ ability on a group of topics at the end of half term two. One of the staff members, who has not had a particularly large role in the project, went ahead and discovered the required software. This included learning how to use it, trialling it and then us using it with our entire cohort. Rubric makes marking on a computer very quick and efficient (E1/ E2/ E3). This worked really well and we will use it in the future for at least one assessment if not more per year. I believe that this would not have happened if we had not been so successful with embedding the research feedback and ideas. Once sureness and buoyancy takes hold, a person is more likely to experiment and have the willpower to succeed.
In the past, it was often only me introducing new technology from within the team. Witnessing experienced staff members now also having an increased confidence to experiment, has been an exceptional experience. Watching experienced staff members become experts in teaching with an iPad, using Jamboard and happily moving to using the Google suite, has been a brilliant experience. The fact that some staff may have been reluctant in using such applications previously only makes the development in their practice more meaningful (Ghurbhurun, 2020).
Evidence of improvement in learners’ achievements, retention and progression
We decided which learners we would track in the first term.
I chose a male learner (IB) and female learner (DE). IB was at the College last year whilst DE was a new learner to the College last September. We struggled at times to follow DE throughout the year, as her attendance became an issue. Fortunately, we still managed to gather a reasonable amount of data and feedback from both learners.
IB came to the College in 2019, with a special educational needs background. IB originally achieved a Grade 1 at school, sitting an exam in summer 2019. IB had a fantastic academic year at the College last year. He came to us from a specialist school and has various learning needs. He is extremely hard working and most importantly for the project, very approachable and open to giving feedback. When we performed a survey in the past, he offered meaningful suggestions. IB struggles with using technology so is therefore a learner who I knew would provide much useful qualitative feedback. IB improved to a Grade 3 with last academic year’s predicted grades. He also achieved a Grade 3 on the November re-sit. After another impressive year, we are hoping he achieves the elusive Grade 4.
DE was always going to be a particularly interesting learner to engage in a case study, due to having a very unfavourable experience with maths at school. She recalled many instances when she had been let down in terms of tuition. Fortunately, she has really enjoyed her time in maths at the College. DE became more and more confident throughout the year, to the point where she was happy to demonstrate her methods to the class. She is also open to giving feedback. DE found remote learning difficult during the previous lockdown, so as with IB, she provided a real test for how adaptive our digital provision was. DE received a Grade 3 in last year’s centre assessed grades.
It has been an overwhelmingly encouraging involvement for both and we predict that both learners will progress/achieve this year.
IB has had an incredible year. It is wonderful to see a learner from an educational needs background develop so rapidly. He once struggled with using technology and would actively try to find alternative solutions to having use it but is now asking, while in lesson, if he can revise on a computer. This has been excellent to observe.
DE struggled throughout the year for various reasons, so we missed many opportunities to gather feedback. However, her willingness and perseverance to work independently, using College videos and Century, is more than I could have asked for. She is a learner who once said, “doing maths on a computer is pointless”.
We have seen a nervous learner who could not complete a worksheet without asking for help grow into a confident learner who will happily catch up on missed work at home. This is the exact kind of scenario and outcome we were hoping for at the beginning of the project.
I have included fuller evidence and scope in the appendices (Appendix 4) for both of these learners, including outcomes and improvements.
Another element we are particularly proud of is the attendance of the main cohort. This includes both progress when going remote and when learners returned to college for face-to-face teaching. Our remote learning package provided by the department was both structured and purposeful. This, in no small part, is because of the action research project and learner feedback received. We strived to produce quality online lessons. Having members of the team with digital backgrounds made it work all the better. As we were able to keep our teaching and learning engaging throughout the lockdowns, our attendance upon returning face-to-face was surprisingly good. In comparison to other courses at the college, we are proud of how many learners attended since returning to college for face-to-face lessons.
Throughout the year, it has been interesting analysing how the data has developed and changed. For example, at the beginning of the year, a particular class felt that the digital parts of the lesson meant that overall; they did not perform as well as normal (72%). However, the same class were asked the same question near the end of the year (post another lockdown) and the result was completely different (20%). This kind of development in perception of technology was a common theme. When we asked a different class after their first flipped learning digital lesson whether they felt that they are more or less likely to try a lesson like this again, only 54% said yes. When asked again later in the year, the same class answered 85% ‘yes’. The department, as well as the project, clearly had proved its worth in creating a better digital learning environment.
Learning from this project
Staff confidence, cohesion and adaptability
We now have a more confident team in terms of their digital skills as well as personal willpower to try out innovative pedagogical methods. Embedding the perseverance to grow with the learners and adapt new techniques has been an emerging theme from this year. I find that the newer members of the team have developed fresh confidence in their ability and are happier to go ahead with their own decisions, from picking out differentiated tasks to creating shared presentation resources.
Some of the more senior staff members, who may have not been comfortable with their own technical abilities, have now developed to the point where they are willingly trying new software then introduce it to the team. The project, with the pandemic impetus, has helped the team develop and become more determined to move even more towards being a digitally specialist area.
It isn’t only staff who have improved with confidence towards using digital means and methods. We have seen some great examples of our project having a positive effect on learners’ perception towards digital learning. Since returning to face-to-face classes, most of the team have had multiple learners opt to use a laptop for class revision, which would have been an utmost rarity in the past.
- 81% of the learners asked said they are more likely to use technology to learn independently in lessons from now on.
- 63% said they are more likely to want to use technology in ALL lessons (including vocational).
- Learners have become happier to use Century (71% said they found it useful for learning how to use fractions) in comparison to a few years ago, when a majority of learners disliked using Century.
In conclusion, our mix of digital mediums/ methods to sit alongside the traditional approaches (that we now know work with our demographic), has resulted in one of the most positive experiences the department has had in many years. The success is all the more notable due to the fact that it has also been a year of a global pandemic. We are extremely proud of what we have achieved.
Appendices
Appendix 2 – Background information about the region
Appendix 3 – Topics and adapted resources
Appendix 4 – Case studies (learner journey), data and feedback
Appendix 5 – Digital methods to sit alongside traditional approaches
Appendix 6 – Final conclusion, recommendations and moving forward
Appendix 7 – Evidence references (A1-18, B1-5, C1-4, D1-9 and E1-6)
References
Dowker, A., Sarkar, A., & Looi, C. Y. (2016) Mathematics Anxiety: What Have We Learned in 60 Years? Frontiers in psychology, 7, 508. Retrieved on 30/06/2021 <https://doi.org/10.3389/fpsyg.2016.00508>
Ghurbhurun, R. (2020) If we don’t upskill teachers in digital skills, learners will suffer. Jisc. Retrieved on 30/06/2021 https://www.jisc.ac.uk/blog/if-we-dont-upskill-teachers-in-digital-skills-learners-will-suffer-30-nov-2020
Google (2021) Create a Rubric using Googledocs. Google. Retrieved on 06/07/2021 <https://sites.google.com/a/mail.brandman.edu/edsu-533-classroom-tutorial/create-a-rubric-using-google-docs>
Nouri, J. (2016) The flipped classroom: for active, effective and increased learning – especially for low achievers. International Journal of Educational Technology in Higher Education 13, 33. Retrieved on 30/06/2021 http://educationaltechnologyjournal.springeropen.com/articles/10.1186/s41239-016-0032-z