Loading Events

« All Events

The Power of Data & How It can Inform Learning

June 9 @ 10:00 - 11:00 BST

Teachers making notes from data.

We all know that the word assessment can send a shiver down the spine of any teacher.

The sense of dread and impending data drops.

High stakes accountability and league tables have left a mark on education.

You may or may not agree with the need for exams and tests and end of key stages. There is something to be said about the way assessment has been used over the last 10-20 years.

It has caused some unintended knock-on effects.

If you didn’t get the chance to attend the live webinar, take a listen to the recording below: 

 

Webinar Transcription:

Matt: Hello there. The focus for today’s session is the power of data and how it can inform learning. It’s always interesting that whenever we do a CPD webinar on data, we always get a huge turnout. So this is no different, this particular one. And the things that we’re going to focus on today are particularly how data can inform learning. So we’re going to talk about lots of different types of data and how we use them at Learning Ladders, how our schools use them and share some sort of best practise. Hopefully that’ll be useful.

My name is Matt. I’m the founder here at Learning Ladders. I also chair EdTech at Besa. I’m a founding member of a group called the EdTech Evidence Group. We look to help schools source appropriate technology by understanding evidence of efficancy, making sure the products actually work and do the jobs that you want them to do. And occasionally I do some work with the DfE as well. .

So some key points, what does good assessment look like? So good assessment is there to identify starting points of learning that you can then build on for individual children. It is also there to give you a review of what they can and can’t do at any point in time. And clearly from a sort of data point of view is very often used for reporting and effectively auditing learning processes. And that’s quite often where the problems start, because an awful lot of the schools that come to us at Learning Ladders have historically relied on a tracking based approach for school data. And tracking is not assessment. Data from tracking won’t improve teaching and learning. The whole area becomes slightly muddled. So we’re going to try and untangle that slightly during the session today.

Have a think about specifically the difference between formative and summative and what the correct usage of them are. The answer, very simplistically, in our opinion, is formative assessment is the stuff that you do day to day, subconsciously, deliberately, but it’s ongoing assessment. It feeds back into your teaching and learning, responsive teaching assessment for learning whatever you want to call it. It’s the stuff that you do to judge whether your children have achieved the aims and objectives that you’ve set out in your school based on your curriculum. So formative assessment should be highly tailored to your individual children, your setting and your curriculum and your particular needs. This is different from summative assessment by which we’re talking about commercially available summative assessment schemes, GL NEFR and all those kind of providers who will give you a snapshot in time based on a standardised test that can be administered in any school around the world in the same way and should provide similar results for children that can enable comparison across schools. And those two types of data clearly should be used in parallel wherever possible.

So most of the time at Learning Ladders, we’re really interested in formative information because the whole purpose of Learning Ladders what we do here is think about anything that we can do that will improve the teaching a learning process and help children achieve more and then data is clearly an important part of that. So that’s what we believe good assessment looks like. That’s what we talk about formative assessment and summative assessment being. Importantly, around all of this, we talk a lot about setting the conditions for success, so your data is only as good as the quality of the curriculum that you’re evaluating. It’s only as good as your assessment policies that you have in there. It’s only as good as the accuracy of the teacher. Judgements are only as good as teachers actually entering data onto whatever system you’re using in an accurate and timely manner. So you can have the most sophisticated data analytics set up in the world. But if the raw data going into it isn’t very high quality, you’re going to get very poor quality out. Likewise, you can have sort of the opposite where you have a super, super simple system, but it’s very high quality data going into it. Chances are you’re going to get better understanding, better insights out of it.

Also, being specific about when tech can help and when sometimes it doesn’t. So tech can be really, really useful because it’s virtually impossible even in a primary where you have maybe one class of 30 children, you get the opportunity to get to know them really, really well. It’s impossible to retain all the learning objectives, all the misconceptions, all the real strengths and areas for development for every single child across every single subject. You need a system to help you do that. You need a way of sharing that in detail with your future colleagues. And you need a way of enabling senior leaders to get a helicopter view of what’s going on throughout the school so they can make systemic improvements in the school over a number of years. So systems like Learning Ladders are extremely helpful for that side of things.

Obviously, in our particular case, it then automates an awful lot of the other tasks, like homework reporting, like upskilling parents and stuff. Sometimes when technology doesn’t help is clearly when you have a system where the technology is leading the process. So some of you have probably worked in schools, well I’ve worked in schools, where assessment in the school is talked about as assessment, but it’s actually tracking and it’s driven by a tracking system which isn’t flexible, doesn’t really help teaching and learning is and is driven primarily for audit and inspection purposes. That isn’t helpful data and that’s not typically what we’re interested in today. It’s perfectly achievable to use data for genuine teaching and learning improvement processes and have fabulous data for inspection. The two are not mutually exclusive at all. We have never had a school anywhere in the world go down in their inspection rating when they’re using Learning Ladders for their assessment that we know that is the case.

And then the final point is obviously questions to ask. Data should prompt questions. Data isn’t necessarily, in our view, always about getting an answer. Data is quite often about identifying where you should be asking better questions and where you should be looking for more information. So a lot of the ways of thinking about data are predicated on this idea that it measures something definitively. It gives you an answer. It reports on something. Actually really good data in education should quite often prompt you to ask more questions. And we talk about this a lot. And all schools talk about this a lot as well. How can you use data to prompt better questions, which enables this sort of sense of curiosity and discovery? So a few basics as well.

Good principles of formative assessment; clarify understanding and share learning. If you don’t know where you’re going, you’re clearly never going to get there. And sharing that with students, being explicit with students that this is what you’re doing. This is what was expected and this is what comes next is clearly great from their learning perspective, from the metacognition and from all those other wonderful things. But from a data perspective as well, if your teacher judgements are based on conversations with children, children are involved in that process. You have a far greater chance of that data going into whatever system you’re using. Being accurate and consistent and reliable. If data are going into your system is generated by teachers who are asked to do a data drop at the end of a half term, it’s highly likely to be inaccurate. It’s highly likely to be manipulated because teachers know what the required thresholds are. If data is generated by teachers after the event and teachers are searching for evidence of learning in an evening, in a weekend, a long time after topic has happened again, it’s likely to be very inaccurate. So do it like do it in the moment, but involve children, do it for the purpose of teaching and learning that record in the moment. And then you’ll get really, really high quality data. Just doing it for data dumps, really unlikely to get high quality data and start those classroom discussions. That’s where you’re going to get the really good insights, provide feedback that helps move learners forward.

So coming back to the very first slide, the purpose of assessment, formative assessment, identifying starting points and helping develop learning. If you’re not feeding that back to the learner, you’re clearly missing the main opportunity for that data to have a purpose to improve learning. So, again, simple, simple stuff doesn’t matter how you do your data, but if your data is just there purely for reporting and you’re sucking out data input from all the teachers, aggregating it into a report, and then it’s sitting on a shelf somewhere or on a drive somewhere, and it doesn’t then feedback into the teaching and learning process. That data process is broken and it’s not going to improve learning. It needs to be visible. It needs to be in the moment. It needs to help teachers and inform the teaching and learning process of it happening, not being there as some sort of tool to audit for somebody who wasn’t in the room at the time that the learning actually happened. That’s not the purpose of this. So that’s the other key thing. And if you’re doing that and you are having those conversations, you can do things like activating students and learning resources for each other, giving them more ownership and all that kind of stuff.

So the point here is that data shouldn’t be seen as something that’s siloed. It’s part of everyday teaching and learning. It should be part of the process. It shouldn’t be driving the process, but it should be part of the process. That’s part of the picture.

OK, obviously, in terms of what you’re aiming for in a school, more generally, you want teachers, students and parents working together. You want students as independent learners. You want parents involved. You want parents scaffolding, learning, not just taking off homework tasks that they’ve read for five minutes. You want to be involved in learning and you want responsive teaching. You want teachers. You have information and data that they can respond to. So another critical part of data is that it’s accurate, that it’s live and that it’s shared across all key stakeholders in an effortless and sort of intuitive way. Again, if it’s just siloed from the assessment manager and SLT and it’s there purely in case there’s an inspection, really not working hard enough. And most teachers will probably object to collecting it on a workload basis with some justification. So share the data and make sure that it’s working hard. And again, the way that we do that Learning Ladders obviously people are entering data on the system. They’re using it to record assessments. But then that data is working really, really hard. It’s giving you scenario based reports, is helping you upskill parents automatically. It’s doing loads of other stuff. If your data entry only results in tracking on a graph, it’s not a great use of your time. You need that data entry to work harder because the data is far more powerful when it’s when it’s shared.

So something to bear in mind as well, this is something we talk about a Learning Ladders a lot. Children know what they’re working on, know what comes next. Data will inform the teachers, know exactly what every child needs. Data will inform that really simple Gap Analysis type stuff. Parents know exactly what their child is learning and what needs to come next. Again, the data will inform and clearly from an SLT point of view, maybe a more comfortable area that we’re more used to talking about that data sort of analytics. So thinking about all of these, fundamentally, what we’re trying to do is generate better conversations about learning. So, yes, part of the time data is there for inspections and we have to prove what we do. And it gives us an opportunity to show off our achievements and show off how we’re on top of everything and how we’re managing the school. Of course, that’s part of what we do. But really, it should be driving those learning conversations. And if we switch that mentality from a sort of historical tracking based approach to a genuine need, data’s there to help. Learning that again is a massive step forward and a massive win. That’s going through a few things very, very specifically. Always, always ask when you’re collecting the data and when you’re looking at the data, so what? So what can you do? What’s the purpose of this data? What am I going to do with this? If there isn’t a clear purpose, it’s probably not going to be worth your time. Doing it specifically, what is this data looking at, and is this data the best way of getting the data to help me answer the question I’m looking at? So asking yourselves those questions to identify gaps in learning a simple Gap Analysis is all you need.

For example, if I go into our live site, this is what Gap Analysis looks like in the site. It is super simple. It is just sharing all the children, all the learning objectives. In this particular case, we’re looking at year five reading and very easily I can toggle back. So I’m a teacher. I want to know what did my current Year five do when they were in year four? Are there any gaps in learning that I should be aware of? Yes, super quick. Super easy. Were there any gaps in learning when my current year five were in year grade? Yes. Couple of children had to be aware of, it takes two seconds. But that data is incredibly simple, but incredibly powerful. That is all you need at that particular moment in time. Yes, of course, the system will do of sophisticated analysis and produce a million graphs. That’s not actually what you need at that moment in time. So it’s about getting the right data.

What is the quality of that data? We talk about this endlessly, but it is so important in educational data. So many times data is fudged or faked in schools because it’s there obviously only for accountability purposes. If data is being gathered purely for an inspector, it is very likely that that data will be manipulated either deliberately or subconsciously because it’s not actually there to inform teaching and learning. We need to change that. So in the UK, in England, Ofsted, our inspector here have specifically now said to schools that they’re aware that this is an issue, that they know that if they are asked to specifically inspect internal assessment data, that gives an incentive to manipulate it. So they now specifically do not look at internal data. They’re not saying that they don’t expect you to keep it and look at it and track it and manage it. They’re just saying that they know that if they are asked to look at it, that it’s probably going to be manipulated. So they do. So expect that you do it, but they’re not going to check it, so you don’t have to fake it. There’s a really important thing that we expect other countries and other regulators around the world to follow. I touched on the work reward ratio as well, but this is critical. Teachers are super busy. There is always a debate about workload. No teacher I have ever met is afraid of hard work, but no teacher I’ve ever met likes wasted time because they’re busy. So like I said, if data to them means getting an email from SLT at the end of a half term asking them to do a data drop that goes into an aggregated spreadsheet and sits in a file somewhere that’s a waste of their time, they see no value in that. If data enables them to save time, doing other things enables them to get insights in learning, improve their teaching practise, automatically engage parents, do their in return pupil reports, homework, learning, all sorts of other stuff. Clearly, that is a good a good transaction from a time point of view.

Now, does it feed into an improvement process, it is another thing to think about data again, if it stops start, we just collecting data for our results for this year. We had a good year. We had an indifferent year, we had a challenging year, and we do the same thing next year. That clearly is not a great use of data. So the process should be the data enables you to interrogate itself so that you can identify things that you might want to change. And in education, we’re quite guilty of typically looking at children coming into the school machine. We analyse then how far they get squirted out in different directions, which groups go in different directions. But we don’t really look at the machine itself. And again, one of the things we do a lot Learning Ladders is enable schools to look at their internal data from a curriculum performance point of view. So, for example, do we consistently see year in, year out in year four boys struggle with poetry? Well, if that’s been the case for the last two years, it’s a reasonable assumption that it might be the case this year. So we can pre-plan for that and put an intervention in place proactively. That is, using data to improve learning of a future cohort, as well as you can use data to improve learning in an ongoing teaching, a learning perspective.

And then the final question is all of this, what impact does it have? So, again, Learning Ladders have done a study. We know the schools that switched from tracking systems to Learning Ladders typically see around about an 11 percent improvement in the primary results on average for every subject within two years. Now, how does that stack up in your processes? Do you have a mechanism in place for evaluating the impact of your data analysis? So it’s not just about using data. Well, but it’s then interrogating your use of that data and the impact that that’s actually having. But a few obvious points, data relies on a bespoke curriculum, good data and education relies on a bespoke curriculum. It relies on a clear assessment policy, a simple system and simple variations of systems. So teacher assessment, self marking quizzes, low stakes quizzes, open-ended tasks, all that kind of stuff will give you different types of data and lots of rich information when you’re looking at systemised data, when you’re looking at calculated judgements. Be aware that nowadays you shouldn’t even think about using a system that doesn’t enable you to tailor the curriculum yourself without having to go to a help desk support and pay money for it. That should be a basic minimum. You should also be able to completely customise the settings for your assessment policy in terms of how many milestones you have, what your language is, etc., etc. But you should also be able to tailor the underlying algorithms and the underlying calculations upon which the system bases its automated judgement. So if you’re using a system and you’re trusting a systems judgement about which children are on track, which children are above, below whatever language you use. You need to be able to manipulate and understand and customise that calculation, the algorithm, if that algorithm is held in a secret black box and teachers enter data and then magically the other side of that black box becomes a judgement from a system that doesn’t seem to make sense because teachers will then also be encouraged to because they don’t believe that judgement or they know it not to be true. They will manipulate the raw data, which completely defeats the whole point of doing any data analysis when you’re manipulating it because the calculation is wrong. So, again, using a system like Learning Ladders where you can not only customise the curriculum and your assessment policy, but also your algorithm, which I think is unique in any system, gives you a far better chance of getting accurate automated data calculations. So something to bear in mind. There are those three elements of automated data calculation in education, curriculum, assessment policy, but then the algorithm as well that calculates it. If you can’t lift the hood on that, you can’t get underneath it, pick it about, change it, customise it however you like. You’re never going to get the accurate data that you’re looking for.

A thought about this in terms of creating conditions for success with children, that’s a matter of sharing the information. Bring it to life for them again, can be really, really simple. So we’re talking about data today. So I won’t leave at this point. But again, in terms of getting accurate data, one of the things that we do very, very simply, as well as having the digital platform, the system has publishing technology which enables you to create your own exercise books, which a lot of schools prefer because it means children can have very specific learning journey booklets. They can take ownership of their learning. What that means is because they’ve taken ownership of their learning and they’re very heavily involved in their own assessment, that means that the assessment data for that school is not only life much easier to gather, but it’s much, much more accurate. So a slightly tangential point to a data webinar, but thinking about ways of getting accurate data are quite often not simply what’s the calculation on the spreadsheet, one of the fields in the search functions, etc. it’s how do we get the quality of this raw data to be as strong and consistent as possible? Like I said, the data is there to help not merely track, so you need to do all of these things; tailoring algorithms, share excellence and stuff again. We tend to look at data as a deficit model, finding gaps in learning, finding issues, finding interventions. But we can also equally find centres of excellence. We can find success and share it and replicate it. We can look at short term patterns and long term patterns within a school, within a cohort, within a curriculum, and identify things proactively that we want to do more of and identify things that we want to rectify in some way.

This is our Gap Analysis. I went into it live just to sort of show you how we do this is again something that we do at Learning Ladders that I think is really useful that you should look to be able to do in your systems. This is looking at a particular part of the curriculum, comparing now different cohorts have gone through this part of the curriculum over the last three years. So what we’re looking at here is I think it’s a year for Curriculum Lab a year for maths curriculum. And we’re looking at the different aspects of the year for maths curriculum by the end of summer to what were the overall assessment points for children in the different year groups. So the current year six, when they were in year for the current year five and the current year for this kind of analysis will do, when you do it for a full academic year, will enable you to identify those areas of the curriculum where you may want to finish your curriculum, do some team teaching, share best practise, put an intervention in whatever it may be. So again, using data for those specific education purposes is really, really important. Different types of data. We’re going to do a webinar specifically on assessments.

This is one of the dashboards for G.L. assessment. So this is showing CAT4 data, Progress test data for different subjects, multiple year CAT4 data, multiple year progress test data Pass data. So pupil’s attitudes to learning school. And the idea of looking at softer side for want of a better word that’s used to learning gives you a much more rounded picture of a child. So when you’re looking at data to improve learning, finding data that suggests the child is struggling with maths, the answer to that may not be give them more maths or give them different maths. The answer may well come from a different kind of data, something like Pass data, which suggests there could be a challenge with the attitudes to learning, which is what’s actually underlying the maths issues. So the ability to look at lots of different types of data very easily for an individual child or a group of children is again best practise in terms of using your data in your school. Obviously, you can use that data in other ways. So a really simple way of using something like your Progress data from from someone right now is to do a simple regression analysis. So what we’re looking at here is children from point in time to another point in time. And how is their SAS progress score impaired? So all the children above the line are making positive progress. So the children down here in the sort of bottom left quadrant here are below 100, which is where you would want them to be. The children in green, however, are making progress. So they’re not quite where you’d want them to be yet, but they are making progress. Conversely, the children in the top right corner and here are way above expectations, way above average. But they’re actually going backwards. So although they’re significantly above, they’re less significantly above than they were in the progress. So that may be a cause for concern. So, again, triangulating your data and education is really important because different data will tell you different things. It’s highly likely that these children in your formative assessments may well be stuck for their attainment significantly above or significantly below age related expectations. And you’re only really pick up those smaller changes by using something like the progress test data. So, again, looking looking at it in the round is a really good way of doing.

The final point I touched on, it wouldn’t be a Learning Ladders webinar without mentioning this, but the whole point of collecting and using data in schools is to improve learning, children’s learning, enjoyment and school performance, whatever you want to call it. And the single biggest, most influential factor in children’s learning is what happens at home is all the parents are the adults at home supporting the child’s learning. So this is not adults being aware of homework, getting reward tokens or stickers electronically or physically or anything like that. This is adults being upskilled week in week out to specifically help and support and encourage actual specific learning intentions that are being worked on in school. So it goes way beyond standard parental involvement. But this is by far the best way to improve learning, particularly at primary and early years. So if you’re interested in educational data, your data should be activating parents to get involved in their children’s learning. And again, that can be automated nowadays. This is exactly what we do, at Learning Ladders.

So every child we use their assessment data from in school to give every child a live, personalised learning journey update and the supporting resources so every child gets a personalised dashboard learning journey, identifying exactly what their teacher thinks they should be working on at the moment and for each part of that curriculum for reading, writing, maths and science. We have broken down the entire primary curriculum into bite sized chunks and created short tutorials, the upskill, the children, but also the adults at home so the adults know how to help. So this is a really powerful use of educational data with making the assessment data that the teachers have made that they make for their own internal purposes. And we’re repurposing it. We’re making that same data work harder that automatically informs students, Upskill parents remotely scale does it in 100 different languages across all the subjects. All of that is using data intelligently because you are activating a really powerful part of the learning process for the no extra workload. So, again, thinking about how hard is your data working is a really sort of critical part of the process, we would suggest. And it all needs to link him together. So, again, we talk a lot about data is part of the process, but the process should not be for data. And that is a really critical change that a lot of schools benefit from hugely when they move away from a sort of old fashioned tracking. If you’ve ever worked in a school where the whole process of assessment of data tracking and reporting is audit and inspection, then the process is for the data. Everything is done for the data is tail wagging dog to use an English expression. Data should be part of the process. It should inform and support the process, but it is not the process. So the process is actually very, very simple.

This is a specific Learning Ladders process, but a bespoke curriculum tailored for every school, a bespoke assessment policy mechanisms to activate children so that the actual raw data you’re putting into the system is accurate and reliable as possible. Teacher tools Gap Analysis simple analytics so that teachers can actually use that data immediately and do something with it to improve their response of teaching, repurposing the data, sharing it with parents, sharing it with children remotely so that data is working hard and has a much broader reach. So it’s improving learning long after it is being gathered and then obviously analytics that sits over the top of it. So you can do things like curriculum analytics and design and everything else. But in this way, your your data, however you look at it, whatever terminology use is part of the process and it’s working much, much harder for you and will therefore improve results. And we know this works. So we’ve been banging the drum for it for a number of years now. And we know that it works.

We get a lot of feedback on on reviews sites. We’ve won lots of awards, all of that kind of stuff. Like I said, we’re doing a study at the moment which is showing increase in performance. We’ve never had a school go down in inspection. There’s a ton of this kind of stuff. So we know that this works. It is perfectly possible to run school data purely for the purpose of improving teaching and learning and still have a really fabulous body of evidence for inspections. It doesn’t need to be the other way round. So that was a whistle stop tour of some of the things to think about, we do specific webinars, and if you’re a member, you will be able to find these recordings within the help section. If you’re not a member, go on the website and you can find the recordings and stuff on the event’s website. We do specific webinars for a lot of the smaller areas that I’ve mentioned today. So if any of those are of interest, go to the website and have a look. Or if you remember, go into the help section and have a look. Use the search functionality and find it. Or reach out using the chat and ask us if you’re getting a recorded version of this because you are unable to attend the session today and you have any questions or comments or thoughts or feedback to let us know who is really, really interested in that.

And obviously, if you’d like to improve your your data in schools and you and you want to think about how we might be able to help you do that, then do do get in touch. We’d be delighted to help. I hope that was useful. Thank you for watching the recording. Thank you.

Details

Date:
June 9
Time:
10:00 - 11:00 BST
Event Category:

Organiser

Learning Ladders