Loading Events

« All Events

Fireside Chat: Headteacher Chat & Learning Ladders

May 26 @ 10:00 - 11:00 BST

Headteacher Chat

We’ve teamed up with Headteacher Chat to provide you with a free, high quality, learning opportunity.

Join us in this ‘fireside chat’ with founder and CEO Matt Koster-Marcon where we discuss in depth how to create better learning conversations, use assessments to inform teaching, and working effectively with pupil data.

Ask questions, engage and learn how to get the most out of your curriculum’s assessments.

If you missed this event don’t worry- grab a cup of coffee and watch the recording here:

Webinar Transcription:

Jonathon Welcome to the third episode of our Teacher Chat webinar and I’ve got Matt here. We’re going to be talking a lot about assessments this morning.

Put it into of context of where we are, this whole year has been very different for schools around the world and actually there’s the standardised test unit here in the U.K. We have not got the SATs tests. And it means that everyone has to try and get a new approach to what their assessments are in school. There are no year six assessments going on this year. So, what we’re going to discuss is how we can view summative assessment and actually how we can get the information from that summative assessment and make an impact on children’s learning and what Learning Ladders do is actually really do that really well. And with the GL assessment, it actually pinpoint where the improvement may be happening in schools and it’s very easy to use and easy to set up.

So, I would like to introduce Matt Koster Marcon, the founder of Learning Ladders. We’ve been working quite closely with them for the last year and it’s such an important platform, teaching, learning that is really focused on the children and they’re learning there.

Matt Hello. Thank you for having me.

Jonathon Can you tell us a little bit about how Learning Ladders was set up?

Matt The story of Learning Ladders? I used to be a primary school teacher in London. It was a sort of career change for me, actually. I came from a background in sort of commercial world marketing.  And essentially, I suppose Learning Ladders started off actually as a parental engagement tool, because in my schools, the real challenge that we had with trying to improve children’s learning was actually getting the adults at home involved in that learning, because as everybody knows, if you can unlock adults at home and parent power, then that that will have more impact on progress than anything you can do in school.

So, the starting point was parental engagement long before coronavirus, long before lockdown, long before remote learning. We were sort of banging that particular drum. And then it seemed to make sense to connect formative assessment and curriculum design because we wanted to start a conversation. So, what we wanted to do was not just completely just have one way communications with parents. We wanted to upskill parents, so they knew exactly what their child was working on, but also so they knew exactly how to help at home. And as part of that process, it clearly made sense to have a single platform which enabled the school to design any curriculum, implement any curriculum, evaluate it, but also share it and discuss it to the whole starting point for us was with starting conversations that we endlessly talk about better conversations, about learning.

And fundamentally, that’s what assessment is, it’s identifying the starting points for children’s learning, sharing it with children, explicitly sharing it with any other adults that are relevant and moving on from there. So, yeah, that’s how we started it. We started here in the UK where we’re based and increasingly in the last few years, we’ve worked with lots of international schools. So, we now have schools in about 25, 30 countries around the world and a whole range of different schools.

And you touched on G.L. assessments. We’ve recently built a whole load of dashboards to help people interpret their GL data, no particular promotion of their company we work with closely. We know and respect them. And that is a good product. We’re not we’re not selling GL above any other system, particularly. It just happens to be a very popular one. So, it’s part of the picture for us.

So, we did formative assessment. We do lots of conversations. You can do homework quizzes, you can do a more formal formative assessment, and then you triangulate that against your summative assessments as well as to try and get a full picture of the child. So that’s the Learning Ladders sort of history, if you like, in a couple of seconds.

Jonathon Do you want to go into a little bit more about the GL assessment, because I had a look at the report you provided and it looked really, really good. You know, as a school leader in that school, you’ll be able to pick it up and identify key areas in the school that you need to be where you want to go and look a bit more correct.

Matt We’ve focused historically on the formative assessment side of things, and that’s about identifying where you can improve, its identifying centres of excellence for individual children, for classes, for groups, for teachers. And the way we approach that has proved extremely popular because it’s a scenario based approach to data. So, most people historically have confused school data with tracking and confused assessment and tracking. And they’re obviously very different things. We were asked to look at summative data and we built these dashboards.

So, the GL data is great. It’s very, very detailed. They have a whole suite of CAT4 programmes, Progress tests and PASS tests, looking at attitudes to learning, lots and lots of data. And they provide their own reports, which are very detailed and should really be the fundamental basis for what most people are doing. But again, in terms of correlating that in an easy way with the class teachers, we felt that there was a way of doing this.

So, some of the things that we do, for example, in terms of the dashboard and one of the challenges is comparing different sets of data. So, this here is looking at your current year data. So, CAT4 data and your various different parts of CAT4 against your progress test for English, maths and science. And just seeing broadly speaking, and this is in a very deliberate over simplistic way of looking at this, but a child’s progress test score compared to their CAT4 plan. What does that tell you? What conversations might that spark? Looking at trends in time in terms of their CAT4 results are there for an individual child or for a group of children?

So, looking at trends in time for Progress test results in terms of children where they’re at, trends in time and particular areas on their PASS survey. So, their attitudes to learning, comparing that to formative data. And we obviously have side by side data comparison for the two areas as well on an individual pupil basis. And all of this is really just designed to show and start conversations around, if I go here, this is what it looks like on a sort of live version, if you like, so do all sorts of different analysis and you can change your scores and view your data in various different ways.

But all of this is designed to start conversations. So, what is this telling me? What is this telling me here at the moment that this particular group of children on average are scoring at these values compared to their CAT4 test? What questions might I ask? Because I can see there are some dips in their PASS survey attitudes to learning. So, this is about starting those conversations. This is about moving beyond just a child is struggling with maths. So, I’m going to give them more maths. So, I’m going to give them different maths. This is looking more broadly and saying, well, is the problem actually a broader challenge for them? Is it something that’s happened? Is it to do with their attitudes to learning it? Should we be tackling that as part of our overall strategy for that child? Then sharing that with parents, you know, the ability to have a really detailed conversation with the adults at home and the child themselves when they’re old enough, is obviously hugely positive, that’s the purpose of all of this. And that’s how we do it.

Jonathon For school leaders, you have an overview of the school and looking at HPL assessment as well.

Matt The way the dashboard works is you select anything you like. So you can select an individual child and have a look at the dashboard for them, or you can select a group of children. Obviously, the bigger that group, the broader it gets. But, yeah, absolutely. You could choose the entire school or possibly I think more schools might focus on possibly a cohort or a peer group, particularly because the way the GL tests are designed, they have a slightly different emphasis on different year groups.

Jonathon So shall we open up the floor and see if there’s any questions from the people attending the webinar, have they got any other questions they would like to ask you about?

Matt If there are any detailed questions about G.L., then I’ll need to take them away and get back to you. So, leave your email address. I actually have a meeting with them tomorrow so I can raise it fairly promptly for you and get back to you. Or drop us an email, or Twitter. The feedback that we’ve had on it so far, it’s been great. This particular aspect of it, because, again, it’s not designed to replace the reports themselves. So, the GL reports themselves are far more detailed, but they take a bit of wading through. They’ll have the real detail behind every single child. This is designed to be a deliberate simplification of that, to give you sort of really accessible information for everyone, for class teachers, for senior leaders, for your assessment leaders, whoever it may be. So that’s where this is pitched. And it shouldn’t replace those GL reports, because we don’t attempt to visualise on a dashboard things like confidence levels or range of potential scores. So, you know, just again, use it for what it’s intended, I would say.

Jonathon There we go, questions so far. Can I compare my school with another school in the same country, an international school?

Matt You can I mean, you’d have to do it probably slightly manually at the moment, unless you’re both Learning Ladders schools. It would depend. So obviously, G.L. will provide both schools with the same comparable results and then you can compare, irrelevant of whether you do it through Learning Ladders. If you were both Learning Ladders schools, you could run the same search and compare your results. Is there the option within Learning Ladders to compare your school results with, for example, an average in your country? No, because we don’t have access to the full GL data, so we have access to each individual.

Jonathon And next one from John Roberts, we currently use PowerBi to track G.L. assessments. How does Learning Ladders intergrate?

Matt Apparently you can integrate with Learning Ladders as well, and it’s a great tool for data visualisation, it tends to be a pro tool, so it tends to be used by schools who have someone on the payroll who’s been trained on Bi. If you’re lucky enough to have that person, then that’s great. They’ll do a lot of similar things. I mean, if you look at our dashboards and Power Bi the functionality is broadly the same. I suppose the big difference is with Learning Ladders, it’s designed to be accessible based on your school’s data, your formative data and everything else. Very easy to share. And it’s part of the platform. So why would you go to the extra trouble of linking something else and creating another dashboard? To be completely fair, the flip side of that is a lot of the schools that we work with, one of the reasons why we integrate with B.I is you may choose to include other information in your B.I dashboard, which we just don’t have access to. So, some schools will link it in in that way. So, there’s no blanket Learning Ladders is better than Bi or Bi is better than Learning Ladders for doing that because it will depend on exactly what you’re trying to do. The headline I would say is though typically power Bi is very much a pro tool the only way of inputting data into power BI is through a spreadsheet obviously, or through an API link. So that’s beyond a lot of schools. And we’re pitching for the everyday use for the everyday teacher being able to use.

Jonathon OK, the next question, I work as an analyst, do you feel that GL data aligns more or less with a particular type of teaching framework.

Matt That’s a very detailed question. I mean, in our experience, most of our experience working with GL has been in the Middle East region, mainly in sort of British Curriculum International schools, because obviously it’s mandated in that part of the world in terms of our overlapping different regions. And it’s a relatively new function. So personally, I don’t feel I’m necessarily qualified to answer that one. I’m afraid I would probably direct you in in sort of GL’s direction to answer that one. That’s not really fair for me to say something on their behalf.

Jonathon Next question. I was interested in what you said about assessment being different to tracking. And could you go into more detail about this and how we could look at this as assessments as well as using them to support?

Matt Yeah, this is a soapbox moment for me, so depending on the school, a lot of the time when we go into a school and they’re using Learning Ladders for a first timer, we asked them to describe their previous practice for what does assessment look like in their school and why are they doing it? The key driver behind the process that they are describing is to audit teaching for somebody who wasn’t there at the time, normally an inspector. So, what tends to drive academic data is having to build a picture to evidence that you’re doing your job well. So that should you be inspected, you get a favourable result. That tends to be the overlying thing. And most territories work in a high account of high stakes accountability systems. That’s incredibly important. But clearly what that does is that distorts the data usage. So, one of the things that we’ve worked very hard to do is to try and separate different types of data for different purposes.

And one of the reasons why the combination of your formative internal assessment data and the aggregation of that, to give you pictures of what’s going on in your school and the ability to use third party commercial universal summative data, which is the same for every school in the world, is to build up different datasets for different purposes. So, what I mean by that is your formative data, your internal data should really be about painting a picture that’s specific to you as a school. So, it should really be about how is it going to help your teachers deliver the curriculum that you’ve decided is important for your school community to the best of their ability and provide them with all the information they need to make sure that the year group, the cohort and an individual pupil is, for want of a better phrase, kind of on track.

So, the kind of thing that we focus on, is curriculum design, sensible assessment milestones, but also then things like, what are you going to do with that assessment information? So, we have linked resources. You know, we have a Curriculum Lab so you can identify this particular group of children in year for a struggling with quadratic equations, here are a load of fantastic resources from various different providers that work for this particular challenge. So, it’s moving assessment beyond just a long tick list, which gives you a very big graph into something which is a useful tool for teaching and learning, because assessment is something that should be done with children, not something that’s done to them after the event, and they have no impact on it. If your assessment waits until you get an email from senior leadership seeing it, saying, I need your data and you spend two days at the weekend or in the evenings filling in data on a spreadsheet or some sort of commercial system to produce lots of graphs which sit on a shelf, and nothing happens with that information, that’s clearly a waste of time and much, much better to use a mechanism and a process to start structured conversations. Use assessment for what it’s meant to record.

That information, aggregated by all means, that’s internal data. So, when you’re inspected, it’s perfectly reasonable to say we have set out this as our curriculum and this is how we’re implementing it, and this is how we’re making sure that that implementation is successful and comprehensive and we’re not missing anybody out. And we benchmarking ourselves against other schools and nationally by using the summative tests. So, we do a combination of the two. So even if your internal data paints a particular picture, it should be about starting that conversation. So, it’s an important difference because if all you’re doing is entering tracking information onto a system, a point in time assessment of where a child is and that doesn’t feed back into responsive teaching, doesn’t feed back into better teaching and learning, more personalised teaching and learning. What’s the point?

Certainly, here in the UK, Ofsted are very specific now. They don’t want to see internal data anymore because they recognise if they do, then it’s likely to provide an incentive to manipulate that data. But they’re not saying you shouldn’t do internal data. This has been a big misconception in the UK. There is very much an understanding that you need to be on top of what’s going on in your school. So, there is an expectation that you have mechanisms in place for doing that, but they’re not looking to see if they’re looking to have that conversation. And that’s beginning to be the case in lots of the other schools that we work with around the world. So, it’s a sort of subtle difference. But if all of your assessment practice boils down to an excel sheet that just produces graphs at the end of every period of time, that sit on someone’s shelf, that’s clearly not an assessment because assessment is identifying the starting points for children. It changes teaching practice, and you share it with the children, and you share it with other adults, and you get everybody involved and you review it. So very long answer. But this is sort of what we do I suppose.

Jonathon In some ways, we need to pave the way further because the summative assessment is so important, especially in UK schools this year, because we rely so much on all the external testing. What is summative assessment speaking, in your opinion, where does summative assessment fit in to the whole school assessment process and how could it be done?

Matt, I think it’s part of it, part of the overall picture. It has no more significance than formative assessment. It’s just different and it needs to be presented in that way. And to be clear, one of the things, again, when we talk about summative assessment, what we’re talking about here, you know, rightly or wrongly, but just to be clear, is commercially available assessment. So, people like GL data, you, whoever it might be, who do all these assessments, not necessarily an in-house end of topic quiz. So those assessments are also incredibly valuable, but they’re not norm referenced. They’re not tested. They’re not executed in appropriate conditions and all that kind of stuff. So that’s not what we’re talking about here.  If the underlying reason for asking the question, is you want to find out as much as possible about a child’s starting points and the things that might influence their learning success, then things like your summative data gives you more information so you can understand how is a child performing under classroom conditions and how are they performing under summative assessment conditions? And it may well be that when you get into the detail, that prompts some conversations. This child is demonstrating a skill in this particular area, but they clearly are in summative tests. So that’s a good thing. I need to spend some time trying to observe that and make sure that I understand that they do know that. Or do they just get lucky in the test, Unlikelier, if it’s a well-designed test compared to your formative assessment and if your teachers are saying these children are absolutely flying and they’re way above expectations, but the summative assessment data is saying something completely different, is your benchmarking of your assessments right? Have you got that challenge level, right?

If so, that one conversation, particularly things like the Pass data and the attitudes to learning data will be quite interesting in that, particularly at the moment after lockdown is going to be interesting to see how that changes and a lot of schools. So that would be the starting point. If your starting point, is I’m interested in, how do I improve teaching and learning if I’m realistic here? Because we go through this with lots of schools. Every school we’ve ever worked with anywhere in the world has touchwood, have never had a school go down in their local inspection ratings. So, I know this works. So, if you’re looking at it from an inspection point of view, it gives you a different angle. It gives you another thing to talk about. So, it’s particularly useful, for example, children who are on the extremes of formative assessment, those children who are perennially, significantly below age related expectations or significantly above age related expectations. You can look at their results and see, well, they are significantly above their expectations, but they’re less significantly above this year than they were last year. So actually, although they’re doing well, that’s a cause for concern because they’re going backwards.

Likewise, children who are outside age related expectations are making rapid progress on their tests can be a really good thing. And we find again and if inspection is your is your driver for this, that complete picture for an inspector is incredibly valuable and incredibly positive because they see the children in the classroom involved in their assessments, articulating their learning. They can see that you’re sharing it with parents. They can see that you’re having proper, meaningful conversations about learning that you’re completely on top of the teaching and learning process and that you’re benchmarking it against a robust external system as well and triangulating that information. So, you know, it’s for those purposes, I would say.

Jonathon  Can we merge internal data, Excel sheet on Learning Ladders platform and can we then triangulate internal and GL?

Matt So without seeing it, it’s difficult to commit to exactly what that data is from a Learning Ladders perspective. You can have your internal data, you can compare it to GL data. There’s a whole other section without going into a product demo or a sales pitch, which I promised that I wouldn’t do then. Yes. So, what I would suggest is get in touch with the office and I’ll get more details at the end of the Learningladders.info is the website. There’s a contact button there. And let’s have a look at your specific information. And we can answer that, particularly for your school. I would say, broadly speaking, the answer is yes.

Jonathon  I love talking about assessment, one of my favourite things, even as a class teacher, it makes such a difference and makes it easier to be a good teacher because you know precisely where children are and what the children need to make them improve. And then we talked about last week about cutting down on workload, because actually, if you get the assessment right, it makes it so much easier in the classroom.

Matt Yeah, I mean, funnily enough, even as somebody who runs a company that’s best known for assessments and stuff, I’m not a particular assessment nerd. I think it’s really about those beautiful moments, which is why we all go into teaching. You know, what people sort of colloquially call the light bulb moments. You have a far better chance of having more of those and more meaningful moments. If you know what children’s starting points are and you have a clear plan for where they’re going to go. Otherwise, it’s just luck. It’s just random. So, it’s about being specific and rigorous and having those structured conversations to generate that. That process and assessment obviously underpins that. I think maybe as a as a company that occupies this space and speaks at events and stuff about assessment, and sometimes we can get too carried away with pretty graphs, we can get too carried away with the data. I think we always have to pull it back, too. So, what am I going to do with this and actually put it in the context of in the whole picture of how I’m going to improve a child’s learning? Where does this fit? Is this the best use of my time really getting into this granular detail? Or do I need a system that’s just going to do it for me really quickly and easily? That’s going to give me the information I need and then I can get on with it. So, yes, I agree.

Obviously, it can be incredibly powerful, but I think it’s, again, which one of the things that always worries me when you look at things like social media platforms and stuff is you have sort of you know, you have very popular accounts on there, you know, talking about particular data visualisations, all this and the other. And it’s the wrong conversation for me. You know, it should be about how is this improving a teacher’s life, it’s making life easier for teachers so they can spend more time teaching the children regardless.

Jonathon CAT4 data, we got one more question should pupil’s targets be shared with primary school children or people?

Matt Wow, that’s a big one, and that’s probably a conference in itself, should pupil’s targets be shared with children? So, it depends on what person. This is a pure personal view now. And I’m not saying this based on any particular sort of wisdom in the area. It’s quite a specific one. But let me go slightly outside my line. In my experience, working with schools, explicitly sharing the granular objective a child is currently working on and how that fits into a learning sequence absolutely, 100 percent tailored to the individual child.

So, what we do at Learning Ladders is we create very structured conversations by sharing very explicitly with the children. This is what you’re working on right now. This is what you need to do to achieve it. And this is what comes next. But the critical thing about that is, is although it’s part of the school’s curriculum and its consistent across the year group in the school, it’s individualised to every child. So, every child is working on something that’s appropriate for them rather than for them. And they’re all making progress. So, if you’re sharing targets with children in terms of that granular, very specific objective level, which probably in our system you’d have you’d have your overall sort of structure of Learning Ladders objectives. And then within those you’d break those down into success criteria or whatever in individual lessons. That’s that’s where we picture that one hundred percent. Yes.

And all the research shows that that’s useful to do, starting conversations, being explicit with children, giving them ownership of their learning, giving them the responsibility for it. It’s the first step of proper parental engagement, because if you’re not having conversations with children about learning in class, they’re not going to magically be able to talk about their learning when they do remote learning or when they’re at home. You have to start that process. If you’re talking about things to a child, you’re currently b and we really want to be in a at the end of the year. No, I personally would say absolutely not. I think that’s a total waste of time and incredibly self-defeating because it creates pressure, it creates classroom competition, and it creates playground chatter amongst parents. It’s totally unhelpful and it won’t improve the learning in my particular experience. And you need to be challenged on that.

Jonathon, I think I would agree with that on main one.  That child has that feedback or their work or whatever that is, and that should link with the target. So, it should be that granular work with that individual child.

I really love talking about assessment.  So, we would highly recommend you look at Learning Ladders and see how it works and have a conversation with Matt to be saving work in your school. I know I’d like to say is thanks everyone for coming today and I really appreciate you joining us and hopefully see you next time in a couple of weeks’ time. Thank you very much.

Matt No problem. Thank you, everybody, and thank you for the questions.

Details

Date:
May 26
Time:
10:00 - 11:00 BST
Event Category:

Organiser

Learning Ladders