--
This group is managed by NALDIC, the UK's EAL Professional Association. Please visit our website for further information: www.naldic.org.uk
---
You received this message because you are subscribed to the Google Groups "EAL-Bilingual" group.
To unsubscribe from this group and stop receiving emails from it, send an email to eal-bilingua...@googlegroups.com.
To post to this group, send email to eal-bi...@googlegroups.com.
Visit this group at http://groups.google.com/group/eal-bilingual.
To view this discussion on the web, visit https://groups.google.com/d/msgid/eal-bilingual/70d6a040-97b6-4926-97b4-e86095464ccd%40googlegroups.com.
Points fully taken and in full agreement.
But nevertheless the census return data is collated and reported nationally and gets media attention when released.
There is usually an interest in number of languages and numbers speaking those languages however broad brush and incomplete that information might be.
It’s also becomes part of the school profile for external partners, whether LAs or other
I do think it’s important to get it as right as possible with the tools we have to use – in particular to get SIMs repopulated with the extended language codes. I increasingly suspect problems with this is at the heart of the majority of the ‘pending’ and ‘language other than English’ returns in place for 10% of the recorded EAL demographic.
Next step more adaptive and flexible reporting to capture pupils’ full linguistic repertoires – yes please.
But worth working to firm up the returns on behalf of that 10% in the mean time.
Di
--
This group is managed by NALDIC, the UK's EAL Professional Association. Please visit our website for further information: www.naldic.org.uk
---
You received this message because you are subscribed to the Google Groups "EAL-Bilingual" group.
To unsubscribe from this group and stop receiving emails from it, send an email to eal-bilingua...@googlegroups.com.
To post to this group, send email to eal-bi...@googlegroups.com.
Visit this group at http://groups.google.com/group/eal-bilingual.
To view this discussion on the web, visit https://groups.google.com/d/msgid/eal-bilingual/4e3110fa-51af-46c6-9c4e-7d9dd7578ac3%40googlegroups.com.
This is all a bit intriguing then.
The specific languages (using or not using the extended codes for the First/Home Languages drop down) do end up in the census report which is published after the census return.
And there are definitely reported categories along the lines of first language ‘believed to be other than English’ etc. Catharine Driver picked that up last summer when the January 15 return was published -in about June/July I think.
It surely shows what a pixillated and sorry mess the whole creaky system is, whichever end of the knot one pulls hopefully
Di
I think it’s a case of having to look from multiple perspectives at once. The data collection process is a servant of many masters, and they don’t always want the same thing…
In school, these data returns have to be completed but don’t have an obvious relationship with what happens in the classroom – as Ann said, putting EAL information in the notes is a way of subverting the restrictions that the system puts in place. Most teachers don’t look at the information, but transforming knowledge-about-pupils into official data can have other useful (but originally unintended) consequences: as Helen said, it gets colleagues to recognise what’s happening with the pupils; as Diane said it can also speak to partners outside the school.
The data system, though, is originally a management tool (this was unambiguous in interviews with the DfE). It’s for holding schools accountable … and that means different things at school level (where you generate ‘data’), at local level (where you probably know the LEA advisor well) and at national level (where they’re only interested in large-scale effects). This is also a problem for researchers – the NPD isn’t designed for research, just for management.
(Incidentally, it’s why Di found the system ‘pixillated’ – quite rightly: depending on which level you look at, and who’s writing the report, you can get quite different pictures. Raise, for example, really struggles to combine multiple categories, such as low-income Pakistani boys [FSM, language/nationality, gender], so there’s a limit on how much you’ll ever get out of a Raise report, even though the data’s there in the system.)
In effect, we’re all secondary users trying to make something of a system that was designed for people holding schools to account … BUT at the same time we’re expected to use this system to deepen our understanding and inform practice. They’re not easily compatible, which is why we end up with so many work-arounds. (I heartily recommend Jack Marwood’s blog on school data – polemical and statistically astute, as well as very readable: http://icingonthecakeblog.weebly.com).
So where does that leave us? Using a system that doesn’t work for actual teaching and learning, having to force our thinking out of these reductive categories every time we use SIMS. Data managers are brilliant, though. If you can persuade them to make you a bespoke spreadsheet with all the information (literacy practices, language repertoires, learning in schools and other settings, etc.) you could be on to a winner. Only then they’ll still have to put all the ‘data’ (i.e. stuff the kids into Capita’s boxes) into the system all the same …
Rob