So, it was quite interesting to see in the recent paper from OFSTED that schools who are good and outstanding when it comes to safeguarding, there is a clear link to those that look more towards the ‘managing’ side of technology, rather than those who look at technology just for ways to block things … to lock things down. Of course, to go along with the management side of things there is a large chunk of education which is required, and there is plenty of advice and guidance out there about how to educate users when it comes to safeguarding and esafety, but I still want to delve a bit more into the technology side of things and spend some time looking at how technology becomes a positive tools for managing safeguarding.
I’ve written before about the different types of monitoring and filtering, the idea of reactive and passive solutions. Being able to actively look, in real time, at activities in the classroom will always be my preferred option for effective classroom management through technology and being able to be interactive with the students there and then is important, but occasionally you do need to take a step back from just the teacher / student relationship.
In a conversation earlier in the year with a Head I know, we were talking about concerns about filtering and how people rely on it too much to prevent unsuitable materials being available, and how some schools need to sit down and try to look at what they want to get out of any systems they use. For me, I still have concerns about email. Over the years people have seen the whole idea of what you use email filtering for has changed. Yes, you still have to deal with spam as that is still a massive issue. For Northamptonshire schools alone, I have seen 82% of email traffic being spam (that is a total of nearly 2,350,000 emails that are spam) and this is a fairly good month. Spam is a massive problem and the blocking of inappropriate emails from the outside world is important. When a student or member of staff goes home and uses a Windows Live account they find a spam filter … if they use a mail client such as Apple Mail there is a junk rule … and this is before we get into the malicious emails category. Yes, there is still too large an amount of viruses which are moved about via email. So, if you don’t block spam and potentially malicious emails / files then you are just on a hiding to nothing.
Then we get to the other aspect of email filters, something that not all schools see. Depending on which system you use for your email you have the situation about one student sends another student an offensive email. Because this has not come from the outside world (or being sent to the outside world – don’t forget we also need to protect the world from us!) then the normal filtering might not apply. Also we have to consider that we want to be more relaxed about communications between our own students.
Now the majority of filters are based on the idea of ‘if it is a naughty word then block it or at least quarantine it!’ … and you have massive lists of what the system will deem as inappropriate. Now don’t get me wrong … we are the people that put this list in here and it is full of words we have concerns about. I’m not going to start listing them here, partly because it would mean a chunk of people would not see my blog anymore as it would be filtered, but also because there are some words in there *I* had to look up! I am obviously not down with the kids anymore. So, we have set up this list, we decide to add or remove things from it (or in many cases the LA /RBC will decide) and we leave it at that.
Well, I am not happy with that … I want people to start taking more control of what is going on. Most schools don’t realise that some of the questionable emails that are sent (ie not clearly enough spam or abusive) might get quarantined … put into a pot to keep it out of the way until someone looks to see if it is ok. The problem we have here though is time. It is hard for schools to set the time aside for double checking quarantined emails.
And what happens if there are some words you are ok about, but it is the context which make them offensive. Another term for a cockerel is fine in many contexts but not all. So this is where you get into the increasingly intelligent technology. How about if you put a score down for certain words. Then set a threshold of 20 points and if it gets to that point then someone in the school gets an email to say “hey, there is this questionable email … I have let it through but you might want to check it out.” If it is 40 points then it doesn’t even get through and a stronger email is sent to a designated person in the school. Why do I like this model so much? Well … you get to spot trends. You get to see if certain students are extracting the urine, you get to also see if any new trends in language are cropping up to allow you to fine tune the filters, you get a chance to monitor on a number of areas of safeguarding including bullying and esafety.
Now I am not saying it works overnight … the first time I worked on this sort of scoring system (1999-2004) it took some months to fine tune … and you have to invest time in it to check false positives and further tweaking. Then you have to look at how you will deal with this information. Have a look at your pastoral structure for who the best person to work with this intelligence is (and yes … it is intelligence … you have now entered the arms race) and how it will be integrated into existing procedures. This is a very important fact here … there is no point in using this in isolation. If it is not part of a joined up system to deal with issues then it is technology driving the agenda and not technology as a tool to support it. It sits right alongside information that peer mentors may be given by other students, next to comments from staff and concerns from parents.
But back to the conversation between me and the Head … yes, he had a very valid point that if you are not careful having filtering on emails will just force the problems elsewhere … via text message, verbal abuse on the bus home … but these will also happen anyway. By taking control of how the filtering works for you, you can take control of that information. And then we get onto the education side of things … it changes from “Don’t swear or the filters will block it” to “Are you sure that this is appropriate language to use with one another?” We both recognise that all filtering solutions without the education to back it up will be misused. Everything from sending your first few emails, through to advanced netiquette … from protecting your identity through to understanding about data protection … but we have to start somewhere.
So this is why, when working with LP+, we have worked hard to make sure the email provided on the enable learning platform has a filtered email solution and is as granular as *you* want it to be. Anyone running an Exchange server should not only be filtering their incoming and outgoing emails, but also the emails between users, in a fine tuned manner, not just a blanket ban!
Anyone running an Exchange server should not only be filtering their incoming and outgoing emails, but also the emails between users, in a fine tuned manner, not just a blanket ban! Anyone taking a filtering system from their LA / RBC should be asking about how it is configured. Schools should take a look at the options different technologies can give them to support pastoral issues. It might be that you only need to look at this occasionally, you might not want to put the time to it as you need to concentrate on other options, but, as always, an informed decision allows you to plan for all areas and make better judgements.