Seven vacancies about to be filled at the Cambridgeshire charity devoted to keeping the internet free from images and videos of child abuse.

The Hunts Post: The Internet Watch Foundation (IWF), based outside of Cambridge, is busy recruiting a new taskforce to grade and ‘hash’ two million indecent images and videos of children.The Internet Watch Foundation (IWF), based outside of Cambridge, is busy recruiting a new taskforce to grade and ‘hash’ two million indecent images and videos of children. (Image: Archant)

Applicants are being told the job involves them viewing criminal material and are therefore office based only “in a Covid-safe, supportive and friendly environment”.

The Internet Watch Foundation (IWF), based outside of Cambridge, is busy recruiting the new taskforce to grade and ‘hash’ two million images.

The hashes of the images will be used by police and internet companies around the world to prevent the spread of images and videos of children suffering sexual abuse.

The part-time jobs were advertised at salaries of up to £16,500 a year for 20 hours a week.

Six of the roles will be as assessors with recruits asked to review and categorise potentially criminal images sourced from the Police National Child Abuse Image Database

The foundation – which employs around 40 people and had a turnover last year of more than £3m - says it is on track for its busiest year.

Those being interviewed will be “compassionate, organised, and meticulous” people and will join the foundation’s hotline team.

The IWF is an international charity responsible for finding and removing child sexual abuse material from the internet. Their analysts assess images and videos of child sexual abuse in a bid to eradicate them from the internet.

In 2019, the IWF had a record year, with analysts processing 260,400, up from 229,328 reports in 2018.

Of these reports, 132,700 showed images and/or videos of children being sexually abused. This compares to 105,047 reports of child sexual abuse material in 2018.

Chief executive Susie Hargreaves said their work has been accelerated during the coronavirus crisis. Data published in July showed the IWF received 44,809 reports from members of the public between March 23 and July 9 this year.

In the same period in 2019, the IWF received, 29,698 reports – meaning there has been an increase of about 50 per cent while the UK was under lockdown.

Ms Hargreaves said the new recruits would have a chance to make a real difference to children’s lives, but warned that this is not a job for everyone.

She said: “This is a tough job and only emotionally resilient people should apply. The interview stages will help us assess a candidate’s suitability to the role and successful candidates can expect a gold-standard welfare programme to support them with this task.

“Our staff are at the front line of a very real battle to keep children safe from predators online.

“The work is very much not for everyone, but those who are able to do it really are superheroes.

“They can go home every day and know they have made a difference to children’s lives. They know they can stop the spread of these images and videos across the internet, and that is the most rewarding thing on earth.”

The IWF says that since 1996 their analysts have manually assessed more than one million reports.

They have removed over 250,000 individual webpages of criminal content, primarily child sexual abuse images and videos.

Importantly, says the IWF, their work has also led to the rescue of victims from further abuse.

A spokesperson said: “As a result of our work with the internet industry , as well as our strong partnerships nationally and internationally, we have almost completely removed child sexual abuse imagery hosted in the UK.

“Less than one per cent of the content is now hosted in the UK. We want to help other countries achieve the same result.”

The IWF’s latest accounts say among funding streams to support them is a grant of £505,000 from EU (European Union) Commission.

Hash List explained:

Each image can be given a unique code, known as a hash. In 2018, the list held hashes relating to 345,961 individual images, an increase of 50,572 hashes over the course of the year.

Of these, just over a quarter related to the worst forms of abuse. IWF analysts in the same year reviewed 4,361 images each, alongside reviewing reports and actively searching for child sexual abuse images and videos.

The IWF also provides a list of webpages (URLs) with child sexual abuse content hosted abroad to companies who want to block them.