Train passengers had their emotions recorded by AI cameras at major stations without their knowledge, it has been revealed.
Network Rail took photographs of people passing through ticket barriers as part of a trial launched in 2022, according to documents obtained by civil liberties group Big Brother Watch.
The images were sent for analysis by Amazon Rekognition software, which can detect emotions such as whether someone is happy, sad or hungry.
The system, piloted at stations such as London Euston, Glasgow, Leeds and Reading, also recorded demographic details, such as a passenger’s gender and age range.
In the documents, obtained in response to a freedom of information (FOI) request, Network Rail said this analysis could be used to “measure satisfaction” and “maximise advertising and retail revenue”.
The cameras were part of a wider trial to use AI to tackle issues such as trespassing, overcrowding, bicycle theft and slippery floors.
Jake Hurfurt, head of research and investigations at Big Brother Watch, said: “Network Rail had no right to deploy discredited emotion recognition technology against unwitting commuters at some of Britain’s biggest stations, and I have submitted a complaint to the Information Commissioner about this trial.
“It is alarming that as a public body it decided to roll out a large-scale trial of Amazon-made AI surveillance in several stations with no public awareness, especially when Network Rail mixed safety tech in with pseudoscientific tools and suggested the data could be given to advertisers.
“Technology can have a role to play in making the railways safer, but there needs to be a robust public debate about the necessity and proportionality of tools used.
“AI-powered surveillance could put all our privacy at risk, especially if misused, and Network Rail’s disregard of those concerns shows a contempt for our rights.”
A Network Rail spokesperson said: “We take the security of the rail network extremely seriously and use a range of advanced technologies across our stations to protect passengers, our colleagues and the railway infrastructure from crime and other threats.
“When we deploy technology, we work with the police and security services to ensure that we’re taking proportionate action, and we always comply with the relevant legislation regarding the use of surveillance technologies.”
The Times reported that the AI trial continues, but the part analysing emotions and demographics has ended.
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules here