Speaking at Reuters today, Charles Randell, chair, Financial Conduct Authority and Payment Systems Regulator continued that people will start that process and people must remain in charge and accountable.
“People, not machines, need to understand and control the outcomes that the technology they are designing is producing; people, not machines, have to make the judgment as to whether these outcomes are ethically acceptable – and ensure that they don’t just automate and intensify unacceptable human biases that created the data of the past,” he said.
A strong focus on checking outcomes will be essential as some forms of machine learning, such as neural networks, may produce results through processes which cannot be fully replicated and explained.
He added another danger is that removal of people from the process may make them less willing to intervene because they feel less connected to consumer outcomes. He asked: “‘What if the cost of machines that think is people that don’t?’.”
Human judgment more important than ever
He said firms need to build a culture that questions the results of automation and maintains good judgment.
“As Donella Meadows has written, ‘Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity – our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality.’”
He outlined some real-world examples including micro insurance where AI promises to increase coverage for people on low incomes by improving risk modelling.
But on the negative side, flagged a report in the New York Times that US credit card companies are cutting cardholders’ credit limits when charges appeared for marriage guidance counselling, since marriage breakdown is highly correlated with debt default.
Media reports earlier this year suggested price comparison websites quoting significantly higher car insurance premiums for people with names suggesting they are members of ethnic minorities. He added that the regulator is already concerned firms are predicting and taking advantage of customers less likely to shop around by offering them more expensive products.
“Some may say ‘buyer beware’ or ask why firms should not maximise their profits using all means at their disposal. But I don’t think that would be the prevailing view in our society,” said Randell.
“Society in general and policy makers in particular need to think about how to mitigate the risk that an algocracy – a society ruled by algorithms – exacerbates social exclusion and worsens access to financial services in the way that it identifies the most profitable or the most risky customers.”
The government has already launched a consultation on a proposed Centre for Data Ethics and Innovation, which opened on 13 June and closes on 5 September.