What happens if algorithms screw up your life? In an increasingly-connected world, the possibility of different systems working together to accidentally ruin your credit or put you under surveillance is growing fast. Who's responsible in these circumstances?
Microsoft researcher Kate Crawford reckons that the answer could lie in an obscure legal concept called the 'deodand'. In medieval England, a deodand was an item of personal property that was responsible for the death of a human being.
Haystacks, pigs and horses were all defined at different times as deodands, and when they were their owner was ordered to pay their value as a fine to the court. The principle eventually fell out of favour with the rise of corporations, but Crawford believes it's time to bring it back.
Crawford told an audience at Theorizing the Web in New York that we have to be wary of allowing technology companies to use unseeable complexity as a reason to wash their hands when things go awry. She thinks the deodand is the perfect tool to do that - if algorithms screw up your life in some way, the owners of those algorithms would have to pay their value as a fine.
It's easy to see a few immediate problems with the suggestion, most notably how to calculate the value of an algorithm. But if that can be cracked, then perhaps the courts of a few decades' time will once again ring out with the language of the 11th century.