EMILY HENCHER | LEGAL JARGON WRITER I think it is evident from recent news stories that algorithms play a larger part in our lives than we even understand. Recently we have seen social media platform Instagram accused of targeting plus sized black female influencers. We have seen the UK Government accused of discriminating against schools in poorer districts in their grading. The Home Office is changing their traffic light system in visa applications after allegations of racism in their approach as well. This is all down to the use of algorithms which, it seems, have a tendency to operate in a discriminatory way. These stories usually end with the person responsible apologising and altering their approach, but I am not satisfied. Why are there so many cases of algorithmic bias? Who is actually at fault? How have we, as a society, allowed algorithms to sneak up behind us, completely unregulated? I definitely cannot claim to know the answers to these questions, but I’d like to open up the discussion. How are algorithms regulated? In short – they aren’t! Social media companies, internet service providers and search engines have always used algorithms simply to make their products work. Despite this, it seems as though their complexity has lead to them being largely ignored despite their rapid development. As algorithms evolved into the complex systems we use today, there was simply no awareness of the need to regulate them until it was too late. As a result, we have no harmonised approach to regulating the way algorithms are used or created. This lack of regulation is primarily due to the fact that algorithms are complex, ever-changing systems and it is difficult to know where to start. As the prevalence of algorithmic technology has increased, more and more court cases and public discussions have taken place, but none of these have resulted in the creation of a comprehensive approach to dealing with algorithms when they misbehave. Who is at fault? The complexities of who is to blame for the ‘behaviour’ of algorithms introduces an incredibly intricate discussion. It might initially be thought that it is simply the company using the discriminatory algorithm that is to blame. But I invite you to challenge your critical mind here. In many other situations where a company uses a product which turns out to be faulty or reaps incorrect outcomes, the manufacturer of that product is responsible not the user. So, is there maybe scope for companies to avoid liability and pin it on the single creator of the algorithm? We can dig even deeper than that. Advanced algorithms are often subject to Machine Learning whereby the data given to that algorithm will alter its ‘actions’. This could then turn the fault back to company as the user of the algorithm – perhaps they have fed the algorithm a policy which is inherently discriminatory and therefore the outcome has reflected that. If you want to get even deeper into this discussion, is it the actions and reactions of the users of Instagram (for example) which influences how the algorithm works? You can see that ‘who is to blame’ is far from a simple question when it comes to algorithms. What next? Though algorithms are not currently regulated, regulatory bodies across the world are suggesting ways to start. There are a number of proposals on the best way to approach the issue. One example comes from the Ada Lovelace Institute which is exploring the idea of Algorithm Audits which would scrutinise the way in which algorithms operate and suggest alterations. The most important next step in this is to make an effort to understand algorithms so that when a regulatory system is created, it is comprehensive and effective. Summary This short blog can only hope to scratch the surface of the complexities of regulating algorithms. I hope that this discussion has inspired some deep critical thinking about the importance of regulating algorithms and of the complexities involved in creating that regulatory system. EMILY HENCHER Emily is a recent LLB Graduate from Scotland completing the Diploma in Legal Practice next year. She is passionate about innovation in the legal sector, showing particular interest in technology and corporate law. With a background in fundraising, she is hoping to specialise in charities and public sector work. 👨💻Want to share feedback? Did we miss something important? Let us know! We would love to hear from you at firstname.lastname@example.org or simply just comment below!