Ethics, governance and data for good at the AI & Big Data Expo

0


AI is more than a trend and it’s also not a specialist space anymore. This year, the topic was embedded across the tech conference calendar in London—with every event packed full of people keen to learn and share their experiences.

The AI & Big Data Expo stood out for its great mixture of speakers, not only targeting people working within data, but making the topics feel completely accessible to somebody like me, who isn’t a data scientist by background. As the CEO of an infrastructure charity, I know our beneficiaries don’t necessarily work closely with data, or hold it at the forefront of their minds, so it was very interesting to see how AI and big data impacts a diverse range of different sectors, and how they employ and deploy different strategies to work with its new challenges. 

I especially enjoyed the talks focused on ethics and governance, which resonate with our beneficiaries and the challenges that they face. What’s interesting is that there seems to be a real drive to ensure that ethics is baked into AI strategies moving forward. It’s very heartening that ethics is being talked about at this early stage – that it isn’t being ignored as it may have been when past technologies developed as quickly.

One talk tackled governance, and how governments are still playing catch up. There seems to be an overarching feeling that AI has to be regulated, but whether the regulation that people want is possible is the next big question. Can it be regulated, and how? Is the EU act going to work well, and is the legislation in the US going to be effective, or will it be watered down? What systems do you use, and therefore what do you endorse? Is this the right thing to do, is this the right way to deploy this sort of power, and what would the fallout be if we did?

As a charity working to serve other charities, safeguarding is a huge area of concern for us, so it’s good to know that the mainstream is also thinking about transparency. AI providers and tools have not yet done enough to flag the potential risks for third sector organisations that, for example, routinely handle sensitive data about vulnerable individuals. This could result in a number of issues for charities using AI for the first time, not least data breaches. We have already seen the misuse of AI to replace services that are still necessarily led by humans – in one example, a chatbot that replaced a manned helpline gave people with eating disorders dangerous dieting advice. Tech leaders and governments must take the lead by demonstrating responsible approaches and creating frameworks around safeguarding and risks. There will always be bad actors in this space, but there seems to be a ‘coalition of the willing’ that wants to ensure AI is continually safe, not just for those with enough resources to create their own safeguarding.

As these debates continue and the technology develops apace, it’s so important that there are spaces in which the third sector can be heard alongside private or statutory organisations. At the AI & Big Data Expo, we were able to showcase our work as a representative voice, and garner enthusiasm in the ‘data for good’ movement. We made some fantastic connections with others, as a result of realising how aligned our overarching missions are. Testament to that was the enthusiasm of our audience, asking our wonderful volunteers Adam and Alvaro tons of questions, and chatting to us in person afterwards. We are thrilled to have been part of these conversations.

Finally, we want to say a big thank you to the organisers for the opportunity to get stuck into a cross-sector event like this. We’re looking forward to the next one!

To find out more about DataKind UK and how you can support our vision of a strong, thriving third sector that embraces data science to become more impactful, visit our website.

Tags: ai, ai & big data expo, ai expo, artificial intelligence, ethics, governance, government, Society



Source link

You might also like
Leave A Reply

Your email address will not be published.