top of page
Writer's pictureAlice Thwaite

CogX AI Festival 2021 - context and complexity in ethics




The CogX AI festival has wrapped up for 2021. This is a celebration about the future of AI and a huge conference that brings together practitioners and academics, engineers and ethicists, and business and policy makers.


There were 17 stages and many of them are focused on matters relating to ethics, sustainability, and culture.


It was a privilege to be the host of the ‘Ethics and Society’ stage through 80% of the sessions. Two key themes came across for me from the events and panels.


Context


Ethics doesn't have broad-brush principles, but rather it's the detail that matters. This means there is no quick fix. But the lack of a quick fix doesn't mean we shouldn't invest in ethics. What it means is the builders of technological systems need to include ethics teams or responsible business teams that can work alongside other business units. Ethics is a skill set, like many other professions, and it employs so many different methods to imagine what the best form of society could be, communicate and listen to diverse groups and communities, and then create technical structures to strive towards those ideas.


It is the context where the technology is situated which enables a good ethical decision.


Complexity


AI has traditionally been a field that takes a vast amount of data and reduces this down

into easy-to-understand categories to make decisions. But this approach is too simple. We need to understand that the complexity of this data is a necessity, and indeed more complexity needs to be added to it.


We also need to make the governance behind technological systems more complex. Diverse teams who practice different methodologies should be behind their curation. We need to acknowledge the complex trade-offs that exist behind parts of the design of AI technologies, and the impact these decisions have on the outcomes of technology.


What was the ethics and society stage about?


Ethics has many definitions, but perhaps the broadest one is this: ethics is the study of how we want to live. What kind of societies do we want to create? Who do individuals want to be? What does it mean to be a human that is flourishing?


Despite being a discipline that has existed for thousands of years, ethics is in an embryonic state when it comes to creating systems whereby ethical thinking is a considered part of the design and evaluation of technologies. How can we ensure that all stakeholders are part of the decision-making process? How do various power structures form to create and build structures for the future?


My hope is you will all see ethics is not just a talking shop - an armchair discipline that has no part in technology labs - but is rather a thriving community of professionals. The stage looked to reinforce these principles.


Pick of the Ethics and Society Stage


There were so many wonderful sessions. Here are three that particularly resonated with us.


The last decade has seen a rise in ethical principles seeking to carve a path for governments and industry to develop safe and responsible technology. This movement spawned teams like Google’s Ethical AI team, but Google’s decision to dismantle it raises inevitable questions about the future of ethics teams within corporations. Join this session to learn how we can ensure technology is developed and studied in a safe, responsible and meaningful way in the next 10 years.


Diversity and the number of identities in the LGBTQIA+ community makes algorithmic design, decision-making and prediction exceptionally difficult. Particularly when people identify as queer or counter the norm. This panel explores how identity and diversity in the LGBTQ+ community can help create better, more exciting technology for everyone.


Despite what we learn about scientific neutrality and objectivity, science has a long history of legitimising and perpetuating racism. This history plays an important role in how we understand the state of AI research, and how it’s actively shaping power dynamics in the AI research and development community. This session explores how AI researchers can use anti racist tools and frameworks to reject harm as an inevitable byproduct of scientific progress, and to create better ways of working and being in this field.


_________________________________________________________________________


Keep up to date with all things tech ethics. Sign up to our newsletter today.

_________________________________________________________________________



Комментарии


Stay up to date with the world of tech ethics by signing up to our newsletter here

bottom of page