Artificial intelligence and machine learning influence our experiences of the public realm in all sorts of ways. Over the past 10 years, urban tech businesses such as airbnb, Uber, Deliveroo and Voi have impacted massively upon societies worldwide.
“I spend more time interacting with a place using digital tools than I spend in the place itself!”
Research participant, NHS NEL Digital Placemaking scoping project
Common to online holiday rental marketplaces, food delivery companies and e-scooters is data – lots and lots of data. Add to that, the data and digital technologies which manage ‘smart’ urban infrastructure and digital twins that replicate all kinds of tangible and intangible information and assets, and we can see that data are central to our experience of the built environment. Data ethics and ethical practice, therefore, should be too. What reasonable person would suggest otherwise?
At the recent Design+Engineering Expo in Birmingham’s National Exhibition Centre, I spoke about digital innovation being undertaken at speed and ethically. Innovation rooted in care – care for people, society and the natural world, with ethical practice as foundational. This article draws upon some aspects of my talk.
Dr Jo Morrison delivering a presentation entitled ‘Innovating at speed and with care’
The importance of ethical practice in the ways that we design, deploy and use place-based digital tech and data is vital. Otherwise, we risk causing more harm to society than good – whether this is through building biassed algorithms, delivering technology that negatively impacts the experience of public space or widening the digital divide.
Changing unhelpful narratives and practice
There is a pervading and unhelpful narrative in the tech world that ethical practice gets in the way of innovation and therefore slows it down. The argument being that if you slow down you could lose your first mover advantage, so let’s all ‘move fast and break things’. This Wild West approach has, for too long, led many tech companies to push against any adherence to compliance frameworks, regulation and accountability mechanisms. At which point I turn to the words of the UK’s Shadow Minister for Science, Innovation and Research who describes this influential Silicon Valley culture and mindset:
“These companies were against any and all regulations and the internet is a good example of this; it had to be unregulated otherwise that would destroy innovation and it would destroy the internet. That’s led to the situation we’re now in where people are being harmed and where a lot of people don’t trust technology.”
Chi Onwurah MP, Life Scientific, BBC Radio 4, 24 May 2022
Ashish Jaiman, Director of Product Management at Microsoft, said recently:
“There’s a tension between privacy and innovation…it’s perceived that if we have to wait, we have to let go of ethical uses of AI or privacy concerns…individual rights and concerns are most important…privacy and innovation can go hand in hand.”
“privacy and innovation can go hand in hand.”
Ashish Jaiman, Microsoft, Synthetic LiveStream Event, February 2022
It is encouraging to read Jaiman’s words that set ethical practice and innovation as mutual. Indeed, increasingly governments are recognising that regulation and governance is intrinsic to supporting a fair, innovative and thriving society. The UK Government published its plans to regulate digital technology to drive growth and innovation last year, and the EU’s new draft set of rules to regulate internet platforms includes measures to ensure platforms are held accountable for their algorithms.
“Regulation is what makes technology safe and empowering for people and enables good businesses to work successfully.”
Chi Onwurah MP, Shadow Minister for Science, Innovation and Research
Life Scientific, BBC Radio 4, 24 May 2022
Embedding ethics into urban tech design and development
In order to minimise the harm and maximise the value of digital innovation projects, an ‘ethical by design’ approach must be baked into projects from the start, so that values can be woven into the practical project management and workflow.
Values should be treated as active design ‘materials’ to be included across all phases – for instance, at the early stage of a project when the user experience is being imagined.
These values should be understood, agreed and applied by all members of the team, not just a lone voice who is ‘the ethics person’. Here are a couple of practical examples that you might adopt:
- When software developers are peer reviewing code, the documentation could contain relevant aspects of the ethical framework for examination.
- While agile development is a key approach in the tech industry, it should be recognised that its adoption can also lead to a way of working that breaks activities down into bite-sized tasks for individuals. While this enables faster production of software through iterative design and development, it can also lead to individuals focusing on their work tickets, not the project as a whole and its potential consequences. Therefore, it’s important for the team to come together at key stages of a project and discuss it holistically. Peer-to-peer engagement and reflection is an integral part of the process and will help to ensure that everybody adopts ethical practice and is held to account.
Having suggested a couple of ways that a project team can readily apply ethical practice in their day to day workflow, let’s take a look at one of Calvium’s own innovation projects that used an ethics by design approach – and sped up the decision-making process as a result.
NavSta: lessons in ethical decision-making
NavSta is Calvium’s mobile wayfinding system that was funded by the Department for Transport and Innovate UK, to help people with less visible impairments navigate railway stations independently and with confidence.
Three screens from the NavSta wayfinding flows
When designing, building and testing the NavSta Passenger App, Calvium used five core principles. Each of these active guiding principles was first born from user research workshops and was particular to this project, underpinning the decisions taken across its design and development:
- Trust — Trusting that the information in the system is reliable and will not mislead the user.
- Safety — A system that allows the safe passage of the user through physical and digital interaction.
- Clarity — The information in the system is displayed in a clear and understandable format.
- Personalisation — The user can tailor the services of the application to suit their individual needs.
- Usefulness and Relevance — The information presented in the system supports the user to complete their tasks effectively.
These principles became a set of active tools. They were front of mind throughout the project and were called upon to give us a steer on a number of occasions.
Crucially, rather than slowing down the iterative design and development process, having the principles as part of the project allowed us to speed up certain aspects of production.
For instance, we discovered some inaccuracies about the accessibility amenities of our test site, Canada Water Station, in a third party’s open data API. One of those inaccuracies was significant – it stated there were toilets on site when there were none. However, because we had the guiding principle of ‘trust’, in this case the passenger needed to trust the accuracy of the information present in the App, we decided to pull the provider’s data from NavSta immediately; without needing to have any back and forth time consuming conversations about whether to use the data or not.
Doteveryone – the responsible technology think tank that is now housed in the Ada Lovelace Institute – has developed a process called ‘Consequence Scanning’ to help companies consider the potential consequences of their product or service on people, communities and the planet.
Companies are asked to answer three questions:
- What are the intended and unintended consequences of this product or feature?
- What are the positive consequences we want to focus on?
- What are the consequences we want to mitigate?
These questions are designed to help teams share knowledge and expertise, and raise concerns in a dedicated format so they can have conversations with colleagues and map the potential impact of the product or feature in question.
Doteveryone has also put together a list of unintended consequences of digital technology for people to consider. These include: imbalance in benefits of technology, unforeseen uses, erosion of trust, impact on the environment, changes in norms and behaviours and displacement and societal shifts.
Each of these comes with a ‘cause of consequences’, highlighting a number of important factors to consider when designing and applying ethical principles and frameworks. I recommend reading them and putting them into action.
By making sure ethics play an active part throughout the design and development process, thinking about the consequences (both intended and unintended), keeping an open dialogue with colleagues and holding ourselves and each other to account, we will greatly increase the chances of minimising the harm and maximising the value of digital innovation in our towns and cities.
It really is as simple as that.
I have discussed many aspects of this subject over the past five years – from future ethical questions for digital placemakers to risks and rewards to our urban future to the role of ethics in designing and building smart cities. Through these articles I’ve explored the need for creative, responsible and inclusive approaches to collecting data, highlighted the critical importance of building trust with citizens, and hopefully shown how data can improve our quality of life, help the environment and create a better place to live when we use it well.
Feature Image Source: Mauro Mora on Unsplash