Skip to content

KnowRisk: responsible innovation does not happen overnight

Posted 17 Jan 2022

Responsible innovation does not happen overnight, but it’s important to start the conversation early

Nathan Coulson, Senior Technologist for Responsible and Ethical AI, Digital Catapult 


The sustainable development of advanced technologies depends on the ability to identify and mitigate risk. 


Unfortunately, without standards or regulations, using machine learning technology with confidential data can have negative consequences. Whether the harm is deliberate or unintentional, data can be incorrectly labelled, leaked or biased – leading to certain groups being marginalised or discriminated against, while a critical software failure could have devastating results. Therefore, ethics must be embedded from the start of development and throughout the product lifecycle, alongside any commercial objectives.


Responsible innovation doesn’t happen overnight, but it is a worthwhile endeavour. It’s essential to have the conversation around ethics in the early stages of projects, to set a precedent for how entire projects are governed and technologies developed.

Within the KnowRisk project, ethical tools and principles have been implemented as a  practical way of reducing harm and risk.


Through the KnowRisk project, Digital Catapult is part of a consortium that aims to reduce the risk and impact of supply chain disruption.


In today’s globalised economy, supply chains are highly complex, where one single problem in any part of the network can impact multiple businesses.  Consumers have now witnessed first-hand the effects of labour shortages, transport delays and supply disruptions, creating a surge in demand and panic buying.Business owners have little insight into the risks within their supply chain, but with visibility across the network, they could minimise the impact of adverse events. 


The KnowRisk project uses real-time data to improve visibility in the supply chain to help organisations avoid problems, ensuring they have the right insurance in place for when things go wrong.


Using the KnowRisk solution, a business can combine its internal data, alongside accounting, insurance and legal (AIL) data, augmenting this with geospatial, Internet of Things (IoT) data and over 300 third-party data sources to create a 360-degree view of risk.


While advanced technologies progress, any regulations or legislation to govern these lag behind. As the KnowRisk project combines various technologies, includes multiple players and could potentially impact several companies financially, ethical concerns must be identified and addressed head on.


Revising the Digital Catapult Ethics Framework


To embed ethics principles into early-stage machine learning startups, Digital Catapult created an ethics framework. 


The ethics framework encourages the responsible use of algorithms and data in machine learning applications, through seven core principles:


  • Be clear about the benefits of the product or service
  • Use data responsibly
  • Know and manage the risks
  • Be worthy of trust
  • Promote diversity, equality and inclusion
  • Be open and understanding in communications
  • Consider the business model


However, compared to a single startup, KnowRisk is a multi-technology, multi-stakeholder environment that connects insurance companies and businesses of all sizes, so applying ethics is far more challenging. Therefore, it was necessary to revise the ethics framework to create a bespoke version that could meet the needs of a consortium.


For example, when raising the question of who would benefit from the platform – KnowRisk could potentially impact many global supply chains, economies and countries, where a benefit to one country could negatively affect others.


In creating visibility across the supply chain, access to this information may not benefit all stakeholders equally, giving some companies a competitive advantage. For example, larger organisations in a supply chain may exploit transparency to extract profit from smaller companies and undercut or exclude them from the supply chain.


The responsible use of data in a machine learning environment is not only an ethical consideration but fundamental to developing a high-quality product.


The distributed nature of both the KnowRisk consortium and the network of users using this platform raises a key ethical concern in maintaining the robustness of machine learning processes, while respecting the privacy of sensitive commercial data and fulfilling the need for transparency.


To address this concern, the KnowRisk platform uses a federated learning (FL) approach. Federated learning is essential when one or more data owners need to adopt machine learning solutions trained on and run using distributed confidential data. With federated learning, analysis is carried out on site, where the data is stored, to protect privacy without explicitly sharing data between parties.


Evaluating and adapting applied AI ethics tools


As part of the KnowRisk ethics workstream, Digital Catapult also adapted and tested two applied AI ethics tools to enhance the transparency and robustness of the federated learning system: 


Model score cards for federated model reporting: This model card is a living document describing the machine learning model developed for KnowRisk and adapted for a federated learning context.


Record on negative impact (RONI): Developed for use in a small consortium of organisations, this is an adaption of Reject on Negative Impact, a defence mechanism against model corruption and data poisoning attacks targeting federal learning systems.

Practical ethics can enable practitioners to truly consider, from the outset, the implications of their technology and product development to mitigate risk.


In conclusion, as a demonstration of how ethics can be applied in practice, the consortium established that identifying risks and addressing ethical concerns must be a top priority for businesses at the beginning of product development.


The focus on ethics must be embedded into a product, kept alive through its lifecycle and not fall victim to ethics washing or other corporate social responsibility activities that fail to protect against harm.

To find out more, download the KnowRisk report here.

knowrisk-logo (1)