Data privacy assurance serves as an example of what should be done in this space. Organisations have become quite good at working out how to evaluate whether a particular form of corporate behaviour is appropriately protective of data privacy rights of individuals. âPrivacy impact assessmentsâ are conducted by privacy officers, lawyers and other professionals who are trained to understand whether or not a particular practice in collection and handling of personal information about individuals may cause harm to those individuals,” says Prof. Leonard.
âThis provides an example of how a pretty amorphous concept â a privacy right â can be protected through use of a problem-specific risk assessment process that leads to concrete privacy risk mitigation recommendations that an organisation should implement.”
Bridging functional gaps in organisations
It is important to reduce disconnects between key functional stakeholders who need to be involved in assuring fair, accountable and transparent end-to-end decision-making. These disconnects appear across many industry sectors. One example is digital advertising services. Chief marketing officers, for example, determine marketing strategies that are dependent upon the use of advertising technology which are, in turn, managed by a technology team. Separate to this is data privacy, which is managed by a different team. Prof. Leonard says each of these teams donât speak the same language as each other in order to arrive at a strategically cohesive decision.
Some organisations are addressing this issue by creating new roles, such as a chief data officer or customer experience officer, who are responsible for bridging these functional disconnects. Such individuals will often have a background in or experience with technology, data science and marketing, in addition to a broader understanding of the business than is often the case with the CIO.
âWeâre at a transitional point in time where the traditional view of IT and information systems management doesnât work anymore, because many of the issues arise out of analysis and uses of data,â says Prof. Leonard. âAnd those uses involve the making of decisions by people outside the technology team, many of whom donât understand the limitations of the technology in the data.â
Why government regulators require teeth
Prof. Leonard was recently appointed to the NSW inaugural AI Government Committee â the first of its kind for any federal, state or territory government in Australia, to deliver on key commitments in the stateâs AI strategy. A key focus for the committee is ethics in AI. Prof. Leonard is critical of governments that publish aspirational statements and guidance on ethical principles of AI, but fail to go further.
He gave the example of Minister for Industry, Science and Technology Karen Andrewsâ announcement on ethics principles around artificial intelligence. âThis statement was published that more than 18 months ago,â he says. âI look at that, and say, âWhat good is that?â Itâs like the 10 commandments, right? Yes, theyâre a great thing. But are people actually going to follow them? And what are we going to do if they donât?âÂ
Prof. Leonard believes itâs not worth publishing statements of principles unless they go down to the harder levels of creating processes and methodologies for assurance and governance of automation applications that include âtrueâ AI and ML, and ensure that incentives within organisations are aligned to ensure good practice. Some regulation will be needed to build the right incentives, but he says organisations need to first know how to assure good outcomes, before they are legally sanctioned for bad outcomes.
Organisations need to be empowered to think their way through issues, and Prof Leonard says there needs to be adequate transparency in the system â and government policy and regulators should not lag too far behind. A combination of these elements will help reduce the reliance on ethics within organisations internally, as they are provided with a strong framework for sound decision-making. âAnd then you come behind with a big stick if theyâre not using the tools or theyâre not using the tools properly. Carrots alone and sticks alone never work. You need the combination of two,â says Prof. Leonard.
Risk management, checks and balances
A good example of the need for this can be seen in the Royal Commission into Misconduct in the Banking, Superannuation and Financial Services Industry. It noted key individuals who assess and make recommendations in relation to prudential risk within banks are relatively powerless compared to those who control profit centres. âSo, almost by definition, if you regard ethics and policing of economics as a cost within an organisation, and not an integral part of the making of profits by an organisation, you will end up with bad results because you donât value highly enough the management of prudential, ethical or corporate social responsibility risks,â says Prof. Leonard. âYou name me a sector, and Iâll give you an example of it.â
While he notes larger organisations will often “fumble their way through to a reasonably good decisionâ, another key risk exists among smaller organisations. âThey donât have processes around checks and balances and havenât thought about corporate social responsibility yet, because theyâre not required to,â says Prof. Leonard. Small organisations often work on the mantra of âmoving fast and breaking thingsâ and this approach can have a âvery big impact within a very short period of timeâ thanks to the potentially rapid growth rate of businesses in a digital economy.
âTheyâre the really dangerous ones, generally. This means the tools that you have to deliver have to be sufficiently simple and straightforward that they are readily applied, in such a way that an agile âmove fast and break things type-businessâ will actually apply them and give effect to them, before they break things that really can cause harm,â he says.
This article is republished with permission from UNSW BusinessThink, the knowledge platform of UNSW Business School. You may access the original article here.Â