Can an Artificial Intelligence (AI) bot ‘create’ a new work or invention, with the human creator, simply being a facilitator? If so, who owns the creative work or invention? As we move to the next stage of computing and AI, it raises a number of challenges in relation to intellectual property, data, privacy and ethics. Enter the world of robots, conversational human-computer interaction and AI with us.
We will be using devices such as Amazon Alexa, Google Home and AI based gaming and computer vision, powered by Twitter chatbots, to explore these important questions for the future, through interactive activities.
You will no doubt have received the emails yourself: don’t forget to opt in, click here to stay in touch, we don’t want to lose you. The General Data Protection Regulation, or GDPR, comes into force on May 25, and organisations and businesses large and small are racing to ensure the way they collect, store and use the personal data of their customers and clients meets the new, higher standards of this new European Union privacy law.
Compliance with GDPR can be costly, requiring organisations to analyse the way they work, the data they use, how it is handled and secured. Documenting how personal data is held and processed is tedious and time consuming, as is developing procedures for dealing with individuals’ requests to see the data held on them, security breaches that involve loss of data, or assessing the privacy impact of some new product or service.
To data protection authorities across the European Union, such as the UK Information Commissioner’s Office (ICO), this is just good practice – the cost of doing business in a free and open market. But what if yours is a non-profit organisation? Several UK charities have been fined for breaking existing data protection laws. Many others are acutely aware that a single penalty for non-compliance could put them out of business.
The ICO has produced guidance for charities, and reading it you might think that the challenges charities face are the same as those facing any small business. Both have limited resources, time and money to spend on ensuring compliance. Losing or misusing personal data leads to the erosion of trust, irrespective of whether those affected are paying customers or charity donors. But scratch beneath the surface and you can see how GDPR causes unique problems for small charities, particularly those that work to help society’s most vulnerable.
Duty of care
The new privacy regulations require that personal data is “processed in a manner that ensures appropriate security of the personal data”. Any security expert will tell you that perfect security is impossible, so businesses can meet this requirement by investing in security considered “good enough” to meet the duty of care to their clients and customers.
But for charities, the duty of care they have for both their vulnerable client base and their donors is so strong that a culture of cost-cutting has formed. Because charities lack the expertise to understand the risks they face, they may wrongly believe they are avoiding risks, or accept risks without understanding the implications. Ultimately, this works against charities investing in the security they actually need. A report commissioned by the UK Department for Culture Media and Sport in 2017 found this culture even led to some charities intentionally relying on out-of-date or low technology solutions. In one case, a charity was even prepared to accept the risk of damaging data losses, in the hope that their donors would be sympathetic and appreciate that, to them, cybersecurity is a luxury they cannot afford.
The new privacy regulations are built around fair treatment, but this also fails to appreciate the ethical tensions faced by charities. Under GDPR, organisations can only collect data from individuals when they have a legal basis for doing so, for example that the individual has given their consent (such as signing up for an email newsletter), or that the organisation must do so in order to comply with a legal obligation (such as banking information required to meet money laundering regulations). However, complications arise because while an individual may give consent, they may also withdraw it.
Imagine, for example, that Bob suffers from a drug addiction. In a moment of clarity, he checks into private rehab uk for help, and gives consent for the centre to collect what personal data they require. But Bob later relapses, and – to keep this information from his family – withdraws his consent and exercises his right to be forgotten, demanding that the rehab centre deletes the data on him that it holds.
The GDPR provides some discretion for processing personal data in matters of life and death, but not if Bob is capable of giving consent. And so the rehab centre faces a dilemma: it can assert Bob isn’t capable, exposing themselves to the risk of a fine should he report them to the ICO. Alternatively, they can comply and expose Bob to future risks that may threaten his health or life, and reduce or remove the information they know that might one day help save his life.
ICO guidance for not-for-profits should answer the sorts of questions regularly raised by charities. But instead it treats small charities like any other small business. The ICO claims the is information that charities want, but it is not the information they need. If guidance fails to acknowledge the risks to small charities, what incentive do charities have to invest time and money following it?
What charities need are less platitudes on what they should be doing – they already know this – and more advice on how to do it, given the very particular challenges they face. In a speech given to the charities attending the Funding and Regulatory Compliance conference last year, the information commissioner said that getting privacy right can be done, that it should be done, and she would say how it can be done. Yet as the deadline looms, charities are still waiting to hear about the “how”.