Author Profile Picture

Claire Williams

CIPHR

Director of people and services

Read more about Claire Williams

The socially responsible business: using technology ethically

Choosing, deploying and using digital HR tools in a socially responsible way.
istock-1132912672

In the race to keep pace with – or get ahead – of their competitors, organisations’ use of technology is accelerating rapidly. A recent survey found that 44% of businesses plan to increase their technology spending in 2020.

Consumers, on the other hand, are growing more wary of ‘big tech’. High profile data leaks and scandals such as the one involving Cambridge Analytica mean that more people are thinking twice before offering up their digital information. So why aren’t businesses more concerned?

You’ll need to prepare for any potential ethical objections, which may well centre around data privacy and security. 

In this context, when did you last kick off a digital project by outlining some guiding ethical principles behind its selection, deployment and day-to-day usage? Often, ethical principles – related to fairness, reliability and safety, privacy and security, inclusiveness, transparency, and accountability – can be overlooked when we discuss the pros and cons of deploying a new system or investing in new hardware.

There are inherent risks to using technology without considering these principles first. For example, would you sanction the introduction of a new digital tool if you knew it would create a health and wellbeing issue? Probably not – yet many businesses are switching to Slack as a primary communications tool, despite some branding it a “life ruiner” that stops you switching off at the end of the working day.

As an organisation, having a code of ethics that you stick to is good for your reputation, your profitability and performance, as well as for your employer brand. As the CIPD notes, HR professionals are ‘uniquely placed’ to encourage principled decision-making. Here’s how you can apply your ethics and principles to making decisions about technology – whether that’s choosing to invest in new HR system, mobile devices, artificial intelligence (AI), or manufacturing equipment.  

Choosing and procuring tools

There are a couple of key areas where ethics comes into play when choosing and procuring technology. The first is the group of internal stakeholders you choose to engage with during the benchmarking or selection project. These stakeholders should represent all types of user, from across your organisation. If you’re a large company looking to purchase a new HR system, for example, then your heaviest users (in terms of hours spent on the system) are unlikely to be your HR administrators – it’ll be your managers and employees whose views matter the most.

When defining this stakeholder group, ensure it is as representative as possible; bear in mind different geographies and languages, remote workers and those who spend a lot of time on the road, and those with different levels of access to and experience using desktop computers, for example. Given that one in five UK people has a disability, you’ll also need to consider those with additional access requirements, such as a need for screen magnifiers or readers, or speech recognition tools.

Just because you’ve considered the ethics, choosing and deploying technology doesn’t mean the task is complete – you’ll want to monitor that systems and hardware are still being used ethically, and that they remain fit for purpose.

Secondly, you’ll want to consider the ethics and principles that are guiding the work of the technology vendor itself. This is particularly important if you’re choosing to invest in AI tools. Take the case of a machine-learning system built by Amazon to speed up its hiring processes. The AI was trained to rate applicants based on CVs submitted to the company over a 10-year period. Since most of the CVs in the dataset were from men, the AI taught itself that male candidates were preferable, and gave lower ratings to CVs that included the word ‘women’s’ (e.g. women’s chess club). Despite editing the programmes, Amazon felt it couldn’t guarantee that gender bias wouldn’t remerge further down the line, and terminated the project. So if you’re choosing or working with an AI vendor, ask them about bias: what datasets does the programme learn from? Are the engineers creating the system to be demographically representative?

Finally, if environmental concerns are a key component of your company’s ethical principles, you’ll want to choose a vendor that is also focused on lessening its impact. Microsoft, for instance, has recently pledged it will become ‘carbon negative’ by 2040; hardware suppliers who support the principle of the circular economy will be making efforts to recycle and refurbish equipment as far as possible.  

Deploying systems

Your representative group of stakeholders should stand you in good stead for a successful deployment and implementation phase, communicating your organisation’s vision for the new technology and keeping employees up to date with the project’s progress and expected benefits.

Nevertheless, you’ll need to prepare for any potential ethical objections, which may well centre around data privacy and security. The use of wearable devices such as fitness trackers and collection of biometric data, for example, is likely to be particularly contentious.

One of the GDPR’s guiding principles is data minimisation: that personal data you process is adequate for its stated purpose, relevant to that propose and, crucially, limited to only what is necessary. Some argue that employer-provided fitness trackers collect too much data: not only are they collecting data 24/7, it’s rarely vital that the data is required in real-time (point-in-time health assessments can suffice for many office workers), and the data collected often isn’t revelatory, given that problems such as mental health, obesity and blood pressure are widespread.  

If your organisation is introducing technology or AI that will automate some activities, what impact will this have on your workforce? If jobs become redundant, does letting people go fit with your guiding ethical principles? Or should you invest in retaining, upskilling and relocating affected people? It’s estimated that around 60% of occupations could have 30% or more of their consistent activities automated – a figure that will no doubt rise as technologies become more sophisticated, making this a pressing, HR-centric challenge in the years to come.

Using digital technology

Just because you’ve considered the ethics, choosing and deploying technology doesn’t mean the task is complete – you’ll want to monitor that systems and hardware are still being used ethically, and that they remain fit for purpose. You’ll need to ensure that data is being stored and processed in accordance with the GDPR, and that the privacy and security of personal information is guaranteed by guarding against cyber-attacks and data breaches. More than half of cybersecurity incidents are caused be negligent employees. HR teams should be working closely with IT and security departments to train staff on the importance of cybersecurity, and to create a security-aware culture. If you’re providing employees with hardware and systems to use outside the office, and encouraging them to use social media or communication apps, you might need to consider revising your fair usage policies.

In addition, if you are using AI to help make decisions – such as around hiring or promotion – which employees will be held responsible for the programmes’ outcomes? How will you justify this decision-making to applicants and staff? How will you guard against the risk of bias creeping in as the AI trains itself on your datasets?

Ultimately, technology is not inherently good or bad: it’s our usage – and the principles behind that usage – that really matters. In a competitive labour market, and with high-profile misuse of data occurring in the consumer sphere, now is the time for employers to put ethics at the heart of their technology strategy, with HR professionals leading the way.

Interested in this topic? Read Ethics in the workplace: what’s the point?

Author Profile Picture
Claire Williams

Director of people and services

Read more from Claire Williams
Newsletter

Get the latest from HRZone

Subscribe to expert insights on how to create a better workplace for both your business and its people.

 

Thank you.

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to HRZone's newsletter
ErrorHere