As digitization steadily transforms the landscape of business, new opportunities continue to unfold. The depth, breadth, and sheer volume of information that organizations and employees have at their disposal today is unprecedented. Employers now have the ability to examine their workforce in remarkable detail and have strong incentives to do so. Employee data tells a story about the organization’s workforce – how employees spend their time, how productive they are, and how different factors contribute to their output. The information is now available, and while digitization unlocks new opportunities for understanding, it also raises new concerns.
Beyond legal considerations, progressive employers should be mindful of the ethical standards they adhere to while utilizing this information. Collecting and analyzing workforce data without appropriate communication and purpose may cause unease and distrust among employees. It raises a question: what are the ethical implications of people analytics?
Technological advancement and human acceptance of its outcomes are not always aligned. For example, to assess productivity, Worksmart by Crossover takes photos of employees at their desks every 10 minutes. While this may provide organizations with valuable insight, this type of technology risks creating an environment employees perceive as “big brother”-esque. Employees who feel under surveillance and lack trust in their employer may be less engaged. One study estimated that 81% of people analytics projects are jeopardized by ethics and privacy concerns.
In this article, we first explore the shifting legal landscape around industry uses of personal data. We then discuss the matter of privacy, considering both the employer’s perspective and employee rights. Next, we provide recommendations to assist organizations in building a data policy that allows leaders to adopt analytical approaches to people management without compromising employees’ rights — or dampening their spirits.
The law provides an important starting point when considering the collection and use of employee information. But what does the law have to say about these matters? We are now in what may be seen as the Wild West of digital rights — that is, rights of employees in the digital era.
On May 25, 2018, the EU’s GDPR went into effect. This legislation put forward a comprehensive set of regulations designed to protect the rights of individuals with regards to companies using their data. GDPR may be seen as a milestone in ethics for the digital era in that it focuses on individual rights, and the penalties for breaching it are severe. GDPR heavily emphasizes consent for use of personal data, and in an employment context, the meaning of this is quite strict.
Apart from cases where the use of data is in the employee’s best interest, consent is considered invalid because of the imbalanced nature of an employee-employer relationship. The EU website gives the example of an employer requesting employees consent to install CCTV cameras in the hallway and the washroom entrance in order to monitor time spent outside of the office. In this case, even if employees do consent, it would not be considered valid and therefore the employer could not install cameras on this basis.
The penalties for breaching GDPR can also be very strict. According to the text, “companies who fall foul of the regulation and are found to be misusing personal information face stiff fines of up to €20m or 4% of annual worldwide turnover, whichever is the greater of the two.” The stricter nature of this approach further indicates that protecting individuals’ privacy rights is of real concern and is now being addressed seriously.
While this represents a significant advancement of employee rights in the digital era, the scope of GDPR is somewhat limited. Most significantly, it applies only to EU citizens. It also primarily protects personally identifiable information, leaving open questions around anonymized or non-identifiable data.
Ethics, in this case, is defined as the gap between what, in legal terms, companies can do when it comes to collecting and analyzing personal data, and what they should do. Although the purpose of the law is to discourage unethical behavior, it is nearly impossible for legislators to keep pace with technological advancement. For this reason, considering the implications of people analytics beyond the scope of the law is of great importance.
Privacy is defined as “the ability to regulate how much information about one’s self is known to others” (e.g. Westin, IBM), and is widely considered a universal human right. The United Nations’ Universal Declaration of Human Rights states that “No one shall be subjected to arbitrary interference with his privacy.”
However, since this declaration was written in 1948, the information landscape has drastically changed. The availability of information has increased exponentially, and organizations’ capacity to gather, store, and analyze data has advanced by leaps and bounds. Today, our browsing history, shopping habits, and even our time spent sitting in traffic contribute to our digital footprint, which can be collected and sold by third parties. These new streams of information have made accessing data on employee’s personal lives extremely simple.
This shifting landscape raises new questions, such as: What data do employers have the right to access? What rights should employees (or prospective employees) have in terms of what data is collected on them?
Privacy Matters during Recruitment and Hiring
The internet and social media are blurring the lines between public and private life, making it easier than ever before to acquire an individual’s personal information. To illustrate this point, a 21-year-old Russian photography student ran a project called “Your Face is Big Data”. He photographed 100 subway passengers using his phone (most claimed not to notice), then used facial recognition software to identify these individuals and retrieve their social media profiles. For individuals between 18 and 35, he claimed it was extremely easy to find their profiles, illustrating that our private lives are getting easier and easier to trace.
The availability of information on the internet opens up new opportunities for employers, particularly when it comes to recruitment and hiring. Hiring the right people is critical to organizations’ success, and the cost of a wrong hire can be very high. To avoid these costs, employers are motivated to make more informed decisions by seeking additional information on prospective job candidates via the internet. In a 2017 survey of 2,300 hiring managers sponsored by CareerBuilder, 70% reported using social media to screen candidates.
Furthermore, 54% reported finding information on social media that led them not to hire a prospective candidate for an open role. The most commonly cited factor was the candidate posting provocative or inappropriate content.
Do prospective employees anticipate that they will be googled by employers, and manage their online presence accordingly? Research by Execunet found that 82% of employees expect prospective employers to google them. Curiously, however, only 33% reported bothering to google themselves.
Beyond a quick online search, Employers have been found to dig deep for prospective employees’ personal information. In some cases, third-party vendors are used to acquire information on social media usage, online shopping activity, online gaming sites, and online auctions such as eBay.
What does this imply for employee rights? If a prospective employee fails to adjust their privacy settings on Facebook and their salacious photographs are visible to a prospective employer, is this reasonable ground for exclusion? A comparable candidate with equally questionable photos posted may be considered if the photos are not visible to the employer.
- Define the job and expected competencies as the first step. Let each interviewer rate the potential candidate on these parameters.
- Have a clear policy on what information is used during hiring, and be sure to make it known to potential job candidates.
- Be consistent. Sources of information that are assessed for one candidate must be assessed for all candidates.
- Have processes in place to ensure that this policy is followed.
- Background checks should be conducted by a professional third party, rather than hiring managers, so that no personal information (marital status, etc) is revealed.
Privacy Matters in Gathering Workforce Data
The new and growing field of People Analytics uses workforce data to support people management decisions for all human resource functions – from resourcing, to change management to diversity initiatives.
Business leaders are focused on meeting business objectives and maximizing value for investors or shareholders. When it comes to the workforce, leaders have incentives to identify strong performers as well as poor performers. Gaining a picture on productivity levels is another concern, and may be of particular interest to Canadian employers, as Canadian productivity has been shown to be lower than American levels across firms of every size.
Data science is developing methods to reveal surprising insights from big data. For example, Target gathers data on its customers’ shopping behavior. Using this data, the American retailer’s data science team was able to identify when customers were pregnant. While this is in the realm of consumer behavior, many employees would fear their employer learning such personal information.
These days, data is collected automatically through routine activities. For example, our search history provides information from browsers that employers can use to track what websites were visited at what time, and by whom. Does this data belong to employers? Is it reasonable for them to use it to evaluate employees’ productivity?
Considering the employer’s perspective, with much activity now happening online, it may be easier for employees to attend to personal matters in the office. Prior to the advent of the internet, it would have been difficult to engage in personal activities such as shopping or social conversations with friends while at work. However, without monitoring search history and internet activity, it is difficult to gauge whether employees are focused on company matters during company time.
When employees are asked whether they consent to the collection of information, it is important to consider whether they are making this choice freely or under duress. If an employee feels they are at risk of facing negative consequences for declining to provide data, their freedom in making this decision will be compromised. For this reason, we recommend that managers remain blind to whether or not their team members have provided consent to ensure employees who do provide data do so freely, and without concern for their superior’s reaction.
In terms of the collection of data, consideration should be given to when, how, and by whom the data collection practices are shared. Language should be easy to understand, not legalese. In addition, employers should consider whether consent is an opt-out or an opt-in process. This will ensure procedural transparency so that employees can make a fully informed decision.
Should there be a distinction between personal and anonymized data? While of course, as an employee, it is logical to expect to be asked for consent before employers collect personally identifiable data, what about anonymized data? If data cannot be attributed to a specific individual, and only general, aggregate trends are analyzed, should employees have the choice to opt-out?
Our view is that obtaining consent is still advisable. Even if data is anonymized, obtaining employees consent before collecting data will ensure that they do not feel distrustful of the employer. Feelings of mistrust threaten to reduce engagement and productivity, thereby defeating the purpose of the data collection in the first place.
- Be transparent about what data is collected and for what purposes. For example, browsing patterns. This should be reviewed at the time of hire (ideally earlier), and regularly revisited.
- Obtain explicit consent from employees on the collection of any non-routine information. Request employees renew consent on a regular basis, such as once every quarter.
- Ensure managers are blind to employees’ decisions around whether or not to provide data.
- Avoid using people analytics to identify individuals. Remove identifiers and examine in the aggregate.
Our recommendations for ethically approaching employee data collection and analysis rest on one central theme: transparency. Having clear policies in place for these practices will foster fairness, clarity, and in turn, trust. We recommend building a policy that carefully considers the interests of both the employer and the employee.
The most effective approach to creating policies on the collection and use of personal data will depend on the social norms of the country in which your organization is located. Not all countries and cultures have the same attitudes towards personal data collection. A study of 17,432 workers in 24 countries was conducted by IBM to assess how willing individuals across different countries are to share their personal information with their employer.
The study revealed variation both between countries and within countries in worker attitudes towards sharing information with employers. Canada ranked third in terms of highest resistance towards sharing data with employers, behind Germany and the USA. These norms around privacy are important to consider when establishing a company policy. In Canada, where individuals are generally uneasy and hesitant to share personal information with their employers, policies that allow employees to opt-in rather than opt-out may be better received.
However, while the country rankings outlined by IBM give an idea of how employees may feel towards personal data collection, these are not absolute. In India for example, which ranked the lowest in terms of resistance towards sharing data with employers, there were still numerous individuals who displayed high levels of resistance.
This demonstrates that comfort levels with personal data sharing may vary significantly across organizations and individuals. Therefore, understanding both the cultural norms around privacy within your country, and within the organizational culture itself will lead to the most effective policy.
The world is changing – and quickly. Data is increasing in unprecedented volumes, and going forward, this trend will only increase.
While people analytics is now being used to leverage this data and drive organizational efficiency, it remains a largely uncharted territory when it comes to the law. While legislation like the European Unions’ GDPR is emerging, technology may advance faster than legislators can keep up with. For this reason, as new tools and techniques to harness the power of people data continue to emerge, it is important to consider the ethical implications of these practices as they may be beyond the scope of the law.
We recommend providing transparent information on the type of data will be collected, and its purpose. It is good practice to outline the benefits and potential risks, and how the data collected will be used to enhance the organization. In addition, allow employees to opt-out in cases where data collection is not essential, and keep managers blind to this decision.
Creating clear, consistent and up-to-date policies for all members of the organization will ensure that the expectations of employees and employers are aligned. In all, we believe the most important factor to be transparency, as it reduces the risk of employees feeling under surveillance or losing trust in their employer.
Following these guidelines will allow organizations to make the most of people analytics without breaching employee trust and confidence. After all, the purpose of people analytics is to increase organizational effectiveness – so measures should be taken to ensure it does just that.
 Insight222 People Data Ethics & Privacy Research, November 2017
About the Authors
Heather Mann, PhD
Heather has long been interested in the science of human decision-making. She holds three degrees in Psychology, including a PhD from Duke University where she was advised by behavioural economist Dan Ariely. In recent years, Heather has become increasingly interested in how data can be used to enhance decision-making. She consults for tech and social enterprise clients in this domain. Previously, Heather led research and analytics for Ayogo, a health tech company. She is the COO of Percipient Solutions Ltd.
Claire is a Researcher at Percipient Solutions and a fourth-year student at the University of Victoria. She is completing a Bachelors of Science in Economics with a minor in Commerce. Claire is motivated by a desire to understand how individuals make decisions, and how data can be used to optimize this process.