Wearables at work allow employers to track productivity and health indicators—and pose tricky privacy issues
Wearable electronics, like the Fitbits and Apple Watches sported by runners and early adopters, are fast becoming on-the-job gear. These devices offer employers new ways to measure productivity and safety, and allow insurers to track workers’ health indicators and habits.
For employers, the prospect of tracking people’s whereabouts and productivity can be welcome. But collecting data on employees’ health—and putting that data to work—can trigger a host of privacy issues.
The Wall Street Journal asked John M. Simpson, director of the Privacy Project at the nonprofit advocacy group Consumer Watchdog; Chris Brauer, director of innovation and senior lecturer at Goldsmiths, University of London; and Edward McNicholas, co-leader of privacy, data security and information law at law firm Sidley Austin LLP, to weigh in on how companies should handle data collected from wearables. Here are edited excerpts of their discussion.
Do I have to?
WSJ: Should employers be able to require their employees to wear wearables?
MR. BRAUER: It’s about a social contract between employer and employee. It’s in nobody’s interest to have overworked, stressed and anxious employees who often aren’t even aware of their own condition. Making things visible is a good thing if there is a culture of trust and accountability.
The real challenge is in productivity and performance. Sport science has evolved remarkably in the last 10 years, and we can expect the same from management science.
Is it reasonable for a team to expect a football player to wear a sensor in his shirt to monitor granular movement and injury susceptibility—things that video, psychologists and pitchside observers just don’t pick up? Nowadays you can’t compete at top-level sport without this kind of wearable insight and analytics.
In the near future we’ll see the same kind of thing in all fields of endeavor. In most fields it may be a similar question, not so much of whether you should be able to require wearables as whether you can compete without them.
MR. SIMPSON: Wearables that provide health data about an individual provide deeply personal information. Requiring an employee to wear such a device is an Orwellian overreach and an unjustified invasion of privacy.
Another issue to consider is just how accurate the data such devices provide actually turns out to be. There are serious questions about the accuracy of many of the apps that power these devices. Making decisions about people based on their private information is bad enough. Worse would be making decisions based on private health data that was wrong.
I don’t see how there is a legitimate place for mandatory health wearables in the workplace. Moreover, their required use would undermine employee morale, likely having a negative impact on productivity.
If the employer makes the case for access to some of the data and the employee agrees, that is a different situation. The problem is that the employee might feel under great pressure to agree to the use of their data. If data is shared on a voluntary basis, there must be provisions in place so there is no coercion.
MR. MCNICHOLAS: Some wearables will protect workers from radiation, accidents, particulate matter in their lungs. Such health protections should be treated differently.
Performance monitoring, however, raises other issues. Transparency and reasonableness strike me as key. Employers should be mandated to be transparent with their employees and to let the employees make the choice about whether it is reasonable.
When nobody is being harmed by a wearable, I think we have to acknowledge that the equation is different and leans toward more liberal use of wearables.
The danger of discrimination
WSJ: Here’s an example of how employers might use data from wearables: Imagine a company’s sales representatives wear trackers that measure sleep hours and quality. The boss has access to this data, and can use it to inform decisions.
Studying the sleep patterns of sales reps Jack and Jill, he notices that Jill slept well last night and Jack did not. He decides that Jill will make that afternoon’s client pitch, since the data gives him more confidence in her ability to perform that afternoon. Is this appropriate?
MR. BRAUER: If there is a very strong historical correlation between Jack and Jill’s sleeping patterns and their sales performance, then it makes sense to make a strategic decision to send one or the other into a big pitch using this data point. This assumes that Jack and Jill have volunteered or are contracted to wear the fitness tracker with the knowledge that the data from the device may be used by management to make these kinds of strategic resource-allocation decisions.
You’d also like to see organizations that understand sleep quality as a predictor of performance incorporating this into health and well-being strategies for their workforce—offering sleep training, for example, or sharing knowledge around anonymized and aggregated data.
We are also going to see lots of examples of individual employees developing biometric curricula vitae that indicate their productivity and performance under certain conditions, and they can use this to lobby employers or apply for jobs. So if the job requires high performance under stressful conditions, you can demonstrate in your data how you have performed under stressful conditions in the past. This primary data can potentially be a very reliable predictor of future performance.
MR. SIMPSON: Given that different people require different amounts of sleep, it would be difficult for any manager to make meaningful decisions about which employee to send on a client pitch based on how much sleep they had. I’d think past job performance and results would be much more useful.
Using private health data to apply for jobs would open the door to all sorts of unfair discrimination. A question: In this predicted world of wearable fitness devices in the workplace, would managers and executives be expected to share their private health data with employees?
MR. MCNICHOLAS: In some situations, employers need to know health information about employees in order to keep them safe. Employees operating dangerous machinery should also have some obligation to share with their employer whether they are under the influence of medicines that may impact their ability to do their job safely. The safety concerns here are often at least as much about other workers, customers, and the general public as they are about the health of the particular employee.
Perhaps an employer could get the same result by giving employees an incentive payment or award if they opt into sharing sleep patterns and hit their sleep goals.
The potential for discrimination against persons with physical or mental differences must be kept in mind. If the results of this sort of tracking led to discrimination against persons with conditions ranging from insomnia or depression or ADHD, the program would need to be reformed. White House and Federal Trade Commission reports on big data have highlighted the potential for the new world of big data to lead to such results.
The rubber will hit the road when we have artificial intelligence analyzing the massive data sets that will be created by the information coming from these wearable devices. To my mind, we should not deny ourselves the potential benefits of these technologies by banning them, but we must keep a critical eye on particular implementations of such technologies in order to ensure that they do not become new ways of discriminating against people based on any number of illegal and illicit criteria.
Duty to disclose?
WSJ: If employers collect employee data that indicate the employee may have a serious health condition, should they notify the employee? For example, some employers use retinal scans for identification when employees enter a secure facility, and retinal scans can indicate diabetes. Furthermore, should the employer be required to test employee data for serious illnesses?
MR. MCNICHOLAS: Imposing duties to scan data and warn people about information from big-data analysis could lead to crushing financial burdens on the development of big data. If every person with any data had some sort of obligation to analyze it for every possible issue, how could anyone know if you have conducted enough analyses? Indeed, even if you did every analysis that you could imagine, maybe you could learn something more if you conducted one more test later?
It would be far more reasonable to ask an employer to make a good-faith effort to inform an employee when it happens to learn about a risk to employee health. It seems like an entirely different issue to use the threat of class action to force an employer to conduct tests solely in order to gain that knowledge.
Eventually some tests might become so inexpensive and commonplace that they will be conducted as a matter of course, but it will be important even then for employees to have some ability to decline to have sensitive personal data analyzed for nonemployment reasons. Some employees may well object to such testing, even if their employers could conduct it for free. At this point, the only thing that is clear is that we are at the cusp of what could be a dramatically different relationship between employers and employees.
MR. SIMPSON: If the retinal data collected in this case happens to show a problem, then the employer should inform the employee. I see this as something that is discovered while doing the security check. However, analyzing the retinal data for other purposes would be wrong. Employers should not be analyzing their employees’ personal health information and certainly should not be required to do so.
Ms. Haggin is a reporter for The Wall Street Journal in San Francisco. Email her at [email protected].