As state and local economies reopen, employers across the country are cautiously welcoming employees back to their jobs, fearing a resurgence of the COVID-19 outbreak. For returning workers, the workplace will be different from before, including the extent to which their privacy will be protected, especially medical and health information.
In recent weeks, the White House released guidelines for reopening the nation's economy, largely punting to state and local officials to assess whether they are sufficiently prepared to stay ahead of the COVID-19 spread and when to reopen non-essential businesses. In anticipation of bringing employees back to the workplace, the administration has also instructed employers to "develop and implement appropriate policies" to keep workers and patrons safe from contagion. That may be easier said than done.
With the rapid advance of technologies that enable constant connectivity, the once-sharp line between home and work began to blur over the last decade. Laws that govern employees' privacy rights were already lagging behind this reality when the COVID-19 coronavirus hit, and workplaces worldwide were suddenly converted into remote workplaces overnight.
The COVID-19 pandemic has forced much of the American workforce online, where employers are making use of a variety of platforms to facilitate remote work. Some of these platforms involve video recording or access by fingerprint, face scan, or retina or iris scan, which may result in the capture and storage of sensitive biometric information. As workplaces reopen, there may will likely be an uptick in the collection of biometric data as employers turn symptom screening technologies that collect biometric data, such as contactless thermometers that identify particular employees through facial recognition technology, and look tofacial recognition and retina or iris scanning technologies to facilitate contactless security access.
Even before the onset of the COVID-19 pandemic, California businesses braced for the significant impact that the new California Consumer Protection Act (CCPA) would have on their operations. This far-reaching and the first consumer protection law of its kind in the country provides consumers rights associated with how businesses collect and use their data. The law, which took effect on January 1, 2020, was pushed by consumer privacy advocates after a series of data security breaches, like the 2014 hack of Sony associated with the release of its film "The Interview," which exposed its employees' emails and personally identifiable information.
For all of its ostensible benefits and efficiencies, widespread implementation of artificial intelligence (AI) poses significant danger to workers in the U.S. and elsewhere. As employers increase the amount of employee data they collect, so does the risk that the information will be abused, placing employee security and privacy in peril.
Science fiction movies and sensational headlines warn us that artificial intelligence (AI) is going to make our jobs obsolete, widen the chasm between the very rich and the barely-surviving poor, and even develop superior consciousness. Far-fetched fantasies aside, many of AI's applications pose some very real threats to the modern workplace.
Today's computer technology improves exponentially from year to year, putting tiny, yet ever more powerful, computers in the palms of our hands, on our bodies, or even under our skin. With the proliferation of wearable "Internet of Things" devices, many new technologies that track our physical and physiological traits are moving into the workplace - yet, our privacy laws are struggling to keep up. This gap between technology and the law can put employees' privacy rights at risk.
The Ninth Circuit, in tension with the Fifth and Tenth Circuits, holds that a public employee has a federal constitutional privacy right (under due process) not to be fired from a job because of an extramarital affair with a co-worker. A concurring judge in the panel agrees with the result, but offers a narrower rationale.