By: Karma Lei Angelo and Samuel H. Gilbert
Ministry of Fear
Mark was just hired for a new warehouse position. It’s a third-shift freight handler job with excellent benefits at a Fortune 100 company. He beat out hundreds of other applicants for the opportunity to be with this corporation.
However, his excitement on his first day begins to wane. The more he learns about the internal affairs and procedures from the orientation process, the more concerned he grows.
Before Mark can even set foot in the warehouse, he must undergo an extensive physical, which also includes the insertion of an implant in the back of his right hand. Any devices—such as smartphones or tablets—must be checked in at the security desk and have Wi-Fi enabled. ID badges with credit card chips, personal data, and GPS trackers must be worn from the neck and the information scanned must match the data on the implant. He’s also equipped with a company-issued earpiece and microphone to monitor everything he says and hears.
As if that is not physically invasive enough, he’s issued safety goggles with a two-way camera: one sees what he sees, one tracks his pupil and eye movement, documenting every blink, twitch, and focal point. Wrist and ankle bands will be required to ensure that his performance stays consistent each month, his blood pressure and other health biometrics stay within normal parameters, and he gets enough walking each day. Mark must also sign a waiver granting the company access to his vehicle at all times and allow the company to install an additional GPS tracking unit to monitor its location at all times.
Reluctantly, he accepts the terms and is outfitted with all the exterior—and interior—technology. He hopes he made the right decision to give up every aspect of his privacy. He hopes the company doesn’t use all the collected data against him.
Is this just a singular company’s attempt to control and monitor every single one of its employees? Is this their way to prevent dishonesty, frivolous lawsuits, and stealing? Or is this something darker? What happens when surveillance goes beyond opening postal mail and shooting video? Is this the new reality in the workplace?
Covert Affairs
Monitoring employees is nothing new in the workforce. Telephone monitoring has been around for decades. Government agencies have kept an eye on their own employees to prevent leaks of sensitive information. Even toll-free phone calls often start with the disclaimer: “This call may be recorded for quality assurance.”
Software monitoring can track what applications are used, websites are visited, and specific keys are utilized. It can also track the rate of keystrokes per hour and mistakes made during the process. Key logging can capture specific words or store passwords, private messages, PIN numbers, and usernames from third-party applications. Email monitoring is perhaps the most widely used form of electronic surveillance by employers. Emails, even those previously deleted, can be recovered and used in work performance evaluations or sexual harassment claims.
Location monitoring is typically used for tracking employees who do not remain in one location. Companies can track vehicles to ensure their employees deliver product as required.
Best management practices have long since included ways for employers to track their workers and run successful businesses. But that begs the questions: Do employees have an expectation of privacy when clocked in on the job? At what point do employers cross the technological privacy line and become too invasive into an individual’s life?
Get Smart
Several big-name companies stand behind the decision to use sophisticated sensors and microchips to monitor employee activity and productivity. These include companies we interact with everyday:
- UPS has outfitted their delivery trucks and are able to monitor the vehicle’s engine and whether a driver has used their seatbelt.
- The largest retailer in the nation, Walmart, has filed a patent for an audio system that can listen in on customers. While the retail giant claims this is to filter keywords by customers, the patent also states that the data could also be used to monitor employee conversations.
- In the UK, a talent management company, Crossover, monitors their remote employees by taking photos every 10 minutes through their webcams. This ensures workers are not abusing company time and claiming to work when they are not.
These trends in surveillance may seem disturbing, but many employees have accepted it. Some enthusiastic individuals are even happy with the direction technological monitoring is going in the workplace.
Some employees with health problems can have conditions caught ahead of time by monitoring biometrics. These could include stressful triggers, potential heart disruptions, weight gains or losses, or a decline in the mobility range of limbs.
In addition, the use of such monitoring technology can serve to strengthen the well known link between physical activity and mental wellness by reminding employees to go for a walk, take a break or even do some yoga if the workload is starting to impact their biometrics. A happy workforce is, afterall, a productive one.
Furthermore, the digital citizenry of the world seem to be willing to make a tradeoff in their personal lives. A recent study shows that as long as consumers were receiving some tangible benefit, such as more tailored search results and useful ads, they were okay with giving away some piece of themselves to data brokers.
It would follow, then, that framing such workplace intrusions as a matter of convenience and ease would garner buy-in from a generation that has become accustomed to commodification by data brokers.
Technology has not yet reached the point where it would invade a person’s thoughts, but it is getting close. Deep learning tech is allowing marketing companies to target consumers on an individual level. Financial institutions will soon do the same. Some programs are even able to predict future behavior, if only in the short term, but the accuracy of prediction over longer periods of time will only continue to grow. It seems given then that employers would use this opportunity to predict, monitor, and assess their employees on an unprecedented scale.
Can we trust the accuracy of this technology?
Duplicity
Researchers at Dartmouth have recently developed a mobile-sensing system which utilizes fitness bracelets, a smartphone, and custom app. They found that the system could correctly distinguish the difference between low- and high-performing employees only 80% of the time. What about the other 20%? Are those workers misjudged and targeted erroneously?
It seems possible. Tech companies are so concerned about the biases and flaws in their algorithms that they are taking active steps to combat it.
What of the right of the employee and their privacy? There are few privacy laws in place to protect workers from the prying and humanly flawed eyes of their employers:
- A teacher’s aide was suspended in 2012 for refusing to share access to her social media account.
- A worker was fired after an AI flagged his profile after the manager forgot to renew the employee’s contract.
- Reportedly, multiple warehouse employees at Amazon lost their jobs from a single facility because statistics gathered from tracking software showed a loss in productivity from those individuals over time.
The news is filled with many such reports over firings due to AI-based interferences. This is not simply a matter of employee privacy, but also a matter of deferred responsibility. If HR departments are willing to forgo some amount of paperwork and allow potentially flawed programming to make decisions, then the whole process becomes easier but also more susceptible to error.
The Game
Look at the future of brain-machine and brain-computer interfacing. BMI and BCI is already being written by the giants of the tech industry. Elon Musk and Mark Zuckerberg have invested heavily in BMI and BCI research in the hopes of using the technology to improve the workplace. Researchers have monitored brainwaves, checking attention and concentration levels in test subjects. While this may be beneficial to a few workers (such as helping to prevent truckers from falling asleep at the wheel and, in turn, preventing fatal accidents) there are always weaknesses in such systems.
Ever since the age of big data began, our information has been at risk. Now, imagine a hacker with access to your mind, with access to predictions about your behaviour. Where is the security in such a system?
Data brokers have already been hacked, exposing the inner lives of millions. Such unprecedented access could allow hackers to meddle in the economy by planting false flags on employees for termination, or on a more intimate level open up the possibility of blackmail and extortion. The mental health benefits of monitoring would be rendered null in a digital biome where anyone is merely a single breach away from being pushed into unemployment or having their thoughts thrown into the wind.
Our world is growing closer. The line between the personal and the profession, the private and the public, is growing thinner every year. The membrane that separates our lives, the firewalls constructed to keep our thoughts private and the outer world out, are weakening. There may come a time when the panopticon, that old occult symbol of all-knowledge is not just a mystical ideal, but a brutal and omnipresent reality.
Add comment