Apple is studying mood detection using iPhone data. Critics say the tech is flawed

New information about a current study between UCLA and Apple shows that the iPhone maker is using facial recognition, patterns of speech, and an array of other passive behavior tracking to detect depression. The report, from Rolfe Winkler of The Wall Street Journal, raises concerns about the company’s foray into a field of computing called emotion AI, which some scientists say rests on faulty assumptions.

Apple’s depression study was first announced in August 2020. Previous information about the study suggested the company was using only certain health data points, like heart rate, sleep, and how a person interacts with their phone to understand their mental health. But The Wall Street Journal report says researchers will monitor people’s vital signs, movements, speech, sleep, typing habits—even the frequency of typos, according to the report—in an effort to detect stress, depression, and anxiety. Data will come from both the Apple Watch and iPhone, utilizing the latter’s camera and mic. Data obtained through Apple’s devices will be compared against mental health questionnaires and cortisol-levels data (ostensibly retrieved from participants’ hair follicles).

Apple is also participating in studies that aim to detect cognitive decline and autism in children based on iPhone and Watch data, according to The Journal. These studies are an extension of Apple’s existing interest in individual health. The company has invested heavily in tracking exercise, sleep, hearing, physical stability, menstruation, diet, and other markers of a person’s daily health. It even integrates medical data and can send a health report to your doctor.

Read more at Fast Company

The Starset Society



[mc4wp_form id=”2223″]


Have something to  share? Become a Starset Society Contributor today.