Focused photo. Suspicious man passes lie detector in the office. Asking questions. Polygraph test
Focused photo. Suspicious man passes lie detector in the office. Asking questions. Polygraph test.

“Given that we suffer so much fake news and disinformation spreading, there is a benefit to these technologies.”

Image by standret on Freepik

A polygraph test ostensibly measures a person’s breathing rate, pulse, blood pressure, and perspiration to figure out if they’re lying or not — though the 85-year-old technology has long been debunked by scientists. Basically, the possibility of false positives and the subjectiveness involved in interpreting results greatly undermines the usefulness of the polygraph as a lie detector. Tellingly, their results are generally not admissible in US courts

Because it’s 2024, researchers are now asking whether artificial intelligence might help. In a new study published in the journal iScience, a team led by University of Würzburg economist Alicia von Schenk found that yes, it just might — but, as MIT Tech Review reports, it also led to experimental subjects making more accusations overall, in yet another warning about the far-flung risks of replacing human intuition with algorithms.

First, the researchers asked participants to write down statements about their weekend plans. If they successfully lied about their plans without being found out, they were given a small financial reward.

The collected statements were then used to train an algorithm based on Google’s large language model BERT. The scientists found that it was capable of telling if a given statement was a lie with a success rate of 67 percent, a significant improvement over humans, who tend to only get it right 50 percent of the time.

Von Schenk and her colleagues then offered a separate group of volunteers the option to use the algorithm for a small fee to detect lies or rely on their own human intuition.

Only a third of the volunteers were willing to use the tool — but they also became power users.

More here

Categories:

Tags:

Comments are closed