(No title)

“We should not take computer scientists at their word that the paradigms for human emotions they have developed… can produce ground truth about human emotions.”

Part of the reason is that machines are biased. Women, older employees, neurodiverse workers, and people of color are far more likely to be misread and mismeasured. What the algorithm flags as “disengagement” may simply be fatigue, cultural difference, or, god forbid, a moment of quiet reflection. Yet those misreadings can influence performance reviews, promotions, and layoffs.”

Yeah.  Computer scientists are in waaaaay over their heads on most AI applications.

Please follow and like us:

Leave a Reply

Your email address will not be published.