A Google user in China types in “Buddhism” and launches a search. A second in France searches the same term, as does a third in Boston. Will all three get the same results? New Harvard research says not necessarily, finding online search responses can vary significantly and even conflict depending on the topic and language of the query.
The variation is a result of a hidden “language bias” embedded in the search algorithms of Google, ChatGPT, YouTube, and Wikipedia, says Queenie Luo, who studies artificial intelligence ethics and early Chinese history and is co-author of a paper with her Ph.D. adviser, Michael Puett, the Walter C. Klein Professor of Chinese History and Anthropology, and Michael D. Smith, former dean of the Harvard Griffin Graduate School of Arts & Sciences who now teaches at the Harvard Paulson School of Engineering and Applied Sciences (SEAS).
Read more at the Harvard Gazette.