Think About It: Breaking news, AI caught hallucinating! – Sequim Gazette
Sequim is fortunate to have people with diverse backgrounds living in and around it. Many come and join the pioneer families and long-term locals — I do not know how long we must live here to be considered a true local — to create a teaming environment of diversity in expertise, education, experience and values.
Greater Sequim has attracted people who vacationed here and found the area also diverse in opportunities to enjoy the beauty of mountains and sea all while standing in one place.
People who come here to retire after a long career typically come with enough money to settle and are good for our economy. Many also bring a wealth of knowledge, skill and experience built over years of working in their chosen field.
I had the opportunity to talk with a recent transplant who moved here three years ago. Editor Mike referred me to him to me for a possible column after hearing his recent experience with AI (artificial intelligence).
I was interested especially when I learned about his odd experience with AI, one that alarmed him about the potential for misinformation, distortion and confabulation with, intended or not, untold adverse consequences.
The fact he has a substantial background in AI added to the allure of the story. He held a leadership position in AI development in his industry. He discovered and utilized an industry-appropriate AI product that greatly enhanced the quality and efficiency of decision making in his industry.
I immediately adopted him as an AI-expert for this and future columns about this complicated and worrying innovation that is becoming an influence in our lives. I will refer to this data scientist as AI-expert throughout this column.
Who am I?
The story starts when AI-expert decides to use Microsoft’s Co-pilot to search himself.
Those of us who have Microsoft Edge know that most searches come with the options of asking Co-pilot to enhance the search. I have not used it enough to comment but on the few occasions I have, Co-pilot produces a narrative intended to answer the question.
He put his name in with Co-pilot and out came a person he did not recognize.
The description began with identifying AI-expert as an active resident of Sequim … “well known for his Christian faith, support for (one of the two candidates running to be our next president) …”
Then Co-pilot invites the reader to “explore perspectives related to (him).”
What follows is Co-pilot’s report of AI-expert’s point of view of a community issue expressed in his letter to the Sequim Gazette editor including a supportive opinion from another letter writer and a not supportive opinion from a third letter-writer.
Then Co-pilot added “thoughts” related to a different letter written by AI-expert for a different day and claimed he was a Canadian who visited Sequim and wrote that George Floyd’s arrest history must be considered for a “measured perspective.”
The information Co-pilot provides about AI-expert’s identity is wrong in important areas. He does not support that candidate mentioned, and he is a Christian but not well known. He is not Canadian, and he has never written an opinion about George Floyd.
The attempt to present pros and cons on an opinion expressed by him as part of perspectives related to AI-expert based on one letter to an editor was a big lift by Co-pilot.
AI-expert gave Co-pilot a second chance and entered his name again. The description started with a claim he was a resident of Sequim for about 35 years and later in the piece, writing he was a Canadian who lived across the border in Vancouver, B.C. who visited Sequim often. The piece ended with the impression he was a Sequim influencer.
Looks like Co-pilot doubled down on prior mistakes the second time. We already know AI-expert has only been in Sequim three years and is not Canadian. Neither description tells us much about the living AI-expert.
A warning!
AI-expert tells me that mistakes by artificial intelligence are called “hallucinations” in the AI industry. Turns out that using a term usually applied to humans is an intention to humanize AI.
With all due disrespect for the idea, I laughed out loud for a long-time hearing that. Shouldn’t it get an MRI, therapy or at least medication? I wondered.
Reeducation might be a better recourse given that Co-pilot contradicts itself a few sentences apart.
The main source for the hallucination was Co-pilot taking information from letters to the editor and not knowing when to stop. Co-pilot used the information in the letter by a Canadian who lives in Vancouver, B.C. and another letter that referenced George Floyd that followed AI-expert’s letters.
All that aside, what bothers AI-expert and now me is that those mistakes are about a human, are on the internet for life and do more than state facts about that human.
AI-expert has little concern about any impact on his life. I agree since he is barely recognizable in the description.
Instead, he is concerned by what appeared to be an intention to interpret or characterize views that may be controversial and may not be true instead of describing a person with objectivity that is well supported in the data.
His Co-pilot overview included a focus on the political, cultural, and philosophical divisions and tensions in our country.
To what end? AI-expert asks.
Why are the great divides part of our identity, part of our search?
What happens when Co-pilot becomes more sophisticated, and we can no longer identify the hallucinations?
Is part of our role in monitoring the development to call out the intentions built into the very beginning of AI.
AI-expert thinks so.
He informed Microsoft of his experience whose responses have been kind but unhelpful.
“I wouldn’t be surprised if it is an AI response,” he said.
Bertha Cooper, an award-winning featured columnist with the Sequim Gazette, spent her career years in health care administration, program development and consultation and is the author of the award-winning “Women, We’re Only Old Once.” Cooper and her husband have lived in Sequim more than 25 years. Reach her at [email protected].
This post was originally published on 3rd party site mentioned in the title of the post.