UNC journalism professors contend with artificial intelligence – INDY Week

13 minutes, 5 seconds Read

UNC-Chapel Hill’s Carroll Hall sits just off the main quad and is home to the Hussman School of Journalism and Media. Twelve granite steps lead up to its shallow east-facing portico shaded by a pediment held aloft by six Ionic columns, a nod to the Greeks. 

On this sunny spring day, the building’s would-be journalists are away, out for spring break. On one side of the lobby the words of the First Amendment are carved into the wall. On the other is the Hussman credo, which reads in part, “The pursuit of truth is a noble goal of journalism. But the truth is not always apparent or known immediately.” 

You can say that again.

Given the advances in generative artifical intelligence (AI) and its rapid deployment across the media landscape, this doesn’t seem likely to change anytime soon. For those charged with teaching the craft of journalism, the question is how to help the next generation of reporters manage through this seismic shift in technology, consistent with their historical role of trying to give readers, listeners, and viewers the information they need to parse out the “truth.”

To understand how educators are tackling this challenge, the INDY spoke with several prominent journalists, including Hussman dean Raul Reis, about how AI is impacting the teaching and practice of the profession. The application of AI has broad implications, especially in this year’s presidential election, and going forward the role of reporters in supplying facts and sussing out misinformation and disinformation remains as crucial as ever. 

Fastest-growing consumer application

Introduced by OpenAI in November 2022, the generative AI–driven chatbot ChatGPT quickly became the fastest-growing consumer application in history, according to the investment firm UBS. But what exactly is meant by generative AI? 

Put that query to ChatGPT and here’s what you get: “Generative AI refers to a class of artificial intelligence techniques that are designed to generate new data or content that resembles and often expands upon the patterns and characteristics of existing data. These systems learn from a dataset and can create new instances of data that are similar to the examples they were trained on.”

Specialized processors, better algorithms, and access to massive data sets are among the developments that make all this possible. Its impact is being felt everywhere. To again cite the chatbot, “It [generative AI] has shown promising results in creating realistic images, generating human-like text, synthesizing music, and much more. However, it also raises ethical concerns, particularly regarding the potential for generating misleading or harmful content, as well as issues related to intellectual property and privacy.”

Hussman’s Reis was named dean in July 2022, four short months before ChatGPT burst on the scene. Prior to joining UNC, Reis spent six years as dean of the School of Communication at Emerson College in Boston and before that, as dean of the School of Journalism and Mass Communication at Florida International in Miami. He has worked as a reporter for both U.S. and Brazilian news organizations.

UNC Hussman Dean Raul Reis Credit: Photo courtesy of the subject

“Since I became an academic it’s been a nonstop train of emerging technologies,” says Reis. He places AI in the continuum of other new technologies that have disrupted the profession over the years, different in scale, possibly, but not in type. 

“I think we [Hussman] can take a leadership position in the way AI is adopted,” Reis says of the school’s role in helping manage this transition. “We want to be part of the process of determining what is the future of the industry and how the industry is disrupted.”

Reis points out that UNC’s journalism school has a long history of innovation in data-driven journalism, pioneered by the recently deceased Phil Meyer, who has been called the “father of computer-assisted reporting.” To that end, Reis anticipates working closely with other schools within UNC, including the data science and business schools. 

In considering how AI might play out in the classroom, Reis offers nothing revolutionary but rather is inclined to emphasize the basics. 

“We teach critical thinking,” he says. “Not only data gathering, but how do you process that data? How do you separate truth from hallucinations [i.e., incorrect predictions resulting from the use of incomplete or biased data]?”

The last major disrupter, the internet, laid waste to a broad swathe of the journalism profession, and newsrooms around the world continue to feel its impact. What was true in the pre-internet age is even more true in the time of AI as misinformation is spread across a growing range of news and social media platforms, burying facts under an avalanche of data-driven deepfakes, further undermining trust in an industry whose credibility is already at or near an all-time low, according to research from Gallup.

In this context, AI is seen simultaneously as both a powerful tool and an existential threat. That dichotomy was captured in a recent internal memo to employees from Mathias Döpfner, the CEO of the German publisher Axel Springer, which owns Politico among other outlets. 

“Artificial intelligence,” Döpfner wrote, “has the potential to make independent journalism better than it ever was—or simply replace it.” 

Shannon McGregor, an associate professor at Hussman and a principal investigator with the Center for Information, Technology, and Public Life, holds a less Manichean view. 

“It’s another technology but because it’s so new we tend to get this great excitement and also this moral panic,” McGregor says. “This is a cycle we tend to get over and over again—it’s the great hope, but also it’s going to ruin everything. Like many other technologies, most of the time it’s going to be used in pretty benign ways, but there will always be the outlier cases that can have these huge impacts.”  

Former Hussman professor Chris Roush, who recently stepped down from his role as dean of the School of Communications at Quinnipiac University to return to practicing journalism, says that in one sense, “everyone becomes an editor for AI” when it is used as a research tool. 

This can be good or bad. On the bad side, reporters can become increasingly dependent on a less-than-transparent technology to source the facts of a story or, with generative AI, to create the story itself. But AI can also be useful, summarizing lengthy documents, analyzing historical data, and writing reports, for example. 

“If AI can write basic stories, it frees journalists up for longer, more analytical pieces that are more important to society,” Roush says.

Transparency is clearly an important issue, as is disclosure. But transparency in the sense that a reader understands the process through which AI derives a result is not really possible in many instances. Disclosure, too, has its own set of challenges. McGregor questions how journalists can gauge the level of disclosure they should use when they’ve employed some type of AI. 

“A lot of that is an open question,” she says. “On the one hand, you could say that disclosure in any use case makes sense. But I don’t disclose when I’ve used the grammar editor on [Microsoft] Word.” 

Everyone is worried

This year’s focal point for those worried about disinformation and misinformation is likely to be the U.S. presidential election, but the issue is one of global concern. In a survey conducted at the World Economic Forum in Davos, Switzerland, in January, attendees agreed that “false or wrong information poses the biggest danger to the world in the next two years,” according to a story in Bloomberg. This beat out other perennial worries such as extreme weather and involuntary migration.

“At Bloomberg, AI is probably the most anticipated and discussed subject, and that’s not going to change,” says Matthew Winkler, who cofounded Bloomberg News in 1990 and now serves as editor in chief emeritus. UNC Hussman has partnered with Bloomberg since 2017 to give students from diverse backgrounds the opportunity to learn the fundamentals of journalism.

Matt Winkler of Bloomberg News Credit: Lori Hoffman/Bloomberg News

There’s general agreement that adhering to what has traditionally been considered best practices will go a long way toward inoculating the industry against potential AI-driven mischief. 

“If you obey best practices, you’re aware of how your reporting can go awry,” says Winkler. “Is it transparent? Do I trust this data? How do I know this is true?” 

“Journalists have to pay attention to second sourcing and verification when there is the potential that things have been manipulated,” says McGregor, noting that this will be particularly important around breaking news, where there is intense time pressure and “supercharged” issues. As in other areas, AI can cut both ways, making it easier to fact-check in some instances, for example, while in others “it [AI] makes it harder to discover the provenance of things,” she says.

Born in Indiana

The ability to sort through massive amounts of data to identify patterns—and through these patterns some kind of useful knowledge—is the defining feature of basic AI. Generative AI takes this a step further, creating new content by learning from the data. Almost by definition, these tasks aren’t replicable by humans in any practical time frame; the data sets are too big. 

There is an old saying in the technology industry that “software is its own proof.” In other words, if it works, it works. But no code or algorithm can fully anticipate every circumstance in which it might find itself. That lack of traceable attribution can be a problem, as anyone who has seen those videos of automated cars driving into a dead-end street over and over again can attest.

“Artificial intelligence has the potential to make independent journalism better than it ever was—or simply replace it.”

Mathias Döpfner, AXEl Springer ceo

“ChatGPT continues to insist that I was born in Indiana,” Winkler says of an exercise in which he asked the algorithm to perform what should have been a simple task: pulling up his bio. (He was born in New York City.) “So if you’re a journalist and you think AI is going to do the work for you, forget about it,” he says. “AI cannot replace the journalist if we care about the things that matter.”

Presumably, ChatGPT will eventually discover where the Bloomberg News cofounder was born, but Winkler makes a larger point about how the profession needs to be taught in the world of AI. 

“Journalism schools have a big role to play in teaching the people who are going to lead our profession that one thing has never gone out of style is the need to verify whatever you report before you share it,” he says. “If it’s unique, it can’t possibly be taken at face value until we know why it’s unique. When something is three or four standard deviations from normal and there hasn’t been some biblical event, it has to be treated with great care and suspicion.” 

First, let’s fire all the reporters

In this world, the journalist remains an indispensable actor. But if reporters are more needed than ever, you wouldn’t know it from recent news coming out of the industry. January of this year saw massive layoffs, with more than 500 journalists losing their jobs, according to a report from Challenger, Gray & Christmas, an employment consulting firm. Some of this reflects the continued impact of the internet on business models, but the rapid adoption of AI across newsrooms is almost certainly a factor as well.

JournalismAI—a global initiative from the London School of Economics think tank Polis that’s supported by the Google News Initiative—recently surveyed more than 120 editors, journalists, technologists, and media makers from 105 large and small newsrooms around the world on their use of AI. In its report on the research, the organization describes the growth as “explosive.” 

Eighty-five percent of the survey’s respondents indicated they have experimented with generative AI technologies to author content and better interact with audiences, among other applications. More than 80 percent expected the use of AI to expand over the near term. Current uses included news gathering (75 percent), news production (90 percent), and news distribution (80 percent).

As to generative AI, those polled cite potential uses that include exploring new angles and perspectives that had not been considered before, generating hypotheses and scenarios that had not been previously considered, and creating more personalized content for readers. 

“Generative AI can enhance our journalistic skills and values and empower us to produce more relevant and impactful stories in ways that we can’t even imagine,” said one survey respondent.

Of course one individual’s personalized content is another’s news silo, so there are potential downsides as well. Sixty percent of those surveyed expressed concern about the “ethical implications of AI integration for editorial quality and other aspects of journalism,” including “values like accuracy, fairness, and transparency.” Debiasing—which goes to the sourcing of data by AI as well as the algorithms that underlie the technology—was cited as a major worry. 

There was also this: the concern that generative AI has made possible the “production and distribution of disinformation at a scale we haven’t seen before,” coupled with the inability to cross-check a wholly manufactured story. 

“AI-generated content is trickier to debunk because there’s no reference material to cross-check it with,” said another survey respondent. “It’s completely a work of fiction as opposed to a photo that has been manipulated—with this kind of disinformation you at least have an original photo with which to compare the false version.”  

Better, deeper fakes, in other words.

A critical role

Responsible, public-interest journalism plays a critical role in the health of society. If the JournalismAI survey is any indication, those in the industry believe it will continue to evolve and survive and that its uniquely human qualities will allow it to find a way to coexist with the increasingly humanlike intelligence of generative AI and whatever comes next. 

“News organizations have shown remarkable resilience and innovation in sustaining and sometimes thriving despite the challenges they have faced,” the JournalismAI survey concludes. “It might even be that in a world where gen AI is such a power, for ill as well as good, public interest journalism will be more important than ever.” 

Winkler, for one, maintains his faith in the role that journalists have to play. 

“If you’re interested in the pursuit of the truth, AI is not going to get you there today, tomorrow, or anytime soon,” he says.  

But of course, not everyone is interested in the pursuit of truth.

Mike MacMillan is a freelance writer based in Chapel Hill. He writes regularly on financial issues at his Substack, “Ask Archer.” 

Comment on this story at [email protected].

#placement_703913_0_i{width:100%;max-width:550px;margin:0 auto;}

This post was originally published on this site

Similar Posts