The News-Letter’s Stance on Generative AI in Journalism: Upholding Integrity Over Efficiency
Embracing the Integrity of Journalism: Why The News-Letter Chooses Not to Use Generative AI
Generative AI is increasingly becoming a ubiquitous presence in various domains, from corporate environments to educational institutions. At Johns Hopkins University, students are leveraging tools like ChatGPT for homework, creative writing, and class discussions. In response to this trend, the university has launched the Hopkins AI Lab, providing access to leading language models from companies like OpenAI, Anthropic, and Meta.
At The News-Letter, we recognize the advantages that AI can bring to our daily lives. It saves time, offers instant feedback, and can stimulate creativity. However, when it comes to journalism, we believe that the core values of our craft—accuracy, accountability, and ethical reporting—cannot be compromised for the sake of efficiency. For this reason, we have decided against utilizing generative AI in our writing and editing processes. Here are three fundamental reasons guiding our decision:
1. Accountability Matters
One of the primary benefits of generative AI is its ability to furnish detailed information across various topics. Yet, it’s important to remember that AI systems can make errors, leading to misinformation or inaccuracies that could impact public perception.
While human journalists are equally capable of mistakes, it’s the responsibility to address these errors that is crucial. AI lacks the ability to own up to its shortcomings, nor can it pursue truth with the emotional intelligence that human reporters bring to their work. In a field where trust and accountability are paramount, allowing AI to take the reins would jeopardize the integrity of our reporting. Our community deserves news written by individuals who can engage with their audience sincerely and compassionately.
2. Copyright Concerns
The legal landscape surrounding generative AI is fraught with complexities, especially regarding copyright infringement. Recent high-profile cases, such as The New York Times’ legal battle with OpenAI, highlight the risks associated with using AI that sources its knowledge from protected works without appropriate credits.
By relying on generative AI, The News-Letter would be potentially publishing content that is not wholly original, thus compromising our commitment to respecting copyrights and properly crediting content creators. This dedication to originality is central to our mission of providing articles that genuinely reflect the voices of Hopkins students for the Hopkins community.
3. The Danger of Diluting Critical Thinking
Writing is more than just arranging words; it’s a critical thinking exercise that involves synthesizing information, distilling ideas, and crafting coherent narratives. Each piece published serves not only as information but also as an opportunity for deep engagement and discussion among our readers.
Utilizing generative AI could undermine this thoughtful process, reducing the opportunity for collaboration and dialogue with contributors. Our editorial process allows us to delve deeply into the ideas presented by our writers, fostering a rich exchange that enhances the overall quality of our journalism.
Upholding Local Voices
At its core, The News-Letter is committed to amplifying the voices within the Hopkins and Baltimore communities. Generative AI lacks the nuanced understanding and depth of experience that come from living and working within these specific contexts. The results generated by AI do not encapsulate a cohesive local narrative or endeavor to engage meaningfully with our audience.
As generative AI continues to advance, the boundary between human and machine-generated content blurs. While AI holds promise in various applications, the risks—particularly in journalism—are too significant to overlook. Uncertainty regarding its ethical use and the potential erosion of journalistic integrity compels us to reject AI in our publication process.
The News-Letter will continue to prioritize authenticity and accountability, ensuring that the content we create is rooted in human experience, insight, and responsibility. We believe that the future of journalism lies not in automation, but in the genuine human connections that inform and enrich our storytelling.