Artificial Intelligence Policy

“Artificial Intelligence” (AI) in this policy means computer- or software-based language systems, especially large language models (LLMs) like ChatGPT, Claude, Gemini and similar systems. Large language models generate text by mathematically predicting the most likely words in a sequence. They do so based on an analysis of very large databases of pre-existing human written texts, and through this analysis they map the statistical relationship between words and phrases across the text.

These language systems have three main uses:

  1. Transcription – Large language models can be trained to take in audio of human speech and output a text transcript of that speech.
  2. Recognition and translation – LLMs can recognize text in a variety of formats, and in a variety of languages, and translate or transpose that text into other formats and other languages.
  3. Text generation – Given a prompt, LLMs can produce new text. Here they are taking the prompt as the start of a text, and the new text they produce is the statistical output of their prediction regarding the next most likely words.

The Manchester Mirror has in a limited way used Large Language Models in the first two categories, but has not and will not use LLMs in the third way.

In the first case, recording and transcribing audio is a long standing journalistic practice. Machine transcription has been widely used in the industry for years, and LLM based transcription is especially good and efficient. All Mirror transcripts are always checked by the writer against the original audio, and the original audio is preserved as a record.

In the second case, LLMs are very good at recognizing text in non-text and/or non-editable file formats. The Mirror is often sent text in these formats. For example, we receive tables of names in PDFs where it is either impossible or impractical to extract the names by hand. LLMs can easily parse those files and output raw text, which we then again check against the original for accuracy.

In the third case, The Manchester Mirror has not and will not use an LLM to produce new text. We will also not knowingly accept submissions that were in whole or in part generated by an LLM.

The core mission of The Manchester Mirror is to collect and present factual and true information. The first two instances, when LLMs are used as support tools, advance this mission. They make gathering and presenting factual and true information faster and more efficient, allowing us to bring you more information in greater detail. This is especially important when direct quotes of public figures are needed to clarify events in our community.

By contrast, text that is generated by a mathematical prediction system like an LLM can neither be factual nor true.

Any “fact” produced by the system will be incidental, and will always have to be checked by a human. This is neither fast nor efficient. The pernicious reality of these systems is that they are trained to be persuasive first, and accurate second. The consequence is that they are very good at, and often can’t avoid, inventing facts that “sound good” ESPECIALLY when those facts are absent. One convincing lie embedded among ten, fifty or a hundred facts makes the whole thing useless.

But more importantly, a computer system cannot produce truth. Humans are expressed through language. When we read something written by a person, or when we read a quote spoken by a person, we encounterare encountering that person. The encounter between a reader and a writer, or between a listener and a speaker, is reality truth. If we receive a statement from a public official that was generated by an LLM, it is less than meaningless. It does not and cannot correspond to anything happening inside of a person. So it has no value.

If you need help composing any piece of writing for the paper, please reach out to us. We will support and assist you.

A newspaper should work toward factual accuracy and human truth. The Manchester Mirror will use any tools that advance this mission, and. But we will reject any systems that impede this mission.

Adopted July 26, 2025

An Addendum on Facts and Truth.

by Fritz Swanson

Both facts and truth are knowledge. Knowledge is any statement which corresponds to reality, where the speaker understands how it corresponds. This is a principle called Justified True Belief.

Knowledge is first an expression of a person. People know things. Rocks don’t know things.

Knowledge is an expression of both observation and reason. If I say, “I have a car” that statement rests both on my observation that there is a car behind my house, and the reasoning that I know that my signature is on the legal title of the car, and my understanding of what a title is, and how car ownership works. So, even a simple fact statement like “I have a car” rests on some pretty complicated metaphysics that happens inside of me.

So, to know a fact is a special human quality. When we report something, it’s because we know a fact.

Truth is any fact where part of our knowledge of it rests on reasonable faith. So I can know that I own a car in a factual sense. But the statement “My mother loves me” is a truth rather than a fact because part of it rests on my faith in her.

Truth is also a reportable thing in journalism, but it is more subtle to capture. It’s why accurate quotes and statements from public figures are so important to reporting. It’s why reporting on events over time is so important. Because we aren’t just capturing facts alone. We are also trying to ascertain hidden but real things about people and events.

If we say “Politician A is a good person” that statement may be true, but we can only know it to be true by comparing that person’s statements and actions to our own sense of morality over time. And yet the “goodness” of a public official is as important to journalism as the numbers on the budget. More so in fact. But the goodness of any public person can’t be checked directly the way we check budget numbers, or car ownership. It’s a real thing, but it requires a lot of data points, a lot of judgment, and a certain amount of faith. This is because it is a hidden thing. A truth is a fact where some key element is permanently hidden, and therefore must be deduced through reason.

A computer can’t reason, and it can’t have faith. It can’t “know” things. So even if it contains statements that correspond to reality, we can’t say that these things are either facts or truth. They only become facts and truth in the mind of a reasoning person. Therefore a computer can’t generate useful text. It can capture text for us to use. But it can’t generate it. This is currently a definitional reality, as we understand it.

An Addendum on the Addendum

by Fritz Swanson

We normally wouldn’t delve into metaphysics when describing a policy for the paper. But the arrival of text generating machine systems like Large Language Models has revealed a society wide confusion about what writing is, what facts are, and what we are as people.

It is not enough to say what we will or will not do with AI. We feel that it is important to explain WHY. Because it is clear that throughout society there are elements of reality which have never been explained or explored. Whether we are the best voice to do that work is beside the point.

This policy is our individual attempt as an organization to clarify our thinking during all of this confusion. We are open to comment and correction. We believe that humans in conversation can get better, and know more. We don’t assert here that we have a definitive answer. But we want to demonstrate that we have given this matter due consideration.