posted by:
Jason E. B. , On: Jan. 26, 2023, 9:48 p.m.
Post update history
Subject: ChatGPT is fun, but not an author Content: H. HOLDEN THORPAuthors Info & Affiliations
SCIENCE
26 Jan 2023
Vol 379, Issue 6630
p. 313 DOI: 10.1126/science.adg7879 Data is empty
More worrisome are the effects of ChatGPT on writing scientific papers. In a recent study, abstracts created by ChatGPT were submitted to academic reviewers, who only caught 63% of these fakes. That’s a lot of AI-generated text that could find its way into the literature soon.
For years, authors at the Science family of journals have signed a license certifying that “the Work is an original” (italics added). For the Science journals, the word “original” is enough to signal that text written by ChatGPT is not acceptable: It is, after all, plagiarized from ChatGPT. Further, our authors certify that they themselves are accountable for the research in the paper. Still, to make matters explicit, we are now updating our license and Editorial Policies to specify that text generated by ChatGPT (or any other AI tools) cannot be used in the work, nor can figures, images, or graphics be the products of such tools. And an AI program cannot be an author. A violation of these policies will constitute scientific misconduct no different from altered images or plagiarism of existing works. Of course, there are many legitimate data sets (not the text of a paper) that are intentionally generated by AI in research papers, and these are not covered by this change.
Most instances of scientific misconduct that the Science journals deal with occur because of an inadequate amount of human attention. Shortcuts are taken by using image manipulation programs such as Photoshop or by copying text from other sources. Altered images and copied text may go unnoticed because they receive too little scrutiny from each of the authors. On our end, errors happen when editors and reviewers don’t listen to their inner skeptic or when we fail to focus sharply on the details. At a time when trust in science is eroding, it’s important for scientists to recommit to careful and meticulous attention to details.
The scientific record is ultimately one of the human endeavor of struggling with important questions. Machines play an important role, but as tools for the people posing the hypotheses, designing the experiments, and making sense of the results. Ultimately the product must come from—and be expressed by—the wonderful computer in our heads.
H. HOLDEN THORPAuthors Info & Affiliations
SCIENCE
26 Jan 2023
Vol 379, Issue 6630
p. 313 DOI: 10.1126/science.adg7879 Data is empty
More worrisome are the effects of ChatGPT on writing scientific papers. In a recent study, abstracts created by ChatGPT were submitted to academic reviewers, who only caught 63% of these fakes. That’s a lot of AI-generated text that could find its way into the literature soon.
For years, authors at the Science family of journals have signed a license certifying that “the Work is an original” (italics added). For the Science journals, the word “original” is enough to signal that text written by ChatGPT is not acceptable: It is, after all, plagiarized from ChatGPT. Further, our authors certify that they themselves are accountable for the research in the paper. Still, to make matters explicit, we are now updating our license and Editorial Policies to specify that text generated by ChatGPT (or any other AI tools) cannot be used in the work, nor can figures, images, or graphics be the products of such tools. And an AI program cannot be an author. A violation of these policies will constitute scientific misconduct no different from altered images or plagiarism of existing works. Of course, there are many legitimate data sets (not the text of a paper) that are intentionally generated by AI in research papers, and these are not covered by this change.
Most instances of scientific misconduct that the Science journals deal with occur because of an inadequate amount of human attention. Shortcuts are taken by using image manipulation programs such as Photoshop or by copying text from other sources. Altered images and copied text may go unnoticed because they receive too little scrutiny from each of the authors. On our end, errors happen when editors and reviewers don’t listen to their inner skeptic or when we fail to focus sharply on the details. At a time when trust in science is eroding, it’s important for scientists to recommit to careful and meticulous attention to details.
The scientific record is ultimately one of the human endeavor of struggling with important questions. Machines play an important role, but as tools for the people posing the hypotheses, designing the experiments, and making sense of the results. Ultimately the product must come from—and be expressed by—the wonderful computer in our heads.