5/5 - (3 votes)

Since its beginning, ChatGPT has been drawing attention, bringing awareness to the increasingly complex capabilities of writing tools powered by artificial intelligence (AI). Although AI tools can be a great help when conducting research and writing, it remains to be seen whether they can take the position of human authors.

While academia was still investigating the potential applications of AI tools, ChatGPT just made its official debut in the world of scientific publishing, according to a recent report in Nature. ChatGPT was listed as a co-author in four scholarly papers. As AI tools like ChatGPT and Large Language Models are used more frequently in research papers, COPE: Committee on Publication Ethics emphasizes that AI tools should not be acknowledged as authors of an article. Other organizations that share this view include the WAME (World Association of Medical Editors) and the JAMA (Journal of the American Medical Association) Network. Journal editors, researchers, and publishers are currently debating the place of such AI tools in the published literature as well as whether it is proper to cite the bot as an author in response to scientists’ objections. Publishers are scrambling to create guidelines outlining the functions and obligations of such products.

chat gpd

AI technologies have proven to be extremely beneficial but they are not without restrictions. To create output, AI systems rely on patterns and data sets. This indicates that they are only as good as the training data. The output of the AI tool will reflect any biases or faults present in the data used to train it. The accuracy and trustworthiness of the study’s conclusions are significantly impacted by this.

There’s no doubting the fact that AI has made many tasks simpler, like answering inquiries, summarizing content, composing emails, and even exchanging clever banter. But let’s face it, ChatGPT is about as effective as a fish trying to climb a tree when it comes to editing, proofreading, and getting a book ready for publication.

Without a doubt, ChatGPT has a wide range of data and is capable of identifying common writing problems, including poor grammar and spelling. However, it is comparable to a kid playing darts while wearing a blindfold when it comes to spotting subtle flaws like style inconsistencies, unsuitable phrasing, or incorrect grammar. In conclusion, even though ChatGPT is an excellent tool for creating material and providing basic input, it is advisable to leave getting your work publication-ready to human professionals. Also, it can be challenging to understand the context and the author’s intentions. The manuscript may receive updates via ChatGPT that are completely misguided, giving the impression that a machine created it. Addressing the potential effects of the academic community’s over-reliance on AI technologies is also crucial.

The crucial question of whether AI tools will ever be properly regarded as “writers” in their own right needs to be addressed last. While it’s extremely conceivable to apply AI technologies to generate writing, art, and other creative works, it’s unnecessary if the output produced by these techniques can be regarded as “original”.

Let’s analyze it. When we talk about authorship, we mean the responsibility and accountability that come with having your name attached to a piece of work. An AI tool cannot take ownership of the content that is submitted to a journal for publication. It lacks a conscience, a sense of right and wrong, and a sense of ownership over the outcome of the task. As a result, it is eliminated from the competition in terms of authorship.

It involves more than just accepting responsibility for one’s work, though. Conflicts of interest, copyright, and licensing arrangements are important aspects of the publication process. As a non-legal company, ChatGPT cannot confirm the existence or absence of conflicts of interest. Moreover, it is impossible to manage license and copyright agreements. These things need to be understood, interpreted, and decided upon by humans.

What does this entail for authors who use AI technologies to write manuscripts, create graphics or other visual components for papers, or gather and analyze data? It entails being open and honest about what AI tool was utilized and how in the Materials and Methods (or a part akin to it) of the publication. This is essential so that readers and reviewers can comprehend how much the AI tool was used and its participation in the final product. However, in terms of a manuscript’s content, the responsibility for any sections created with an AI tool still falls on you as the human author. Any transgression of publication ethics is your responsibility. Your reputation is on the line here!

What can we learn from this? First and first, it’s crucial to be open about how AI techniques are applied to the publishing and academic processes. It’s also important to keep in mind that even if AI tools help authors create their work, they are still ultimately responsible for its content. Last but not least, we need to be wary of the effects of excessive dependence on AI technologies. Technology cannot take the role of human judgment and decision-making. The ethics of authorship are complex and varied. While ChatGPT and other AI tools may not be able to match the requirements for authorship in research, they can undoubtedly help certain people who are having trouble writing.

Share This Post!

Editing More than 200,000 Words a Day

Send us Your Manuscript to Further Your Publication.

    captcha