Article type
Abstract
Background: Artificial Intelligence (AI) is at the forefront of a revolution in several areas of society, including science, where it automates tasks and analyses complex data. One of the most notable advancements is the development of AI prompts, which are instructions or questions formulated to generate specific responses from an AI model. The application of these technologies in the writing of scientific articles promises to speed up and simplify the process, ranging from initial conception to final formatting. This study analyses different AIs, such as GPT, Bard, Monica, SciSpace, Consensus, and SciBERT, to explore their capabilities and limitations in assisting scientific production, such as rapid and systematic review. Objectives: To investigate the effectiveness of many Artificial Intelligences in supporting the writing of scientific articles, to improve the efficiency and quality of scientific production. Methods: Meticulous benchmarking of AIs specializing in scientific writing. The evaluation criteria were carefully selected to cover aspects such as automation of bibliographic research, efficiency in the generation of coherent and accurate texts, data analysis capacity, and the ease of referencing and formatting documents. Each AI was subjected to a standardized set of tasks, designed to simulate the various phases involved in scientific writing. The results of these tests were systematized in a comparative table, allowing an analysis of the most effective tools to meet specific research needs. It should be noted that all the platforms evaluated have features for grammar correction and are compatible with the English language. Results: The analysis revealed significant differences in the capabilities of the AIs examined. Tools such as GPT and SciBERT stood out for their efficiency in generating coherent text and analysing data. Platforms such as SciSpace and Consensus have been notable in automating literature search and organizing references. We also identified common functionalities between the tools, evidencing an essential base of resources for scientific writing. Conclusions: The choice of the most appropriate tool should consider the specific needs of the project and the researcher. An integrated approach, which combines the strengths of multiple AIs, is recommended to maximize the efficiency of the scientific writing process.