PDF: ChatPDF reads documents and answers questions – 05/15/2023 – Tech
[ad_1]
The ChatPDF platform has been delivering, since March, a personalized document summary service. The user places a file in PDF format on the website, and the artificial intelligence (AI) automatically delivers a synopsis and suggested questions.
People recommend using the tool on Twitter to speed up academic work and studies. The platform claims on its website to be efficient for analyzing corporate documents, such as balance sheets and sales reports.
ChatPDF, meanwhile, invents facts and people like the famous ChatGPT text-generating AI. This deviant behavior gets worse the less known and more recent the theme of the text under analysis is.
ChatPDF uses technology from the creators of ChatGPT OpenAI to generate responses. The information was given by the co-creator of the platform Mathis Lichtenberger to the report of Sheet.
For texts that are highly cited on the internet, with established concepts and well-known characters, the tool quickly delivers accurate summaries with reference to the pages of the document.
The Twitter profile IA Explorador gained more than 5,000 likes promoting ChatPDF as a “productivity tool for students.”
Users can submit three files — or 120 pages — a day for ChatPDF analysis for free. People can subscribe to the platform for $5 (R$24.55) to keep it viable. The paid version of the service supports 50 files or 2,000 pages daily.
ChatPDF uses the document to create preconditions for questions asked of OpenAI’s artificial intelligence. Therefore, the answers have to do with the PDF document.
When the AI has no answer to the question, these conditions are of little relevance. Hence the greater risk of “hallucinations”.
Prosecutor Hélio Telho tried to use the tool to summarize a criminal sentence involving several defendants and accusations. “The summary reversed some convictions – which were actually acquittals – and including convictions on charges that had not even been filed.”
ChatPDF co-creator Mathis Lichtenberger suggests that users ask the platform to mention only information present in the text or to deliver sources, with the aim of mitigating artificial intelligence confabulations.
[ad_2]
Source link