Researchers create a system that translates ideas into text directly. Learn the details

0
11



Researchers have developed a system that translates brain signals into text directly, setting the stage for a kind of “artificial pronunciation” that helps convert ideas into text via a computer, while Joseph McCain, a researcher at the University of California, said with The Guardian and said, “We haven’t come there yet, But this may be the beginning of artificial pronunciation.

McCain and his partners described the system in a research paper in the journal Nature Nutrition. The researchers recruited epilepsy patients whose brain implanted electrodes to monitor brainwaves, according to the Emirates Future Observatory website. Participants were also asked to read the texts aloud and collect neurological information, and a database was used to train the brain signal analysis algorithm even when the participants did not read the sentences aloud .

But there is a weakness for this system, as it only works on the sentences that it is trained on, and with this, its accuracy remains excellent, as the error rate in it reaches only 3%. Or something similar.

On the other hand, OpenAI, a company backed by tech giant Elon Musk, created an artificial intelligence system called GPT-2 that could create text similar to human texts, in February, but it has not been able to launch it since this time due to concerns that it might be used in Spam and fake news.

According to the British newspaper “Daily Mail”, this robot was currently released despite these concerns, which relate to the fact that the artificial intelligence in which the tool deals is able to take a snippet of text and extrapolate that small information in a larger document.

The crisis is that the robot will be able to produce a somewhat convincing fake news story, according to the title a person enters into the machine, and this ability also includes more creative writing modifications like poetry and aphorisms that distinguish humans from machines in their writing.

GPT-2 has been trained with 8 million documents and is able to generate surprisingly coherent and compelling text.

LEAVE A REPLY

Please enter your comment!
Please enter your name here