Google s new AI language models can comprehend entire books

Get the Full StoryOne of the prime challenges of a language-based AI model is to understand the context of the surrounding content. To solve this problem, Google has introduced a new model called Reformer, which understands the context of 1 million lines using just 16GB space. The company built this to solve problems of its old model Transformer a neural network that compares words in a paragraph to each other to understand the relationship between them. Current models, support understanding of a few lines or paragraphs before and after the text in focus. However, as it uses pair matching, Transformer takes a This story continues at The Next WebOr just read more coverage about: Google