Google website offers new way to discover books and fun way to play with words

Google website offers new way to discover books and fun way to play with words
Credit: Google

Google has launched a new website called Semantic Experience. The website indicates how Google is taking word play and serious research efforts to higher levels, and it all has to do with something called semantic search.

Ray Kurzweil, a Google director of engineering, wrote in his "The Kurzweil Accelerating Intelligence" newsletter that "Semantic search is based on searching meaning, rather than on keywords or phrases."

Developed with machine learning, "it uses 'natural language understanding' of words and phrases," he said.

On that note, two developments have been announced under that umbrella Semantic Experiences website. It offers two interactive options, one called Talk to Books, and the other called Semantris.

The latter is an amusing (and difficult to just play once; you will want to revisit) game you can play for points. The game is all about word association. The game is in two modes, Arcade and Blocks; you can try each. The game also makes you aware how Google researchers have learned to predict semantically related words.

"When you enter a word or phrase, the game ranks all of the words on-screen, scoring them based on how well they respond to what you typed," said the Google Research blog. You can select words that are alike, opposite or neighboring concepts. Semantris' semantic model takes any of these.

"You may enjoy exploring how obscure you can be with your hints."

Abner Li in 9to5Google said he found the Blocks mode fun, "especially when you try to be opaque and see whether the natural language understanding can make even the most minor of connections."

Talk to Books is designed to be a useful tool that can help people find . Brittany Roston, SlashGear: Talk to Books allows users to enter a natural language phrase into a text box. "The system searches books for similar phrases, showing users which books contain them."

As such, consider if you have a research question but, as of yet, no verbatim quotation to go by, no book title to go by, no authors to go by, just the research question. And a need to know about books that have touched on the question in mind. No problem.

"Talk to Books is an entirely new way to explore books by starting at the sentence level, rather than the author or topic level," said Kurzweil and Rachel Bernstein in the Google Research Blog.

Why "Talk" to Books?

Google Research Blog: "You make a statement or ask a question, and the tool finds sentences in books that respond, with no dependence on keyword matching. In a sense you are talking to the books, getting responses which can help you determine if you're interested in reading them or not."

Also, there are no predefined rules that bound the relationship between what you put in and the results you get, they said.

The Kurzweil Accelerating Intelligence newsletter said "For example, if you ask, 'Can AIs have consciousness?,' Talk to Books returns a list of books that include information on that specific question."

In another development, the semantics focus is evident in a recently published paper,"Universal Sentence Encoder", which is now up on arXiv.

The paper describes the models for encoding sentences into embedding vectors in more detail. The paper's authors are Daniel Cer, Yinfei Yang, Sheng-yi Kong, Nan Hua, Nicole Limtiaco, Rhomni St. John, Noah Constant, Mario Guajardo-Cespedes, Steve Yuan, Chris Tar, Yun-Hsuan Sung, Brian Strope and Ray Kurzweil. Author affiliations are Google Research in Mountain View and New York and Google in Cambridge, MA.

More information: Universal Sentence Encoder, arXiv:1803.11175 [cs.CL] arxiv.org/abs/1803.11175

Abstract
We present models for encoding sentences into embedding vectors that specifically target transfer learning to other NLP tasks. The models are efficient and result in accurate performance on diverse transfer tasks. Two variants of the encoding models allow for trade-offs between accuracy and compute resources. For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance. Comparisons are made with baselines that use word level transfer learning via pretrained word embeddings as well as baselines do not use any transfer learning. We find that transfer learning using sentence embeddings tends to outperform word level transfer. With transfer learning via sentence embeddings, we observe surprisingly good performance with minimal amounts of supervised training data for a transfer task. We obtain encouraging results on Word Embedding Association Tests (WEAT) targeted at detecting model bias. Our pre-trained sentence encoding models are made freely available for download and on TF Hub.

© 2018 Tech Xplore

Citation: Google website offers new way to discover books and fun way to play with words (2018, April 16) retrieved 20 April 2024 from https://techxplore.com/news/2018-04-google-website-fun-words.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Google Books to add Creative Commons books

31 shares

Feedback to editors