Parsey McParseface understands short sentences!

Anyone can post and discuss breaking science news or science-related public policy, that interests them (please respect posting guidelines and be sure to reference properly).
Forum rules
Please be sure to check our forum's Rules & Guidelines

Parsey McParseface understands short sentences!

Postby Natural ChemE on May 13th, 2016, 1:09 am 

Natural language processing, Wikipedia (links omitted) wrote:Natural language processing (NLP) is a field of computer science, artificial intelligence, and computational linguistics concerned with the interactions between computers and human (natural) languages. As such, NLP is related to the area of human–computer interaction. Many challenges in NLP involve: natural language understanding, enabling computers to derive meaning from human or natural language input; and others involve natural language generation.

Major software projects like Google's search engine use natural language processing (NLP) in an attempt to:
  • help understand what people are saying to it;
  • categorize human knowledge found on websites.
Wikipedia lists some common applications here.

Google has just released Parsey McParsefac, a tool that they use in their own work. Note that parsing means converting symbols like written text (strings) into something recognizable to the system.

Announcing SyntaxNet: The World’s Most Accurate Parser Goes Open Source, Slav Petrov, Senior Staff Research Scientist at Google, via Google Research Blog (2016-05-12) (links omitted) wrote:At Google, we spend a lot of time thinking about how computer systems can read and understand human language in order to process it in intelligent ways. Today, we are excited to share the fruits of our research with the broader community by releasing SyntaxNet, an open-source neural network framework implemented in TensorFlow that provides a foundation for Natural Language Understanding (NLU) systems. Our release includes all the code needed to train new SyntaxNet models on your own data, as well as Parsey McParseface, an English parser that we have trained for you and that you can use to analyze English text.

Parsey McParseface is built on powerful machine learning algorithms that learn to analyze the linguistic structure of language, and that can explain the functional role of each word in a given sentence. Because Parsey McParseface is the most accurate such model in the world, we hope that it will be useful to developers and researchers interested in automatic extraction of information, translation, and other core applications of NLU.

How does SyntaxNet work?

SyntaxNet is a framework for what’s known in academic circles as a syntactic parser, which is a key first component in many NLU systems. Given a sentence as input, it tags each word with a part-of-speech (POS) tag that describes the word's syntactic function, and it determines the syntactic relationships between words in the sentence, represented in the dependency parse tree. These syntactic relationships are directly related to the underlying meaning of the sentence in question. To take a very simple example, consider the following dependency tree for Alice saw Bob:
    Image
This structure encodes that Alice and Bob are nouns and saw is a verb. The main verb saw is the root of the sentence and Alice is the subject (nsubj) of saw, while Bob is its direct object (dobj). As expected, Parsey McParseface analyzes this sentence correctly, but also understands the following more complex example:
    Image
This structure again encodes the fact that Alice and Bob are the subject and object respectively of saw, in addition that Alice is modified by a relative clause with the verb reading, that saw is modified by the temporal modifier yesterday, and so on. The grammatical relationships encoded in dependency structures allow us to easily recover the answers to various questions, for example whom did Alice see?, who saw Bob?, what had Alice been reading about? or when did Alice see Bob?.

[more]


Related articles:
Natural ChemE
Forum Moderator
 
Posts: 2744
Joined: 28 Dec 2009


Re: Parsey McParseface understands short sentences!

Postby TheVat on May 19th, 2016, 12:17 pm 

This is fascinating. Don't know why this thread didn't get more response. NLP seems to be at the point where it's really going to grapple with the tough nut of semantics, i.e. meaning. NLP can't really succeed until you have an AI that truly grasps meanings. Without meaning and the contextual understanding that goes with it, you will have machines that are confused by tiny little typos like, say, omitted commas. Compare the sentences....

"Let's eat, Grandma.'

"Let's eat Grandma."

A human can parse the second sentence as more likely to be a slightly mistyped suggestion to Grandma that it's suppertime, than it is an invitation to an act of cannibalism. We understand words in a deep cultural context, we form understandings in what the philosopher Willard Quine called "a web of belief." That's the sort of web that Parsey and his pals will need, to talk with humans in natural language.
User avatar
TheVat
Forum Administrator
 
Posts: 7107
Joined: 21 Jan 2014
Location: Black Hills



Return to Science News Discussion Forum

Who is online

Users browsing this forum: No registered users and 9 guests