Meaning | Structure of Language in NLP

Learn via video courses
Topics Covered

Overview

Semantic Analysis is a subfield of the NLP discipline that helps the NLP model in generating output from the input. Semantic Analysis helps the NLP model or the computer to understand the correct context of any input text and also helps the computer to extract the right meaning depicted from the input text. Sometimes the same word can represent by two meanings depending on how it is used and it is important to extract the context from the sentence. Thus, semantic analysis helps the computer to achieve the human level of accuracy by extracting important information from the input. Word Sense Disambiguation is one of the processes of semantic analysis. It is an automatic process that identifies the context of the words present in a sentence. This helps in removing the ambiguity of the word and thus helps the NLP model in better training.

Pre-requisites

Before learning about the sentence-level constructions of NLP, let us first learn some basics about NLP itself.

  • NLP stands for Natural Language Processing. In NLP, we analyse and synthesise the input and the trained NLP model then predicts the necessary output.
  • NLP is the backbone of technologies like Artificial Intelligence and Deep Learning.
  • In basic terms, we can say that NLP is nothing but the computer program's ability to process and understand the provided human language.
  • The NLP process starts by first converting our input text into a series of tokens (called the Doc object) and then performing several operations on the Doc object.
  • A typical NLP processing process consists of various stages like tokenizer, tagger, lemmatizer, parser, and entity recognizer. In every stage, the input is the Doc object and the output is the processed Doc.
  • Every stage makes some kind of respective change to the Doc object and feeds it to the subsequent stage of the process.

Introduction

As humans, we interact with each other in our natural language. Similarly, the animals also communicate with each other in their language. Now, computers understand human language with the use of NLP or Natural Language Processing.

Humans can communicate with each other and express emotions but how can the computer detect the emotion of the human's input and provide the desired adequate output? Well, this question is answered by the semantic analysis. In this article, we will be learning about syntax and semantic analysis and we will be focusing on the topic - sentence level constructions NLP.

Semantic Analysis is quite important as a sentence (input sentence or text) provided by the user can have multiple meanings. So, how can the computer make sure that the provided input sentence has this meaning only? For example, analyze the sentence “Ram is great.” In this sentence, the speaker is talking either about Lord Ram or about a person whose name is Ram. That is why the job, to get the proper meaning of the sentence, of semantic analyzer is important. This is done by the NLP's subfield i.e. Semantic Analysis. The purpose of semantic analysis is to draw exact meaning, or you can say dictionary meaning from the text. The work of the semantic analyzer is to check the text for meaningfulness.

Let us discuss about the semantic analysis in detail in the next section along with its building blocks.

Building Blocks of Semantic System

Semantic Analysis helps the NLP model or the computer to understand the correct context of any input text and also helps the computer to extract the right emotion depicted from the input text. Thus, semantic analysis helps the computer to achieve human-level of accuracy by extracting important information from the input. Some of the use cases of the semantic analysis are:

  • chatbots,
  • search engines,
  • text analyzer,
  • machine translators, etc.

Let us now learn about the building blocks of the semantic system and its working in detail.

Semantic analysis is an important step involved in the NLP process. The semantic analysis uses the other component of the NLP process i.e. lexical analysis. So, let us first know what is a lexical analysis and what is a lexical analyzer. Lexical analysis is nothing but the process of analyzing and identifying the syntactic structure of words and phrases. In the lexical field, we are concerned with the structure formed by lexemes, whereas, in the semantic analysis field, we are concerned with the underlying meaning which finds expression in lexemes.

In the semantic analysis, first, the syntax is verified then the semantic or meaning is verified. In a more theoretical way, we can say that the semantic analysis verifies the relationship between the lexical items (the lexemes generated by the lexical analyzer).

The lexical analysis involves the following steps.

  1. It first classifies the words, affixes, and sub-words (known as lexical items) from the input sentence.
  2. It then decomposes the previously classified lexical items.
  3. Finally the similarities and the differences between the generated lexical-semantic structures are analyzed and verified.

Let us learn about the various relations involved in the semantic analyzer.

  1. Hyponymy:
    This relation tells us the connection between the generic work and the number of instances it has appeared in the text. The generic term or the word is also known as hypernym and instances. Its instances are known as hyponyms. For example, the verb to see has several hyponyms like glimpse, stare, gaze, ogle, and so on.
  2. Homonymy: This relation tells us about words that have the same spelling or the same form but have unconnected or diverse meanings. For example: rose might mean the past form of rise or a flower, – the same spelling but different meanings; hence, rose is a homonymy.
  3. Polysemy:
    This relation tells us about the phrase whose meaning is different but they are still comparable. Polysemy also depicts words that have the same spelling but the meanings are varied similarly. For example: ‘man‘ may mean ‘the human species‘ or ‘a male human‘ or ‘an adult male human‘ – since all these different meanings bear a close association, the lexical term ‘man‘ is a polysemy.
  4. Synonymy:
    Synonymy is derived from the word synonyms which means similarity. So, this relation tells us the relation between two lexical elements. It detects whether they have the same meaning but different forms or not. For example: (Job, Occupation), (Large, Big), (Stop, Halt).
  5. Antonymy:
    Antonymy is derived from the word antonyms which means dis-similarity. So, this relation also tells us the relation between two lexical elements. It detects whether they have different meanings but different forms or not. In simpler terms, it checks whether the semantic components involved are symmetric concerning an axis or not. For example: (Day, Night), (Hot, Cold), (Large, Small).
  6. Meronomy:
    This relation tells us about the logical arrangement of words and letters and it indicates whether it is a component portion of a member or anything. For example, the word finger is a meronym of the word hand because a finger is part of a hand. Similarly, the word wheel is a meronym of the word automobile.

The meaning of a sentence is represented using semantic analysis. Now, for the representation, we have several methods and processes. These processes and methods use the building blocks of the semantic system. Let us learn about the various building blocks of the semantic system.

  • Entities:
    An entity is a unit in the world of semantic analysis. So, a sentence can be described as a group of interrelated entities. Some examples of an entity can be name, position, place, organization, etc. For example in the sentence, Ram is a boy., Ram is an entity.
  • Concepts:
    Concepts represents the generalized category such as a person, organization, location, etc. For example in the sentence, Google it., Google is an organization.
  • Relations:
    Relations tell us about the relationship among the various concepts and entities of a text or a sentence. For example in the sentence, Ram is the brother of Shyam, there is a relationship depicted.
  • Predicates:
    The predicates tell us about the predicate part of a sentence i.e. the main verb structure of a sentence. For example, in the sentence The athlete ran, the predicate is run because it tells us what the athlete did.

Approaches to Meaning Representations

The meaning representation is a formal structure that is used to get the meaning of the linguistic inputs. For getting the meaning, we assume that the provided linguistic structure has some information that can be easily used to express the world's state. For example, it can be used to check if someone has praised us or insulted us.

So, to get the actual meaning, we try to break the linguistic input into the meaning structure and then start linking those structures to our real-world knowledge.

Now, we have numerous approaches to meaning representations. Let us see them.

  • Case Grammar
  • CD or Conceptual Dependency
  • Conceptual Graphs
  • Frames
  • FOLP or First-Order Predicate Logic
  • Rule-based Architecture
  • Semantic Nets

Need of Meaning Representations

As we have discussed that the meaning representation is a formal structure that is used to get the meaning of the linguistic inputs. Now, a question arises here is what the needs of using the meaning representation? Well, lets us see the various reasons for using the meaning representation.

  1. In linking the non-linguistic and linguistic elements:
    One of the prime reasons for using the meaning representation is that it helps in linking the non-linguistic and linguistic elements so that the actual sentiment can be derived.
  2. In representing the variety at the lexical level:
    The meaning representation is also used to represent the canonical forms at the lexical level.
  3. In reasoning:
    The meaning representation also helps us to check or verify whether the semantic representation infers our knowledge of the world or not

WSD

WSD or Word Sense Disambiguation is one of the processes of semantic analysis (the other is relationship extraction). It is the ability that helps to map the actual meaning of the word used in a particular context. The Parts of Speech tagging is one of the prime stages of the entire NLP process that deals with Word Sense Disambiguation and it helps in achieving a high level of accuracy of the meaning of the word.

A word can have multiple meanings according to the sentence which is known as ambiguity. So the Word Sense Disambiguation helps us to resolve such syntactic ambiguity. More formally, we can define Word Sense Disambiguation as an automatic process that identifies the context of the words present in a sentence. This helps in removing the ambiguity of the word and thus helps the NLP model in better training. So, once the model is trained well it can predict the accurate output and can match the level of human accuracy in sentiment analysis.

There are a lot of methods and approaches to Word Sense Disambiguation. Let us see them.

  • Knowledge-based Methods or Dictionary-based Methods.
  • Unsupervised Methods
  • Semi-supervised Methods
  • Supervised Methods

There are a lot of areas in which the Word Sense Disambiguation is used. Some of the use cases are:

  • IE or Information Extraction
  • IR or Information Retrieval
  • Machine Translation
  • Lexicography
  • Text Mining

Conclusion

  • Semantic Analysis helps the NLP model or the computer to understand the correct context of any input text and also helps the computer to extract the right emotion depicting from the input text.
  • Semantic analysis helps the computer to achieve human-level accuracy by extracting important information from the input. So, it is one of the phases of the NLP process of generating output from the input. Word Sense Disambiguation is one of the processes of semantic analysis is an automatic process that identifies the context of the words present in a sentence.
  • Some of the use cases of semantic analysis are chatbots, search engines, text analyzers, machine translators, etc.
  • The meaning representation is a formal structure that is used to get the meaning of the linguistic inputs. For getting the meaning, we assume that the provided linguistic structure has some information that can be easily used to express the world's state.
  • WSD helps in removing the ambiguity of the word and thus helps the NLP model in better training. Once the model is trained well it can predict the accurate output and can match the level of human-level accuracy in sentiment analysis.
  • The meaning representation helps us in linking the non-linguistic and linguistic elements, representing the variety at the lexical level, etc.