Semantics (from Ancient
Greek: σημαντικός sēmantikós)[1][2] is
the study of meaning. It focuses on the relation between signifiers,
like words, phrases, signs, and symbols, and what
they stand for, their denotation. Linguistic semantics is the study of meaning
that is used for understanding human expression through language. Other forms
of semantics include the semantics of programming languages, formal logics, and
semiotics.
The word semantics itself denotes a range of ideas, from the popular to
the highly technical. It is often used in ordinary language for denoting a
problem of understanding that comes down to word selection or connotation.
This problem of understanding has been the subject of many formal enquiries,
over a long period of time, most notably in the field of formal semantics. In linguistics,
it is the study of interpretation of signs or symbols used in agents
or communities
within particular circumstances and contexts.[3]
Within this view, sounds, facial expressions, body language, and proxemics
have semantic (meaningful) content, and each comprises several branches of
study. In written language, things like paragraph structure and punctuation
bear semantic content; other forms of language bear other semantic content.[3]
The formal study of
semantics intersects with many other fields of inquiry, including lexicology,
syntax,
pragmatics,
etymology
and others, although semantics is a well-defined field in its own right,
often with synthetic properties.[4]
In philosophy of language,
semantics and reference
are closely connected. Further related fields include philology,
communication,
and semiotics.
The formal study of semantics is therefore complex. Semantics contrasts with syntax,
the study of the combinatorics of units of a language (without reference to
their meaning), and pragmatics,
the study of the relationships between the symbols of a language, their
meaning, and the users of the language.[5]In
international scientific
vocabulary semantics is also called semasiology.
|
Linguistics
In linguistics,
semantics is the subfield that is devoted to the study of meaning, as
inherent at the levels of words, phrases, sentences, and larger units of discourse
(termed texts). The basic area of study is the meaning of signs, and the study of relations between
different linguistic units and compounds: homonymy, synonymy, antonymy, hypernymy, hyponymy, meronymy, metonymy, holonymy,
paronyms. A key concern is how meaning attaches to larger chunks of text,
possibly as a result of the composition from smaller units of meaning.
Traditionally, semantics has included the study of sense
and denotative reference, truth
conditions, argument structure, thematic
roles[disambiguation
needed], discourse analysis, and the linkage of all of
these to syntax.
Montague grammar
In the late 1960s, Richard
Montague proposed a system for defining semantic entries in the lexicon in
terms of the lambda calculus. In these terms, the syntactic parse of the
sentence John ate every bagel would consist of a subject (John) and
a predicate (ate every bagel); Montague demonstrated that the meaning of
the sentence altogether could be decomposed into the meanings of its parts and
in relatively few rules of combination. The logical predicate thus obtained
would be elaborated further, e.g. using truth theory models, which ultimately
relate meanings to a set of Tarskiian universals, which may lie outside the
logic. The notion of such meaning atoms or primitives is basic to the language of thought hypothesis from the 1970s.
Despite its elegance, Montague
grammar was limited by the context-dependent variability in word sense, and
led to several attempts at incorporating context, such as:
- Situation semantics (1980s): truth-values are incomplete, they get assigned based on context
- Generative lexicon (1990s): categories (types) are incomplete, and get assigned based on context
In Chomskyan
linguistics there was no mechanism for the learning of semantic relations, and
the nativist view considered all semantic
notions as inborn. Thus, even novel concepts were proposed to have been dormant
in some sense. This view was also thought unable to address many issues such as
metaphor or
associative meanings, and semantic change, where meanings within a linguistic
community change over time, and qualia or subjective experience. Another issue not addressed
by the nativist model was how perceptual cues are combined in thought, e.g. in mental
rotation.[6]
This view of semantics, as an
innate finite meaning inherent in a lexical
unit that can be composed to generate meanings for larger chunks of
discourse, is now being fiercely debated in the emerging domain of cognitive linguistics[7]
and also in the non-Fodorian camp in philosophy of language.[8]
The challenge is motivated by:
- factors internal to language, such as the problem of resolving indexical or anaphora (e.g. this x, him, last week). In these situations context serves as the input, but the interpreted utterance also modifies the context, so it is also the output. Thus, the interpretation is necessarily dynamic and the meaning of sentences is viewed as context change potentials instead of propositions.
- factors external to language, i.e. language is not a set of labels stuck on things, but "a toolbox, the importance of whose elements lie in the way they function rather than their attachments to things."[8] This view reflects the position of the later Wittgenstein and his famous game example, and is related to the positions of Quine, Davidson, and others.
A concrete example of the latter
phenomenon is semantic underspecification – meanings are not
complete without some elements of context. To take an example of one word, red,
its meaning in a phrase such as red book is similar to many other
usages, and can be viewed as compositional.[9]
However, the colours implied in phrases such as red wine (very dark),
and red hair (coppery), or red soil, or red skin are very
different. Indeed, these colours by themselves would not be called red
by native speakers. These instances are contrastive, so red wine is so
called only in comparison with the other kind of wine (which also is not white
for the same reasons). This view goes back to de Saussure:
Each
of a set of synonyms like redouter ('to dread'), craindre ('to
fear'), avoir peur ('to be afraid') has its particular value only
because they stand in contrast with one another. No word has a value that can
be identified independently of what else is in its vicinity.[10]
and may go back to earlier Indian
views on language, especially the Nyaya view of words as indicators and not carriers of meaning.[11]
An attempt to defend a system
based on propositional meaning for semantic underspecification can be found in
the generative lexicon model of James
Pustejovsky, who extends contextual operations (based on type shifting)
into the lexicon. Thus meanings are generated "on the fly" (as you
go), based on finite context.
Theories in semantics
Originates from Montague's work
(see above). A highly formalized theory of natural language semantics in which
expressions are assigned denotations (meanings) such as individuals, truth
values, or functions from one of these to another. The truth of a sentence, and
more interestingly, its logical relation to other sentences, is then evaluated
relative to a model.
Pioneered by the philosopher Donald Davidson, another formalized
theory, which aims to associate each natural language sentence with a
meta-language description of the conditions under which it is true, for
example: `Snow is white' is true if and only if snow is white. The challenge is
to arrive at the truth conditions for any sentences from fixed meanings
assigned to the individual words and fixed rules for how to combine them. In
practice, truth-conditional semantics is similar to model-theoretic semantics;
conceptually, however, they differ in that truth-conditional semantics seeks to
connect language with statements about the real world (in the form of
meta-language statements), rather than with abstract models.
Lexical and conceptual semantics
This theory is an effort to
explain properties of argument structure. The assumption behind this theory is
that syntactic properties of phrases reflect the meanings of the words that
head them.[13]
With this theory, linguists can better deal with the fact that subtle
differences in word meaning correlate with other differences in the syntactic
structure that the word appears in.[13]
The way this is gone about is by looking at the internal structure of words.[14]
These small parts that make up the internal structure of words are termed semantic
primitives.[14]
Lexical semantics
A linguistic theory that
investigates word meaning. This theory understands that the meaning of a word
is fully reflected by its context. Here, the meaning of a word is constituted
by its contextual relations.[15]
Therefore, a distinction between degrees of participation as well as modes of
participation are made.[15]
In order to accomplish this distinction any part of a sentence that bears a
meaning and combines with the meanings of other constituents is labeled as a
semantic constituent. Semantic constituents that cannot be broken down into
more elementary constituents are labeled minimal semantic constituents.[15]
Semantic models
Terms such as semantic
network and semantic data model are used to describe
particular types of data models characterized by the use of directed
graphs in which the vertices denote concepts or entities in the world, and
the arcs denote relationships between them.
The Semantic
Web refers to the extension of the World
Wide Web via embedding added semantic metadata, using
semantic data modelling techniques such as Resource Description Framework (RDF)
and Web Ontology Language (OWL).
Psychology
In psychology,
semantic
memory is memory for meaning – in other words, the aspect of
memory that preserves only the gist, the general significance, of
remembered experience – while episodic
memory is memory for the ephemeral details – the individual features,
or the unique particulars of experience. Word meaning is measured by the
company they keep, i.e. the relationships among words themselves in a semantic
network. The memories may be transferred intergenerationally or isolated in
one generation due to a cultural disruption. Different generations may have
different experiences at similar points in their own time-lines. This may then
create a vertically heterogeneous semantic net for certain words in an
otherwise homogeneous culture.[19]
In a network created by people analyzing their understanding of the word (such
as Wordnet)
the links and decomposition structures of the network are few in number and
kind, and include part of, kind of, and similar links. In
automated ontologies
the links are computed vectors without explicit meaning. Various automated
technologies are being developed to compute the meaning of words: latent semantic indexing and support vector machines as well as natural language processing, neural
networks and predicate calculus techniques.
Ideasthesia
is a rare psychological phenomenon that in certain individuals associates
semantic and sensory representations. Activation of a concept (e.g., that of
the letter A) evokes sensory-like experiences (e.g., of red color).
What is
semantics?
Semantics is a sub discipline of linguistics which
focuses on the study of meaning. Semantics tries to understand what meaning is
as an element of language and how it is constructed by language as well as
interpreted, obscured and negotiated by speakers and listeners of language.[1]
Semantics is closely linked with another sub
discipline of linguistics, pragmatics, which is
also, broadly speaking, and the study of meaning. However, unlike pragmatics,
semantics is a highly theoretical research perspective, and looks at meaning in
language in isolation, in the language itself, whereas pragmatics is a more
practical subject and is interested in meaning in language in use.
Semantics is the study of meaning, but what do we mean by
'meaning'?
Meaning has been given different definitions
in the past.
Meaning = Connotation?
Is meaning simply the set of associations that a word
evokes, is the meaning of a word defined by the images that its users connect
to it?
So 'winter' might mean 'snow', 'sledging' and 'mulled
wine'. But what about someone lives in the Amazon? Their 'winter' is still
wet and hot, so its original meaning is lost. Because the associations of a
word don't always apply, it was decided that this couldn't be the whole story.
Meaning = Denotation?
It has also been suggested that the meaning of a word is
simply the entity in the World which that word refers to. This makes perfect
sense for proper nouns like 'New York' and 'the Eiffel Tower', but there are
lots of words like 'sing' and 'altruism' that don't have a solid thing in
the world that they are connected to. So meaning cannot be entirely denotation
either.
Meaning = Extension and Intention
So meaning, in semantics, is defined as being Extension:
The thing in the world that the word/phrase refers to, plus Intention: The
concepts/mental images that the word/phrase evokes.[2]
Semantics is interested in:
- How meaning works in language:
The study of semantics looks at how meaning works in
language, and because of this it often uses native speaker intuitions about the
meaning of words and phrases to base research on. We all understand semantics
already on a subconscious level, it's how we all understand each other when we
speak.
- How the way in which words are put together creates meaning:
One of the things that semantics looks at, and is based
on, is how the meaning of speech is not just derived from the meanings of the
individual words all put together, as you can see from the example below.
The Principle of Compositionality says that the
meaning of speech is the sum of the meanings of the individual words plus the
way in which they are arranged into a structure.
- The relationships between words:
Semantics also looks at the ways in which the meanings of
words can be related to each other. Here are a few of the ways in which words
can be semantically related.
Semantic relationship
|
Definition
|
Example
|
Synonymy
|
Words are synonymous/ synonyms when they can be used to
mean the same thing (at least in some contexts - words are rarely fully
identical in all contexts).
|
Begin and start,
Big and large, Youth and adolescent. |
Antonyms
|
Words are antonyms of one another when they have
opposite meanings (again, at least in some contexts).
|
Big and small,
Come and go, Boy and girl. |
Polysemy
|
A word is polysemous when it has two or more
related meanings. In this case the word takes one form but can be used to
mean two different things. In the case of polysemy, these two meanings must
be related in some way, and not be two completely unrelated meanings of the
word.
|
Bright- shining and bright- intelligent. Mouse- animal
and mouse- on a computer.
|
Homophony
|
Homophony is similar to polysemy in that it
refers to a single form of word with two meanings, however a word is a
homophone when the two meanings are entirely unrelated.
|
Bat- flying mammal and bat- equipment used in cricket.
Pen- writing instrument and pen- small cage. |
[2]
- The relationships between sentences:
- Ambiguity:
One of the aspects of how meaning works in language which
is studied most in semantics is ambiguity. A sentence is ambiguous when it has
two or more possible meanings, but how does ambiguity arise in language? A
sentence can be ambiguous for either (or both!) of the following reasons:
Lexical Ambiguity: A sentence is lexically ambiguous when
it can have two or more possible meanings due to polysemous (words that have
two or more related meanings) or homophonous (a single word which has two or
more different meanings) words.
Example of lexically ambiguous sentence: 'Prostitutes
appeal to the Pope'. This sentence is ambiguous because the word 'appeal'
is polysemous and can mean 'ask for help' or 'are attractive to'.
Structural Ambiguity: A sentence is structurally ambiguous if
it can have two or more possible meanings due to the words it contains being
able to be combined in different ways which create different meanings.
Example of structurally ambiguous sentence: 'Enraged
cow injures farmer with axe'. In this sentence the ambiguity arises from
the fact that the 'with axe' can either refer to the farmer, or to the act of
injuring being carried out (by the cow) 'with axe'.[2]
Semantics in the field of Linguistics
Semantics looks at these relationships in language and
looks at how these meanings are created, which is an important part of
understanding how language works as a whole. Understanding how meaning occurs
in language can inform other sub disciplines such as Language
acquisition, to help us to understand how speakers acquire a sense
of meaning, and Sociolinguistics, as the
achievement of meaning in language is important in language in a social
situation.
Semantics is also informed by other sub disciplines of
linguistics, such as Morphology, as
understanding the words themselves is integral to the study of their meaning,
and Syntax, which
researchers in semantics use extensively to reveal how meaning is created in
language, as how language is structured is central to meaning.
References
[1] http://www.universalteacher.org.uk/lang/semantics.htm
[Accessed 3.05.2012]
[2] Wood, G.C., (2011). Lecture on
Introduction to Semantics at the University of Sheffield.
No comments:
Post a Comment