Morphology Syntax
How to read this page: This article maps the topic from beginner to expert across six levels � Remembering, Understanding, Applying, Analyzing, Evaluating, and Creating. Scan the headings to see the full scope, then read from wherever your knowledge starts to feel uncertain. Learn more about how BloomWiki works ?
Morphology and Syntax are the branches of linguistics that deal with the structure of language. Morphology is the study of the internal structure of words (how they are built from smaller units of meaning called morphemes), while Syntax is the study of the rules that govern how words are combined to form grammatically correct sentences. Together, they provide the "blueprint" of a language, allowing speakers to generate an infinite number of novel expressions from a finite set of components. Understanding these structures is essential for everything from child language development to the engineering of natural language processing (NLP) systems.
Remembering
- Morphology — The study of the structure and formation of words.
- Syntax — The study of the rules for combining words into sentences.
- Morpheme — The smallest unit of meaning in a language (e.g., "un-", "break", "-able").
- Free Morpheme — A morpheme that can stand alone as a word (e.g., "dog", "walk").
- Bound Morpheme — A morpheme that must be attached to another to have meaning (e.g., prefixes like "re-" or suffixes like "-ed").
- Inflectional Morphology — Adding morphemes to change a word's grammatical function (e.g., "walk" -> "walked", "cat" -> "cats").
- Derivational Morphology — Adding morphemes to create a new word or change its category (e.g., "happy" -> "happiness").
- Constituent — A word or group of words that functions as a single unit within a sentence (e.g., a noun phrase).
- Phrase Structure Rules — Rules that generate the underlying structure of a sentence (e.g., S -> NP VP).
- Word Order — The typical arrangement of Subject, Verb, and Object in a language (e.g., English is SVO).
- Head — The core word in a phrase that determines its category (e.g., the noun "cat" in "the big cat").
- Argument — A phrase required by a verb to complete its meaning (e.g., "the ball" in "John kicked the ball").
- Adjunct — An optional phrase that adds extra information (e.g., "in the park" in "John kicked the ball in the park").
- Universal Grammar (UG) — The theory that all human languages share a common underlying syntactic structure.
Understanding
Language is a hierarchical system, not just a linear string of words.
The Building Blocks of Words (Morphology): Words are often complex machines. Consider the word "disproportionately": 1. dis- (bound prefix: negation) 2. pro- (bound prefix: forward/for) 3. portion (free root: a part) 4. -ate (bound suffix: verbalizer) 5. -ly (bound suffix: adverbializer) By understanding these pieces, we can decode unfamiliar words and create new ones.
The Architecture of Sentences (Syntax): Syntax tells us who did what to whom. Even if a sentence is nonsensical, we can recognize its structure. Noam Chomsky's famous example: "Colorless green ideas sleep furiously" is syntactically perfect but semantically void.
- Phrase Structure: Sentences are built from "blocks" like Noun Phrases (NP) and Verb Phrases (VP).
- Recursion: A key feature of human syntax is the ability to nest phrases within each other ("The dog that chased the cat that ate the rat..."). This allows for infinite sentence lengths.
- Deep vs. Surface Structure: A single "deep" idea ("John kicked the ball") can be expressed in different "surface" forms ("The ball was kicked by John").
Applying
Simulating a Basic Phrase Structure Parser: <syntaxhighlight lang="python"> def simple_parser(sentence):
"""
A toy parser that identifies basic constituents
(assuming SVO structure).
"""
words = sentence.split()
if len(words) < 3: return "Incomplete sentence"
# Mock tags (in real NLP, use NLTK or spaCy)
subject = words[0]
verb = words[1]
obj = " ".join(words[2:])
return {
"Sentence": sentence,
"NP (Subject)": subject,
"VP (Predicate)": {
"V": verb,
"NP (Object)": obj
}
}
- Parsing "The_giant_robot crushed the_city"
print(simple_parser("Robot crushed city"))
- In real linguistics, we use 'Tree Diagrams' to show this hierarchy.
</syntaxhighlight>
- Structural Diversity
- Analytic Languages → Rely on word order and separate particles (e.g., Mandarin Chinese, English).
- Synthetic Languages → Use complex morphology/endings to show relationships (e.g., Latin, Russian).
- Polysynthetic Languages → Can express an entire sentence as a single complex word (e.g., Inuktitut).
- SVO vs SOV → English says "I ate the apple" (SVO); Japanese says "I apple ate" (SOV).
Analyzing
| Feature | Inflectional | Derivational |
|---|---|---|
| Changes Category? | No (walk -> walks) | Yes (teach -> teacher) |
| Meaning Change | Minimal / Grammatical | Significant / Semantic |
| Productivity | High (applies to most words) | Varied (may be restricted) |
| Order | Occurs after derivation | Occurs closer to the root |
The Ambiguity of Structure: Syntax resolves (or creates) ambiguity. Consider: "I saw the man with the telescope." 1. Interpretation A: I used a telescope to see the man (Telescope is an adjunct of 'saw'). 2. Interpretation B: The man was holding a telescope (Telescope is an adjunct of 'man'). Linguists use "tree branching" to show which interpretation is intended based on how the phrases are grouped.
Evaluating
Evaluating linguistic theories:
- Generative Capacity: Can the theory explain how children produce sentences they have never heard?
- Typological Coverage: Does the syntactic model work for non-European languages?
- Processing Speed: How does the brain parse complex syntax in real-time?
- Evolutionary Origin: How did the capacity for recursion emerge in humans but not in other primates?
Creating
Future Directions:
- Computational Morphology: Building AI systems that can handle "unseen" words by analyzing their morphemes (crucial for languages like Turkish or Finnish).
- Universal Grammar 2.0: Using big data from thousands of world languages to find the true universals of human speech.
- Syntactic Priming: Researching how using one sentence structure makes a speaker more likely to use it again (revealing the "latent" structures in the mind).
- Formalizing Sign Language Syntax: Proving that sign languages have the same structural complexity as spoken languages.