Editing
Transformer Architecture
(section)
Jump to navigation
Jump to search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== <span style="color: #FFFFFF;">Analyzing</span> == {| class="wikitable" |+ Transformer Architecture Trade-offs ! Aspect !! Benefit !! Limitation |- | Self-attention || Captures long-range dependencies in O(1) steps || O(nΒ²) memory and compute with sequence length |- | Parallelism || Trains much faster than RNNs on modern hardware || Requires large GPU memory for long contexts |- | Scale || Performance consistently improves with more parameters and data || Training cost is enormous (millions of dollars for frontier models) |- | Context window || Modern models handle 100k+ tokens || KV-cache memory grows linearly; long-context retrieval degrades |- | Positional encoding || Sinusoidal or RoPE encodes position || Generalization beyond training context length is degraded |} '''Failure modes and nuances:''' * '''Attention sink''' β Research shows early tokens receive disproportionate attention regardless of relevance, a consequence of the softmax function needing to sum to 1. * '''Quadratic scaling''' β Self-attention complexity is O(nΒ²) in sequence length, making very long documents expensive. Mitigations: Flash Attention, sliding window attention (Longformer), linear attention approximations. * '''Position generalization''' β Transformers trained with absolute positional encodings often fail on sequences longer than seen in training. RoPE (Rotary Position Embeddings) and ALiBi improve this. * '''Repetition''' β Decoder-only models can fall into repetitive loops; nucleus sampling (top-p) and repetition penalties help. </div> <div style="background-color: #483D8B; color: #FFFFFF; padding: 20px; border-radius: 8px; margin-bottom: 15px;">
Summary:
Please note that all contributions to BloomWiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
BloomWiki:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Navigation menu
Personal tools
Not logged in
Talk
Contributions
Create account
Log in
Namespaces
Page
Discussion
English
Views
Read
Edit
View history
More
Search
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Tools
What links here
Related changes
Special pages
Page information