Self Attention
# Self-Attention: Relating Different Parts Of A Sequence Ever wondered how a machine can understand the context of a word within a sentence, just like you do? How does it know that "bank" refers to a financial institution in one sentence and the side of a river in another? The secret lies in a revolutionary mechanism called **Self-Attention**. This article dives deep into the core of self-attention, a fundamental building block of [The Transformer Architecture](https://techielearn.com/learn/nat