Positional Encoding

Positional Encoding: Also Known As Positional Embedding

Positional Encoding is an easy way to let the model know where the words are placed in a sentence.

Here's an example:

"Darth Vader swung the lightsaber."

Now here's what might happen without positional encoding:

"lightsaber swung the Darth Vader."

Now, as funny as it would be, and you can probably write a story with a model about a lightsaber swinging sith lords and jedis around, it doesn't make much sense.

To make sure that the placement of words stays stable throughout a generation, positional encoding is used to hold those words in place.