Natural Language Processing

23 / 23

Positional embedding generates embeddings using cos and tan functions which allows the model to learn the relative positions of words

See Answer

Note - Having trouble with the assessment engine? Follow the steps listed here


No hints are availble for this assesment

Loading comments...