Sentence embedding

From Robowaifu Institute of Technology
Revision as of 11:15, 13 October 2022 by RobowaifuDev (talk | contribs) (Added references heading)
Jump to navigation Jump to search

A sentence embedding is a technique in natural language processing where sentences are mapped to vectors and can be used for similarity search. In transformer models this is usually achieved with a classification token but it can also be done by taking the first token of the hidden state of a transformer encoder or mean pooling over all tokens, from the last layer or multiple layers.

State of the art

As of October 2022, CLIP text embeddings from a 38M parameter model have been found out perform BERT and Phrase-BERT 110M parameter models, when using domain aware prompting on sentences from news articles (CoNLL-2003), chemical-disease interactions (BC5CDR), and emerging and rare entity recognition (WNUT 2017).[1] Without domain aware prompting, CLIP still outperformed other models on sentences from news articles.

Pretrained models

Sentence transformers provides a variety of pretrained models for sentence embeddings.

References

  1. An Yan, Jiacheng Li, Wanrong Zhu, Yujie Lu, William Yang Wang, Julian McAuley. "CLIP also Understands Text: Prompting CLIP for Phrase Understanding." 2022; arXiv:2210.05836