GMU:The Hidden Layer:Topics: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
Line 3: Line 3:
For a general explanation look here:
For a general explanation look here:
[https://blog.acolyer.org/2016/04/21/the-amazing-power-of-word-vectors/]
[https://blog.acolyer.org/2016/04/21/the-amazing-power-of-word-vectors/]
As wordvector algorithms
==Word2vec==
==Word2vec==
Made by Google, uses Neural Net, performs good on semantics.
Made by Google, uses Neural Net, performs good on semantics.
Line 42: Line 44:
* [https://github.com/3Top/word2vec-api#where-to-get-a-pretrained-models https://github.com/3Top/word2vec-api Mostly GloVe, some word2vec, English, Trained on News, Wikipedia, Twitter]
* [https://github.com/3Top/word2vec-api#where-to-get-a-pretrained-models https://github.com/3Top/word2vec-api Mostly GloVe, some word2vec, English, Trained on News, Wikipedia, Twitter]
* [https://github.com/facebookresearch/fastText/blob/master/pretrained-vectors.md https://github.com/facebookresearch/fastText/blob/master/pretrained-vectors.md: Fasttext, all imaginable languages, trained on Wikipedia]
* [https://github.com/facebookresearch/fastText/blob/master/pretrained-vectors.md https://github.com/facebookresearch/fastText/blob/master/pretrained-vectors.md: Fasttext, all imaginable languages, trained on Wikipedia]
* [https://radimrehurek.com/gensim/scripts/glove2word2vec.html https://radimrehurek.com/gensim/scripts/glove2word2vec.html convert between GloVe and Word2Vec Format]
* [https://levyomer.wordpress.com/2014/04/25/dependency-based-word-embeddings/ https://levyomer.wordpress.com/2014/04/25/dependency-based-word-embeddings/ an interesting approach that gives similarities between syntaktically equivalent words]