GMU:The Hidden Layer:Topics: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
Line 6: Line 6:
Made by Google, uses Neural Net, performs good on semantics.
Made by Google, uses Neural Net, performs good on semantics.
=== Installation + getting started: ===
=== Installation + getting started: ===
<code>pip install gensim</code>\\
<code>pip install gensim</code><br>
Here are some of the things you can do with the model: [http://textminingonline.com/getting-started-with-word2vec-and-glove-in-python]\\
Here are some of the things you can do with the model: [http://textminingonline.com/getting-started-with-word2vec-and-glove-in-python]<br>
Here is a bit of background information an an explanation how to train your own models: [https://rare-technologies.com/word2vec-tutorial/].
Here is a bit of background information an an explanation how to train your own models: [https://rare-technologies.com/word2vec-tutorial/].
==Fastword==
==Fastword==

Revision as of 10:09, 8 May 2017

General Information on word embeddings

For a general explanation look here: [1]

Word2vec

Made by Google, uses Neural Net, performs good on semantics.

Installation + getting started:

pip install gensim
Here are some of the things you can do with the model: [2]
Here is a bit of background information an an explanation how to train your own models: [3].

Fastword

Made by Facebbok based on word2vec. Better at capturing syntactic relations (like apparent ---> apparently) see here: [4] Pretrained model files are HUGE

GloVe

pre trained models