Difference between revisions of "Class"

From
Jump to: navigation, search
Line 17: Line 17:
 
== Word2Vec ==
 
== Word2Vec ==
 
Skip-Gram
 
Skip-Gram
 
 
  
  
Line 24: Line 22:
  
 
* Word Cloud
 
* Word Cloud
 
 
 
  
 
= Text Classification =
 
= Text Classification =
Line 33: Line 28:
  
 
= Financial News =
 
= Financial News =
 +
  
 
Tools:
 
Tools:
Line 40: Line 36:
 
** Skipgram
 
** Skipgram
 
** Continuous bag of words
 
** Continuous bag of words
 +
 +
Multi-channel LSTM Network
 +
Keras wih TensorFlow
 +
Utilize the CloVe and FastText Skipgram pretrained embeddings, allows he underlying network to access larger feature space to build complex features on top of.
 +
 +
Can use utilize combinations of vaious corpus and embedding methods for better performance
 +
 +
Bidirectional LSTM network is used o encode seuential information on the embedding layers.
 +
 +
Dense layer to project fnal output classification

Revision as of 11:53, 23 October 2018

https://courses.nvidia.com/dashboard

Linguistic Concepts

  • conference - anaphors
  • gang of four design
  • null subject
  • recursion

Word Embeddings

  • HMMS, CRF, PGMs
    • CBoW -Bag of Words / ngrams - feature per word/n items
    • 1-hot Sparse input - create a vector the size of the entire vocabulary
  • Stop Words
  • TF-IDF

Word2Vec

Skip-Gram


  • Firth 1957 Distributional Hypothess
  • Word Cloud

Text Classification

Text/Machine Translation (MNT)

Financial News

Tools:

  • Glove
    • dot product
  • FastText
    • Skipgram
    • Continuous bag of words

Multi-channel LSTM Network Keras wih TensorFlow Utilize the CloVe and FastText Skipgram pretrained embeddings, allows he underlying network to access larger feature space to build complex features on top of.

Can use utilize combinations of vaious corpus and embedding methods for better performance

Bidirectional LSTM network is used o encode seuential information on the embedding layers.

Dense layer to project fnal output classification