일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
- NLP
- 주가예측
- r
- word2vec
- 빅데이터처리
- ggplot
- 데이터분석
- AI
- Hadoop
- lstm
- CNN
- Python
- R그래프
- 하둡
- Deeplearning
- 빅데이터
- R프로그래밍
- 머신러닝
- 데이터시각화
- 기계학습
- 데이터
- pandas
- 그래프
- SQL
- R시각화
- 자연어처리
- HIVE
- 그래프시각화
- 데이터처리
- 딥러닝
- Today
- Total
목록데이터과학/개념 : NLP (6)
욱이의 냉철한 공부
* Word Representation 관점 (Word Embedding) 1. Discrete Representation : Local Representation 1) One - hot Vector - One - hot Vector 2) Count Based - Bag of Words (BoW) - Document-Term Matrix (DTM) - (TDM) - Term Frequency-Inverse Document Frequency (TF - IDF) - N-gram Language Model (N-gram) 2. Continuous Representation 1) Prediction Based (Distributed Representation) - Neural Network Language ..

* Word Representation 분류체계 1. Discrete Representation : Local Representation 1) One - hot Vector - One - hot Vector 2) Count Based - Bag of Words (BoW) - Document-Term Matrix (DTM) - (TDM) - Term Frequency-Inverse Document Frequency (TF - IDF) - N-gram Language Model (N-gram) 2. Continuous Representation 1) Prediction Based (Distributed Representation) - Neural Network Language Model (NNLM) or..

* 자료출저 및 참고논문 - 논문 GloVe : 2014, Global Vectors for Word Representation * Word Representation 분류체계 1. Discrete Representation : Local Representation 1) One - hot Vector - One - hot Vector 2) Count Based - Bag of Words (BoW) - Document-Term Matrix (DTM) - (TDM) - Term Frequency-Inverse Document Frequency (TF - IDF) - N-gram Language Model (N-gram) 2. Continuous Representation 1) Prediction Base..

* 자료출저 및 참고논문 - 강의 Coursera, Andrew Ng 교수님 인터넷 강의 - 논문 Word2Vec: 2013, Effcient Estimation of Word Representations in Vector Space * Word Representation 관점 : Word Embedding 만들기 1. Discrete Representation : Local Representation 1) One - hot Vector - One - hot Vector 2) Count Based - Bag of Words (BoW) - Document-Term Matrix (DTM) - (TDM) - Term Frequency-Inverse Document Frequency (TF - IDF) - N-..

* 자료출저 및 참고논문 - 강의 Coursera, Andrew Ng 교수님 인터넷 강의 - 논문 NPLM : A Neural Probabilistic Language Model * Word Representation 분류체계 1. Discrete Representation : Local Representation 1) One - hot Vector - One - hot Vector 2) Count Based - Bag of Words (BoW) - Document-Term Matrix (DTM) - (TDM) - Term Frequency-Inverse Document Frequency (TF - IDF) - N-gram Language Model (N-gram) 2. Continuous Repre..

* 자료출저 및 참고논문 - 강의 Coursera Andrew Ng 교수님 인터넷 강의 - 논문 Lingusitic Regularities in Continuous Space Word Representations 목차 Introduction to Word Embedding 1. Word Representation(Sparse Representation, Local Representation) -> Word Embedding(Dense Representation, Distributed Representation) 2. Using Word embeddings 3. Properties of Word embeddings 4. Embedding matrix 1. Word Representation(Sp..