Skip links

  • Skip to primary navigation
  • Skip to content
  • Skip to footer
AAA (All About AI)
  • PRML
  • ML
  • DLF
  • TS0
  • TS
  • MAMBA
  • LLM
  • TAB
  • DIFF
  • AUDIO
  • ASSET
  • STUDY
  • GIT
  • SQL
  • PYTHON
  • FP
  • DOCKER
  • KUBER
  • CS
  • MLOPS
  • JAVA
  • R
  • OS
  • CV
  • NLP
  • GAN
  • RL
  • RS
  • SSL
  • CL
  • CO
  • GNN
  • DA
  • BNN
  • META
  • CONT
  • RELI
  • INTE
  • MULT
  • ABSA
  • HBERT
  • STAT
  • DE
  • PPT
  • ETC
  • about me

    (발표 자료) Negative Sampling & Hierarchical Softmax

    자연어 처리를 위한 딥러닝 (인공지능학과 전공) 논문 발제 자료

    less than 1 minute read

    Seunghan Lee

    Seunghan Lee

    Deep Learning, Data Science, Statistics

    • Seoul, S.Korea
    • Email
    • GitHub
    • Email

    Negative Sampling & Hierarchical Softmax

    2021/03/16에 “자연어처리를 위한 딥러닝” 수업에서 진행했던 논문 발제 자료

    Paper : Distributed Representations of Words and Phrases and their Compositionality

    Tags: Hierachical Softmax, Negative Sampling, NLP, Word2vec

    Categories: NLP, PPT

    Updated: May 3, 2021

    Twitter Facebook LinkedIn
    Previous Next

    You May Also Enjoy

    8 minute read

    2 minute read

    5 minute read

    14 minute read

    • GitHub
    • Email
    • Feed
    © 2025 Seunghan Lee. Powered by Jekyll & Minimal Mistakes.