현재 가장 많이 쓰이는 어텐션 메커니즘(Attention Mechanism)에 대해
아래 논문을 바탕으로 정리한 내용입니다.

논문에 나오는 Global/Local Attention 내용에 추가로 Soft/Hard Attention에 대한 소개와
아래 논문보다 앞에 나온 Bahdanau vs Luong  Attention에 대한 비교 소개

Attention Mechanism(Seq2Seq)

Attention Mechanism(Seq2Seq)

Introduction to Attention Mechanism (Neural Machine Translation / Python, Tensorflow)

Source: www.slideshare.net/healess/attention-mechanismseq2seq

Effective Approaches to Attention-based Neural Machine Translation

Comments: 11 pages, 7 figures, EMNLP 2015 camera-ready version, more training details
Subjects: Computation and Language (cs.CL)
Cite as: arXiv:1508.04025 [cs.CL] (or arXiv:1508.04025v5 [cs.CL] for this version)

논문 소스는 Python을 활용하여 Tensorflow v1.3으로 구현 하였다.

healess/Attention

healess/Attention

Attention based neural machine translation

Source: github.com/healess/Attention

Categories: Paper Study

Leave a Reply

Your email address will not be published. Required fields are marked *