Skip to content

wmt/007-wmt-phase2-attention #3

@utterances-bot

Description

@utterances-bot

SameTime WMT 专题:Attention——从时序链到全连接邻接图 | GrepCode

RNN 用单条时间链关押所有词的关系——Attention 把每个词放进独立桶,用邻接矩阵保留全连接图。SoftBLEU 的下一个瓶颈:vocab 稀释梯度。

https://www.grepcode.cn/wmt/007-wmt-phase2-attention.html

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions