From 1b4715ee95f77c7aa3e77fb499572ee1dec8ec00 Mon Sep 17 00:00:00 2001 From: Matsumoto Ruko Date: Tue, 9 Jul 2024 11:54:26 +0800 Subject: [PATCH] [Docathon][Add Overview Doc No.19-21] add doc of docathon 19-21 (#6504) * add doc of docathon 19-21 * fix some issues --- docs/api/paddle/nn/Overview_cn.rst | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/docs/api/paddle/nn/Overview_cn.rst b/docs/api/paddle/nn/Overview_cn.rst index bee31da0886..ee40272f162 100644 --- a/docs/api/paddle/nn/Overview_cn.rst +++ b/docs/api/paddle/nn/Overview_cn.rst @@ -205,6 +205,7 @@ Transformer 相关 " :ref:`paddle.nn.MultiHeadAttention ` ", "多头注意力机制" " :ref:`paddle.nn.functional.scaled_dot_product_attention ` ", "点乘注意力机制,并在此基础上加入了对注意力权重的缩放" + " :ref:`paddle.nn.functional.sparse_attention ` ", "稀疏版本的 Attention API,对 Transformer 模块中的 Attention 矩阵进行了稀疏化,从而减少内存消耗和计算量" " :ref:`paddle.nn.Transformer ` ", "Transformer 模型" " :ref:`paddle.nn.TransformerDecoder ` ", "Transformer 解码器" " :ref:`paddle.nn.TransformerDecoderLayer ` ", "Transformer 解码器层" @@ -264,6 +265,7 @@ Loss 层 " :ref:`paddle.nn.CrossEntropyLoss ` ", "交叉熵损失层" " :ref:`paddle.nn.CTCLoss ` ", "CTCLoss 层" " :ref:`paddle.nn.HSigmoidLoss ` ", "层次 sigmoid 损失层" + " :ref:`paddle.nn.HingeEmbeddingLoss ` ", "HingeEmbeddingLoss 损失层" " :ref:`paddle.nn.KLDivLoss ` ", "Kullback-Leibler 散度损失层" " :ref:`paddle.nn.L1Loss ` ", "L1 损失层" " :ref:`paddle.nn.MarginRankingLoss ` ", "MarginRankingLoss 层" @@ -271,6 +273,7 @@ Loss 层 " :ref:`paddle.nn.NLLLoss ` ", "NLLLoss 层" " :ref:`paddle.nn.GaussianNLLLoss ` ", "GaussianNLLLoss 层" " :ref:`paddle.nn.PoissonNLLLoss `", "PoissonNLLLoss 层" + " :ref:`paddle.nn.RNNTLoss `", "RNNTLoss 层" " :ref:`paddle.nn.SmoothL1Loss ` ", "平滑 L1 损失层" " :ref:`paddle.nn.SoftMarginLoss ` ", "SoftMarginLoss 层" " :ref:`paddle.nn.TripletMarginLoss ` ", "TripletMarginLoss 层" @@ -516,6 +519,8 @@ Embedding 相关函数 " :ref:`paddle.nn.functional.triplet_margin_loss ` ", "用于计算 TripletMarginLoss" " :ref:`paddle.nn.functional.triplet_margin_with_distance_loss ` ", "用户自定义距离函数用于计算 triplet margin loss 损失" " :ref:`paddle.nn.functional.multi_label_soft_margin_loss ` ", "用于计算多分类的 hinge loss 损失函数" + " :ref:`paddle.nn.functional.hinge_embedding_loss ` ", "计算输入 input 和标签 label(包含 1 和 -1) 间的 `hinge embedding loss` 损失" + " :ref:`paddle.nn.functional.rnnt_loss ` ", "计算 RNNT loss,也可以叫做 softmax with RNNT" " :ref:`paddle.nn.functional.multi_margin_loss ` ", "用于计算 multi margin loss 损失函数" " :ref:`paddle.nn.functional.adaptive_log_softmax_with_loss ` ", "自适应 logsoftmax 损失函数" @@ -548,6 +553,7 @@ Embedding 相关函数 " :ref:`paddle.nn.functional.temporal_shift ` ", "用于对输入 X 做时序通道 T 上的位移操作,为 TSM 中使用的操作" " :ref:`paddle.nn.functional.upsample ` ", "用于调整一个 batch 中图片的大小" " :ref:`paddle.nn.functional.class_center_sample ` ", "用于 PartialFC 类别中心采样" + " :ref:`paddle.nn.functional.channel_shuffle ` ", "将一个形为[N, C, H, W]或是[N, H, W, C]的 Tensor 按通道分成 g 组,得到形为[N, g, C/g, H, W]或[N, H, W, g, C/g]的 Tensor,然后转置为[N, C/g, g, H, W]或[N, H, W, C/g, g]的形状,最后重新排列为原来的形状" .. _about_initializer: