This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
-
Updated
Sep 23, 2021 - Python
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Tensorflow 2.0 tutorials for RNN based architectures for textual problems
My current project about building chatbot with deep neural networks. This repo show my chatbot code for deployment purpose
Image Captioning is the process of generating textual description of an image. It uses both Natural Language Processing and Computer Vision to generate the captions.
Seq2Seq model implemented with pytorch, using Bahdanau Attention and Luong Attention.
A simple and easy to understand NLP teaching
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
This repository contains TensorFlow/Keras models for implementing an Encoder-Decoder architecture for sequence-to-sequence tasks. It includes components such as Encoder, Decoder, Embedding Layer, LSTM Layer, Attention Mechanism, and more.
Successfully established a Seq2Seq with attention model which can perform English to Spanish language translation up to an accuracy of almost 97%.
Successfully established a text summarization model using Seq2Seq modeling with Luong Attention, which can give a short and concise summary of the global news headlines.
Add a description, image, and links to the luong-attention topic page so that developers can more easily learn about it.
To associate your repository with the luong-attention topic, visit your repo's landing page and select "manage topics."