Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs
-
Updated
Apr 21, 2020 - Python
Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs
Spanish word embeddings computed with different methods and from different corpora
Using pre trained word embeddings (Fasttext, Word2Vec)
CNN-based model to realize aspect extraction of restaurant reviews based on pre-trained word embeddings and part-of-speech tagging
Taking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
Fully batched seq2seq example based on practical-pytorch, and more extra features.
GloVe word vector embedding experiments (similar to Word2Vec)
This sentiment analysis project determines whether the tweets posted in the Turkish language on Twitter are positive or negative.
It's Smart-Question Answering System on short as well as long documents. It can automatically find answers to matching questions directly from documents. The deep learning language model converts the questions and documents to semantic vectors to find the matching answer.
NLP tutorial
A Tensorflow 2, Keras implementation of POS tagging using Bidirectional LSTM-CRF on Penn Treebank corpus (WSJ)
Sentiment Analysis using LSTM cells on Recurrent Networks. GloVe word embeddings were used for vector representation of words. Amazon Product Reviews were used as Dataset.
Fake news generator and detector using keras
Machine learning pipeline to predict relevant categories based on a product name and product description, using NLP and Deep learning concepts.
A neural network-based AI chatbot has been designed that uses LSTM as its training model for both encoding and decoding. The chatbot works like an open domain chatbot that can answer day-to-day questions involved in human conversations. Words embeddings are the most important part of designing a neural network-based chatbot. Glove Word Embedding…
A private, free, open-source search engine built on a P2P network
💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Chapter 11: Transfer Learning/Domain Adaptation
This GitHub repository provides an implementation of the paper "MAGNET: Multi-Label Text Classification using Attention-based Graph Neural Network" . MAGNET is a state-of-the-art approach for multi-label text classification, leveraging the power of graph neural networks (GNNs) and attention mechanisms.
[not maintained anymore] [for study purpose] A simple PyTorch implementation for "Global Vectors for Word Representation".
Add a description, image, and links to the glove-embeddings topic page so that developers can more easily learn about it.
To associate your repository with the glove-embeddings topic, visit your repo's landing page and select "manage topics."