Transformer(Attention Is All You Need) Implementation in Pytorch
-
Updated
Dec 2, 2022 - Python
Transformer(Attention Is All You Need) Implementation in Pytorch
JAX implementation of the bart-base model
JAX implementation of the T5 model: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Censored tweets annotated for specificity; AAAI 2019 paper: Predicting and Analyzing Language Specificity in Social Media Posts
A NLP algorithm I developed to determine the similarity or relation between two documents/Wikipedia articles. Inspired by the cosine similarity algorithm and built from WordNet.
Train a T5 model to generate simple Fake News and use a RoBERTa model to classify what's fake and what's real.
An interactive application that leverages a pre-trained language model (GPT-2) to generate human-like text based on user prompts. The application can produce various types of text, such as stories, reports, or dialogues, making it a versatile tool for creative writing, brainstorming, or content generation.
Advanced NLP model implementation in PyTorch featuring transformer architecture, multi-head attention, and comprehensive training pipeline with mixed precision, gradient accumulation, and dynamic learning rate scheduling.
Transform emoticon to text, e.g., :) => Smile.
A Streamlit-based spam classifier that predicts whether a message is spam or not spam using machine learning.
Add a description, image, and links to the nlp-model topic page so that developers can more easily learn about it.
To associate your repository with the nlp-model topic, visit your repo's landing page and select "manage topics."