Skip to content

This repository contains the notebook for sentiment analysis of text using fine-tuned BERT

Notifications You must be signed in to change notification settings

okaditya84/Fine-tuning-BERT-Sentime-analysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Emotion Classification with DistilBERT

Emotion Classification with DistilBERT is a project dedicated to utilizing the DistilBERT transformer-based model for accurately classifying emotions in text data. Emotions play a fundamental role in human communication, and being able to identify and understand emotions in text is invaluable for various applications, including sentiment analysis, customer feedback analysis, mental health monitoring, and more.

Installation

To run the code in this repository, you need to install the necessary dependencies. Use the following command to install the required packages:

  pip install transformers[torch] datasets accelerate -U

Usage

Clone this repository:

git clone https://github.com/yourusername/emotion-classification.git
cd emotion-classification

Run the provided code using a Python environment:

python emotion_classification.py

Datasets

The emotion dataset used in this project is loaded using the datasets library. The dataset consists of textual data labeled with various emotions.

Model Fine-Tuning

We fine-tune a transformer-based model for emotion classification using the Hugging Face Trainer and TrainingArguments. The model is trained for a specified number of epochs with the given hyperparameters.

About

This repository contains the notebook for sentiment analysis of text using fine-tuned BERT

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published