Skip to content

Detects whether the entered text contains toxic content using text toxicity detection model from Tensorflow.js.

Notifications You must be signed in to change notification settings

aastha985/Text-Toxicity-Detector

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Text-Toxicity-Detector

Using text toxicity detection model from Tensorflow.js.

Live Demo

Detects whether the entered text contains toxic content:

  • identity attack
  • insult
  • obscene
  • severe toxicity
  • sexual explicit
  • threat
  • toxicity

About

Detects whether the entered text contains toxic content using text toxicity detection model from Tensorflow.js.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published