text-emotion-classification

A text emotion recognition application that can be quickly deployed and used locally. You can perform interactive inference simply by running main.py.

中文版

Features

  • Local Inference: Loads the sentiment_roberta model directory within the repository for text emotion classification.
  • Label Mapping: Reads id -> Chinese Emotion Name mapping from text-emotion.yaml.
  • Interactive CLI: Enter text in the command line to output the emotion category and confidence level.

Directory Structure (Key Files)

  • main.py: Entry script (run directly).
  • sentiment_roberta/: Exported Transformers model directory (contains config.json, model.safetensors, tokenizer, etc.).
  • text-emotion.yaml: Label mapping file.
  • release-note.md: Release notes (used by GitHub Actions as the release body).

Environment Requirements

  • Python 3.10 (Recommended, matches the author's environment; 3.9+ is theoretically compatible but not fully verified).
  • Dependency Management: Conda environment (recommended) or venv.
  • PyTorch:
    • CPU Inference: Install the CPU version of torch.
    • GPU Inference: Requires an NVIDIA GPU + corresponding CUDA version (the author's environment uses torch==2.10.0+cu128 / torchvision==0.25.0+cu128 built with CUDA 12.8).

The author's conda environment export file is provided: environment.yml.

Installation

Using Conda Environment File (Recommended)

conda env create -f environment.yml
conda activate text-emotion-classification

Usage

python main.py

Follow the prompts to enter text:

  • Enter any text: Outputs emotion prediction and confidence.
  • Empty input (Enter): Exits the program.

FAQ

  • Cannot find model directory sentiment_roberta
    • Ensure sentiment_roberta/ exists in the root directory and contains files like config.json and model.safetensors.
  • Inference Device
    • The program automatically selects cuda if available; otherwise, it defaults to cpu.

License

See Apache 2.0 License.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support