Last updated:
The increasing deployment of large language models (LLMs) in natural language processing (NLP) tasks raises concerns about energy efficiency and sustainability. While prior research has largely focused on energy consumption during model training, the inference phase has received comparatively less attention
This project evaluates the trade-offs between model accuracy and energy consumption in text classification inference.
Supported are traditional models, e.g. linear or xgboost, and large language models if available on https://huggingface.co/
Making software use visible!
Are you already using an openCode software project? Confirm active use and support the community!