Letzte Aktualisierung:
The increasing deployment of large language models (LLMs) in natural language processing (NLP) tasks raises concerns about energy efficiency and sustainability. While prior research has largely focused on energy consumption during model training, the inference phase has received comparatively less attention
This project evaluates the trade-offs between model accuracy and energy consumption in text classification inference.
Supported are traditional models, e.g. linear or xgboost, and large language models if available on https://huggingface.co/
Jetzt Nutzung sichtbar machen!
Sie setzen ein Softwareprojekt von openCode bereits ein? Bestätigen Sie den aktiven Einsatz und unterstützen Sie die Community!