
Hugging Face's Auto classes automatically detect the right architecture and load the correct model class — so your code works across any model.
from transformers import AutoTokenizer, AutoModel, AutoModelForSequenceClassification
# Works with any model — bert, roberta, distilbert, etc.
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased")AutoModelForSequenceClassification — text classificationAutoModelForTokenClassification — NER, POS taggingAutoModelForQuestionAnswering — extractive QAAutoModelForSeq2SeqLM — translation, summarizationAutoModelForCausalLM — text generationReference:
TaskLoco™ — The Sticky Note GOAT