Week8
| Aspect | PLM | LLM |
|---|---|---|
| Scpoe | General term for pertrained models | Subset of PLMs with large parameter count |
| Size | Can be small to medium | Usually very large (billions of params) |
| Architecture | Encoder, decoder or both | Mostaly decoder only |
| Training Goal | Pretrain + fine-tune | General-purpose + prompt-based use |
| Usage style | Fine-tuning baseed | Prompt-based, few-shot or zero shot |
| Examples | BERT, RoBERTA, T5 | GPT-3/4, LLaMA |