Creating your own LLM from opensource models
Sebastiano Galazzo
(Synapsia AI)
Lingua:
Inglese
Orario: 14:30
- 15:15
From a "simple" finetuning to your own Mixture of Expert model using opensource models.
Nowadays training from scratch an LLM is a so huge effort also for very big company. Starting from pre-trained models to create your own model is no more a way for resourceless companies, but a often a must starting point.
- Lora
- Quantization and QLora
- Injecting embeddinds model into Lora to manage multiple Lora adapters.
- Mixing models
- Creating your MoE (Mixture of experts) model using several finetuned (Your own) models
Accedi per vedere il video