Creating your own LLM from opensource models
Sebastiano Galazzo
(Synapsia AI)
Lingua:
Italiano
Orario: 20:00
- 20:45
From a "simple" finetuning to your own Mixture of Expert model using opensource models.
Nowadays training from scratch an LLM is a so huge effort also for very big company. Starting from pre-trained models to create your own model is no more a way for resourceless companies, but a often a must starting point.
- Lora
- Quantization and QLora
- Injecting embeddinds model into Lora to manage multiple Lora adapters.
- Mixing models
- Creating your MoE (Mixture of experts) model using several finetuned (Your own) models