-
Notifications
You must be signed in to change notification settings - Fork 5
Model Training ‐ Introduction
So, we will train the LoRA model. The essence of LoRA comes down to the fact that during image generation it inserts its own additional layers between the neural network layers.
We can generate an image of a random person. It is the base image. If we add a LoRA to the generation then base image will shift in a new direction, such as updating image style or changing person's appearance to the one that LoRA has been trained on.
Here is the base image and the same image with Lana Del Rey LoRA.

An important advantage we will greatly benefit from is that we can train LoRA on any checkpoint and then apply it to all the checkpoints based on the same neural network. I.e. LoRA trained on any SD1.5 based checkpoint can be applied to any SD1.5 based checkpoint. Same for SDXL and others. It may not perform well everywhere, but technically it will work. Looking for the best checkpoint to train on will be a huge part of further comparison.
Next - Model Training ‐ Basics
- Introduction
- Examples
- Dataset Preparation
- Model Training ‐ Introduction
- Model Training ‐ Basics
- Model Training ‐ Comparison - Introduction
Short Way
Long Way
- Model Training ‐ Comparison - [Growth Rate]
- Model Training ‐ Comparison - [Betas]
- Model Training ‐ Comparison - [Weight Decay]
- Model Training ‐ Comparison - [Bias Correction]
- Model Training ‐ Comparison - [Decouple]
- Model Training ‐ Comparison - [Epochs x Repeats]
- Model Training ‐ Comparison - [Resolution]
- Model Training ‐ Comparison - [Aspect Ratio]
- Model Training ‐ Comparison - [Batch Size]
- Model Training ‐ Comparison - [Network Rank]
- Model Training ‐ Comparison - [Network Alpha]
- Model Training ‐ Comparison - [Total Steps]
- Model Training ‐ Comparison - [Scheduler]
- Model Training ‐ Comparison - [Noise Offset]
- Model Training ‐ Comparison - [Min SNR Gamma]
- Model Training ‐ Comparison - [Clip Skip]
- Model Training ‐ Comparison - [Precision]
- Model Training ‐ Comparison - [Number of CPU Threads per Core]
- Model Training ‐ Comparison - [Checkpoint]
- Model Training ‐ Comparison - [Regularisation]
- Model Training ‐ Comparison - [Optimizer]