Saturday, October 25, 2025

Sampler - Stable Diffusion

  1. DDPM: Denoising Diffusion Probabilistic Model
    1. basic sampler -> Stable Diffusion's basic sampler 
    2. 1000 steps 
  2. DDIM: Denoising Diffusion Implicit Model
    1. DDPM -> DDIM 
    2. 10 ~ 50 steps 
  3. Euler Discrete
    1. Ordinary Differential Equation (OED) 
    2. simple & fast 
    3. same of DDIM or more steps
  4. OLSS: Optimal Linear Subspace Search
    1. fast, but using the old model 
    2. 5 ~ 10 steps 
  5. CM: Consistency Model
    1. 1 step
    2. pixel space
    3. complex calculation
    4. limited high resolution image 
  6. LCM: Latent Consistency Model
    1. 1 ~ 4 steps 
    2. latent space
    3. low dimension calculation
    4. support high resolution image 
    5. need LoRA or need re-training model (cannot use old model) 

Saturday, January 18, 2025

LVM - DDPM, DDIM

  1. Evaluation
    1. Inception Score
    2. FID (Freche Inception Distance)
  2. LVM - Large Visual Model
    1. GAN
    2. DDPM: denoising diffusion probabilistic models (https://arxiv.org/abs/2006.11239)
    3. DDIM: denoising diffusion models (https://arxiv.org/abs/2010.02502)



Thursday, October 10, 2024

PyTorch - get the total number of model parameter

Total number of model parameters


1. simple version

pytorch_total_params = sum(p.numel() for p in model.parameters())

2. listed version

def count_parameters(model):

  str_name = "name"

  str_parameter = "parameter"

  print(f"{str_name:50s}: {str_parameter:10s}")

  total_params = 0

  for name, parameter in model.named_parameters():

    if not parameter.requires_grad:

      continue

    params = parameter.numel()

    print(f"{name:50s}: {params:10s}")

    total_params += params

  print(f"Total Trainable Params: {total_params}")

  return total_params