LoRA Lab
Prepare datasets, review captioned character features, launch RunPod-backed training jobs, and monitor ComfyUI or Musubi execution without leaving YoAI.
Source Assets
Select images from your projects to build a captioned training dataset.
Dataset Pipeline
Caption first, analyze recurring features, decide what belongs in the trigger, then generate training captions.
Current Dataset Snapshot
No dataset selected yet.
Feature Review
Tell YoAI which recurring features should be treated as part of the subject identity and stripped from per-image captions.
Caption Preview
Review generated dataset captions and spot-check how the trigger word and stripped features are landing.
Training Run
Launch either ComfyUI or Musubi-backed training with the currently prepared dataset and pod selection.
Launch Context
Quick sanity-check of what will be sent into the training route.
Recent Jobs
Click a job to open it in the monitor tab.
Jobs
Recent training runs with live refresh.
No Job Selected
Pick a job on the left to inspect its current stage, telemetry, and recent events.
Loss Curve
Most recent step and average-loss samples.
Pod Telemetry
Pod helper status and GPU hints from the job’s resolved pod.
Recent Events
Stage updates and parser-friendly progress rows.
Config Snapshot
The exact training configuration currently attached to the selected job.
RunPod Control
Manage the pieces the training stack depends on: pods, templates, registry auths, and attached network storage.
Pods
Synced RunPod resources for this account.
Create Pod
Launch and save a pod definition tied to the current user and optional project.