A Checkpoint is a saved snapshot of an AI model's weights and parameters at a specific point during or after training. It represents the complete state of a model at that moment, capturing everything the model has learned up to that stage and allowing the model to be loaded and used for generation or continued training without starting from scratch.
In practice, checkpoints serve several purposes. During training, they allow progress to be preserved at regular intervals so that if training is interrupted or produces degraded results, it can be resumed or rolled back to an earlier state. In the broader AI image and video generation community, checkpoints are the primary format in which trained models are distributed and shared. Different checkpoints built on the same base architecture, such as Stable Diffusion, can produce radically different visual styles depending on the data they were trained on, making checkpoint selection a key creative decision for practitioners working with open-source or community-trained models.
For creators using AI generation platforms, checkpoints are often presented as selectable model options that determine the base aesthetic and capability of what the tool produces. Understanding that each checkpoint represents a distinct trained entity, rather than simply a settings variation of the same model, helps creators make more informed choices when selecting between available options and interpreting the differences in their outputs.