FLUX uses a split checkpoint (i.e. UNET/VAE/CLIP aren't contained in a single file), therefore it can only be used in the "Advanced" checkpoint mode.
Following automatic installation, a "metacheckpoint" named "FLUX.1 Schnell" (or similar depending on your choice) will be created. Simply using that option in "Simple" mode is enough.
Following manual installation, models must be selected manually in the "Advanced" checkpoint mode as follows:
Due to licensing restrictions, Metastable is unable to provide a fully automatic installation procedure for FLUX. The models can still be installed manually.
If you don't have a HuggingFace account yet, create a new account here - https://huggingface.co/join
Log into your HuggingFace account.
Choose a model and navigate to:
Fill in the "You need to agree to share your contact information to access this model" form, and submit. The access should be granted instantly.
Download the following files:
Model file (depending on the model you're trying to use):
Text encoders:
VAE: https://huggingface.co/black-forest-labs/FLUX.1-schnell/resolve/main/ae.safetensors
Open Metastable.
Go to "Settings", "About Metastable" and click on the "Reveal in explorer" button in the "Storage" section.
In the newly opened file explorer window, open the "models" directory.
Move your model file (sdX_X.safetensors) to the "checkpoint" directory.
Move your VAE file (ae.safetensors) to the "vae" directory and rename it to fluxvae.safetensors
.
Move your text encoder files (clip_l.safetensors, t5xxl_fp16.safetensors) to the "text_encoder" directory.
Feature | FLUX |
---|---|
Text-to-image | ✅ |
Image-to-image | ✅ |
Inpainting | ✅ |
LORA | ✅ |
ControlNet | ✅ |
IPAdapter | ❌ |
PULID | ❌ |