4
import torch
from diffusers import FluxPipeline

pipe = FluxPipeline.from_pretrained('C:\\Python\\Projects\\test1\\flux1dev', torch_dtype=torch.bfloat16)
pipe.enable_sequential_cpu_offload()

prompt = "beach ball"
image = pipe(
    prompt,
    height = 1024,
    width = 1024,
    guidance_scale = 3.5,
    num_inference_steps = 50,
    max_sequence_length = 512,
    generator = torch.Generator("cpu").manual_seed(0)
).images[0]
image.save("beach ball.png")

I ran into an issue running this simple test of Flux.1

Every time I tried to run the code it would simply stop after loading part of the pipline components.

No exception or error code was thrown and no output was given the program just stopped.

I really have no idea what I'm doing and I was kind of just messing around with Flux to see what it could do.

Theres also this really weird issue in the terminal where the last thing being loaded gets pushed into the next line of the terminal

4
  • Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. Commented Aug 22, 2024 at 2:16
  • No exception or error code was thrown and no output was given This code has no print statements, so what output were you expecting to see? Commented Aug 22, 2024 at 2:35
  • @JohnGordon Sorry for not clarifying. This script should exit after an image has been generated but instead just terminated part of the way through without throwing any exception. Commented Aug 23, 2024 at 20:33
  • terminated part of the way through How do you know this? What actual behavior do you see, that tells you the script did not fully execute? Commented Aug 23, 2024 at 20:36

2 Answers 2

3

I run FLUX.1-dev using CPU alone, with NO GPU and on an old rack server at home over PUtty via ssh, namely an obsolete elderly potato, and generating images works just fine for me.

CAVEAT. SADLY, YOU MAY NOT BE ABLE TO ADD 192GB TO YOUR PC as I have in my old rack server but that's life, although I have heard, unconfirmed, that any home desktop machine with 64GB is usable.

NOTE. There are TRICKS to getting FLUX, both -schnell and -dev, to work on ANY PC using CPU alone.

That 'weird issue on terminal' is A PROGRESS BAR and it is supposed to look like that. If your CPUs and system is slow like mine it will not change for a long time.

If you'd have left it alone, perhaps in a few days it might have produced an image.

Changes to your Python program that might help;

  • REMOVE the pipe.enable_sequential_cpu_offload() line as this is a work-around for a hardware limitation in CUDA GFX Cards being used as Math Accelerators since they lack the VRAM to do real work and the 80GB cards are, as of 2024, out of the reach of mere mortals.

  • change torch_dtype=torch.bfloat16 to torch_dtype=torch.float32 which is counter-intuitive as 32bit floats would seem to be slower than all-new-and-shiny bfloat16s or even float16 BUT on old potatoes geared up to float64 and float32 not every CPU has F16C nor AVX-512 capabilities and so EVERY SINGLE ONE OF THOSE bfloat16 calculations must be converted (time consuming) into a floating-point number your system DOES recognize THEN back again to the bfloat16 format which kills performance. Check your CPU capabilities and you'll likely find bfloat16 and even float16 aren't for you. If a CPU upgrade is not possible then just use float32 and you'll find it works at least at visible human speeds. My elderly Xeon CPUs, for instance, don't even have AVX nor AVX2 and FLUX-1.dev and FLUX.1-schnell works with float32 just fine albeit slower on my system; 5 mins per image with a visibly moving text progress bar.

  • DO NOT GO MAD WITH IMAGE SIZES AND INFERENCE STEPS so 1024x768 in 6 to 8 steps is often usable

Be reasonable and don't ask for the Earth and FLUX.1-dev and FLUX.1-schnell are more than usable on home systems.

It would appear the developers of FLUX.1 assumed that everyone has a brand new CPU in their tricked-out home supercomputer and a brand new $100,000 accelerator card plugged in as an afterthought, likely next to their Ferrari and their private yacht, which in 2024 is likely not the case.

So the code to get FLUX.1-schnell to run on JUST CPUs becomes;

import torch
from diffusers import FluxPipeline

DEVICE = "cpu"

print("Creating Pipeline...")

#
# FOR WINDOWS USERS
# 
pipe = FluxPipeline.from_pretrained("C:\\Python\\Projects\\test\\flux1schnell", torch_dtype=torch.float32).to(torch.device(DEVICE))

#
# FOR LINUX USERS
#
#pipe = FluxPipeline.from_pretrained("black-forest-labs/FLUX.1-schnell", torch_dtype=torch.float32).to(torch.device(DEVICE))

prompt = f"beach ball with a sign saying {DEVICE}-ONLY"

print("Generating Images...")

image = pipe(
    prompt=prompt,
    height=512, width=512,
    guidance_scale=3.5, num_inference_steps=5, max_sequence_length=256,
    output_type="pil", num_images_per_prompt=1,
    generator=torch.Generator(DEVICE).manual_seed(0)
).images

print("Output Images...")

for i, img in enumerate(image):
    print(f"Saving image {i}...")
    img.save(f"{prompt}_on_{DEVICE}_{i}.png")

print("Done.")

Luckily, the AI revolution is open to everyone if you know the way to get it to work.

Sign up to request clarification or add additional context in comments.

Comments

0

I had the same problem. The post by David H Parry does not seem to help at all. After a bit of trial and error I found a fix. Heres my code:

import torch
from diffusers import FluxPipeline
from huggingface_hub import login

login()
pipe = FluxPipeline.from_pretrained("black-forest-labs/FLUX.1-dev",
                                    torch_dtype=torch.bfloat16).to(torch.device("cuda"))

prompt = "dog"
image = pipe(
    prompt,
    height=256,
    width=256,
    guidance_scale=3.5,
    num_inference_steps=50,
    max_sequence_length=512,
    generator=torch.Generator("cpu").manual_seed(0)
).images[0]
image.show()
print("Done!")

The main problem is RAM. Running FLUX dev requires about 50-60GB of RAM (you can also use a pagefile). After setting up a pagefile it should start generating the image.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.