Submitted by laprika0 t3_yj5xkp in MachineLearning
papinek t1_iums091 wrote
Reply to comment by BlazeObsidian in [D] Machine learning prototyping on Apple silicon? by laprika0
Works very well. I use Stable Diffusion on Mac M1 and using mps its blazing fast
caedin8 t1_iunmrya wrote
By blazing fast he means as fast as a gtx 1060. My 3070 is 5x faster than my M1 Pro
papinek t1_iunqbum wrote
Well on my M1 it takes using mps 20 seconds to generate image using SD and 30 steps. Using CPU it would be lots of minutes per one image. So I would say it works on M1 well.
caedin8 t1_iunqou5 wrote
I don't even think you can use CPU to make images using stable diffusion, but maybe you can.
Yeah my M1 Pro takes about 25-30 seconds per image, some of that has to do with configuration. But my RTX 3070 cranks them about in about 4 to 5 seconds per image.
BlazeObsidian t1_iumubsj wrote
Did you run into memory issues ? I assumed it wouldn’t work with only 8 gigs unified memory.
papinek t1_iunm0b5 wrote
I have 32GB but never run into issue. Next to SD I run Photoshop, Intellij Idea and Chrome with 20 tabs and it was always enough.
BlazeObsidian t1_iunmgfn wrote
Hmm. Might give it a try. Usually I use colab. If there isn’t much of a difference during inference, local is better
[deleted] t1_iupcesw wrote
[deleted]
Viewing a single comment thread. View all comments