Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Don't start in RX 7600 (Linux) #12409

Open
1 task done
fernandoisnaldo opened this issue Aug 8, 2023 · 12 comments · May be fixed by #16780
Open
1 task done

[Bug]: Don't start in RX 7600 (Linux) #12409

fernandoisnaldo opened this issue Aug 8, 2023 · 12 comments · May be fixed by #16780
Labels
bug-report Report of a bug, yet to be confirmed platform:amd Issues that apply to AMD manufactured cards

Comments

@fernandoisnaldo
Copy link

Is there an existing issue for this?

  • I have searched the existing issues and checked the recent builds/commits

What happened?

Without set HSA_OVERRIDE_GFX_VERSION=11.0.0, this program doesn't start on my RX 7600.

I could solve this adding this variable, then I suggest to add this in webui.sh

Steps to reproduce the problem

bash webui.sh

What should have happened?

Missing setting HSA_OVERRIDE_GFX_VERSION=11.0.0 in source code conditionals of webui.sh.

Version or Commit where the problem happens

1.5.1

What Python version are you running on ?

Python 3.10.x

What platforms do you use to access the UI ?

Linux

What device are you running WebUI on?

AMD GPUs (RX 6000 above)

Cross attention optimization

Automatic

What browsers do you use to access the UI ?

Google Chrome

Command Line Arguments

`HSA_OVERRIDE_GFX_VERSION=11.0.0 bash webui.sh` to solve this problem.

List of extensions

None

Console logs

Irrelevant information, because I solved the problem with the environment variable that they forgot to export in the code.

Additional information

I think this bug report didn't need so many fields.

@fernandoisnaldo fernandoisnaldo added the bug-report Report of a bug, yet to be confirmed label Aug 8, 2023
@catboxanon catboxanon added the platform:amd Issues that apply to AMD manufactured cards label Aug 8, 2023
@Boom-Hacker
Copy link

so,you can run with override command?

@Boom-Hacker
Copy link

can you join my AIT test?it can provide 170% speed

@fernandoisnaldo
Copy link
Author

@Boom-Hacker

Yep, I can assure you that with HSA override this works perfectly.

Without HSA override, the server crashes due to segfaut.

Regarding "AIT test", I will not cooperate, as I don't know you and I have no reference as to who you are, but assuming good intentions, I thank you in the same way.

@Boom-Hacker
Copy link

nothing

@Boom-Hacker
Copy link

@Boom-Hacker

Yep, I can assure you that with HSA override this works perfectly.

Without HSA override, the server crashes due to segfaut.

Regarding "AIT test", I will not cooperate, as I don't know you and I have no reference as to who you are, but assuming good intentions, I thank you in the same way.

what did you do?i can't start

@fernandoisnaldo
Copy link
Author

fernandoisnaldo commented Sep 20, 2023

what did you do?i can't start

I got it working successfully with this command:

HSA_OVERRIDE_GFX_VERSION=11.0.0 webui.sh

You also can, in another alternative, export HSA_OVERRIDE_GFX_VERSION=11.0.0 to environment variables.

@Boom-Hacker
Copy link

what rocm and pytorch version?i stuck on Creating model from config: /home/beforespace/stable-diffusion-webui/configs/v1-inference.yaml

@Boom-Hacker
Copy link

Traceback (most recent call last):
File "/home/beforespace/RX7600/sd/venv/lib/python3.11/site-packages/diffusers/utils/import_utils.py", line 684, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1206, in _gcd_import
File "", line 1178, in _find_and_load
File "", line 1149, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/home/beforespace/RX7600/sd/venv/lib/python3.11/site-packages/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py", line 20, in
from transformers import CLIPImageProcessor, CLIPTextModel, CLIPTokenizer
ImportError: cannot import name 'CLIPImageProcessor' from 'transformers' (/home/beforespace/RX7600/sd/venv/lib/python3.11/site-packages/transformers/init.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/beforespace/RX7600/sd/test.py", line 2, in
from diffusers import StableDiffusionPipeline
File "", line 1231, in _handle_fromlist
File "/home/beforespace/RX7600/sd/venv/lib/python3.11/site-packages/diffusers/utils/import_utils.py", line 675, in getattr
value = getattr(module, name)
^^^^^^^^^^^^^^^^^^^^^
File "/home/beforespace/RX7600/sd/venv/lib/python3.11/site-packages/diffusers/utils/import_utils.py", line 675, in getattr
value = getattr(module, name)
^^^^^^^^^^^^^^^^^^^^^
File "/home/beforespace/RX7600/sd/venv/lib/python3.11/site-packages/diffusers/utils/import_utils.py", line 674, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/beforespace/RX7600/sd/venv/lib/python3.11/site-packages/diffusers/utils/import_utils.py", line 686, in _get_module
raise RuntimeError(
RuntimeError: Failed to import diffusers.pipelines.stable_diffusion.pipeline_stable_diffusion because of the following error (look up to see its traceback):
cannot import name 'CLIPImageProcessor' from 'transformers' (/home/beforespace/RX7600/sd/venv/lib/python3.11/site-packages/transformers/init.py)

@fernandoisnaldo
Copy link
Author

what rocm and pytorch version?i stuck on Creating model from config: /home/beforespace/stable-diffusion-webui/configs/v1-inference.yaml

From what I read in the webui.sh source code, the Torch version for Navi 3 is this:

https://download.pytorch.org/whl/nightly/rocm5.6

@Boom-Hacker
Copy link

i tested,no effect

@Boom-Hacker
Copy link

i think is about pytorch 2.2 update?

@cbayle
Copy link

cbayle commented Jan 10, 2025

I managed to make it work on RX7600 XT with:


export TORCH_COMMAND="pip install --extra-index-url https://download.pytorch.org/whl/rocm5.7  torch==2.3.1+rocm5.7 torchvision==0.18.1+rocm5.7 torchaudio==2.3.1+rocm5.7"
export HSA_OVERRIDE_GFX_VERSION=11.0.0

Going to make a pull request to fix this in webui.sh

@cbayle cbayle linked a pull request Jan 10, 2025 that will close this issue
4 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-report Report of a bug, yet to be confirmed platform:amd Issues that apply to AMD manufactured cards
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants