You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've attached the following snippet of code which is a simplication for the actual problem I'm facing. I've built a version of GPT that has Stable Embeddings and my use case requires me to constantly be reloading new versions of that model. I find that this leads to a gradual increase in GPU RAM usage over time.
To investigate this I've tried initialising the StableEmbeddings components on their own as per the above code snippet and then deleting them. The issue is that when I delete the variable and then empty out the cuda cache the GPU memory is still occupied by it.
The text was updated successfully, but these errors were encountered:
System Info
Python 3.10.6
Ubuntu 22.04.2 LTS
transformers 4.39.3
Reproduction
Expected behavior
I've attached the following snippet of code which is a simplication for the actual problem I'm facing. I've built a version of GPT that has Stable Embeddings and my use case requires me to constantly be reloading new versions of that model. I find that this leads to a gradual increase in GPU RAM usage over time.
To investigate this I've tried initialising the StableEmbeddings components on their own as per the above code snippet and then deleting them. The issue is that when I delete the variable and then empty out the cuda cache the GPU memory is still occupied by it.
The text was updated successfully, but these errors were encountered: