-
Notifications
You must be signed in to change notification settings - Fork 748
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No one has noticed that if it's Windows, there will be a problem in step 8 according to your instructions, where socat listens on port 11434 and ollama also listens on port 11434, which will cause a process conflict. #253
Comments
I cannot even see any record information on Ollama about the records convex have accessed. |
Yes I have been having this problem as well on multiple machines... I basically brute force it everytime. Will post in a minute when I get it working again.. |
|
Honestly for now I am using the no clerk branch, makes my machine less hot anyhow, maybe one of the devs of the main fork will chime in eventually when they see this, I cant figure out how I got passed this last time. |
Okay so I found out if you are using WSL you have to make sure you have Ollama installed in WSL and not on windows itself, not sure if there is a port problem with having it on windows itself but I wouldnt have it on there just to be safe. Secondly install Llama with Thirdly, once you install Llama, you are good to go, don't worry about doing a Now back in WSL (Remember every one of these steps ive mentioned are things you will do in WSL not cmd, other than the netstat -ano | findstr :11434 step and the kill step) do a I am going to check the other socat config and convex port assign steps as well to make sure that is still working. [EDIT]: so with so cat first set your host ip like this |
I dont understamd your idea,if I use the ollama serve in WSL and socat TCP-LISTEN:11434,fork TCP:$HOST_IP:11434 &ps aux | grep socat,I will find that Process conflict,because they both use 11434,maybe your ollama serve started in other port |
Have you solved the problem? |
When I skipped step 8, my Convex backend was completely unable to access Ollama and kept telling me that
9/25/2024, 9:26:31 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught Error: Request to http://localhost:11434/api/embeddings forbidden
9/25/2024, 9:26:31 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] [LOG] 'Texts to be sent for embedding: ' [ 'Bob is talking to Stella' ]
9/25/2024, 9:26:31 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] [LOG] 'Sending data for embedding: {"model":"mxbai-embed-large","prompt":"Bob is talking to Stella"}'
9/25/2024, 9:26:31 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught Error: Request to http://localhost:11434/api/embeddings forbidden
The text was updated successfully, but these errors were encountered: