Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GitHub Models support #113

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NEWS.org
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
* Version 0.21.0
- Add GitHub's GitHub Models
- Add ~llm-models-add~ as a convenience method to add a model to the known list.
* Version 0.20.0
- Add ability to output according to a JSON spec.
Expand Down
5 changes: 5 additions & 0 deletions README.org
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,11 @@ Microsoft Azure has an Open AI integration, although it doesn't support everythi
- ~:key~, the Azure key for Azure OpenAI service.
- ~:chat-model~, the chat model, which must be deployed in Azure.
- ~embedding-model~, the embedding model which must be deployed in Azure.
** GitHub Models
gs-101 marked this conversation as resolved.
Show resolved Hide resolved
GitHub now has its own platform for interacting with AI models. For a list of models check the [[https://github.com/marketplace/models][marketplace]]. You can set it up with ~make-llm-github~, with the following parameters:
- ~:key~, a GitHub token or an Azure AI production key.
- ~:chat-model~, the chat model, which can be any of the ones you have access for (currently o1 is restricted).
- ~:embedding-model~, the embedding model, which can be better found [[https://github.com/marketplace?type=models&task=Embeddings][through a filter]]a.
** Gemini (not via Google Cloud)
This is Google's AI model. You can get an API key via their [[https://makersuite.google.com/app/apikey][page on Google AI Studio]].
Set this up with ~make-llm-gemini~, with the following parameters:
Expand Down
56 changes: 56 additions & 0 deletions llm-github.el
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
;;; llm-github.el --- llm module for integrating with GitHub Models -*- lexical-binding: t; package-lint-main-file: "llm.el"; byte-compile-docstring-max-column: 200 -*-

;; Copyright (c) 2024 Free Software Foundation, Inc.

;; Author: Gabriel Santos de Souza <[email protected]>
;; Homepage: https://github.com/ahyatt/llm
;; SPDX-License-Identifier: GPL-3.0-or-later

;; This program is free software: you can redistribute it and/or
;; modify it under the terms of the GNU General Public License as
;; published by the Free Software Foundation; either version 3 of the
;; License, or (at your option) any later version.

;; This program is distributed in the hope that it will be useful, but
;; WITHOUT ANY WARRANTY; without even the implied warranty of
;; MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
;; General Public License for more details.

;; You should have received a copy of the GNU General Public License
;; along with GNU Emacs. If not, see <https://www.gnu.org/licenses/>.

;;; Commentary:
;; This file implements the llm functionality defined in llm.el,
;; for the GitHub Models platform.

;;; Code:

(require 'llm)
(require 'llm-openai)
(require 'cl-lib)

(cl-defstruct (llm-github (:include llm-openai-compatible (url "https://models.inference.ai.azure.com"))))
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting that this is an Azure URL. Can or should we base it off of the azure provider instead? Perhaps it could just be a Azure provider with a different default value for the URL.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting that this is an Azure URL. Can or should we base it off of the azure provider instead? Perhaps it could just be a Azure provider with a different default value for the URL.

It could be, as it also supports the Azure SDK:

07:38 AM 18-12-2024

Python Example on the page:

import os
from azure.ai.inference import ChatCompletionsClient
from azure.ai.inference.models import SystemMessage, UserMessage
from azure.core.credentials import AzureKeyCredential

endpoint = "https://models.inference.ai.azure.com"
model_name = "Meta-Llama-3.1-405B-Instruct"
token = os.environ["GITHUB_TOKEN"]

client = ChatCompletionsClient(
    endpoint=endpoint,
    credential=AzureKeyCredential(token),
)

response = client.complete(
    messages=[
        SystemMessage(content="You are a helpful assistant."),
        UserMessage(content="What is the capital of France?"),
    ],
    temperature=1.0,
    top_p=1.0,
    max_tokens=1000,
    model=model_name
)

print(response.choices[0].message.content)

For basing it off (as in, making a new file like I did), I don't think it would be necessary, as it already works (the page on the screenshot only has the Azure example, but I was able to make a request with the current configuration).

But, if we were to expand the Azure provider to support Github Models, that would require setting the chat and embedding URL as arguments, or add a conditional to change the formatting URL.

Example with chat URL:

(cl-defmethod llm-provider-chat-url ((provider llm-azure))
-  (format "%s/openai/deployments/%s/chat/completions?api-version=2024-08-01-preview"
-          (llm-azure-url provider)
-          (llm-azure-chat-model provider)))
(cl-defmethod llm-provider-chat-url ((provider llm-azure))
+  (if (llm-azure-github t)
+      (format "%s/chat/completions"
+              (llm-azure-url provider))
+    (format "%s/openai/deployments/%s/chat/completions?api-version=2024-08-01-preview"
+            (llm-azure-url provider)
+            (llm-azure-chat-model provider))))

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You shouldn't need to do the if, though, you could have your struct include the llm-azure struct, and then you could keep your code as normal.

(cl-defmethod llm-provider-chat-url ((provider llm-github))
  (format "%s/chat/completions" (llm-azure-url provider))))

Copy link
Author

@gs-101 gs-101 Dec 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You shouldn't need to do the if, though, you could have your struct include the llm-azure struct, and then you could keep your code as normal.

Thanks, sorry about that, I don't know much about Common Lisp.

I tried both of these:

-(require 'llm-openai)
+(require 'llm-azure)

-(cl-defstruct (llm-github (:include llm-openai-compatible (url "https://models.inference.ai.azure.com"))))
+(cl-defstruct (llm-github (:include llm-azure (url "https://models.inference.ai.azure.com"))))
-(require 'llm-openai)
+(require 'llm-azure)

-(cl-defstruct (llm-github (:include llm-openai-compatible (url "https://models.inference.ai.azure.com"))))
+(cl-defstruct (llm-github (:include llm-azure (url "https://models.inference.ai.azure.com"))))

 (cl-defmethod llm-provider-chat-url ((provider llm-github))
   (format "%s/chat/completions"
-          (llm-github-url provider)))
+          (llm-azure-url provider)))

 (cl-defmethod llm-provider-embedding-url ((provider llm-github) &optional _)
   (format "%s/embeddings/"
-          (llm-github-url provider)))
+          (llm-azure-url provider)))

But my requests timed out.


(cl-defmethod llm-nonfree-message-info ((_ llm-github))
"Return GitHub's nonfree terms of service."
"https://docs.github.com/en/site-policy/github-terms/github-terms-of-service")

(cl-defmethod llm-provider-chat-url ((provider llm-github))
(format "%s/chat/completions"
(llm-github-url provider)))

(cl-defmethod llm-provider-embedding-url ((provider llm-github) &optional _)
(format "%s/embeddings/"
(llm-github-url provider)))

(cl-defmethod llm-provider-headers ((provider llm-github))
`(("api-key" . ,(llm-github-key provider))))

(cl-defmethod llm-capabilities ((_ llm-github))
(list 'streaming 'embedding))

(cl-defmethod llm-name ((provider llm-github))
(format "GitHub Models %s" (llm-github-chat-model provider)))

(provide 'llm-github)
;;; llm-github.el ends here
5 changes: 5 additions & 0 deletions llm-integration-test.el
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@
;; - AZURE_EMBEDDING_MODEL: The name of the embedding model to test.
;; - AZURE_SLEEP: The number of seconds to sleep between tests, to avoid rate
;; limiting.
;; - GITHUB_TOKEN: The key for GitHub models. Can either be a GitHub token or
;; an Azure production key.
;;
;; If any of these are set (except for Azure, which needs multiple), the
;; corresponding provider will be tested.
Expand Down Expand Up @@ -138,6 +140,9 @@ else. We really just want to see if it's in the right ballpark."
:chat-model (getenv "AZURE_CHAT_MODEL")
:embedding-model (getenv "AZURE_EMBEDDING_MODEL"))
providers))
(when (getenv "GITHUB_TOKEN")
(require 'llm-github)
(push (make-llm-github :key (getenv "GITHUB_TOKEN")) providers))
(when (getenv "OLLAMA_CHAT_MODELS")
(require 'llm-ollama)
;; This variable is a list of models to test.
Expand Down
Loading