Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release prep #116

Merged
merged 3 commits into from
Jul 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: chattr
Title: Interact with Large Language Models in 'RStudio'
Version: 0.1.0.9006
Version: 0.2.0
Authors@R: c(
person("Edgar", "Ruiz", , "[email protected]", role = c("aut", "cre")),
person(given = "Posit Software, PBC", role = c("cph", "fnd"))
Expand Down
2 changes: 1 addition & 1 deletion NEWS.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# chattr (dev)
# chattr 0.2.0

## General

Expand Down
4 changes: 2 additions & 2 deletions R/backend-databricks.R
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,9 @@ ch_databricks_complete <- function(prompt, defaults, stream = TRUE) {

host <- ch_databricks_host(defaults)
host_url <- url_parse(host)
if(is.null(host_url$scheme)) host_url$scheme <- "https"
if (is.null(host_url$scheme)) host_url$scheme <- "https"

user_agent <-paste0("chattr/", utils::packageVersion('chattr'))
user_agent <- paste0("chattr/", utils::packageVersion("chattr"))

req_result <- host_url %>%
url_build() %>%
Expand Down
3 changes: 1 addition & 2 deletions R/backend-openai.R
Original file line number Diff line number Diff line change
Expand Up @@ -201,12 +201,11 @@ ch_openai_error <- function(x, use_abort = TRUE) {
"Error from OpenAI\n",
substr(x, 10, nchar(x))
)
if(use_abort) {
if (use_abort) {
abort(error_msg)
} else {
cli_alert_warning(error_msg)
}

}
invisible()
}
Expand Down
2 changes: 1 addition & 1 deletion R/ch-r.R
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ ch_r_error <- function() {
err <- ch_env$r_session$read_error()
if (err != "") {
error_marker <- "! {error}"
if(substr(err, 1, nchar(error_marker)) == error_marker) {
if (substr(err, 1, nchar(error_marker)) == error_marker) {
err <- substr(err, nchar(error_marker) + 1, nchar(err))
}
out <- err
Expand Down
1 change: 0 additions & 1 deletion R/chattr-defaults.R
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,6 @@ chattr_defaults <- function(type = "default",
type <- "console"
} else {
type <- ui_current()

}
}

Expand Down
2 changes: 1 addition & 1 deletion R/chattr.R
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ chattr <- function(prompt = NULL,
while (ch_r_state() == "busy") {
curr_text <- ch_r_output()
ret <- c(ret, curr_text)
if(ui_current() == "markdown") {
if (ui_current() == "markdown") {
cat(curr_text)
} else {
ide_paste_text(curr_text)
Expand Down
34 changes: 20 additions & 14 deletions cran-comments.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,30 @@
## New package re-submission
## Pakcage Submission

Addresses CRAN comments
* Fixes how it identifies the user's current UI (console, app, notebook) and
appropriately outputs the response from the model end-point (#92)

Original:
* Adding support for Databricks foundation model API (DBRX, Meta Llama 3 70B,
Mixtral 8x7B) (#99)

This is a new package submission. Enables user interactivity with large-language
models ('LLM') inside the 'RStudio' integrated development environment ('IDE').
The user can interact with the model using the 'Shiny' app included in this
package, or directly in the 'R' console. It comes with back-ends for 'OpenAI',
'GitHub' 'Copilot', and 'LlamaGPT'.
* Fixes how it displays error from the model end-point when being used in a
notebook or the app for OpenAI

* Fixes how the errors from OpenAI are parsed and processed. This should make
it easier for users to determine where an downstream issue could be.

* Adds `model` to defaults for Copilot

* Improves Copilot token discovery

## R CMD check environments

- Mac OS M3 (aarch64-apple-darwin23), R 4.3.3 (Local)
- Mac OS M3 (aarch64-apple-darwin23), R 4.4.1 (Local)

- Mac OS x86_64-apple-darwin20.0 (64-bit), R 4.3.3 (GH Actions)
- Windows x86_64-w64-mingw32 (64-bit), R 4.3.3 (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), R 4.3.3 (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), R 4.4.0 (dev) (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), R 4.2.3 (old release) (GH Actions)
- Mac OS x86_64-apple-darwin20.0 (64-bit), R 4.4.1 (GH Actions)
- Windows x86_64-w64-mingw32 (64-bit), R 4.4.1 (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), R 4.4.1 (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), (dev) (GH Actions)
- Linux x86_64-pc-linux-gnu (64-bit), R 4.3.3 (old release) (GH Actions)

## R CMD check results

Expand Down
2 changes: 1 addition & 1 deletion tests/testthat/test-chattr-use.R
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ test_that("Menu works", {
return(1)
}
)
print(ch_get_ymls(menu = TRUE) )
print(ch_get_ymls(menu = TRUE))
expect_true(
ch_get_ymls(menu = TRUE) %in% c("gpt35", "gpt4", "gpt4o")
)
Expand Down
Loading