Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KL-UCB #59

Closed
Show file tree
Hide file tree
Changes from 49 commits
Commits
Show all changes
52 commits
Select commit Hold shift + click to select a range
77eaef7
Add BayesUCB to DDPG and accelerated the speed of learning
shashist Oct 11, 2023
0323e96
Merge branch 'refactoring' into 'main'
shashist Oct 11, 2023
eea822a
[CI] Running CI in CI_DEFAULT_BRANCH and publish dev packages
OnlyDeniko Oct 13, 2023
25504eb
Merge branch 'feature/ci_update' into 'main'
monkey0head Oct 13, 2023
eceef07
Fix package registry path
OnlyDeniko Oct 16, 2023
a220e55
Merge branch 'fix/ci' into 'main'
monkey0head Oct 16, 2023
a6aebfd
Move some functionality to experimental part
EgorBodrov Oct 20, 2023
61144aa
Merge branch 'fix/split_on_experimental' into 'main'
monkey0head Oct 20, 2023
52e8a25
Add dataset functionality
EgorBodrov Oct 24, 2023
3af8e74
Merge branch 'feature/dataset_interface' into 'main'
OnlyDeniko Oct 24, 2023
3c7c6b2
[Feature] Building experimental package
OnlyDeniko Nov 1, 2023
c928f3d
Merge branch 'feature/experimental_packages' into 'main'
AleXXL1986 Nov 1, 2023
c2c241f
Add new splitters
EgorBodrov Nov 2, 2023
b5232e4
Merge branch 'feature/add_splitters' into 'main'
bysheva Nov 2, 2023
6fe0339
Feature/add new preprocessing
EgorBodrov Nov 9, 2023
5746fdc
Merge branch 'feature/add_new_preprocessing' into 'main'
bysheva Nov 9, 2023
66d1d33
Update metrics interfaces
OnlyDeniko Nov 10, 2023
dc9cabd
Merge branch 'feature/metrics' into 'main'
monkey0head Nov 10, 2023
c19bcd8
Fix metrics
OnlyDeniko Nov 10, 2023
0a194cc
Merge branch 'feature/metrics' into 'main'
OnlyDeniko Nov 10, 2023
5ef760b
Update models interfaces for Dataset concept
wowMalow Nov 13, 2023
d49083e
Merge branch 'feature/dataset2models' into 'main'
monkey0head Nov 13, 2023
523e1e1
Dependencies updates
EgorBodrov Nov 13, 2023
298fb58
Merge branch 'fix/dependency_version_changes' into 'main'
bysheva Nov 13, 2023
0d8b7ca
Pyspark - optional dependency
OnlyDeniko Nov 15, 2023
b77702d
Merge branch 'feature/pyspark_optional' into 'main'
monkey0head Nov 15, 2023
8554308
Torch - optional dependency in default part
OnlyDeniko Nov 15, 2023
ffe208e
Merge branch 'feature/torch_optional' into 'main'
OnlyDeniko Nov 15, 2023
3e7568a
Update filters to classes
wowMalow Nov 16, 2023
d143b18
Merge branch 'feature/update_filters' into 'main'
AleXXL1986 Nov 16, 2023
b1cb808
Upd Pyarrow version
crazy8nick Nov 16, 2023
fc6c5d4
Merge branch 'fix/pyarrow_new_version' into 'main'
OnlyDeniko Nov 16, 2023
a072065
Rename experiments to examples
Nov 16, 2023
71766ce
Fixed yaml
Nov 16, 2023
59d3974
fix
Nov 16, 2023
0ffa5b2
Merge branch 'feature/rename_examples' into 'main'
OnlyDeniko Nov 16, 2023
cdb6859
Update CI 0.0.7
OnlyDeniko Nov 16, 2023
f4236e2
Merge branch 'fix/ci' into 'main'
OnlyDeniko Nov 16, 2023
f4dadaa
unix
Nov 16, 2023
df63da2
Merge branch 'release/v0.13.0' into 'main'
monkey0head Nov 20, 2023
b9c9183
Little fixes in toml
OnlyDeniko Nov 20, 2023
fab0f2d
Merge branch 'fix/toml' into 'main'
OnlyDeniko Nov 20, 2023
abd825a
KL-UCB implementation with docs and experiments
Arkadiy-Vladimirov Aug 24, 2023
81c1cbc
tests for kl_ucb
Aug 25, 2023
2c75433
some naming and docs fixes
Arkadiy-Vladimirov Aug 29, 2023
0dccfc5
pylint fixes
Arkadiy-Vladimirov Nov 7, 2023
44a2d5e
fixed tests,linter
levensons Nov 20, 2023
dfd56b3
fixed tests
Nov 21, 2023
62cb686
rebase with main
Nov 22, 2023
2cc22f3
fixes
Nov 23, 2023
a5d89f2
fixed tests
Nov 28, 2023
7867dd7
fixes
Dec 1, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
[run]
omit =
replay/models/extensions/ann/index_stores/hdfs_index_store.py
replay/models/extensions/spark_custom_models/*
replay/experimental/*
7 changes: 6 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@
.ipynb_checkpoints
*/.ipynb_checkpoints/*

# Poetry (since they are generated from template automatically)
/poetry.lock
/pyproject.toml

### Python template
# Byte-compiled / optimized / DLL files
__pycache__/
Expand Down Expand Up @@ -37,6 +41,7 @@ share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
*.xml
MANIFEST

# PyInstaller
Expand Down Expand Up @@ -168,7 +173,7 @@ requirements.txt
airflow.yaml

# temporary
experiments/tests
examples/tests

### d3rlpy logs
d3rlpy_logs/
2 changes: 1 addition & 1 deletion .gitlab/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ RUN update-alternatives --install /usr/bin/java java /usr/local/openjdk-11/bin/j

WORKDIR /root

RUN pip install --no-cache-dir --upgrade pip wheel poetry==1.5.1 lightfm \
RUN pip install --no-cache-dir --upgrade pip wheel poetry==1.5.1 poetry-dynamic-versioning lightfm==1.17 \
&& python -m poetry config virtualenvs.create false
COPY . .
RUN poetry install
Expand Down
158 changes: 143 additions & 15 deletions .gitlab/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -1,73 +1,201 @@
workflow:
rules:
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
- if: $CI_PIPELINE_SOURCE == "schedule"
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
- if: $CI_COMMIT_TAG

image: "${CI_REGISTRY_IMAGE}:${VERSION}_py39"

variables:
VERSION: "0.0.5"
VERSION: "0.0.7"
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"

.setup_env: &setup_env
- pip install -q --upgrade pip wheel poetry==1.5.1 lightfm
- pip install -q --upgrade pip wheel poetry==1.5.1 poetry-dynamic-versioning

.setup_experimental_env: &setup_experimental_env
- *setup_env
- pip install -q --upgrade lightfm==1.17

.install_replay: &install_replay
before_script:
- *setup_env
- poetry install
- ./poetry_wrapper.sh install

.install_experimental_replay: &install_experimental_replay
before_script:
- *setup_experimental_env
- ./poetry_wrapper.sh --experimental install

.install_experimental_replay_with_spark: &install_experimental_replay_with_spark
before_script:
- *setup_experimental_env
- ./poetry_wrapper.sh --experimental install --all-extras

cache: &global_cache
key: ${CI_COMMIT_REF_NAME}_${CI_COMMIT_SHORT_SHA}
paths:
- .cache/pip
- .cache/pypoetry
- ./.cache/pip
- ./.cache/pypoetry
policy: pull

stages:
- resolve
- code_quality
- test
- merge coverage
- examples
- build packages

resolve-job:
stage: resolve
cache:
<<: *global_cache
policy: push
script:
- *setup_env
- *setup_experimental_env
- poetry --version
- pip --version
- poetry install
- ./poetry_wrapper.sh --experimental install --all-extras
- dependencies="${CI_COMMIT_REF_NAME}_${CI_COMMIT_SHORT_SHA}_dependencies.txt"
- dependencies=$(echo ${dependencies} | sed -e 's/[^0-9a-zA-Z.-]/_/g') # removed invalid characters
- pip list > ${dependencies}
artifacts:
paths:
- poetry.lock
- projects/experimental/poetry.lock
- ${dependencies}
expire_in: 2 week

pylint-job:
<<: *install_replay
<<: *install_experimental_replay_with_spark
stage: code_quality
script:
- pylint replay

pycodestyle-job:
<<: *install_replay
<<: *install_experimental_replay_with_spark
stage: code_quality
script:
- pycodestyle replay tests

sphinx-job:
<<: *install_replay
<<: *install_experimental_replay_with_spark
stage: code_quality
script:
- make -C docs clean html

test-job:
<<: *install_replay
pytest-core:
stage: test
script:
- ./poetry_wrapper.sh install
- pytest -m core tests/
- mv .coverage .coverage_core
needs: ["pylint-job", "pycodestyle-job", "sphinx-job"]
artifacts:
paths:
- .coverage_core
expire_in: 1 day

pytest-torch:
stage: test
script:
- ./poetry_wrapper.sh install -E torch
- pytest -m "not spark and not experimental" tests/
- mv .coverage .coverage_torch
needs: ["pylint-job", "pycodestyle-job", "sphinx-job"]
artifacts:
paths:
- .coverage_torch
expire_in: 1 day

pytest-spark:
stage: test
script:
- ./poetry_wrapper.sh install -E spark
- pytest -m "not torch and not experimental" tests/
- mv .coverage .coverage_spark
needs: ["pylint-job", "pycodestyle-job", "sphinx-job"]
artifacts:
paths:
- .coverage_spark
expire_in: 1 day

pytest-spark-and-torch:
stage: test
script:
- pytest
- ./poetry_wrapper.sh install --all-extras
- pytest -m "not experimental" --ignore=replay/experimental --ignore=tests/experimental
- mv .coverage .coverage_all
needs: ["pylint-job", "pycodestyle-job", "sphinx-job"]
artifacts:
paths:
- .coverage_all
expire_in: 1 day

pytest-experimental:
stage: test
script:
- ./poetry_wrapper.sh --experimental install --all-extras
- pytest -m "experimental"
- mv .coverage .coverage_experimental
needs: ["pylint-job", "pycodestyle-job", "sphinx-job"]
artifacts:
paths:
- .coverage_experimental
expire_in: 1 day

merge-coverage:
stage: merge coverage
before_script:
- *setup_env
- ./poetry_wrapper.sh install --only dev
script:
- coverage combine .coverage_core .coverage_spark .coverage_torch .coverage_all .coverage_experimental
- coverage report --fail-under=93
- coverage xml
needs: ["pytest-core", "pytest-experimental", "pytest-spark", "pytest-spark-and-torch", "pytest-torch"]
coverage: '/TOTAL.*\s+(\d+%)$/'
artifacts:
when: always
reports:
coverage_report:
coverage_format: cobertura
path: coverage.xml


examples-execute-job:
<<: *install_replay
rules:
- when: never
stage: examples
script:
- export EXAMPLES_EXCLUDE=02_models_comparison.ipynb,06_item2item_recommendations.ipynb
- cd examples
- for i in *.ipynb; do [[ ! "$EXAMPLES_EXCLUDE" =~ "$i" ]] && jupyter nbconvert --to notebook --execute $i; done

build-production-package:
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
stage: build packages
script:
- *setup_env
- export PACKAGE_SUFFIX=.dev${CI_JOB_ID}
- echo $PACKAGE_SUFFIX
- ./poetry_wrapper.sh --generate
- poetry version
- poetry config repositories.replay ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/pypi
- poetry publish --build -r replay -u gitlab-ci-token -p ${CI_JOB_TOKEN}


build-experimental-package:
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
stage: build packages
script:
- export PACKAGE_SUFFIX=.preview${CI_JOB_ID}
- echo $PACKAGE_SUFFIX
- ./poetry_wrapper.sh --experimental --generate
- poetry version
- poetry config repositories.replay ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/pypi
- poetry publish --build -r replay -u gitlab-ci-token -p ${CI_JOB_TOKEN}
Loading