For the results, please see the thesis PDF.
This project requires Python 3.8+, TF 2.2+ (recommended TF 2.3). See Dockerfile
.
-
All settings and hyperparameters are set in
src/utils/types.py
-
For explanatory notebooks see
src/notebooks
-
For experiments see
src/experiments
or runsrc/run_experiments.py
-
To train, evaluate and generate action placement for beat maps by hand, modify and run
src/experiment_by_hand.py
-
research
is for previous iterations and experimentation during development -
Links to songs used for comparison of this project, OxAI DeepSaberv2 and Beat Sage are in
data/evaluation_dataset/song_urls.txt
- Download OxAI or your own beat maps
- Unzip them in
data/human_beatmaps/new_dataformat
- Or change
config.dataset.beat_maps_folder
- Or change
- Run
src/generate_initial_dataset.py
(data/new_dataformat
(as set inconfig.dataset.storage_folder
) with pickled DataFrames should be created
- Run
src/notebooks/create_action_embeddings.ipynb
- FastText action embeddings should be created
- Run
src/generate_initial_dataset.py
again, or start experimenting withsrc/experiment_by_hand.py
- The project is ready for usage
- Explore data set in
src/notebooks/data_exploration.ipynb
- Experiment by hand with
src/experiment_by_hand.py
- Run the experiments
src/run_experiments.py
(takes long, long time) - Explore results in
src/notebooks/results_exploration.ipynb
- Explore data set in
- Beat maps are sentences; actions are words
- Use Word2Vec and FastText to create action embeddings
- Dataset of action analogies
- Evaluation of new features for Learning to choreograph
- Part of the song, difficulty, MFCC, etc.
- Multi LSTM architecture
- Handles well multiple different input streams
- Local metric based on action embeddings
- Global metric to measure the similarity between human a synthetic choreography based on the distribution of new actions.
- Video of the current version coming soon.