You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You are here because you considered contributing to blackjax for at least a split second. Thank you! But sometimes we are just not quite sure what to work on/don't want to bother the maintainer. We've been there. That is why we put together a list of the projects that are up for grabs on blackjax.
You can pick any of these, open an issue to signal you are working on it and run with it.
Documentation 📖
Documentation is obviously lacking, and as the user base is expanding, writing documentation is probably the biggest contribution one could make. Even setting things up (with Sphinx) would be a huge help!
Move the documentation to ReadTheDocs (but still execute the examples)
Move examples to a different repo to ease CI
Add a mat plot l’on style sheet for the plots in doc
Use daft to represent models in the examples
Better API documentation
Examples 🎡
Examples are useful not only to get to know the library, but also to learn about the pros and cons of different algorithms. We are looking for any example contribution, and would be extra happy with:
Any comparison with optimization (looking at you, neural networks!);
Comparisons between sampling algorithm;
Examples with many dimensions / big datasets; problems where you typically wouldn't use bayesian inference;
Complex combination of algorithms: mgrad within Gibbs within SMC for a GP, for instance
How to distribute gradient computation for HMC?
Algorithms 🔨
We are of course always looking for new algorithms! The library currently has a focus on gradient-based algorithms and SMC, and we are currently looking for the following to expand the library's scope:
In practice adaptation algorithms (that compute reasonable values for the sampling algorithms' parameters) are as important as samplers. Here are the algorithms that we know of and would like to implement:
The next best thing when you can't easily sample from the posterior! There are a lot of exciting new algorithms that could be use alone or combine with MCMC
Hey 👋
You are here because you considered contributing to
blackjax
for at least a split second. Thank you! But sometimes we are just not quite sure what to work on/don't want to bother the maintainer. We've been there. That is why we put together a list of the projects that are up for grabs onblackjax
.You can pick any of these, open an issue to signal you are working on it and run with it.
Documentation 📖
Documentation is obviously lacking, and as the user base is expanding, writing documentation is probably the biggest contribution one could make. Even setting things up (with Sphinx) would be a huge help!
daft
to represent models in the examplesExamples 🎡
Examples are useful not only to get to know the library, but also to learn about the pros and cons of different algorithms. We are looking for any example contribution, and would be extra happy with:
For instance:
Algorithms 🔨
We are of course always looking for new algorithms! The library currently has a focus on gradient-based algorithms and SMC, and we are currently looking for the following to expand the library's scope:
Adaptation ✨
In practice adaptation algorithms (that compute reasonable values for the sampling algorithms' parameters) are as important as samplers. Here are the algorithms that we know of and would like to implement:
Approximation 🍰
The next best thing when you can't easily sample from the posterior! There are a lot of exciting new algorithms that could be use alone or combine with MCMC
Meta-algorithms ☁️
Testing 🔎
Performance is good, accuracy and correctness is often better. We welcome implementations of the following algorihms:
The text was updated successfully, but these errors were encountered: