Skip to content

Commit

Permalink
Add GIF animations of Dashboard
Browse files Browse the repository at this point in the history
  • Loading branch information
c-bata committed Oct 28, 2020
1 parent bc6b5c1 commit 0a2aae2
Showing 1 changed file with 10 additions and 7 deletions.
17 changes: 10 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,6 @@ Distributed hyperparameter optimization framework, inspired by [Optuna](https://
This library is particularly designed for machine learning, but everything will be able to optimize if you can define the objective function
(e.g. Optimizing the number of goroutines of your server and the memory buffer size of the caching systems).

**Key features:**

| State-of-the-art algorithms | Optuna compatible RDB backend |
| --------------------------- | ----------------------------- |
| <img width="750" alt="state-of-the-art-algorithms" src="https://user-images.githubusercontent.com/5564044/88860180-2d66cc00-d236-11ea-9a2f-de731c54a870.png"> | <img width="750" alt="optuna-compatibility" src="https://user-images.githubusercontent.com/5564044/88843168-a3aa0500-d21b-11ea-8fc1-d1cdca890a3f.png"> |

**Supported algorithms:**

Goptuna supports various state-of-the-art Bayesian optimization, Evolution strategy and Multi-armed bandit algorithms.
Expand All @@ -28,6 +22,12 @@ These algorithms are implemented in pure Go and continuously benchmarked on GitH
* Median Stopping Rule [6]
* ASHA: Asynchronous Successive Halving Algorithm (Optuna flavored version) [1,7,8]

**Built-in dashboard:**

| Manage optimization results | Interactive live-updating graphs |
| --------------------------- | -------------------------------- |
| <img width="750" alt="state-of-the-art-algorithms" src="https://user-images.githubusercontent.com/5564044/97099702-4107be80-16cf-11eb-9d97-f5ceec98ce52.gif"> | <img width="750" alt="visualization" src="https://user-images.githubusercontent.com/5564044/97099797-66e19300-16d0-11eb-826c-6977e3941fb0.gif"> |

**Projects using Goptuna:**

* [Kubeflow/Katib: Kubernetes-based system for hyperparameter tuning and neural architecture search.](https://github.com/kubeflow/katib)
Expand Down Expand Up @@ -96,7 +96,6 @@ Furthermore, I recommend you to use RDB storage backend for following purposes.
* Continue from where we stopped in the previous optimizations.
* Scale studies to tens of workers that connecting to the same RDB storage.
* Check optimization results via built-in dashboard.
* Visualize parameters on Jupyter notebook using Optuna.

### Advanced usage

Expand Down Expand Up @@ -205,6 +204,10 @@ References:
* [8] [Liam Li, Kevin Jamieson, Afshin Rostamizadeh, Ekaterina Gonina, Moritz Hardt, Benjamin Recht, and Ameet Talwalkar. Massively parallel hyperparameter tuning. arXiv preprint arXiv:1810.05934, 2018.](https://arxiv.org/abs/1810.05934)
* [9] [J. Snoek, H. Larochelle, and R. Adams. Practical Bayesian optimization of machine learning algorithms. In Advances in Neural Information Processing Systems 25, pages 2960–2968, 2012.](https://arxiv.org/abs/1206.2944)

Presentations:

* :jp: [Goptuna Distributed Bayesian Optimization Framework at Go Conference 2019 Autumn](https://www.slideshare.net/c-bata/goptuna-distributed-bayesian-optimization-framework-at-go-conference-2019-autumn-187538495)

Blog posts:

* [Practical bayesian optimization using Goptuna](https://medium.com/@c_bata_/practical-bayesian-optimization-in-go-using-goptuna-edf97195fcb5).
Expand Down

0 comments on commit 0a2aae2

Please sign in to comment.