Pyhopper is a hyperparameter optimizer that integrates seamlessly with existing training pipelines and allows you to define NumPy array type hyperparameters with millions of dimensions.
With a single argument, you can run multiple training process in parallel across all your machine's GPUs without modifying your training code. Pyhopper takes care of querying the available GPUs and setting up the environment variables accordingly.
Define your own hyperparameter types and sampling strategies with minimal code. Pyhopper has built-in support for tracking your experiments in TensorBoard, MLflow, and Weights & Biases