You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When optimizing hyperparameters I may want to fix a specific parameter.
I.e., do I want to train bonsai or parabel variants of omikuji.
This could possibly configurin by not optimizing hyperparameters defined int the project.
I had a look at Optuna a while back they don't allow this by default. The option I see is to wrap the trial in a proxy to handle this.
The text was updated successfully, but these errors were encountered:
This makes sense, and I've had the same thought for example about fastText (where hyperopt isn't supported yet) - sometimes it would make sense to fix e.g. loss=hs when performing hyperparameter optimization, because the other options would be way too slow.
However, I'm not sure if "not optimizing hyperparameters defined in the project" is the best way - how about instead being able to specify the fixed parameters on the command line? This could be done using the --backend-param option, which is already implemented for e.g. the suggest command.
When optimizing hyperparameters I may want to fix a specific parameter.
I.e., do I want to train bonsai or parabel variants of omikuji.
This could possibly configurin by not optimizing hyperparameters defined int the project.
I had a look at Optuna a while back they don't allow this by default. The option I see is to wrap the trial in a proxy to handle this.
The text was updated successfully, but these errors were encountered: