Skip to content

Commit 326f05a

Browse files
committed
fix: optimizer was not used in workflow with multiple fits
For the optimizer to be used, the approximator.compile function has to be called. This was not the case. I adapted the `setup_optimizer` function to match the description in its docstring, and made the compilation conditional on its output. The output indicates if a new optimizer was configured.
1 parent d333870 commit 326f05a

File tree

1 file changed

+5
-3
lines changed

1 file changed

+5
-3
lines changed

bayesflow/workflows/basic_workflow.py

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -914,6 +914,7 @@ def build_optimizer(self, epochs: int, num_batches: int, strategy: str) -> keras
914914
self.optimizer = keras.optimizers.Adam(learning_rate, clipnorm=1.5)
915915
else:
916916
self.optimizer = keras.optimizers.AdamW(learning_rate, weight_decay=5e-3, clipnorm=1.5)
917+
return self.optimizer
917918

918919
def _fit(
919920
self,
@@ -955,9 +956,10 @@ def _fit(
955956
else:
956957
kwargs["callbacks"] = [model_checkpoint_callback]
957958

958-
self.build_optimizer(epochs, dataset.num_batches, strategy=strategy)
959-
960-
if not self.approximator.built:
959+
# returns None if no new optimizer was built and assigned to self.optimizer, which indicates we do not have
960+
# to (re)compile the approximator.
961+
optimizer = self.build_optimizer(epochs, dataset.num_batches, strategy=strategy)
962+
if optimizer is not None:
961963
self.approximator.compile(optimizer=self.optimizer, metrics=kwargs.pop("metrics", None))
962964

963965
try:

0 commit comments

Comments
 (0)