Shortcuts
adv. user 1.5ΒΆ

If

Then

Ref

used self.log(sync_dist_op=...)

use self.log(reduce_fx=...) instead. Passing "mean" will still work, but it also takes a callable

PR7891

used the argument model from pytorch_lightning.utilities.model_helper.is_overridden

use instance instead

PR7918

returned values from training_step that had .grad defined (e.g., a loss) and expected .detach() to be called for you

call .detach() manually

PR7994

imported pl.utilities.distributed.rank_zero_warn

import pl.utilities.rank_zero.rank_zero_warn

relied on DataModule.has_prepared_data attribute

manage data lifecycle in customer methods

PR7657

relied on DataModule.has_setup_fit attribute

manage data lifecycle in customer methods

PR7657

relied on DataModule.has_setup_validate attribute

manage data lifecycle in customer methods

PR7657

relied on DataModule.has_setup_test attribute

manage data lifecycle in customer methods

PR7657

relied on DataModule.has_setup_predict attribute

manage data lifecycle in customer methods

PR7657

relied on DataModule.has_teardown_fit attribute

manage data lifecycle in customer methods

PR7657

relied on DataModule.has_teardown_validate attribute

manage data lifecycle in customer methods

PR7657

relied on DataModule.has_teardown_test attribute

manage data lifecycle in customer methods

PR7657

relied on DataModule.has_teardown_predict attribute

manage data lifecycle in customer methods

PR7657

used DDPPlugin.task_idx

use DDPStrategy.local_rank

PR8203

used Trainer.disable_validation

use the condition not Trainer.enable_validation

PR8291


© Copyright Copyright (c) 2018-2023, Lightning AI et al...

Built with Sphinx using a theme provided by Read the Docs.