Shortcuts

Note

You are reading the documentation for MMSelfSup 0.x, which will soon be deprecated by the end of 2022. We recommend you upgrade to MMSelfSup 1.0.0rc versions to enjoy fruitful new features and better performance brought by OpenMMLab 2.0. Check out the changelog, code and documentation of MMSelfSup 1.0.0rc for more details.

Source code for mmselfsup.core.hooks.densecl_hook

# Copyright (c) OpenMMLab. All rights reserved.
from mmcv.runner import HOOKS, Hook


[docs]@HOOKS.register_module() class DenseCLHook(Hook): """Hook for DenseCL. This hook includes ``loss_lambda`` warmup in DenseCL. Borrowed from the authors' code: `<https://github.com/WXinlong/DenseCL>`_. Args: start_iters (int, optional): The number of warmup iterations to set ``loss_lambda=0``. Defaults to 1000. """ def __init__(self, start_iters=1000, **kwargs): self.start_iters = start_iters def before_run(self, runner): assert hasattr(runner.model.module, 'loss_lambda'), \ "The runner must have attribute \"loss_lambda\" in DenseCL." self.loss_lambda = runner.model.module.loss_lambda def before_train_iter(self, runner): assert hasattr(runner.model.module, 'loss_lambda'), \ "The runner must have attribute \"loss_lambda\" in DenseCL." cur_iter = runner.iter if cur_iter >= self.start_iters: runner.model.module.loss_lambda = self.loss_lambda else: runner.model.module.loss_lambda = 0.
Read the Docs v: 0.x
Versions
latest
stable
1.x
dev-1.x
0.x
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.