Shortcuts

注意

您正在阅读 MMSelfSup 0.x 版本的文档,而 MMSelfSup 0.x 版本将会在 2022 年末 开始逐步停止维护。我们建议您及时升级到 MMSelfSup 1.0.0rc 版本,享受由 OpenMMLab 2.0 带来的更多新特性和更佳的性能表现。阅读 MMSelfSup 1.0.0rc 的 发版日志, 代码文档 获取更多信息。

mmselfsup.core.hooks.densecl_hook 源代码

# Copyright (c) OpenMMLab. All rights reserved.
from mmcv.runner import HOOKS, Hook


[文档]@HOOKS.register_module() class DenseCLHook(Hook): """Hook for DenseCL. This hook includes ``loss_lambda`` warmup in DenseCL. Borrowed from the authors' code: `<https://github.com/WXinlong/DenseCL>`_. Args: start_iters (int, optional): The number of warmup iterations to set ``loss_lambda=0``. Defaults to 1000. """ def __init__(self, start_iters=1000, **kwargs): self.start_iters = start_iters def before_run(self, runner): assert hasattr(runner.model.module, 'loss_lambda'), \ "The runner must have attribute \"loss_lambda\" in DenseCL." self.loss_lambda = runner.model.module.loss_lambda def before_train_iter(self, runner): assert hasattr(runner.model.module, 'loss_lambda'), \ "The runner must have attribute \"loss_lambda\" in DenseCL." cur_iter = runner.iter if cur_iter >= self.start_iters: runner.model.module.loss_lambda = self.loss_lambda else: runner.model.module.loss_lambda = 0.
Read the Docs v: 0.x
Versions
latest
stable
1.x
dev-1.x
0.x
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.