jittor.weightnorm

class jittor.weightnorm.WeightNorm(name: str, dim: int)[源代码]

• name (str): 要标准化的权重变量名称；

• dim (int): 对权重进行标准化的维度。-1表示对最后一维进行权重标准化。

>>> import jittor as jt
>>> from jittor import weightnorm
>>> wn = weightnorm.WeightNorm("weight", -1)
>>> linear_layer = jt.nn.Linear(3,4)
>>> wn.apply(linear_layer, "weight", -1) # 对linear_layer的weight变量作归一化
<jittor.weightnorm.WeightNorm object at 0x0000012FB9C3C400>
>>> hasattr(linear_layer, 'weight_g')
True
>>> wn.remove(linear_layer)
>>> hasattr(linear_layer, 'weight_g')
False

compute_weight(module: Module)[源代码]
remove(module: Module) None[源代码]
jittor.weightnorm.remove_weight_norm(module, name: str = 'weight')[源代码]

• module (Module): 需要移除权重归一化的模块

• name (str, 可选): 权重属性的名称. 默认值： ‘weight’

>>> from jittor import weightnorm
>>> from jittor import nn
>>> model = nn.Linear(20, 40)
>>> model = weightnorm.weight_norm(model, 'weight', -1)
>>> hasattr(model,"__fhook2__")
True
>>> model = weightnorm.remove_weight_norm(model, 'weight')
>>> hasattr(model,"__fhook2__")
False

jittor.weightnorm.weight_norm(module, name, dim)[源代码]

• module: 输入的模型，类型为模型对象

• name (Jittor.Var): 指定的参数的名称

• dim (int): 进行权重归一化的维度

>>> import jittor as jt
>>> from jittor import weightnorm
>>> class jt_module(jt.nn.Module):
>>>         def __init__(self, weight):
>>>             super().__init__()
>>>             self.linear = jt.array(weight)
>>>
>>>         def execute(self, x):
>>>             return jt.matmul(self.linear, x)
>>>
>>> jm = jt_module(weight)
>>> weightnorm.weight_norm(jm, 'linear', -1)