jittor.optim

这里是Jittor的优化器模块的API文档,您可以通过from jittor import optim来获取该模块。

class jittor.optim.Adam(params, lr, eps=1e-08, betas=0.9, 0.999, weight_decay=0)[源代码]

Adam Optimizer.

Example:

optimizer = nn.Adam(model.parameters(), lr, eps=1e-8, betas=(0.9, 0.999))
optimizer.step(loss)
step(loss)[源代码]
class jittor.optim.Optimizer(params, lr, param_sync_iter=10000)[源代码]

Basic class of Optimizer.

Example:

optimizer = nn.SGD(model.parameters(), lr)
optimizer.step(loss)
pre_step(loss)[源代码]

something should be done before step, such as calc gradients, mpi sync, and so on.

Example:

class MyOptimizer(Optimizer):
    def step(self, loss):
        self.post_step(loss)
        ...
step(loss)[源代码]
class jittor.optim.RMSprop(params, lr=0.01, eps=1e-08, alpha=0.99)[源代码]

RMSprop Optimizer. Args:

params(list): parameters of model. lr(float): learning rate. eps(float): term added to the denominator to avoid division by zero, default 1e-8. alpha(float): smoothing constant, default 0.99.

Example:

optimizer = nn.RMSprop(model.parameters(), lr) optimizer.step(loss)

step(loss)[源代码]
class jittor.optim.SGD(params, lr, momentum=0, weight_decay=0, dampening=0, nesterov=False)[源代码]

SGD Optimizer.

Example:

optimizer = nn.SGD(model.parameters(), lr, momentum=0.9)
optimizer.step(loss)
step(loss)[源代码]