Module g. MITSUBISHI ELECTRIC Global website

See for a comparison between Note This method modifies the module in-place
This is useful for weakening an assumption to the finite case e The conditions are also convenient to define a notion of a finitely cogenerated module M

So it should be called before constructing optimizer if the module will live on XPU while being optimized.

28
Module — PyTorch 1.9.0 documentation
Otherwise, yields only parameters that are direct members of this module
Finitely generated module
This number is the same as the number of maximal A-linearly independent vectors in M or equivalently the rank of a maximal free submodule of M
Module — PyTorch 1.9.0 documentation
Then M is a G-module studied by
Returns The Parameter referenced by target Return type torch Keys are corresponding parameter and buffer names
Similarly, an M is : any injective endomorphism f is also a surjective endomorphism The hook can modify the output

The same is true if "f.

4
MITSUBISHI ELECTRIC Global website
User can either return a tuple or a single modified value in the hook
Finitely generated module
See for a comparison between
Finitely generated module
Returns a handle that can be used to remove the added hook by calling handle
Mod- G can be identified with the category of left resp The term G-module is also used for the more general notion of an on which G acts linearly i
For a R, finitely generated, finitely presented, and coherent are equivalent conditions on a module See above example for how to specify a fully-qualified string

The parameter can be accessed as an attribute using given name.

11
G
Otherwise, yields only buffers that are direct members of this module
Finitely generated module
This method is helpful for freezing part of the module for finetuning or training parts of a model individually e
G
Buffers, by default, are persistent and will be saved alongside parameters