olm.nn.activations.gelu¶
Classes¶
GELU(*args, **kwargs) |
GELU activation wrapper. |
|---|---|
class olm.nn.activations.gelu.ActivationBase(*args: Any, **kwargs: Any)¶
Bases: Module, ABC
Abstract base class for all activation functions.
Ensures a consistent interface for activation layers, handling device and dtype initialization. Subclasses must implement the forward method.
device¶
The device the module is on.
- Type: torch.device, optional
dtype¶
The data type of the module parameters.
- Type: torch.dtype
abstractmethod forward(x: torch.Tensor) → torch.Tensor¶
Apply activation to x.
class olm.nn.activations.gelu.GELU(*args: Any, **kwargs: Any)¶
Bases: ActivationBase
GELU activation wrapper.
forward(x: torch.Tensor) → torch.Tensor¶
Apply activation to x.