分类目录:《深入浅出Pytorch函数》总目录
相关文章:
· 机器学习中的数学——激活函数:Softmax函数
· 深入浅出Pytorch函数——torch.softmax/torch.nn.functional.softmax
· 深入浅出Pytorch函数——torch.nn.Softmax
将Softmax函数应用于沿dim
的所有切片,并将重新缩放它们,使元素位于
[
0
,
1
]
[0, 1]
[0,1]的范围内并和为1。文章来源:https://www.toymoban.com/news/detail-607491.html
语法
torch.softmax(input, dim, *, dtype=None) -> Tensor
torch.nn.functional.softmax(input, dim=None, _stacklevel=3, dtype=None) -> Tensor
参数
-
input
:[Tensor
] 输入张量 -
dim
:[int
] Softmax函数将沿着dim
轴计算,即沿dim
的每个切片的和为1 -
dtype
:[可选,torch.dtype
] 想要返回张量的数据类型。如果指定,则在执行操作之前将输入张量强制转换为dtype
。这对于防止数据类型溢出非常有用。默认值为None
返回值
与input
具有相同形状且值在[0,1]范围内的Tensor
。文章来源地址https://www.toymoban.com/news/detail-607491.html
实例
>>> x = torch.randn(4, 5)
>>> torch.nn.functional.softmax(x, dim=0)
tensor([[0.4259, 0.5448, 0.1935, 0.3904, 0.1963],
[0.1370, 0.1053, 0.1966, 0.2625, 0.4343],
[0.0540, 0.2823, 0.5101, 0.2082, 0.0905],
[0.3832, 0.0676, 0.0998, 0.1390, 0.2789]])
>>> torch.nn.functional.softmax(x, dim=1)
tensor([[0.0728, 0.1720, 0.1233, 0.5541, 0.0779],
[0.1139, 0.0667, 0.2677, 0.4361, 0.1156],
[0.2576, 0.0811, 0.0671, 0.2975, 0.2968],
[0.1874, 0.0358, 0.2240, 0.4470, 0.1057]])
函数实现
def softmax(input: Tensor, dim: Optional[int] = None, _stacklevel: int = 3, dtype: Optional[DType] = None) -> Tensor:
r"""Applies a softmax function.
Softmax is defined as:
:math:`\text{Softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}`
It is applied to all slices along dim, and will re-scale them so that the elements
lie in the range `[0, 1]` and sum to 1.
See :class:`~torch.nn.Softmax` for more details.
Args:
input (Tensor): input
dim (int): A dimension along which softmax will be computed.
dtype (:class:`torch.dtype`, optional): the desired data type of returned tensor.
If specified, the input tensor is casted to :attr:`dtype` before the operation
is performed. This is useful for preventing data type overflows. Default: None.
.. note::
This function doesn't work directly with NLLLoss,
which expects the Log to be computed between the Softmax and itself.
Use log_softmax instead (it's faster and has better numerical properties).
"""
if has_torch_function_unary(input):
return handle_torch_function(softmax, (input,), input, dim=dim, _stacklevel=_stacklevel, dtype=dtype)
if dim is None:
dim = _get_softmax_dim("softmax", input.dim(), _stacklevel)
if dtype is None:
ret = input.softmax(dim)
else:
ret = input.softmax(dim, dtype=dtype)
return ret
到了这里,关于深入浅出Pytorch函数——torch.softmax/torch.nn.functional.softmax的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!