Tensors | 张量👀
约 498 个字 120 行代码 预计阅读时间 3 分钟
- Tensors 是类似 arrays 和 matrices 的数据结构 (在 Pytorch 中使用 tensors 来编码模型的输入和输出,以及模型的参数)
- Tensors 类似于 Numpy 的 ndarrays,但是可以在 GPU 或其他特殊硬件上使用来加速计算
Tensor Initialization👀
- Tensor 可以通过多种方式进行初始化
Directly from data👀
- Tensor 可以直接从 data 中进行初始化 (数据类型会自动推断)
From a NumPy array👀
- Tensor 可以从 NumPy array 中进行初始化,并且保留 NumPy array 的维度和数据类型 (反之亦然, see Bridge with NumPy
From another tensor👀
- 新的 tensor 保留了原始 tensor 的属性 (shape, datatype),除非被显式地重写
With random or constant values👀
shape
是 tensor 维度的元组(tuple), 上面的函数中,它确定了输出张量的维度
``` output
Random Tensor:
tensor([[0.3904, 0.6009, 0.2566],
[0.7936, 0.9408, 0.1332]])
Ones Tensor:
tensor([[1., 1., 1.],
[1., 1., 1.]])
Zeros Tensor:
tensor([[0., 0., 0.],
[0., 0., 0.]])
Tensor Attributes👀
- Tensor attributrs describe their shape, datatype, and the device on which they are stored
Tensor Operations👀
- Over 100 tensor operations, including transposing, indexing, slicing, mathematical operations, linear algebra, random sampling, and more are comprehensively described here.
- 这些操作都能在 GPU 上运行(通常比在 CPU 上运行速度更快)
Some operations👀
If you’re familiar with the NumPy API, you’ll find the Tensor API a breeze to use.
Standard numpy-like indexing and slicing👀
Joining tensors👀
- We can use
torch.cat
to concatenate a sequence of tensors along a given dimension.
- See also torch.stack and torch.split
- another tensor joining op that is subtly different from
torch.cat
.
Multiplying tensors👀
-
Computing the element-wise product
-
Computing the matrix multiplication between two tensors
In-place operations👀
- Operations that have a
_
suffix are in-place - 这种操作会改变被操作的张量本身,例如
x.copy_(y)
,x.t_()
, 将会改变x
Note
In-place operations 可以节省一些内存,但在计算导数时会丢失历史信息。因此,不鼓励使用它们。
Bridge with NumPy👀
Tensors on the CPU and NumPy arrays can share their underlying memory locations, and changing one will change the other.
Tensor to NumPy array👀
- A change in the tensor reflects in the NumPy array.
NumPy array to Tensor👀
- Changes in the NumPy array reflects in the tensor.
本文总阅读量:
次