快捷方式

热身:numpy

三阶多项式,训练通过最小化欧几里得距离平方来预测从 \(-\pi\)\(pi\)\(y=\sin(x)\)

此实现使用 numpy 手动计算前向传递,损失和反向传递。

numpy 数组是通用的 n 维数组;它不知道深度学习或梯度或计算图,只是一种执行通用数值计算的方法。

99 1546.0365145892183
199 1095.8992907857576
299 777.6096767345516
399 552.5352611528602
499 393.3692169177716
599 280.80664270842703
699 201.19883404382793
799 144.89548639133625
899 105.07298180275444
999 76.90616893555887
1099 56.982896477615675
1199 42.89011969702015
1299 32.9212810307606
1399 25.869417918780947
1499 20.880874497663395
1599 17.35185928109817
1699 14.855296144964393
1799 13.08909475122299
1899 11.839566794279794
1999 10.955552494109163
Result: y = 0.04887728059779718 + 0.854335392917212 x + -0.008432144224579809 x^2 + -0.09298823470267067 x^3

import numpy as np
import math

# Create random input and output data
x = np.linspace(-math.pi, math.pi, 2000)
y = np.sin(x)

# Randomly initialize weights
a = np.random.randn()
b = np.random.randn()
c = np.random.randn()
d = np.random.randn()

learning_rate = 1e-6
for t in range(2000):
    # Forward pass: compute predicted y
    # y = a + b x + c x^2 + d x^3
    y_pred = a + b * x + c * x ** 2 + d * x ** 3

    # Compute and print loss
    loss = np.square(y_pred - y).sum()
    if t % 100 == 99:
        print(t, loss)

    # Backprop to compute gradients of a, b, c, d with respect to loss
    grad_y_pred = 2.0 * (y_pred - y)
    grad_a = grad_y_pred.sum()
    grad_b = (grad_y_pred * x).sum()
    grad_c = (grad_y_pred * x ** 2).sum()
    grad_d = (grad_y_pred * x ** 3).sum()

    # Update weights
    a -= learning_rate * grad_a
    b -= learning_rate * grad_b
    c -= learning_rate * grad_c
    d -= learning_rate * grad_d

print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')

脚本的总运行时间: ( 0 分钟 0.347 秒)

由 Sphinx-Gallery 生成的图库

文档

访问 PyTorch 的全面开发者文档

查看文档

教程

获取针对初学者和高级开发人员的深入教程

查看教程

资源

查找开发资源并获得问题的解答

查看资源