파이토치로 시작하는 딥러닝 기초 (부스트코스) - Lab-04-1 Multivariable Linear regression
선형 회귀에서, x가 여러 개인 경우 (다항 선형 회귀인 경우) 어떻게 해야 할까?
=> matmul 이용!
더 간결하고, x의 길이가 바뀌어도 코드를 바꿀 필요가 없으며, 속도도 빠름
hypothesis = x_train.matmul(W) + b
import torch
import torch.nn as nn
import torch.nn.functional as F
class MultivariateLinearRegressionModel(nn.Module):
def __init__(self):
super().__init__()
self.linear = nn.Linear(3, 1)
def forward(self, x):
return self.linear(x)
x_train = torch.FloatTensor([[73, 80, 75],
[93, 88, 93],
[89, 91, 90],
[96, 98, 100],
[73, 66, 70]])
y_train = torch.FloatTensor([[152], [185], [180], [196], [142]])
model = MultivariateLinearRegressionModel()
optimizer = torch.optim.SGD(model.parameters(), lr=1e-5)
nb_epochs = 20
for epoch in range(nb_epochs + 1):
hypothesis = model(x_train)
cost = F.mse_loss(hypothesis, y_train)
optimizer.zero_grad()
cost.backward()
optimizer.step()
print('Epoch {:4d}/{} hypothesis: {} Cost: {:.6f}'.format(
epoch, nb_epochs, hypothesis.squeeze().detach(), cost.item()
))
>>>
Epoch 0/20 hypothesis: tensor([-62.1786, -79.0841, -75.5887, -83.1113, -60.8045]) Cost: 59994.242188
Epoch 1/20 hypothesis: tensor([33.4710, 35.8815, 37.6879, 40.2441, 26.8857]) Cost: 18809.871094
Epoch 2/20 hypothesis: tensor([ 87.0213, 100.2469, 101.1071, 109.3062, 75.9807]) Cost: 5900.755859
Epoch 3/20 hypothesis: tensor([117.0017, 136.2830, 136.6131, 147.9714, 103.4675]) Cost: 1854.430298
Epoch 4/20 hypothesis: tensor([133.7861, 156.4586, 156.4915, 169.6186, 118.8568]) Cost: 586.119507
Epoch 5/20 hypothesis: tensor([143.1827, 167.7544, 167.6205, 181.7379, 127.4731]) Cost: 188.569061
Epoch 6/20 hypothesis: tensor([148.4430, 174.0789, 173.8511, 188.5230, 132.2975]) Cost: 63.955395
Epoch 7/20 hypothesis: tensor([151.3876, 177.6200, 177.3392, 192.3216, 134.9988]) Cost: 24.893152
Epoch 8/20 hypothesis: tensor([153.0358, 179.6029, 179.2920, 194.4482, 136.5117]) Cost: 12.646570
Epoch 9/20 hypothesis: tensor([153.9581, 180.7133, 180.3851, 195.6387, 137.3590]) Cost: 8.805440
Epoch 10/20 hypothesis: tensor([154.4740, 181.3353, 180.9970, 196.3051, 137.8338]) Cost: 7.598882
Epoch 11/20 hypothesis: tensor([154.7624, 181.6839, 181.3394, 196.6781, 138.1001]) Cost: 7.218121
Epoch 12/20 hypothesis: tensor([154.9234, 181.8793, 181.5310, 196.8868, 138.2495]) Cost: 7.096243
Epoch 13/20 hypothesis: tensor([155.0131, 181.9890, 181.6381, 197.0036, 138.3336]) Cost: 7.055464
Epoch 14/20 hypothesis: tensor([155.0629, 182.0508, 181.6979, 197.0689, 138.3811]) Cost: 7.040148
Epoch 15/20 hypothesis: tensor([155.0903, 182.0856, 181.7313, 197.1053, 138.4081]) Cost: 7.032846
Epoch 16/20 hypothesis: tensor([155.1052, 182.1055, 181.7498, 197.1256, 138.4236]) Cost: 7.028001
Epoch 17/20 hypothesis: tensor([155.1131, 182.1169, 181.7601, 197.1369, 138.4327]) Cost: 7.023935
Epoch 18/20 hypothesis: tensor([155.1171, 182.1236, 181.7657, 197.1431, 138.4382]) Cost: 7.020101
Epoch 19/20 hypothesis: tensor([155.1189, 182.1277, 181.7687, 197.1465, 138.4417]) Cost: 7.016383
Epoch 20/20 hypothesis: tensor([155.1194, 182.1302, 181.7702, 197.1482, 138.4440]) Cost: 7.012684