드랍아웃: 노드에 들어가는 인풋을 임읠 0으로 만들어 학습이 되지 않게 만드는 것.

  • 오버피팅 피하기

import torch 
import matplotlib.pyplot as plt 
torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1)
y=torch.randn(100).reshape(100,1)*0.01
plt.plot(X,y)
[<matplotlib.lines.Line2D at 0x7fb55dbdd880>]

네트워크 설정, 옵티마이저, 로스

X.shape,y.shape
(torch.Size([100, 1]), torch.Size([100, 1]))
torch.manual_seed(1) # 초기가중치를 똑같이 , 위 plot을 맞춰보는 net설정해보기
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=512), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=512,out_features=1)) 
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()

모형학습

for epoc in range(1000): 
    ## 1 
    yhat=net(X) 
    ## 2 
    loss=loss_fn(yhat,y) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step()
    net.zero_grad() 
plt.plot(X,y) 
plt.plot(X,yhat.data)
[<matplotlib.lines.Line2D at 0x7fb55dbc0ac0>]
  • 학습시킨 yhat이 y를 따라가면 안 돼, 우리는 임의로 뽑은 것이니까 나오려면 , 직선이 나와야지
  • 전형적인 overfitting case
  • 또 trend가 있는게 맞는 건지, 0이어야만 하는건지 확실히 알 수 없다
  • 그래서 train/validation 나누어서 해보고 그 train을 통해 얻은 선이 validation에 맞나? 맞는 거 빼고 다 오버피팅~

train / validation

X1=X[:80]
y1=y[:80]
X2=X[80:]
y2=y[80:] 
torch.manual_seed(1) 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=512), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=512,out_features=1)) 
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
for epoc in range(1000): 
    ## 1 
    y1hat=net(X1) 
    ## 2 
    loss=loss_fn(y1hat,y1) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step() 
    net.zero_grad() 
plt.plot(X,y)
plt.plot(X1,net(X1).data,'--r') 
plt.plot(X2,net(X2).data,'--g') 
[<matplotlib.lines.Line2D at 0x7fb55db30520>]
  • 보이는 패턴만 따라가서 loss 줄이려고 하니까
  • 위와 같은 오버피팅이 나옴

드랍아웃

X1=X[:80]
y1=y[:80]
X2=X[80:]
y2=y[80:] 
torch.manual_seed(1) 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=512), 
    torch.nn.ReLU(),
    torch.nn.Dropout(0.8), ############################ 다음에 통과한 노드의 80%가 0이 되어 y 출력
    torch.nn.Linear(in_features=512,out_features=1)) 
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
for epoc in range(1000): 
    ## 1 
    y1hat=net(X1) 
    ## 2 
    loss=loss_fn(y1hat,y1) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step() 
    net.zero_grad() 
 
net.eval() ## 네트워크를 평가모드로 전환
# 학습할 때는 0으로 만들고 평가할 때는 0으로 만들 필요가 없지
plt.plot(X,y)
plt.plot(X1,net(X1).data,'--r') 
plt.plot(X2,net(X2).data,'--g') 
[<matplotlib.lines.Line2D at 0x7fb55da91cd0>]

Warning: 512개 노드들이 업데이트가 될때 잘 된 특정 노드 위주로 노드가 몰리게 된다. 이는 오버피팅을 불러 일으킬 수 있다.
  • 학습할때는 torch.nn.Dropout(0.8) 놓고 평가할때는 지워야겠지
  • 80개 train data만 봤을때 dropout한 게 잘 맞는지 아닌지는 관점의 차이일 수 있지만
  • 20개 test data 봤을때 dropout한 게 잘 맞는다는 것을 알 수 있지

ref: https://pytorch.org/docs/stable/generated/torch.nn.Dropout.html

  • During training, randomly zeroes some of the elements of the input tensor with probability p using samples from a Bernoulli distribution. Each channel will be zeroed out independently on every forward call.
  • This has proven to be an effective technique for regularization and preventing the co-adaptation of neurons as described in the paper Improving neural networks by preventing co-adaptation of feature detectors .
  • dropout은 신경망의 일반화 성능을 높이기 위해 자주 쓰이는 테크닉 중 하나
  • 신경망 구조 학습 시, 레이어 간 연결 중 일부를 렌덤하게 삭제하면, 여러 개의 네트워크를 합치는 효과를 낼 수 있고, 이로 인해 일반화 성능이 높아짐

학습과정 비교 (주의: 코드복잡함)

- 데이터 생성

torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1) 
y=torch.randn(100).reshape(100,1) 

- tr/val 분리

X_tr=X[:80]
y_tr=y[:80]
X_val=X[80:]
y_val=y[80:] 

- 네트워크, 옵티마이저, 손실함수 설정

  • 드랍아웃을 이용한 네트워트 (net2)와 그렇지 않은 네트워크 (net1)
  • 대응하는 옵티마이저 1,2 설정
  • 손실함수
torch.manual_seed(1) 
net1=torch.nn.Sequential(
    torch.nn.Linear(1,512), 
    torch.nn.ReLU(),
    torch.nn.Linear(512,1)) 
optimizer_net1 = torch.optim.Adam(net1.parameters()) 
net2=torch.nn.Sequential(
    torch.nn.Linear(1,512), 
    torch.nn.ReLU(),
    torch.nn.Dropout(0.8),
    torch.nn.Linear(512,1)) 
optimizer_net2 = torch.optim.Adam(net2.parameters())
loss_fn=torch.nn.MSELoss()
tr_loss_net1=[]  # 시뮬레이션 결과 저장할 공간을 만드는 과정
val_loss_net1=[]
tr_loss_net2=[] 
val_loss_net2=[] 
for epoc in range(1000): 
    ## 1 
    yhat_tr_net1 = net1(X_tr) 
    ## 2 
    loss_tr = loss_fn(yhat_tr_net1, y_tr) 
    ## 3 
    loss_tr.backward() 
    ## 4 
    optimizer_net1.step()
    net1.zero_grad() 
    ## 5 기록 
    ### tr 
    tr_loss_net1.append(loss_tr.item())
    
    ### val 
    yhat_val_net1 = net1(X_val) 
    loss_val = loss_fn(yhat_val_net1,y_val) 
    val_loss_net1.append(loss_val.item())
for epoc in range(1000): 
    ## 1 
    yhat_tr_net2 = net2(X_tr) 
    ## 2 
    loss_tr = loss_fn(yhat_tr_net2, y_tr) 
    ## 3 
    loss_tr.backward() 
    ## 4 
    optimizer_net2.step()
    net2.zero_grad() 
    ## 5 기록 
    ### tr 
    net2.eval()
    tr_loss_net2.append(loss_tr.item())
    ### val 
    yhat_val_net2 = net2(X_val) 
    loss_val = loss_fn(yhat_val_net2,y_val) 
    val_loss_net2.append(loss_val.item())
    net2.train() # training 모드로 변환
net2.eval() # 그림 그리기 위해 또 코드 추가
fig , ((ax1,ax2),(ax3,ax4)) = plt.subplots(2,2)
ax1.plot(X,y,'.');ax1.plot(X_tr,net1(X_tr).data); ax1.plot(X_val,net1(X_val).data); 
ax2.plot(X,y,'.');ax2.plot(X_tr,net2(X_tr).data); ax2.plot(X_val,net2(X_val).data); 
ax3.plot(tr_loss_net1);ax3.plot(val_loss_net1);
ax4.plot(tr_loss_net2);ax4.plot(val_loss_net2);
  • ax3 내려가다(잘 적합) 올라간다(과적합)
  • ax4 점점 내려감(잘 적합)

- 다 좋은데 코드를 짜는것이 너무 힘들다.

  • 생각해보니까 미니배치도 만들어야 함 + 미니배치를 나눈상태에서 GPU 메모리에 파라메터도 올려야함.
  • 조기종료(과적합 전 종료)와 같은 기능도 구현해야함 + 기타등등을 구현해야함.
  • 나중에는 학습률을 서로 다르게 돌려가며 결과도 기록해야함 $\to$ 그래야 좋은 학습률 선택가능
  • for문안에 step1~step4를 넣는것도 너무 반복작업임.
  • 등등..

- 위와 같은 것들의 특징: 머리로 상상하기는 쉽지만 실제 구현하는 것은 까다롭다.

Note: pseudocode 의사코드: In computer science, pseudocode is a plain language description of the steps in an algorithm or another system. Pseudocode often uses structural conventions of a normal programming language, but is intended for human reading rather than machine reading. It typically omits details that are essential for machine understanding of the algorithm, such as variable declarations and language-specific code. The programming language is augmented with natural language description details, where convenient, or with compact mathematical notation. The purpose of using pseudocode is that it is easier for people to understand than conventional programming language code, and that it is an efficient and environment-independent description of the key principles of an algorithm. It is commonly used in textbooks and scientific publications to document algorithms and in planning of software and other algorithms.

- 사실 우리가 하고싶은것

  • 아키텍처를 설계: 데이터를 보고 맞춰서 설계해야할 때가 많음 (우리가 해야한다)
  • 손실함수: 통계학과 교수님들이 연구하심
  • 옵티마이저: 산공교수님들이 연구하심

- 교수님 생각

  • 기업의 욕심: real-data를 분석하는 딥러닝 아키텍처 설계 $\to$ 아키텍처별로 결과를 관찰 (편하게) $\Longrightarrow$ fastai + real data
  • 학생의 욕심: 그러면서도 모형이 돌아가는 원리는 아주 세밀하게 알고싶음 $\Longrightarrow$ pytorch + toy example (regression 등을 위주로)
  • 연구자의 욕심: 기존의 모형을 조금 변경해서 쓰고싶음 $\Longrightarrow$ (pytorch + fastai) + any data

- tensorflow + keras vs pytorch + fastai

pytorch + fastai

- 데이터셋을 만든다.

X_tr=X[:80]
y_tr=y[:80]
X_val=X[80:]
y_val=y[80:] 
ds1=torch.utils.data.TensorDataset(X_tr,y_tr) 
ds2=torch.utils.data.TensorDataset(X_val,y_val) 

- 데이터로더를 만든다.

  • 여기까지 pytorch level
  • 다음부터 fastai
dl1 = torch.utils.data.DataLoader(ds1, batch_size=80) # 80개 다 쓰겠다~
dl2 = torch.utils.data.DataLoader(ds2, batch_size=20) 

- 데이터로더스를 만든다.

from fastai.vision.all import * 
dls=DataLoaders(dl1,dl2) 

드랍아웃 제외버전

- 네트워크 설계 (드랍아웃 제외)

torch.manual_seed(1) 
net_fastai = torch.nn.Sequential(
    torch.nn.Linear(in_features=1, out_features=512),
    torch.nn.ReLU(),
    #torch.nn.Dropout(0.8),
    torch.nn.Linear(in_features=512, out_features=1)) 
#optimizer 
loss_fn=torch.nn.MSELoss() # torch에 있는 거 넣어줘도 된다

- 러너오브젝트 (for문 대신돌려주는 오브젝트)

lrnr= Learner(dls,net_fastai,opt_func=Adam,loss_func=loss_fn) 

- 에폭만 설정하고 바로 학습

lrnr.fit(1000)

epoch train_loss valid_loss time
0 1.277156 0.491314 00:00
1 1.277145 0.455286 00:00
2 1.275104 0.444275 00:00
3 1.274429 0.465787 00:00
4 1.273436 0.507203 00:00
5 1.272421 0.548102 00:00
6 1.271840 0.561292 00:00
7 1.271377 0.549409 00:00
8 1.270855 0.530416 00:00
9 1.270437 0.520700 00:00
10 1.270176 0.526273 00:00
11 1.269935 0.543579 00:00
12 1.269655 0.562939 00:00
13 1.269411 0.571586 00:00
14 1.269217 0.563700 00:00
15 1.269018 0.543646 00:00
16 1.268787 0.521385 00:00
17 1.268563 0.505799 00:00
18 1.268362 0.500011 00:00
19 1.268159 0.501830 00:00
20 1.267941 0.506255 00:00
21 1.267730 0.506739 00:00
22 1.267540 0.499733 00:00
23 1.267353 0.487385 00:00
24 1.267163 0.474839 00:00
25 1.266981 0.466926 00:00
26 1.266814 0.465347 00:00
27 1.266648 0.468656 00:00
28 1.266480 0.473641 00:00
29 1.266316 0.476266 00:00
30 1.266156 0.474677 00:00
31 1.265996 0.469958 00:00
32 1.265833 0.465630 00:00
33 1.265673 0.464544 00:00
34 1.265514 0.467181 00:00
35 1.265355 0.472571 00:00
36 1.265194 0.477105 00:00
37 1.265037 0.478357 00:00
38 1.264880 0.475766 00:00
39 1.264724 0.471696 00:00
40 1.264569 0.469089 00:00
41 1.264416 0.469158 00:00
42 1.264262 0.471343 00:00
43 1.264108 0.472992 00:00
44 1.263955 0.471979 00:00
45 1.263801 0.468276 00:00
46 1.263646 0.463477 00:00
47 1.263491 0.460086 00:00
48 1.263336 0.458932 00:00
49 1.263181 0.459443 00:00
50 1.263025 0.459690 00:00
51 1.262869 0.457996 00:00
52 1.262714 0.454969 00:00
53 1.262558 0.451982 00:00
54 1.262402 0.450564 00:00
55 1.262247 0.450934 00:00
56 1.262090 0.451861 00:00
57 1.261933 0.451914 00:00
58 1.261776 0.450721 00:00
59 1.261619 0.448978 00:00
60 1.261461 0.447796 00:00
61 1.261303 0.448038 00:00
62 1.261144 0.448761 00:00
63 1.260986 0.449142 00:00
64 1.260826 0.448443 00:00
65 1.260667 0.446837 00:00
66 1.260507 0.445661 00:00
67 1.260347 0.445344 00:00
68 1.260187 0.445592 00:00
69 1.260026 0.445488 00:00
70 1.259866 0.444427 00:00
71 1.259705 0.442824 00:00
72 1.259543 0.441615 00:00
73 1.259382 0.441126 00:00
74 1.259220 0.441023 00:00
75 1.259058 0.440497 00:00
76 1.258896 0.439592 00:00
77 1.258733 0.438460 00:00
78 1.258569 0.437588 00:00
79 1.258405 0.437321 00:00
80 1.258241 0.437219 00:00
81 1.258077 0.436916 00:00
82 1.257912 0.435913 00:00
83 1.257747 0.435003 00:00
84 1.257582 0.434601 00:00
85 1.257416 0.434494 00:00
86 1.257249 0.434309 00:00
87 1.257081 0.433745 00:00
88 1.256913 0.432914 00:00
89 1.256744 0.432331 00:00
90 1.256575 0.432165 00:00
91 1.256406 0.432003 00:00
92 1.256236 0.431670 00:00
93 1.256065 0.430937 00:00
94 1.255894 0.430317 00:00
95 1.255723 0.429924 00:00
96 1.255550 0.429707 00:00
97 1.255377 0.429296 00:00
98 1.255203 0.428846 00:00
99 1.255029 0.428160 00:00
100 1.254854 0.427743 00:00
101 1.254679 0.427369 00:00
102 1.254504 0.426952 00:00
103 1.254328 0.426511 00:00
104 1.254151 0.426140 00:00
105 1.253973 0.425836 00:00
106 1.253796 0.425516 00:00
107 1.253617 0.425156 00:00
108 1.253438 0.424890 00:00
109 1.253259 0.424601 00:00
110 1.253079 0.424245 00:00
111 1.252898 0.423975 00:00
112 1.252717 0.423880 00:00
113 1.252535 0.423618 00:00
114 1.252353 0.423350 00:00
115 1.252170 0.422870 00:00
116 1.251987 0.422535 00:00
117 1.251803 0.422493 00:00
118 1.251619 0.422289 00:00
119 1.251435 0.421934 00:00
120 1.251250 0.421515 00:00
121 1.251064 0.421340 00:00
122 1.250877 0.421245 00:00
123 1.250690 0.421051 00:00
124 1.250502 0.420791 00:00
125 1.250314 0.420412 00:00
126 1.250125 0.420317 00:00
127 1.249936 0.420248 00:00
128 1.249747 0.420143 00:00
129 1.249556 0.419855 00:00
130 1.249366 0.419597 00:00
131 1.249175 0.419529 00:00
132 1.248984 0.419402 00:00
133 1.248792 0.419158 00:00
134 1.248600 0.419000 00:00
135 1.248406 0.418832 00:00
136 1.248212 0.418870 00:00
137 1.248018 0.418848 00:00
138 1.247823 0.418646 00:00
139 1.247628 0.418527 00:00
140 1.247432 0.418479 00:00
141 1.247236 0.418403 00:00
142 1.247040 0.418210 00:00
143 1.246843 0.417947 00:00
144 1.246646 0.417901 00:00
145 1.246448 0.417914 00:00
146 1.246250 0.417826 00:00
147 1.246051 0.417753 00:00
148 1.245852 0.417775 00:00
149 1.245652 0.417853 00:00
150 1.245452 0.417885 00:00
151 1.245251 0.417743 00:00
152 1.245050 0.417730 00:00
153 1.244848 0.417727 00:00
154 1.244646 0.417681 00:00
155 1.244443 0.417622 00:00
156 1.244240 0.417630 00:00
157 1.244037 0.417595 00:00
158 1.243833 0.417639 00:00
159 1.243629 0.417726 00:00
160 1.243425 0.417700 00:00
161 1.243219 0.417724 00:00
162 1.243014 0.417833 00:00
163 1.242808 0.417928 00:00
164 1.242601 0.417935 00:00
165 1.242394 0.417993 00:00
166 1.242187 0.418112 00:00
167 1.241980 0.418193 00:00
168 1.241772 0.418233 00:00
169 1.241564 0.418344 00:00
170 1.241355 0.418483 00:00
171 1.241146 0.418544 00:00
172 1.240937 0.418641 00:00
173 1.240727 0.418748 00:00
174 1.240517 0.418859 00:00
175 1.240307 0.418996 00:00
176 1.240097 0.419117 00:00
177 1.239886 0.419244 00:00
178 1.239675 0.419343 00:00
179 1.239463 0.419554 00:00
180 1.239251 0.419620 00:00
181 1.239040 0.419845 00:00
182 1.238827 0.420004 00:00
183 1.238615 0.420229 00:00
184 1.238402 0.420362 00:00
185 1.238188 0.420565 00:00
186 1.237974 0.420724 00:00
187 1.237759 0.420960 00:00
188 1.237545 0.421179 00:00
189 1.237330 0.421355 00:00
190 1.237115 0.421552 00:00
191 1.236899 0.421769 00:00
192 1.236684 0.421995 00:00
193 1.236467 0.422213 00:00
194 1.236251 0.422478 00:00
195 1.236034 0.422736 00:00
196 1.235816 0.422935 00:00
197 1.235599 0.423257 00:00
198 1.235381 0.423523 00:00
199 1.235163 0.423746 00:00
200 1.234944 0.424009 00:00
201 1.234725 0.424427 00:00
202 1.234506 0.424665 00:00
203 1.234287 0.424929 00:00
204 1.234068 0.425280 00:00
205 1.233848 0.425315 00:00
206 1.233629 0.425682 00:00
207 1.233409 0.426025 00:00
208 1.233188 0.426234 00:00
209 1.232968 0.426510 00:00
210 1.232747 0.427178 00:00
211 1.232525 0.427304 00:00
212 1.232305 0.427802 00:00
213 1.232084 0.428111 00:00
214 1.231863 0.428183 00:00
215 1.231642 0.428455 00:00
216 1.231420 0.428837 00:00
217 1.231197 0.429088 00:00
218 1.230976 0.429649 00:00
219 1.230753 0.430295 00:00
220 1.230530 0.430616 00:00
221 1.230307 0.431053 00:00
222 1.230084 0.431475 00:00
223 1.229861 0.431794 00:00
224 1.229638 0.431851 00:00
225 1.229414 0.432452 00:00
226 1.229190 0.433051 00:00
227 1.228966 0.433213 00:00
228 1.228742 0.434148 00:00
229 1.228517 0.434328 00:00
230 1.228294 0.435062 00:00
231 1.228069 0.435324 00:00
232 1.227844 0.435593 00:00
233 1.227620 0.436242 00:00
234 1.227395 0.436324 00:00
235 1.227170 0.437004 00:00
236 1.226944 0.437824 00:00
237 1.226719 0.438240 00:00
238 1.226493 0.438990 00:00
239 1.226267 0.439452 00:00
240 1.226041 0.439582 00:00
241 1.225816 0.440158 00:00
242 1.225590 0.440385 00:00
243 1.225363 0.440595 00:00
244 1.225136 0.441521 00:00
245 1.224909 0.441691 00:00
246 1.224683 0.442780 00:00
247 1.224456 0.443318 00:00
248 1.224230 0.443610 00:00
249 1.224003 0.444733 00:00
250 1.223777 0.444483 00:00
251 1.223550 0.445368 00:00
252 1.223323 0.445906 00:00
253 1.223096 0.446098 00:00
254 1.222869 0.447504 00:00
255 1.222642 0.447326 00:00
256 1.222415 0.448967 00:00
257 1.222188 0.449487 00:00
258 1.221961 0.449353 00:00
259 1.221734 0.450770 00:00
260 1.221507 0.449695 00:00
261 1.221280 0.451468 00:00
262 1.221052 0.450778 00:00
263 1.220825 0.452406 00:00
264 1.220597 0.453030 00:00
265 1.220370 0.454064 00:00
266 1.220143 0.455389 00:00
267 1.219915 0.455432 00:00
268 1.219687 0.456134 00:00
269 1.219459 0.456304 00:00
270 1.219232 0.456480 00:00
271 1.219004 0.457973 00:00
272 1.218777 0.457321 00:00
273 1.218550 0.459607 00:00
274 1.218322 0.458900 00:00
275 1.218094 0.460690 00:00
276 1.217867 0.460998 00:00
277 1.217640 0.461632 00:00
278 1.217412 0.462792 00:00
279 1.217185 0.462078 00:00
280 1.216957 0.464403 00:00
281 1.216730 0.463427 00:00
282 1.216502 0.465762 00:00
283 1.216274 0.465187 00:00
284 1.216047 0.467141 00:00
285 1.215820 0.467549 00:00
286 1.215592 0.468001 00:00
287 1.215364 0.468962 00:00
288 1.215137 0.468817 00:00
289 1.214908 0.470964 00:00
290 1.214682 0.468905 00:00
291 1.214455 0.473823 00:00
292 1.214228 0.468575 00:00
293 1.214002 0.476916 00:00
294 1.213777 0.469099 00:00
295 1.213553 0.479390 00:00
296 1.213328 0.473587 00:00
297 1.213103 0.477545 00:00
298 1.212876 0.480034 00:00
299 1.212650 0.473731 00:00
300 1.212426 0.481915 00:00
301 1.212201 0.474704 00:00
302 1.211975 0.478484 00:00
303 1.211748 0.481180 00:00
304 1.211523 0.477734 00:00
305 1.211298 0.485058 00:00
306 1.211073 0.482515 00:00
307 1.210848 0.485141 00:00
308 1.210622 0.486957 00:00
309 1.210398 0.483995 00:00
310 1.210173 0.487897 00:00
311 1.209949 0.484598 00:00
312 1.209725 0.487746 00:00
313 1.209501 0.487176 00:00
314 1.209277 0.488011 00:00
315 1.209053 0.490575 00:00
316 1.208830 0.490209 00:00
317 1.208606 0.492787 00:00
318 1.208382 0.492908 00:00
319 1.208159 0.493765 00:00
320 1.207935 0.495061 00:00
321 1.207713 0.494494 00:00
322 1.207490 0.497208 00:00
323 1.207268 0.495765 00:00
324 1.207046 0.498866 00:00
325 1.206824 0.496930 00:00
326 1.206602 0.498859 00:00
327 1.206380 0.499887 00:00
328 1.206159 0.499914 00:00
329 1.205937 0.502608 00:00
330 1.205715 0.502366 00:00
331 1.205495 0.505626 00:00
332 1.205274 0.504775 00:00
333 1.205053 0.507247 00:00
334 1.204832 0.505510 00:00
335 1.204612 0.508987 00:00
336 1.204392 0.506300 00:00
337 1.204173 0.510754 00:00
338 1.203953 0.507736 00:00
339 1.203734 0.513753 00:00
340 1.203515 0.508676 00:00
341 1.203297 0.516931 00:00
342 1.203079 0.510553 00:00
343 1.202860 0.517162 00:00
344 1.202642 0.513240 00:00
345 1.202424 0.515739 00:00
346 1.202206 0.516983 00:00
347 1.201988 0.515956 00:00
348 1.201771 0.521419 00:00
349 1.201554 0.517885 00:00
350 1.201338 0.525130 00:00
351 1.201122 0.519955 00:00
352 1.200906 0.526839 00:00
353 1.200691 0.522522 00:00
354 1.200475 0.527202 00:00
355 1.200259 0.524338 00:00
356 1.200043 0.527433 00:00
357 1.199828 0.527059 00:00
358 1.199612 0.525954 00:00
359 1.199397 0.532018 00:00
360 1.199182 0.524938 00:00
361 1.198969 0.536932 00:00
362 1.198757 0.525091 00:00
363 1.198544 0.539394 00:00
364 1.198332 0.528241 00:00
365 1.198120 0.538193 00:00
366 1.197907 0.537200 00:00
367 1.197693 0.534040 00:00
368 1.197481 0.543126 00:00
369 1.197270 0.535240 00:00
370 1.197057 0.541768 00:00
371 1.196846 0.542524 00:00
372 1.196634 0.537782 00:00
373 1.196423 0.547702 00:00
374 1.196213 0.538663 00:00
375 1.196003 0.546115 00:00
376 1.195793 0.545712 00:00
377 1.195583 0.543213 00:00
378 1.195374 0.552051 00:00
379 1.195165 0.544678 00:00
380 1.194957 0.551509 00:00
381 1.194749 0.548909 00:00
382 1.194540 0.548179 00:00
383 1.194333 0.553327 00:00
384 1.194125 0.549814 00:00
385 1.193918 0.555625 00:00
386 1.193711 0.554851 00:00
387 1.193504 0.555712 00:00
388 1.193297 0.559774 00:00
389 1.193091 0.555476 00:00
390 1.192885 0.562433 00:00
391 1.192680 0.557535 00:00
392 1.192475 0.559940 00:00
393 1.192270 0.562124 00:00
394 1.192065 0.556576 00:00
395 1.191862 0.566213 00:00
396 1.191659 0.559902 00:00
397 1.191455 0.567632 00:00
398 1.191251 0.566840 00:00
399 1.191049 0.564430 00:00
400 1.190845 0.570297 00:00
401 1.190642 0.562683 00:00
402 1.190440 0.571745 00:00
403 1.190239 0.565711 00:00
404 1.190037 0.571408 00:00
405 1.189836 0.571616 00:00
406 1.189634 0.569108 00:00
407 1.189433 0.576971 00:00
408 1.189232 0.568847 00:00
409 1.189033 0.577803 00:00
410 1.188832 0.571170 00:00
411 1.188632 0.576848 00:00
412 1.188433 0.575512 00:00
413 1.188233 0.576850 00:00
414 1.188034 0.579066 00:00
415 1.187835 0.577082 00:00
416 1.187636 0.581787 00:00
417 1.187439 0.578078 00:00
418 1.187240 0.582614 00:00
419 1.187043 0.582266 00:00
420 1.186846 0.580410 00:00
421 1.186649 0.587922 00:00
422 1.186452 0.578261 00:00
423 1.186255 0.594928 00:00
424 1.186061 0.578724 00:00
425 1.185867 0.596110 00:00
426 1.185673 0.586743 00:00
427 1.185479 0.589400 00:00
428 1.185284 0.598742 00:00
429 1.185089 0.584824 00:00
430 1.184896 0.599986 00:00
431 1.184702 0.591904 00:00
432 1.184508 0.593857 00:00
433 1.184313 0.601378 00:00
434 1.184120 0.595892 00:00
435 1.183926 0.602663 00:00
436 1.183734 0.601748 00:00
437 1.183541 0.600579 00:00
438 1.183349 0.604881 00:00
439 1.183156 0.602084 00:00
440 1.182963 0.604611 00:00
441 1.182772 0.603985 00:00
442 1.182581 0.607958 00:00
443 1.182390 0.606345 00:00
444 1.182200 0.612392 00:00
445 1.182010 0.607756 00:00
446 1.181820 0.613536 00:00
447 1.181630 0.608906 00:00
448 1.181442 0.613233 00:00
449 1.181255 0.613283 00:00
450 1.181066 0.618034 00:00
451 1.180877 0.616676 00:00
452 1.180690 0.619615 00:00
453 1.180502 0.617715 00:00
454 1.180315 0.619031 00:00
455 1.180128 0.617161 00:00
456 1.179941 0.620506 00:00
457 1.179755 0.619178 00:00
458 1.179570 0.624192 00:00
459 1.179384 0.625420 00:00
460 1.179197 0.624873 00:00
461 1.179013 0.628864 00:00
462 1.178828 0.624363 00:00
463 1.178643 0.632658 00:00
464 1.178460 0.626067 00:00
465 1.178276 0.635065 00:00
466 1.178094 0.629589 00:00
467 1.177910 0.637503 00:00
468 1.177727 0.631370 00:00
469 1.177545 0.636182 00:00
470 1.177362 0.631440 00:00
471 1.177178 0.635118 00:00
472 1.176996 0.633891 00:00
473 1.176813 0.637058 00:00
474 1.176632 0.640742 00:00
475 1.176450 0.640980 00:00
476 1.176269 0.646436 00:00
477 1.176087 0.639515 00:00
478 1.175906 0.649731 00:00
479 1.175728 0.633841 00:00
480 1.175548 0.656491 00:00
481 1.175371 0.630892 00:00
482 1.175196 0.660517 00:00
483 1.175021 0.639879 00:00
484 1.174845 0.648354 00:00
485 1.174665 0.659490 00:00
486 1.174490 0.640693 00:00
487 1.174316 0.661490 00:00
488 1.174139 0.652804 00:00
489 1.173962 0.644998 00:00
490 1.173787 0.664158 00:00
491 1.173613 0.646924 00:00
492 1.173438 0.652377 00:00
493 1.173262 0.666445 00:00
494 1.173087 0.647732 00:00
495 1.172916 0.659727 00:00
496 1.172741 0.663725 00:00
497 1.172568 0.648634 00:00
498 1.172397 0.664491 00:00
499 1.172224 0.663532 00:00
500 1.172052 0.651162 00:00
501 1.171881 0.671245 00:00
502 1.171711 0.664546 00:00
503 1.171539 0.657829 00:00
504 1.171369 0.674585 00:00
505 1.171199 0.665482 00:00
506 1.171029 0.660295 00:00
507 1.170860 0.676046 00:00
508 1.170691 0.664666 00:00
509 1.170523 0.664394 00:00
510 1.170355 0.675279 00:00
511 1.170187 0.666940 00:00
512 1.170019 0.669697 00:00
513 1.169852 0.675405 00:00
514 1.169684 0.670651 00:00
515 1.169518 0.671216 00:00
516 1.169351 0.677501 00:00
517 1.169186 0.671080 00:00
518 1.169021 0.674967 00:00
519 1.168855 0.679167 00:00
520 1.168690 0.674872 00:00
521 1.168525 0.679228 00:00
522 1.168361 0.680458 00:00
523 1.168196 0.678544 00:00
524 1.168033 0.683338 00:00
525 1.167870 0.680968 00:00
526 1.167705 0.684111 00:00
527 1.167542 0.683222 00:00
528 1.167380 0.684760 00:00
529 1.167218 0.684904 00:00
530 1.167056 0.683687 00:00
531 1.166896 0.686523 00:00
532 1.166734 0.685283 00:00
533 1.166573 0.689747 00:00
534 1.166413 0.688458 00:00
535 1.166252 0.691884 00:00
536 1.166091 0.692554 00:00
537 1.165933 0.695983 00:00
538 1.165772 0.696132 00:00
539 1.165612 0.699348 00:00
540 1.165453 0.697565 00:00
541 1.165294 0.700519 00:00
542 1.165136 0.696827 00:00
543 1.164980 0.704912 00:00
544 1.164823 0.702665 00:00
545 1.164667 0.705002 00:00
546 1.164510 0.705292 00:00
547 1.164352 0.705426 00:00
548 1.164195 0.703526 00:00
549 1.164039 0.706681 00:00
550 1.163882 0.702986 00:00
551 1.163726 0.706838 00:00
552 1.163571 0.705034 00:00
553 1.163417 0.702783 00:00
554 1.163262 0.707929 00:00
555 1.163108 0.702308 00:00
556 1.162954 0.715909 00:00
557 1.162799 0.708171 00:00
558 1.162646 0.717306 00:00
559 1.162492 0.714014 00:00
560 1.162339 0.715104 00:00
561 1.162187 0.717018 00:00
562 1.162035 0.719366 00:00
563 1.161883 0.715955 00:00
564 1.161731 0.725445 00:00
565 1.161581 0.717198 00:00
566 1.161430 0.724005 00:00
567 1.161279 0.714826 00:00
568 1.161129 0.718211 00:00
569 1.160977 0.722398 00:00
570 1.160827 0.725935 00:00
571 1.160676 0.731058 00:00
572 1.160526 0.728131 00:00
573 1.160377 0.730116 00:00
574 1.160228 0.725589 00:00
575 1.160080 0.725959 00:00
576 1.159930 0.728852 00:00
577 1.159782 0.726024 00:00
578 1.159634 0.739878 00:00
579 1.159488 0.723152 00:00
580 1.159343 0.748881 00:00
581 1.159199 0.721036 00:00
582 1.159055 0.748081 00:00
583 1.158910 0.732644 00:00
584 1.158765 0.734861 00:00
585 1.158618 0.749160 00:00
586 1.158474 0.728235 00:00
587 1.158330 0.743768 00:00
588 1.158188 0.740732 00:00
589 1.158044 0.733889 00:00
590 1.157901 0.755369 00:00
591 1.157759 0.742740 00:00
592 1.157615 0.749549 00:00
593 1.157473 0.752636 00:00
594 1.157331 0.740134 00:00
595 1.157189 0.753275 00:00
596 1.157047 0.747162 00:00
597 1.156906 0.750614 00:00
598 1.156764 0.758259 00:00
599 1.156624 0.750666 00:00
600 1.156483 0.758749 00:00
601 1.156342 0.753580 00:00
602 1.156203 0.752502 00:00
603 1.156063 0.757032 00:00
604 1.155924 0.752669 00:00
605 1.155786 0.764629 00:00
606 1.155646 0.757873 00:00
607 1.155509 0.768916 00:00
608 1.155371 0.764446 00:00
609 1.155233 0.763146 00:00
610 1.155096 0.767720 00:00
611 1.154959 0.759641 00:00
612 1.154823 0.764613 00:00
613 1.154687 0.761114 00:00
614 1.154552 0.760347 00:00
615 1.154416 0.767563 00:00
616 1.154282 0.763232 00:00
617 1.154147 0.775828 00:00
618 1.154013 0.765519 00:00
619 1.153878 0.778106 00:00
620 1.153745 0.765381 00:00
621 1.153611 0.775516 00:00
622 1.153476 0.770315 00:00
623 1.153342 0.773973 00:00
624 1.153209 0.777303 00:00
625 1.153076 0.768123 00:00
626 1.152944 0.785412 00:00
627 1.152812 0.770153 00:00
628 1.152681 0.784272 00:00
629 1.152551 0.771082 00:00
630 1.152421 0.778124 00:00
631 1.152291 0.779991 00:00
632 1.152160 0.773086 00:00
633 1.152030 0.788602 00:00
634 1.151901 0.776838 00:00
635 1.151773 0.791196 00:00
636 1.151643 0.785923 00:00
637 1.151514 0.783956 00:00
638 1.151385 0.791385 00:00
639 1.151256 0.782824 00:00
640 1.151128 0.790044 00:00
641 1.151002 0.784316 00:00
642 1.150874 0.786809 00:00
643 1.150746 0.796783 00:00
644 1.150617 0.785647 00:00
645 1.150491 0.801674 00:00
646 1.150366 0.787745 00:00
647 1.150242 0.795403 00:00
648 1.150117 0.795295 00:00
649 1.149991 0.784823 00:00
650 1.149867 0.803139 00:00
651 1.149742 0.781171 00:00
652 1.149617 0.807291 00:00
653 1.149495 0.792297 00:00
654 1.149372 0.802658 00:00
655 1.149248 0.803362 00:00
656 1.149126 0.790100 00:00
657 1.149003 0.807568 00:00
658 1.148881 0.792497 00:00
659 1.148759 0.800350 00:00
660 1.148636 0.803407 00:00
661 1.148515 0.795993 00:00
662 1.148394 0.817941 00:00
663 1.148273 0.798105 00:00
664 1.148154 0.816646 00:00
665 1.148033 0.802424 00:00
666 1.147914 0.804851 00:00
667 1.147795 0.810295 00:00
668 1.147676 0.800507 00:00
669 1.147555 0.813304 00:00
670 1.147435 0.806676 00:00
671 1.147316 0.816713 00:00
672 1.147198 0.811457 00:00
673 1.147081 0.815311 00:00
674 1.146963 0.808485 00:00
675 1.146846 0.819729 00:00
676 1.146728 0.807255 00:00
677 1.146613 0.816844 00:00
678 1.146498 0.811429 00:00
679 1.146381 0.810872 00:00
680 1.146264 0.819056 00:00
681 1.146149 0.809813 00:00
682 1.146033 0.818623 00:00
683 1.145918 0.812179 00:00
684 1.145803 0.814529 00:00
685 1.145690 0.816851 00:00
686 1.145576 0.819786 00:00
687 1.145463 0.814861 00:00
688 1.145348 0.823726 00:00
689 1.145233 0.821554 00:00
690 1.145119 0.824245 00:00
691 1.145006 0.824527 00:00
692 1.144896 0.824720 00:00
693 1.144784 0.822336 00:00
694 1.144672 0.823316 00:00
695 1.144560 0.822816 00:00
696 1.144448 0.823885 00:00
697 1.144337 0.827526 00:00
698 1.144226 0.827074 00:00
699 1.144117 0.828886 00:00
700 1.144007 0.824797 00:00
701 1.143897 0.833218 00:00
702 1.143788 0.817942 00:00
703 1.143679 0.836724 00:00
704 1.143571 0.812656 00:00
705 1.143463 0.849854 00:00
706 1.143359 0.810656 00:00
707 1.143257 0.854141 00:00
708 1.143152 0.830175 00:00
709 1.143044 0.828141 00:00
710 1.142938 0.844413 00:00
711 1.142832 0.815003 00:00
712 1.142727 0.834576 00:00
713 1.142621 0.836194 00:00
714 1.142513 0.825130 00:00
715 1.142407 0.849269 00:00
716 1.142303 0.837995 00:00
717 1.142198 0.834097 00:00
718 1.142092 0.851348 00:00
719 1.141989 0.825346 00:00
720 1.141885 0.838158 00:00
721 1.141781 0.849517 00:00
722 1.141678 0.827810 00:00
723 1.141577 0.851888 00:00
724 1.141473 0.850364 00:00
725 1.141370 0.831968 00:00
726 1.141267 0.850946 00:00
727 1.141165 0.834469 00:00
728 1.141063 0.834548 00:00
729 1.140960 0.844537 00:00
730 1.140859 0.838511 00:00
731 1.140757 0.838418 00:00
732 1.140657 0.852048 00:00
733 1.140556 0.839991 00:00
734 1.140456 0.848325 00:00
735 1.140355 0.857238 00:00
736 1.140253 0.841009 00:00
737 1.140154 0.855226 00:00
738 1.140055 0.844980 00:00
739 1.139957 0.840221 00:00
740 1.139859 0.855727 00:00
741 1.139762 0.842279 00:00
742 1.139663 0.848787 00:00
743 1.139566 0.854988 00:00
744 1.139468 0.846882 00:00
745 1.139369 0.854328 00:00
746 1.139271 0.852064 00:00
747 1.139177 0.845152 00:00
748 1.139079 0.858374 00:00
749 1.138984 0.842834 00:00
750 1.138889 0.854284 00:00
751 1.138794 0.852951 00:00
752 1.138697 0.850792 00:00
753 1.138601 0.862283 00:00
754 1.138506 0.855278 00:00
755 1.138409 0.861406 00:00
756 1.138315 0.859451 00:00
757 1.138220 0.848930 00:00
758 1.138127 0.861172 00:00
759 1.138033 0.842143 00:00
760 1.137938 0.860208 00:00
761 1.137843 0.851348 00:00
762 1.137750 0.855614 00:00
763 1.137658 0.862108 00:00
764 1.137565 0.857237 00:00
765 1.137471 0.860705 00:00
766 1.137379 0.862395 00:00
767 1.137284 0.852959 00:00
768 1.137190 0.863799 00:00
769 1.137097 0.852512 00:00
770 1.137005 0.860559 00:00
771 1.136913 0.862787 00:00
772 1.136818 0.862184 00:00
773 1.136727 0.874345 00:00
774 1.136635 0.863138 00:00
775 1.136543 0.870405 00:00
776 1.136450 0.859480 00:00
777 1.136358 0.856016 00:00
778 1.136266 0.865735 00:00
779 1.136175 0.850185 00:00
780 1.136086 0.875445 00:00
781 1.135996 0.857999 00:00
782 1.135905 0.880645 00:00
783 1.135815 0.867992 00:00
784 1.135727 0.874274 00:00
785 1.135636 0.867010 00:00
786 1.135545 0.856438 00:00
787 1.135453 0.872254 00:00
788 1.135364 0.858873 00:00
789 1.135273 0.877127 00:00
790 1.135183 0.876687 00:00
791 1.135092 0.869743 00:00
792 1.135001 0.886134 00:00
793 1.134910 0.861011 00:00
794 1.134818 0.882864 00:00
795 1.134729 0.867141 00:00
796 1.134641 0.871579 00:00
797 1.134550 0.878834 00:00
798 1.134459 0.866401 00:00
799 1.134368 0.874019 00:00
800 1.134279 0.873439 00:00
801 1.134187 0.864621 00:00
802 1.134096 0.877635 00:00
803 1.134006 0.863145 00:00
804 1.133914 0.869848 00:00
805 1.133825 0.868888 00:00
806 1.133735 0.860288 00:00
807 1.133644 0.872482 00:00
808 1.133556 0.859649 00:00
809 1.133467 0.875410 00:00
810 1.133379 0.859526 00:00
811 1.133289 0.866583 00:00
812 1.133200 0.875274 00:00
813 1.133112 0.865782 00:00
814 1.133026 0.882280 00:00
815 1.132937 0.862883 00:00
816 1.132849 0.869316 00:00
817 1.132760 0.864621 00:00
818 1.132671 0.863206 00:00
819 1.132582 0.868335 00:00
820 1.132496 0.865084 00:00
821 1.132410 0.871673 00:00
822 1.132323 0.870870 00:00
823 1.132235 0.866101 00:00
824 1.132150 0.871637 00:00
825 1.132066 0.857387 00:00
826 1.131981 0.872094 00:00
827 1.131896 0.854197 00:00
828 1.131810 0.874689 00:00
829 1.131725 0.858084 00:00
830 1.131641 0.878727 00:00
831 1.131557 0.858961 00:00
832 1.131472 0.869223 00:00
833 1.131385 0.857955 00:00
834 1.131299 0.856443 00:00
835 1.131215 0.870857 00:00
836 1.131129 0.854086 00:00
837 1.131045 0.877621 00:00
838 1.130964 0.850963 00:00
839 1.130883 0.873553 00:00
840 1.130799 0.858874 00:00
841 1.130715 0.865007 00:00
842 1.130631 0.868489 00:00
843 1.130548 0.849720 00:00
844 1.130464 0.877812 00:00
845 1.130384 0.845569 00:00
846 1.130304 0.867796 00:00
847 1.130223 0.857568 00:00
848 1.130142 0.850401 00:00
849 1.130058 0.868524 00:00
850 1.129977 0.851188 00:00
851 1.129895 0.860543 00:00
852 1.129814 0.859625 00:00
853 1.129733 0.852063 00:00
854 1.129652 0.862126 00:00
855 1.129571 0.857274 00:00
856 1.129491 0.856204 00:00
857 1.129414 0.860044 00:00
858 1.129334 0.852196 00:00
859 1.129254 0.852517 00:00
860 1.129175 0.859008 00:00
861 1.129093 0.848962 00:00
862 1.129014 0.862057 00:00
863 1.128937 0.856752 00:00
864 1.128860 0.860548 00:00
865 1.128780 0.856671 00:00
866 1.128701 0.859483 00:00
867 1.128622 0.857231 00:00
868 1.128543 0.857547 00:00
869 1.128464 0.858504 00:00
870 1.128385 0.852302 00:00
871 1.128308 0.857933 00:00
872 1.128230 0.853803 00:00
873 1.128152 0.861099 00:00
874 1.128077 0.854009 00:00
875 1.128002 0.853797 00:00
876 1.127927 0.855231 00:00
877 1.127850 0.856125 00:00
878 1.127773 0.850268 00:00
879 1.127695 0.855963 00:00
880 1.127620 0.842745 00:00
881 1.127546 0.861058 00:00
882 1.127473 0.843062 00:00
883 1.127399 0.873071 00:00
884 1.127325 0.844815 00:00
885 1.127250 0.862693 00:00
886 1.127175 0.841935 00:00
887 1.127101 0.857481 00:00
888 1.127027 0.854944 00:00
889 1.126954 0.856011 00:00
890 1.126882 0.865382 00:00
891 1.126809 0.840762 00:00
892 1.126737 0.860966 00:00
893 1.126665 0.834126 00:00
894 1.126592 0.862016 00:00
895 1.126519 0.842175 00:00
896 1.126447 0.852582 00:00
897 1.126375 0.850875 00:00
898 1.126302 0.846766 00:00
899 1.126230 0.847914 00:00
900 1.126158 0.849131 00:00
901 1.126084 0.842236 00:00
902 1.126012 0.853580 00:00
903 1.125940 0.851032 00:00
904 1.125868 0.847052 00:00
905 1.125799 0.854457 00:00
906 1.125730 0.838350 00:00
907 1.125659 0.852468 00:00
908 1.125588 0.842107 00:00
909 1.125518 0.845545 00:00
910 1.125445 0.854992 00:00
911 1.125373 0.835331 00:00
912 1.125305 0.858842 00:00
913 1.125235 0.831471 00:00
914 1.125166 0.854235 00:00
915 1.125098 0.841904 00:00
916 1.125027 0.838347 00:00
917 1.124954 0.853982 00:00
918 1.124886 0.835227 00:00
919 1.124818 0.853135 00:00
920 1.124750 0.848524 00:00
921 1.124680 0.837705 00:00
922 1.124611 0.855274 00:00
923 1.124544 0.832801 00:00
924 1.124477 0.843624 00:00
925 1.124409 0.847890 00:00
926 1.124341 0.835778 00:00
927 1.124276 0.857396 00:00
928 1.124209 0.832616 00:00
929 1.124142 0.845207 00:00
930 1.124075 0.845729 00:00
931 1.124009 0.828168 00:00
932 1.123945 0.859107 00:00
933 1.123882 0.824592 00:00
934 1.123817 0.849279 00:00
935 1.123752 0.845720 00:00
936 1.123686 0.830553 00:00
937 1.123620 0.854023 00:00
938 1.123554 0.837677 00:00
939 1.123488 0.837597 00:00
940 1.123421 0.850289 00:00
941 1.123356 0.826232 00:00
942 1.123292 0.837933 00:00
943 1.123227 0.838509 00:00
944 1.123162 0.823643 00:00
945 1.123097 0.849781 00:00
946 1.123033 0.834404 00:00
947 1.122969 0.838214 00:00
948 1.122903 0.845169 00:00
949 1.122836 0.834216 00:00
950 1.122768 0.840391 00:00
951 1.122702 0.842047 00:00
952 1.122634 0.824440 00:00
953 1.122571 0.839786 00:00
954 1.122505 0.835064 00:00
955 1.122437 0.833512 00:00
956 1.122373 0.842443 00:00
957 1.122309 0.835471 00:00
958 1.122245 0.833878 00:00
959 1.122179 0.842228 00:00
960 1.122113 0.825596 00:00
961 1.122049 0.828197 00:00
962 1.121987 0.835179 00:00
963 1.121922 0.827421 00:00
964 1.121859 0.835130 00:00
965 1.121795 0.829856 00:00
966 1.121730 0.818251 00:00
967 1.121667 0.826072 00:00
968 1.121604 0.806622 00:00
969 1.121539 0.830709 00:00
970 1.121477 0.822611 00:00
971 1.121415 0.829838 00:00
972 1.121353 0.838813 00:00
973 1.121290 0.817203 00:00
974 1.121227 0.827864 00:00
975 1.121164 0.819673 00:00
976 1.121101 0.817444 00:00
977 1.121041 0.831273 00:00
978 1.120979 0.815416 00:00
979 1.120918 0.831825 00:00
980 1.120857 0.818962 00:00
981 1.120794 0.823118 00:00
982 1.120731 0.819802 00:00
983 1.120669 0.814412 00:00
984 1.120608 0.831845 00:00
985 1.120547 0.805859 00:00
986 1.120486 0.831398 00:00
987 1.120425 0.810652 00:00
988 1.120365 0.820052 00:00
989 1.120303 0.821582 00:00
990 1.120243 0.799644 00:00
991 1.120182 0.831146 00:00
992 1.120124 0.813301 00:00
993 1.120062 0.819145 00:00
994 1.120003 0.827779 00:00
995 1.119942 0.808695 00:00
996 1.119880 0.816911 00:00
997 1.119822 0.814241 00:00
998 1.119760 0.796303 00:00
999 1.119699 0.822363 00:00

- loss들도 에폭별로 기록되어 있음

lrnr.recorder.plot_loss()

- net_fastai에도 파라메터가 업데이트 되어있음

# list(net_fastai.parameters())
  • 리스트를 확인해보면 net_fastai 의 파라메터가 알아서 GPU로 옮겨져서 학습됨.

- 플랏

net_fastai.to("cpu") #  net_fastai 의 파라메터가 알아서 GPU로 옮겨져서 학습되기 때문에 CPU로 옮기기 위해 필요한 옵션
plt.plot(X,y,'.')
plt.plot(X_tr,net_fastai(X_tr).data) 
plt.plot(X_val,net_fastai(X_val).data) 
[<matplotlib.lines.Line2D at 0x7fb55d2b4e50>]

드랍아웃 추가버전

- 네트워크 설계 (드랍아웃 추가)

torch.manual_seed(1) 
net_fastai = torch.nn.Sequential(
    torch.nn.Linear(in_features=1, out_features=512),
    torch.nn.ReLU(),
    torch.nn.Dropout(0.8),
    torch.nn.Linear(in_features=512, out_features=1)) 
#optimizer 
loss_fn=torch.nn.MSELoss()

- 러너오브젝트 (for문 대신돌려주는 오브젝트)

lrnr= Learner(dls,net_fastai,opt_func=Adam,loss_func=loss_fn) 

- 에폭만 설정하고 바로 학습

lrnr.fit(1000)

epoch train_loss valid_loss time
0 1.585653 0.428918 00:00
1 1.552326 0.434847 00:00
2 1.568810 0.442775 00:00
3 1.543528 0.449585 00:00
4 1.562597 0.456666 00:00
5 1.523623 0.459943 00:00
6 1.506816 0.458130 00:00
7 1.510407 0.455353 00:00
8 1.532602 0.449054 00:00
9 1.528153 0.445443 00:00
10 1.518390 0.442207 00:00
11 1.508012 0.442086 00:00
12 1.498026 0.443293 00:00
13 1.502874 0.444508 00:00
14 1.502828 0.445713 00:00
15 1.496831 0.446047 00:00
16 1.483070 0.447462 00:00
17 1.496551 0.449803 00:00
18 1.482904 0.450663 00:00
19 1.471269 0.453689 00:00
20 1.467480 0.456816 00:00
21 1.457825 0.460537 00:00
22 1.450724 0.463197 00:00
23 1.445010 0.466199 00:00
24 1.441184 0.471516 00:00
25 1.436977 0.474600 00:00
26 1.431098 0.476256 00:00
27 1.423327 0.478671 00:00
28 1.416092 0.479825 00:00
29 1.414993 0.478338 00:00
30 1.421260 0.477377 00:00
31 1.413346 0.474661 00:00
32 1.417670 0.470384 00:00
33 1.412011 0.468277 00:00
34 1.414570 0.465151 00:00
35 1.416442 0.461778 00:00
36 1.410454 0.457763 00:00
37 1.405844 0.453920 00:00
38 1.405701 0.451884 00:00
39 1.405358 0.450063 00:00
40 1.402212 0.449002 00:00
41 1.403139 0.450332 00:00
42 1.403911 0.450523 00:00
43 1.397601 0.453861 00:00
44 1.399249 0.456292 00:00
45 1.395007 0.460009 00:00
46 1.391067 0.464119 00:00
47 1.387260 0.471901 00:00
48 1.390660 0.477963 00:00
49 1.391881 0.484811 00:00
50 1.390658 0.491122 00:00
51 1.390670 0.495993 00:00
52 1.391075 0.500303 00:00
53 1.392950 0.502622 00:00
54 1.394412 0.507396 00:00
55 1.393165 0.511561 00:00
56 1.392622 0.511535 00:00
57 1.388416 0.510609 00:00
58 1.389699 0.505468 00:00
59 1.388712 0.501358 00:00
60 1.390845 0.492997 00:00
61 1.389794 0.485515 00:00
62 1.388309 0.479284 00:00
63 1.385704 0.473244 00:00
64 1.381632 0.470747 00:00
65 1.379893 0.468668 00:00
66 1.377810 0.466918 00:00
67 1.373863 0.466845 00:00
68 1.373379 0.467098 00:00
69 1.373236 0.469636 00:00
70 1.371915 0.471154 00:00
71 1.374786 0.473317 00:00
72 1.375253 0.477512 00:00
73 1.373598 0.482235 00:00
74 1.370517 0.486834 00:00
75 1.368542 0.490196 00:00
76 1.366800 0.491323 00:00
77 1.365475 0.493006 00:00
78 1.364186 0.492660 00:00
79 1.362411 0.491747 00:00
80 1.363654 0.490545 00:00
81 1.364647 0.486906 00:00
82 1.363840 0.484328 00:00
83 1.360841 0.483667 00:00
84 1.357780 0.482602 00:00
85 1.354387 0.482341 00:00
86 1.354743 0.480979 00:00
87 1.352487 0.480230 00:00
88 1.350849 0.480385 00:00
89 1.347193 0.481687 00:00
90 1.348292 0.482973 00:00
91 1.348094 0.484509 00:00
92 1.349150 0.485363 00:00
93 1.347975 0.486710 00:00
94 1.348030 0.487450 00:00
95 1.347020 0.487784 00:00
96 1.347151 0.488630 00:00
97 1.346722 0.488376 00:00
98 1.346411 0.488689 00:00
99 1.344513 0.487410 00:00
100 1.342907 0.484405 00:00
101 1.342781 0.481908 00:00
102 1.341346 0.479483 00:00
103 1.341766 0.476325 00:00
104 1.342350 0.473111 00:00
105 1.340649 0.469765 00:00
106 1.338788 0.466547 00:00
107 1.337696 0.463014 00:00
108 1.336147 0.461034 00:00
109 1.335183 0.460860 00:00
110 1.335003 0.460619 00:00
111 1.333602 0.460446 00:00
112 1.332649 0.459486 00:00
113 1.332114 0.458578 00:00
114 1.331092 0.458250 00:00
115 1.331056 0.457596 00:00
116 1.329442 0.457309 00:00
117 1.329175 0.458242 00:00
118 1.328749 0.459083 00:00
119 1.328132 0.459782 00:00
120 1.327027 0.460422 00:00
121 1.324988 0.461527 00:00
122 1.325732 0.463066 00:00
123 1.324014 0.464978 00:00
124 1.324666 0.467041 00:00
125 1.323317 0.467265 00:00
126 1.321263 0.467545 00:00
127 1.321853 0.467662 00:00
128 1.319355 0.468605 00:00
129 1.318295 0.468820 00:00
130 1.319102 0.469376 00:00
131 1.318805 0.469254 00:00
132 1.319239 0.468340 00:00
133 1.319683 0.467812 00:00
134 1.319690 0.467858 00:00
135 1.318426 0.467048 00:00
136 1.318110 0.466006 00:00
137 1.319230 0.463519 00:00
138 1.319114 0.460127 00:00
139 1.317928 0.457015 00:00
140 1.317385 0.454293 00:00
141 1.317326 0.451691 00:00
142 1.314811 0.450089 00:00
143 1.314483 0.448852 00:00
144 1.314360 0.448222 00:00
145 1.312964 0.447670 00:00
146 1.312360 0.447532 00:00
147 1.310587 0.447214 00:00
148 1.311691 0.446320 00:00
149 1.309161 0.445114 00:00
150 1.308689 0.443989 00:00
151 1.309653 0.444138 00:00
152 1.308728 0.444494 00:00
153 1.309734 0.446063 00:00
154 1.309190 0.447512 00:00
155 1.310401 0.448599 00:00
156 1.310624 0.449229 00:00
157 1.311330 0.450947 00:00
158 1.311746 0.452629 00:00
159 1.311103 0.454649 00:00
160 1.310514 0.455947 00:00
161 1.311919 0.455855 00:00
162 1.312855 0.454663 00:00
163 1.313068 0.454679 00:00
164 1.311808 0.454553 00:00
165 1.310780 0.455134 00:00
166 1.310750 0.455700 00:00
167 1.310130 0.456383 00:00
168 1.310500 0.457561 00:00
169 1.308649 0.458673 00:00
170 1.307447 0.458367 00:00
171 1.306209 0.458763 00:00
172 1.306656 0.459128 00:00
173 1.305704 0.459041 00:00
174 1.305946 0.458394 00:00
175 1.305129 0.457962 00:00
176 1.305812 0.457666 00:00
177 1.304454 0.456104 00:00
178 1.304170 0.454566 00:00
179 1.303862 0.452822 00:00
180 1.303646 0.450847 00:00
181 1.304117 0.449994 00:00
182 1.306056 0.450307 00:00
183 1.306082 0.451494 00:00
184 1.306572 0.453412 00:00
185 1.307314 0.454422 00:00
186 1.307979 0.455199 00:00
187 1.308226 0.455547 00:00
188 1.307733 0.454574 00:00
189 1.306858 0.452864 00:00
190 1.306951 0.451127 00:00
191 1.307191 0.448808 00:00
192 1.306900 0.447166 00:00
193 1.306474 0.445834 00:00
194 1.306583 0.444366 00:00
195 1.305671 0.443542 00:00
196 1.305142 0.442452 00:00
197 1.305861 0.442109 00:00
198 1.305953 0.442021 00:00
199 1.306188 0.443077 00:00
200 1.305721 0.444788 00:00
201 1.304766 0.447121 00:00
202 1.304900 0.449354 00:00
203 1.304817 0.451541 00:00
204 1.303382 0.454298 00:00
205 1.303250 0.456598 00:00
206 1.301602 0.458440 00:00
207 1.300826 0.460150 00:00
208 1.300216 0.461308 00:00
209 1.299984 0.461123 00:00
210 1.299863 0.460487 00:00
211 1.299613 0.460139 00:00
212 1.298146 0.458774 00:00
213 1.297861 0.457811 00:00
214 1.297246 0.457534 00:00
215 1.297409 0.457475 00:00
216 1.296456 0.457476 00:00
217 1.295171 0.457737 00:00
218 1.294975 0.457871 00:00
219 1.295359 0.458105 00:00
220 1.295160 0.458282 00:00
221 1.295172 0.458718 00:00
222 1.294700 0.458999 00:00
223 1.294092 0.459598 00:00
224 1.294339 0.459741 00:00
225 1.294004 0.460016 00:00
226 1.293507 0.460292 00:00
227 1.293260 0.459921 00:00
228 1.293112 0.460055 00:00
229 1.293474 0.461999 00:00
230 1.293883 0.463124 00:00
231 1.293101 0.463190 00:00
232 1.294397 0.460957 00:00
233 1.293472 0.458565 00:00
234 1.292968 0.456207 00:00
235 1.291682 0.453659 00:00
236 1.290647 0.450835 00:00
237 1.290733 0.448873 00:00
238 1.291057 0.448236 00:00
239 1.291047 0.448282 00:00
240 1.290197 0.448316 00:00
241 1.290132 0.447261 00:00
242 1.290471 0.447169 00:00
243 1.290599 0.447863 00:00
244 1.291709 0.449088 00:00
245 1.291516 0.449997 00:00
246 1.292218 0.451673 00:00
247 1.292811 0.453689 00:00
248 1.291822 0.456328 00:00
249 1.290429 0.458569 00:00
250 1.289345 0.460122 00:00
251 1.289098 0.461364 00:00
252 1.288902 0.462557 00:00
253 1.288983 0.464226 00:00
254 1.289074 0.463428 00:00
255 1.290115 0.461235 00:00
256 1.288825 0.460264 00:00
257 1.288778 0.458635 00:00
258 1.288438 0.457022 00:00
259 1.287207 0.455116 00:00
260 1.287102 0.452730 00:00
261 1.287056 0.449386 00:00
262 1.286713 0.446976 00:00
263 1.286046 0.445370 00:00
264 1.285581 0.444116 00:00
265 1.284314 0.442950 00:00
266 1.283907 0.442153 00:00
267 1.283924 0.441795 00:00
268 1.283110 0.441905 00:00
269 1.283586 0.442992 00:00
270 1.282754 0.445599 00:00
271 1.283452 0.448076 00:00
272 1.282526 0.449792 00:00
273 1.281680 0.451994 00:00
274 1.281593 0.453522 00:00
275 1.282210 0.454511 00:00
276 1.281355 0.455418 00:00
277 1.281241 0.457158 00:00
278 1.282607 0.459756 00:00
279 1.281338 0.462545 00:00
280 1.280468 0.463479 00:00
281 1.281324 0.464576 00:00
282 1.280025 0.465785 00:00
283 1.279208 0.466272 00:00
284 1.278496 0.465768 00:00
285 1.278628 0.464549 00:00
286 1.277772 0.462399 00:00
287 1.278443 0.458362 00:00
288 1.277341 0.453748 00:00
289 1.276036 0.449766 00:00
290 1.276149 0.447136 00:00
291 1.277113 0.444921 00:00
292 1.277600 0.442339 00:00
293 1.278381 0.440815 00:00
294 1.278246 0.440260 00:00
295 1.277780 0.440208 00:00
296 1.279099 0.441141 00:00
297 1.279046 0.442381 00:00
298 1.279273 0.444180 00:00
299 1.278437 0.445275 00:00
300 1.278135 0.446150 00:00
301 1.277237 0.446983 00:00
302 1.275967 0.447600 00:00
303 1.274674 0.448527 00:00
304 1.275023 0.448984 00:00
305 1.273725 0.449964 00:00
306 1.274757 0.450914 00:00
307 1.275644 0.451354 00:00
308 1.275411 0.450634 00:00
309 1.273249 0.449823 00:00
310 1.272668 0.447905 00:00
311 1.273006 0.446382 00:00
312 1.273046 0.445369 00:00
313 1.273439 0.444805 00:00
314 1.273946 0.444691 00:00
315 1.274406 0.444853 00:00
316 1.275469 0.446319 00:00
317 1.276744 0.447802 00:00
318 1.276363 0.449279 00:00
319 1.275606 0.448045 00:00
320 1.276366 0.448368 00:00
321 1.276815 0.449496 00:00
322 1.276668 0.450450 00:00
323 1.277383 0.451422 00:00
324 1.276904 0.451118 00:00
325 1.276425 0.449853 00:00
326 1.275550 0.449960 00:00
327 1.275084 0.450510 00:00
328 1.274734 0.451224 00:00
329 1.273820 0.451804 00:00
330 1.273242 0.453382 00:00
331 1.274745 0.453503 00:00
332 1.274718 0.454331 00:00
333 1.275229 0.454266 00:00
334 1.274459 0.453185 00:00
335 1.275524 0.451685 00:00
336 1.275901 0.450501 00:00
337 1.275864 0.448206 00:00
338 1.276275 0.445659 00:00
339 1.276007 0.442874 00:00
340 1.275419 0.440244 00:00
341 1.276131 0.439074 00:00
342 1.275977 0.439107 00:00
343 1.276573 0.440056 00:00
344 1.276013 0.441976 00:00
345 1.275782 0.443747 00:00
346 1.276260 0.444530 00:00
347 1.277219 0.445925 00:00
348 1.276955 0.448013 00:00
349 1.277579 0.449128 00:00
350 1.278119 0.450851 00:00
351 1.277229 0.451777 00:00
352 1.276578 0.453030 00:00
353 1.275589 0.455285 00:00
354 1.274477 0.455830 00:00
355 1.274144 0.454764 00:00
356 1.274702 0.453221 00:00
357 1.275626 0.451963 00:00
358 1.274649 0.450428 00:00
359 1.274770 0.448316 00:00
360 1.273907 0.446554 00:00
361 1.273655 0.445934 00:00
362 1.274049 0.444397 00:00
363 1.273154 0.444631 00:00
364 1.273079 0.444490 00:00
365 1.273550 0.444211 00:00
366 1.273258 0.443906 00:00
367 1.272076 0.444242 00:00
368 1.272527 0.443567 00:00
369 1.272371 0.441951 00:00
370 1.271729 0.441208 00:00
371 1.272237 0.440633 00:00
372 1.272789 0.439714 00:00
373 1.271987 0.440197 00:00
374 1.271441 0.440779 00:00
375 1.272091 0.442054 00:00
376 1.272142 0.443541 00:00
377 1.272309 0.444584 00:00
378 1.272576 0.445007 00:00
379 1.271549 0.446067 00:00
380 1.272343 0.447739 00:00
381 1.273062 0.449915 00:00
382 1.271915 0.451558 00:00
383 1.272956 0.451587 00:00
384 1.273209 0.450780 00:00
385 1.273339 0.448905 00:00
386 1.273637 0.447097 00:00
387 1.272406 0.445148 00:00
388 1.273209 0.442759 00:00
389 1.273337 0.441212 00:00
390 1.272793 0.439405 00:00
391 1.272642 0.438002 00:00
392 1.273211 0.437436 00:00
393 1.272137 0.437576 00:00
394 1.272994 0.437660 00:00
395 1.273848 0.438264 00:00
396 1.274981 0.439082 00:00
397 1.274626 0.439662 00:00
398 1.274112 0.440225 00:00
399 1.275283 0.439928 00:00
400 1.274686 0.440200 00:00
401 1.273702 0.439983 00:00
402 1.273237 0.438900 00:00
403 1.274384 0.438169 00:00
404 1.273538 0.437765 00:00
405 1.273626 0.437323 00:00
406 1.274259 0.436330 00:00
407 1.273777 0.435359 00:00
408 1.274179 0.434383 00:00
409 1.273506 0.433803 00:00
410 1.272781 0.433587 00:00
411 1.272514 0.433153 00:00
412 1.272213 0.433683 00:00
413 1.272278 0.432541 00:00
414 1.270988 0.430734 00:00
415 1.272043 0.430005 00:00
416 1.272091 0.429156 00:00
417 1.272826 0.428830 00:00
418 1.275164 0.429389 00:00
419 1.275089 0.430739 00:00
420 1.275098 0.432331 00:00
421 1.275988 0.434281 00:00
422 1.277132 0.436318 00:00
423 1.276635 0.437074 00:00
424 1.277540 0.438567 00:00
425 1.278446 0.439168 00:00
426 1.278217 0.440007 00:00
427 1.277427 0.440434 00:00
428 1.277897 0.440567 00:00
429 1.277016 0.440946 00:00
430 1.277213 0.440619 00:00
431 1.277058 0.440119 00:00
432 1.277163 0.439020 00:00
433 1.275971 0.438113 00:00
434 1.276133 0.438139 00:00
435 1.276162 0.438535 00:00
436 1.276245 0.439054 00:00
437 1.276823 0.439915 00:00
438 1.277447 0.440073 00:00
439 1.278078 0.439764 00:00
440 1.277541 0.438341 00:00
441 1.277259 0.437533 00:00
442 1.277890 0.436443 00:00
443 1.278056 0.434781 00:00
444 1.278557 0.433211 00:00
445 1.279172 0.432234 00:00
446 1.278723 0.431493 00:00
447 1.278936 0.431824 00:00
448 1.277782 0.431911 00:00
449 1.277620 0.431848 00:00
450 1.276831 0.431066 00:00
451 1.278341 0.430812 00:00
452 1.278536 0.430441 00:00
453 1.278312 0.430879 00:00
454 1.277748 0.431787 00:00
455 1.277966 0.433044 00:00
456 1.279019 0.434166 00:00
457 1.278404 0.435248 00:00
458 1.276615 0.435892 00:00
459 1.276845 0.436374 00:00
460 1.276245 0.437245 00:00
461 1.276376 0.437681 00:00
462 1.275729 0.437989 00:00
463 1.275048 0.437682 00:00
464 1.274092 0.437276 00:00
465 1.274472 0.436809 00:00
466 1.273300 0.435777 00:00
467 1.273547 0.434478 00:00
468 1.273648 0.433785 00:00
469 1.272841 0.433175 00:00
470 1.272466 0.432485 00:00
471 1.273044 0.431078 00:00
472 1.273408 0.429726 00:00
473 1.274605 0.428988 00:00
474 1.276156 0.429225 00:00
475 1.275433 0.428869 00:00
476 1.274731 0.428331 00:00
477 1.274568 0.427646 00:00
478 1.275164 0.427501 00:00
479 1.275586 0.426797 00:00
480 1.276055 0.426222 00:00
481 1.276230 0.425086 00:00
482 1.275929 0.424352 00:00
483 1.276360 0.423986 00:00
484 1.276525 0.424998 00:00
485 1.276920 0.426166 00:00
486 1.276009 0.427729 00:00
487 1.274913 0.428724 00:00
488 1.274580 0.429512 00:00
489 1.274184 0.431259 00:00
490 1.273628 0.433026 00:00
491 1.273393 0.434834 00:00
492 1.273662 0.435535 00:00
493 1.273636 0.435844 00:00
494 1.273769 0.435797 00:00
495 1.273900 0.436756 00:00
496 1.274714 0.436555 00:00
497 1.274074 0.436512 00:00
498 1.274465 0.434613 00:00
499 1.275774 0.433732 00:00
500 1.275432 0.432233 00:00
501 1.276003 0.430867 00:00
502 1.276261 0.429793 00:00
503 1.276384 0.427979 00:00
504 1.276624 0.426644 00:00
505 1.275999 0.426118 00:00
506 1.276096 0.426525 00:00
507 1.275079 0.427614 00:00
508 1.276388 0.429074 00:00
509 1.276053 0.430425 00:00
510 1.276089 0.431520 00:00
511 1.277126 0.431714 00:00
512 1.275999 0.430963 00:00
513 1.275098 0.429525 00:00
514 1.274984 0.428617 00:00
515 1.275022 0.427229 00:00
516 1.275094 0.425926 00:00
517 1.275182 0.425262 00:00
518 1.274411 0.425496 00:00
519 1.273775 0.426172 00:00
520 1.273251 0.427555 00:00
521 1.273064 0.428508 00:00
522 1.272296 0.428644 00:00
523 1.273507 0.428634 00:00
524 1.274507 0.428889 00:00
525 1.273968 0.428871 00:00
526 1.273722 0.428838 00:00
527 1.272688 0.428265 00:00
528 1.272377 0.427893 00:00
529 1.272426 0.427862 00:00
530 1.273073 0.427427 00:00
531 1.274464 0.426118 00:00
532 1.273954 0.425181 00:00
533 1.273494 0.424574 00:00
534 1.275140 0.424161 00:00
535 1.274743 0.423892 00:00
536 1.274905 0.423776 00:00
537 1.275069 0.424039 00:00
538 1.274786 0.424709 00:00
539 1.275063 0.425237 00:00
540 1.275005 0.426318 00:00
541 1.274613 0.427015 00:00
542 1.275139 0.427537 00:00
543 1.274351 0.428095 00:00
544 1.273227 0.428084 00:00
545 1.273541 0.427931 00:00
546 1.274337 0.428104 00:00
547 1.274290 0.428153 00:00
548 1.274891 0.427597 00:00
549 1.274971 0.427675 00:00
550 1.275439 0.427028 00:00
551 1.274790 0.426988 00:00
552 1.274083 0.427212 00:00
553 1.273748 0.427498 00:00
554 1.274750 0.427614 00:00
555 1.275946 0.426867 00:00
556 1.274293 0.426365 00:00
557 1.275466 0.426012 00:00
558 1.274676 0.425758 00:00
559 1.274342 0.425257 00:00
560 1.273930 0.424702 00:00
561 1.274716 0.424173 00:00
562 1.275054 0.423586 00:00
563 1.275562 0.422569 00:00
564 1.274419 0.421863 00:00
565 1.274622 0.420814 00:00
566 1.275100 0.420400 00:00
567 1.274937 0.419733 00:00
568 1.277137 0.419688 00:00
569 1.276941 0.419439 00:00
570 1.277252 0.419328 00:00
571 1.277493 0.419536 00:00
572 1.277797 0.419480 00:00
573 1.278061 0.419490 00:00
574 1.278169 0.419480 00:00
575 1.277927 0.419490 00:00
576 1.278974 0.420374 00:00
577 1.278948 0.421013 00:00
578 1.278935 0.421877 00:00
579 1.278023 0.423002 00:00
580 1.277989 0.424007 00:00
581 1.276583 0.425599 00:00
582 1.277259 0.427187 00:00
583 1.277854 0.429477 00:00
584 1.277001 0.431623 00:00
585 1.276584 0.432964 00:00
586 1.275946 0.434667 00:00
587 1.276057 0.434762 00:00
588 1.275009 0.433967 00:00
589 1.275314 0.433720 00:00
590 1.273904 0.433479 00:00
591 1.274179 0.433357 00:00
592 1.273775 0.434011 00:00
593 1.273625 0.433441 00:00
594 1.273317 0.432568 00:00
595 1.273117 0.431415 00:00
596 1.273501 0.430274 00:00
597 1.272782 0.429342 00:00
598 1.272771 0.428691 00:00
599 1.273144 0.428483 00:00
600 1.273933 0.427769 00:00
601 1.275232 0.426778 00:00
602 1.274657 0.426384 00:00
603 1.272773 0.426873 00:00
604 1.272794 0.427275 00:00
605 1.271168 0.428485 00:00
606 1.271341 0.429926 00:00
607 1.271791 0.431660 00:00
608 1.271047 0.433437 00:00
609 1.270697 0.436317 00:00
610 1.270496 0.440007 00:00
611 1.270102 0.443772 00:00
612 1.271098 0.448234 00:00
613 1.271582 0.451129 00:00
614 1.271624 0.452896 00:00
615 1.270778 0.454155 00:00
616 1.271866 0.454195 00:00
617 1.272291 0.453104 00:00
618 1.271521 0.450916 00:00
619 1.271600 0.448254 00:00
620 1.271335 0.446454 00:00
621 1.272218 0.444400 00:00
622 1.272856 0.442405 00:00
623 1.272063 0.440086 00:00
624 1.271590 0.437824 00:00
625 1.272498 0.434808 00:00
626 1.271762 0.432513 00:00
627 1.270996 0.429386 00:00
628 1.271373 0.426564 00:00
629 1.270855 0.423724 00:00
630 1.271137 0.421120 00:00
631 1.271783 0.418864 00:00
632 1.273023 0.417564 00:00
633 1.273757 0.416827 00:00
634 1.273862 0.416320 00:00
635 1.274126 0.416095 00:00
636 1.273799 0.415853 00:00
637 1.273040 0.415803 00:00
638 1.272662 0.415664 00:00
639 1.272027 0.416062 00:00
640 1.271672 0.416512 00:00
641 1.272172 0.417236 00:00
642 1.271868 0.418060 00:00
643 1.271568 0.418735 00:00
644 1.271143 0.419603 00:00
645 1.270978 0.420012 00:00
646 1.271980 0.420373 00:00
647 1.271218 0.420731 00:00
648 1.271259 0.420728 00:00
649 1.272616 0.421000 00:00
650 1.272669 0.421076 00:00
651 1.271993 0.421718 00:00
652 1.272139 0.422846 00:00
653 1.271593 0.424016 00:00
654 1.272084 0.424071 00:00
655 1.272031 0.423002 00:00
656 1.272287 0.422750 00:00
657 1.271674 0.422855 00:00
658 1.273351 0.423061 00:00
659 1.272599 0.423173 00:00
660 1.273701 0.422897 00:00
661 1.273888 0.422019 00:00
662 1.273520 0.421038 00:00
663 1.273092 0.420304 00:00
664 1.272444 0.419929 00:00
665 1.271363 0.419600 00:00
666 1.271219 0.419411 00:00
667 1.269995 0.418867 00:00
668 1.269656 0.418260 00:00
669 1.269196 0.417907 00:00
670 1.268761 0.417880 00:00
671 1.268957 0.418323 00:00
672 1.268709 0.418748 00:00
673 1.268655 0.419754 00:00
674 1.268234 0.420989 00:00
675 1.267637 0.422225 00:00
676 1.266987 0.423394 00:00
677 1.267742 0.424076 00:00
678 1.268641 0.424966 00:00
679 1.269050 0.425550 00:00
680 1.269404 0.426399 00:00
681 1.269093 0.427357 00:00
682 1.267688 0.427813 00:00
683 1.267508 0.428132 00:00
684 1.267760 0.428090 00:00
685 1.268440 0.427267 00:00
686 1.268510 0.426125 00:00
687 1.268797 0.424140 00:00
688 1.270080 0.422616 00:00
689 1.269908 0.421240 00:00
690 1.270103 0.419971 00:00
691 1.270363 0.418615 00:00
692 1.270040 0.417742 00:00
693 1.268655 0.417430 00:00
694 1.269910 0.417689 00:00
695 1.270580 0.418506 00:00
696 1.272406 0.419042 00:00
697 1.272349 0.419711 00:00
698 1.272880 0.420755 00:00
699 1.272883 0.421960 00:00
700 1.273314 0.422143 00:00
701 1.273034 0.422114 00:00
702 1.273084 0.421831 00:00
703 1.273080 0.421359 00:00
704 1.272575 0.420919 00:00
705 1.272600 0.421118 00:00
706 1.273959 0.420803 00:00
707 1.273513 0.420513 00:00
708 1.274025 0.420264 00:00
709 1.274153 0.420128 00:00
710 1.274102 0.419967 00:00
711 1.274380 0.419409 00:00
712 1.273970 0.419280 00:00
713 1.273786 0.419395 00:00
714 1.273131 0.420241 00:00
715 1.272942 0.421559 00:00
716 1.271915 0.422934 00:00
717 1.272501 0.424102 00:00
718 1.273117 0.424498 00:00
719 1.272259 0.424320 00:00
720 1.272185 0.424793 00:00
721 1.271772 0.424981 00:00
722 1.272063 0.424488 00:00
723 1.272277 0.423959 00:00
724 1.272755 0.423606 00:00
725 1.273820 0.423754 00:00
726 1.272688 0.423793 00:00
727 1.272453 0.423884 00:00
728 1.272389 0.423727 00:00
729 1.273391 0.422913 00:00
730 1.274100 0.421855 00:00
731 1.273513 0.421279 00:00
732 1.273111 0.420920 00:00
733 1.272613 0.420485 00:00
734 1.272443 0.420529 00:00
735 1.271953 0.420568 00:00
736 1.272574 0.420286 00:00
737 1.273751 0.420097 00:00
738 1.273916 0.420242 00:00
739 1.273586 0.420372 00:00
740 1.272596 0.420492 00:00
741 1.271311 0.420878 00:00
742 1.271327 0.421587 00:00
743 1.271216 0.422174 00:00
744 1.270743 0.422611 00:00
745 1.269523 0.423091 00:00
746 1.269110 0.424107 00:00
747 1.268073 0.425620 00:00
748 1.267374 0.427398 00:00
749 1.267113 0.429331 00:00
750 1.267896 0.430416 00:00
751 1.268471 0.431450 00:00
752 1.268011 0.432519 00:00
753 1.269007 0.433019 00:00
754 1.269112 0.433126 00:00
755 1.269771 0.432392 00:00
756 1.268727 0.431043 00:00
757 1.268470 0.429783 00:00
758 1.269278 0.428051 00:00
759 1.271361 0.426025 00:00
760 1.271295 0.423901 00:00
761 1.271354 0.422107 00:00
762 1.271454 0.420575 00:00
763 1.271540 0.419250 00:00
764 1.270984 0.418816 00:00
765 1.270823 0.418765 00:00
766 1.271914 0.419185 00:00
767 1.272920 0.419934 00:00
768 1.272469 0.420232 00:00
769 1.271818 0.420525 00:00
770 1.271626 0.420704 00:00
771 1.271659 0.421075 00:00
772 1.271453 0.420612 00:00
773 1.272748 0.419964 00:00
774 1.272353 0.419361 00:00
775 1.271293 0.418772 00:00
776 1.270828 0.418036 00:00
777 1.270876 0.417484 00:00
778 1.271057 0.416961 00:00
779 1.271099 0.416504 00:00
780 1.271199 0.416053 00:00
781 1.271219 0.415718 00:00
782 1.271712 0.415253 00:00
783 1.271361 0.415005 00:00
784 1.271433 0.414941 00:00
785 1.271927 0.414972 00:00
786 1.271041 0.415107 00:00
787 1.270600 0.415427 00:00
788 1.270219 0.416175 00:00
789 1.270014 0.417049 00:00
790 1.269783 0.417895 00:00
791 1.270131 0.418861 00:00
792 1.270787 0.419705 00:00
793 1.270611 0.420572 00:00
794 1.270306 0.421565 00:00
795 1.269954 0.422035 00:00
796 1.269831 0.422207 00:00
797 1.270145 0.422326 00:00
798 1.270792 0.422940 00:00
799 1.271778 0.423354 00:00
800 1.271575 0.424080 00:00
801 1.271554 0.423873 00:00
802 1.270897 0.423196 00:00
803 1.272045 0.422298 00:00
804 1.271711 0.421490 00:00
805 1.271199 0.421400 00:00
806 1.270147 0.421098 00:00
807 1.268678 0.421492 00:00
808 1.269291 0.422593 00:00
809 1.269256 0.423766 00:00
810 1.268801 0.424148 00:00
811 1.269273 0.424434 00:00
812 1.269585 0.425138 00:00
813 1.269865 0.425461 00:00
814 1.269681 0.425933 00:00
815 1.269045 0.426009 00:00
816 1.268257 0.425587 00:00
817 1.268794 0.425151 00:00
818 1.267605 0.424925 00:00
819 1.267826 0.424407 00:00
820 1.267772 0.423538 00:00
821 1.268041 0.422893 00:00
822 1.268778 0.422429 00:00
823 1.269550 0.421637 00:00
824 1.269203 0.421282 00:00
825 1.268799 0.421687 00:00
826 1.268237 0.421848 00:00
827 1.267691 0.422487 00:00
828 1.267250 0.422923 00:00
829 1.267680 0.423563 00:00
830 1.268066 0.424449 00:00
831 1.269371 0.425026 00:00
832 1.270578 0.425459 00:00
833 1.270154 0.425517 00:00
834 1.271275 0.425059 00:00
835 1.272392 0.425423 00:00
836 1.272214 0.425736 00:00
837 1.272517 0.426679 00:00
838 1.272938 0.427807 00:00
839 1.271929 0.429674 00:00
840 1.272611 0.431257 00:00
841 1.273283 0.432724 00:00
842 1.274438 0.434084 00:00
843 1.274650 0.434947 00:00
844 1.274472 0.435163 00:00
845 1.273045 0.435121 00:00
846 1.273504 0.433433 00:00
847 1.273688 0.431738 00:00
848 1.272119 0.430599 00:00
849 1.271450 0.429238 00:00
850 1.272066 0.427790 00:00
851 1.271806 0.426300 00:00
852 1.272571 0.424746 00:00
853 1.272507 0.423409 00:00
854 1.273250 0.422637 00:00
855 1.272741 0.421491 00:00
856 1.271687 0.420503 00:00
857 1.272371 0.419858 00:00
858 1.273018 0.419335 00:00
859 1.273156 0.419283 00:00
860 1.272646 0.419418 00:00
861 1.271833 0.419791 00:00
862 1.271710 0.420650 00:00
863 1.272065 0.421545 00:00
864 1.271676 0.421997 00:00
865 1.272315 0.422431 00:00
866 1.272371 0.422683 00:00
867 1.273441 0.423463 00:00
868 1.273368 0.423953 00:00
869 1.273838 0.424227 00:00
870 1.273423 0.424003 00:00
871 1.273252 0.424145 00:00
872 1.272922 0.423755 00:00
873 1.272439 0.423490 00:00
874 1.271831 0.423762 00:00
875 1.271342 0.423780 00:00
876 1.270275 0.423963 00:00
877 1.271327 0.424505 00:00
878 1.272343 0.424360 00:00
879 1.271777 0.424242 00:00
880 1.270334 0.423722 00:00
881 1.270589 0.423348 00:00
882 1.270954 0.422993 00:00
883 1.270391 0.422629 00:00
884 1.270513 0.422359 00:00
885 1.271272 0.422419 00:00
886 1.272306 0.421880 00:00
887 1.272898 0.420643 00:00
888 1.271767 0.419487 00:00
889 1.271009 0.419040 00:00
890 1.271372 0.418528 00:00
891 1.271396 0.417933 00:00
892 1.269892 0.417698 00:00
893 1.269399 0.417749 00:00
894 1.270068 0.418024 00:00
895 1.271702 0.418193 00:00
896 1.270700 0.418013 00:00
897 1.270333 0.418358 00:00
898 1.270212 0.418953 00:00
899 1.269929 0.419519 00:00
900 1.269445 0.420907 00:00
901 1.269470 0.422438 00:00
902 1.270324 0.424143 00:00
903 1.269367 0.425467 00:00
904 1.269738 0.427331 00:00
905 1.269594 0.428506 00:00
906 1.269876 0.428487 00:00
907 1.268683 0.428035 00:00
908 1.268298 0.427372 00:00
909 1.268516 0.426644 00:00
910 1.270161 0.425629 00:00
911 1.269550 0.424801 00:00
912 1.269741 0.423837 00:00
913 1.269373 0.422717 00:00
914 1.270666 0.421385 00:00
915 1.271440 0.419991 00:00
916 1.271428 0.418825 00:00
917 1.270471 0.418084 00:00
918 1.269040 0.417153 00:00
919 1.267550 0.416482 00:00
920 1.266732 0.416238 00:00
921 1.267777 0.416306 00:00
922 1.267468 0.416335 00:00
923 1.266902 0.416258 00:00
924 1.266400 0.416091 00:00
925 1.266268 0.416181 00:00
926 1.267537 0.416459 00:00
927 1.267546 0.416593 00:00
928 1.267580 0.417033 00:00
929 1.267100 0.417179 00:00
930 1.267429 0.417122 00:00
931 1.266354 0.416993 00:00
932 1.266780 0.416593 00:00
933 1.267334 0.416380 00:00
934 1.268203 0.416275 00:00
935 1.268690 0.416225 00:00
936 1.268106 0.416299 00:00
937 1.267751 0.416181 00:00
938 1.267971 0.416177 00:00
939 1.267857 0.416043 00:00
940 1.267757 0.416107 00:00
941 1.267885 0.415961 00:00
942 1.268943 0.415743 00:00
943 1.268653 0.415579 00:00
944 1.268244 0.415384 00:00
945 1.268926 0.415173 00:00
946 1.269704 0.415183 00:00
947 1.268462 0.415245 00:00
948 1.269335 0.415451 00:00
949 1.269407 0.415598 00:00
950 1.269982 0.415618 00:00
951 1.270360 0.415674 00:00
952 1.269101 0.415805 00:00
953 1.269473 0.416018 00:00
954 1.268484 0.416352 00:00
955 1.268797 0.416684 00:00
956 1.267822 0.416871 00:00
957 1.267788 0.416918 00:00
958 1.267305 0.416894 00:00
959 1.267411 0.416783 00:00
960 1.265259 0.416556 00:00
961 1.264081 0.415936 00:00
962 1.264263 0.415769 00:00
963 1.264485 0.415452 00:00
964 1.264702 0.414969 00:00
965 1.263372 0.414560 00:00
966 1.262719 0.414424 00:00
967 1.263197 0.414142 00:00
968 1.264656 0.414061 00:00
969 1.265447 0.414135 00:00
970 1.263721 0.414354 00:00
971 1.264755 0.414610 00:00
972 1.265135 0.414962 00:00
973 1.265999 0.415164 00:00
974 1.265511 0.415425 00:00
975 1.264708 0.415568 00:00
976 1.263597 0.415669 00:00
977 1.263624 0.415896 00:00
978 1.264635 0.415878 00:00
979 1.264610 0.415971 00:00
980 1.264012 0.416127 00:00
981 1.265062 0.416171 00:00
982 1.264798 0.416271 00:00
983 1.264278 0.416447 00:00
984 1.264879 0.416630 00:00
985 1.265420 0.417135 00:00
986 1.265610 0.417360 00:00
987 1.265444 0.417375 00:00
988 1.266299 0.417734 00:00
989 1.264446 0.417796 00:00
990 1.264124 0.418030 00:00
991 1.263804 0.417795 00:00
992 1.264004 0.417678 00:00
993 1.264086 0.417720 00:00
994 1.264177 0.417654 00:00
995 1.265382 0.417681 00:00
996 1.265916 0.417580 00:00
997 1.266178 0.417524 00:00
998 1.265989 0.417092 00:00
999 1.266020 0.416904 00:00

- loss들도 에폭별로 기록되어 있음

lrnr.recorder.plot_loss()

- net_fastai에도 파라메터가 업데이트 되어있음

 
  • 리스트를 확인해보면 net_fastai 의 파라메터가 알아서 GPU로 옮겨져서 학습됨.

- 플랏

net_fastai.to("cpu") 
plt.plot(X,y,'.')
plt.plot(X_tr,net_fastai(X_tr).data) 
plt.plot(X_val,net_fastai(X_val).data) 
[<matplotlib.lines.Line2D at 0x7fb55d157d00>]

CPU vs GPU 시간비교

import time  # 시간 보는 함수

CPU (512)

torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1)
y=torch.randn(100).reshape(100,1)*0.01
torch.manual_seed(1) # 초기가중치를 똑같이 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=512), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=512,out_features=1)) 
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
t1=time.time()
for epoc in range(1000): 
    ## 1 
    yhat=net(X) 
    ## 2 
    loss=loss_fn(yhat,y) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step()
    net.zero_grad() 
t2=time.time()    
t2-t1
0.5415534973144531

GPU (512)

torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1)
y=torch.randn(100).reshape(100,1)*0.01
torch.manual_seed(1) # 초기가중치를 똑같이 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=512), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=512,out_features=1)) 
net.to("cuda:0")
X=X.to("cuda:0")
y=y.to("cuda:0")
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
t1=time.time()
for epoc in range(1000): 
    ## 1 
    yhat=net(X) 
    ## 2 
    loss=loss_fn(yhat,y) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step()
    net.zero_grad() 
t2=time.time()    
t2-t1
2.0947225093841553

- ?? CPU가 더 빠르다!!

CPU (20480)

torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1)
y=torch.randn(100).reshape(100,1)*0.01
torch.manual_seed(1) # 초기가중치를 똑같이 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=20480), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=20480,out_features=1)) 
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
t1=time.time()
for epoc in range(1000): 
    ## 1 
    yhat=net(X) 
    ## 2 
    loss=loss_fn(yhat,y) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step()
    net.zero_grad() 
t2=time.time()    
t2-t1
3.8015544414520264

GPU (20480)

torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1)
y=torch.randn(100).reshape(100,1)*0.01
torch.manual_seed(1) # 초기가중치를 똑같이 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=20480), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=20480,out_features=1)) 
net.to("cuda:0")
X=X.to("cuda:0")
y=y.to("cuda:0")
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
t1=time.time()
for epoc in range(1000): 
    ## 1 
    yhat=net(X) 
    ## 2 
    loss=loss_fn(yhat,y) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step()
    net.zero_grad() 
t2=time.time()    
t2-t1
2.3244359493255615

CPU (204800)

torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1)
y=torch.randn(100).reshape(100,1)*0.01
torch.manual_seed(1) # 초기가중치를 똑같이 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=204800), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=204800,out_features=1)) 
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
t1=time.time()
for epoc in range(1000): 
    ## 1 
    yhat=net(X) 
    ## 2 
    loss=loss_fn(yhat,y) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step()
    net.zero_grad() 
t2=time.time()    
t2-t1
62.91938018798828

GPU (204800)

torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1)
y=torch.randn(100).reshape(100,1)*0.01
torch.manual_seed(1) # 초기가중치를 똑같이 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=204800), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=204800,out_features=1)) 
net.to("cuda:0")
X=X.to("cuda:0")
y=y.to("cuda:0")
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
t1=time.time()
for epoc in range(1000): 
    ## 1 
    yhat=net(X) 
    ## 2 
    loss=loss_fn(yhat,y) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step()
    net.zero_grad() 
t2=time.time()    
t2-t1
2.087972640991211

숙제

- 현재 작업하고 있는 컴퓨터에서 아래코드를 실행후 시간을 출력하여 스샷제출

CPU (512)

torch.manual_seed(5) 
X=torch.linspace(0,1,100).reshape(100,1)
y=torch.randn(100).reshape(100,1)*0.01
torch.manual_seed(1) # 초기가중치를 똑같이 
net=torch.nn.Sequential(
    torch.nn.Linear(in_features=1,out_features=512), 
    torch.nn.ReLU(),
    torch.nn.Linear(in_features=512,out_features=1)) 
optimizer= torch.optim.Adam(net.parameters())
loss_fn= torch.nn.MSELoss()
t1=time.time()
for epoc in range(1000): 
    ## 1 
    yhat=net(X) 
    ## 2 
    loss=loss_fn(yhat,y) 
    ## 3 
    loss.backward()
    ## 4 
    optimizer.step()
    net.zero_grad() 
t2=time.time()    
t2-t1
0.5490124225616455