Semifinal_Other Outlier Detection

GODE
Author

SEOYEON CHOI

Published

August 5, 2023

random.seed contamination Iteration
GODE - O X
LOF - O X
kNN - O X
CBLOF O(random_state) O X
OCSVM - O(nu) O(max_iter)
MCD O(random_state) O X
Feature Bagging O(random_state) O X
ABOD - O X
Isolation Forest O X(할 수 없음) X
HBOS - O X
SOS - O X
SO-GAAL - O O(stop_epochs(default=20)
MO-GAAL - O O(stop_epochs(default=20)
LSCP O(random_state) O X
Simple Linear 논문 Accuracy Precision Recall F1
GODE 0.998 0.999 0.999 0.999
LOF (Breunig et al., 2000) 0.926 0.961 0.961 0.961
kNN (Ramaswamy et al., 2000) 0.950 1.000 0.947 0.973
CBLOF (He et al., 2003) 0.972 0.985 0.985 0.985
OCSVM (Sch ̈olkopf et al., 2001) 0.935 0.991 0.940 0.965
MCD (Hardin and Rocke, 2004) 0.998 0.999 0.999 0.999
Feature Bagging (Lazarevic and Kumar, 2005) 0.986 0.993 0.993 0.993
ABOD (Kriegel et al., 2008) 0.988 0.994 0.994 0.994
Isolation Forest (Liu et al., 2008) 0.868 0.999 0.862 0.925
HBOS (Goldstein and Dengel, 2012) 0.960 0.978 0.980 0.979
SOS (Janssens et al., 2012) 0.916 0.956 0.956 0.956
SO-GAAL (Liu et al., 2019) 0.936 0.966 0.966 0.966
MO-GAAL (Liu et al., 2019) 0.940 0.965 0.972 0.969
LSCP (Zhao et al., 2019) 0.988 0.994 0.994 0.994
Simple Linear 논문 Accuracy Precision Recall F1
GODE 0.998 0.999 0.999 0.999
LOF (Breunig et al., 2000) 0.990 0.995 0.995 0.995
kNN (Ramaswamy et al., 2000) 0.994 0.997 0.997 0.997
CBLOF (He et al., 2003) 0.972 0.985 0.985 0.985
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981 0.982 0.982
MCD (Hardin and Rocke, 2004) 0.998 0.999 0.999 0.999
Feature Bagging (Lazarevic and Kumar, 2005) 0.988 0.994 0.994 0.994
ABOD (Kriegel et al., 2008) 0.988 0.994 0.994 0.994
Isolation Forest (Liu et al., 2008) 0.895 1.000 0.889 0.942
HBOS (Goldstein and Dengel, 2012) 0.960 0.978 0.980 0.979
SOS (Janssens et al., 2012) 0.916 0.956 0.956 0.956
SO-GAAL (Liu et al., 2019) 0.934 0.965 0.965 0.965
MO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
LSCP (Zhao et al., 2019) 0.984 0.992 0.992 0.992
linear_rst
Accuracy Precision Recall F1
GODE 0.998 0.999 0.999 0.999
LOF (Breunig et al., 2000) 0.990 0.995 0.995 0.995
kNN (Ramaswamy et al., 2000) 0.994 0.997 0.997 0.997
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981 0.982 0.982
MCD (Hardin and Rocke, 2004) 0.998 0.999 0.999 0.999
Feature Bagging (Lazarevic and Kumar, 2005) 0.988 0.994 0.994 0.994
ABOD (Kriegel et al., 2008) 0.988 0.994 0.994 0.994
Isolation Forest (Liu et al., 2008) 0.895 1.000 0.889 0.942
HBOS (Goldstein and Dengel, 2012) 0.960 0.978 0.980 0.979
SOS (Janssens et al., 2012) 0.916 0.956 0.956 0.956
SO-GAAL (Liu et al., 2019) 0.934 0.965 0.965 0.965
MO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
LSCP (Zhao et al., 2019) 0.984 0.992 0.992 0.992

\(U^\star\), which is a mixture of uniform distributions \(U(5,7)\) and \(U(-7,-5)\).

Orbit 논문 Accuracy Precision Recall F1
GODE 0.998 0.999 0.999 0.999
LOF (Breunig et al., 2000) 0.954 0.976 0.976 0.976
kNN (Ramaswamy et al., 2000) 0.948 0.999 0.946 0.972
CBLOF (He et al., 2003) 0.918 0.957 0.957 0.957
OCSVM (Sch ̈olkopf et al., 2001) 0.908 0.977 0.925 0.950
MCD (Hardin and Rocke, 2004) 0.916 0.956 0.956 0.956
Feature Bagging (Lazarevic and Kumar, 2005) 0.942 0.969 0.969 0.969
ABOD (Kriegel et al., 2008) 0.988 0.994 0.994 0.994
Isolation Forest (Liu et al., 2008) 0.443 0.992 0.417 0.587
HBOS (Goldstein and Dengel, 2012) 0.935 0.960 0.973 0.966
SOS (Janssens et al., 2012) 0.950 0.974 0.974 0.974
SO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
MO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
LSCP (Zhao et al., 2019) 0.988 0.994 0.994 0.994
Orbit 논문 Accuracy Precision Recall F1
GODE 0.998 0.999 0.999 0.999
LOF (Breunig et al., 2000) 0.950 0.974 0.974 0.974
kNN (Ramaswamy et al., 2000) 0.990 0.995 0.995 0.995
CBLOF (He et al., 2003) 0.918 0.957 0.957 0.957
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.975 0.974 0.974
MCD (Hardin and Rocke, 2004) 0.916 0.956 0.956 0.956
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.978 0.978 0.978
ABOD (Kriegel et al., 2008) 0.988 0.994 0.994 0.994
Isolation Forest (Liu et al., 2008) 0.348 0.990 0.317 0.480
HBOS (Goldstein and Dengel, 2012) 0.935 0.960 0.973 0.966
SOS (Janssens et al., 2012) 0.950 0.974 0.974 0.974
SO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
MO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
LSCP (Zhao et al., 2019) 0.988 0.994 0.994 0.994
orbit_rst.round(3)
Accuracy Precision Recall F1
GODE 0.998 0.999 0.999 0.999
LOF (Breunig et al., 2000) 0.950 0.974 0.974 0.974
kNN (Ramaswamy et al., 2000) 0.990 0.995 0.995 0.995
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.975 0.974 0.974
MCD (Hardin and Rocke, 2004) 0.916 0.956 0.956 0.956
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.978 0.978 0.978
ABOD (Kriegel et al., 2008) 0.988 0.994 0.994 0.994
Isolation Forest (Liu et al., 2008) 0.348 0.990 0.317 0.480
HBOS (Goldstein and Dengel, 2012) 0.935 0.960 0.973 0.966
SOS (Janssens et al., 2012) 0.950 0.974 0.974 0.974
SO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
MO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
LSCP (Zhao et al., 2019) 0.988 0.994 0.994 0.994

\(U^\star\), which is a mixture of uniform distributions \(U(3,7)\) and \(U(-7,-3)\).

Stanford Bunny 논문 Accuracy Precision Recall F1
GODE 0.998 0.995 0.993 0.994
LOF (Breunig et al., 2000) 0.913 0.955 0.953 0.954
kNN (Ramaswamy et al., 2000) 0.942 0.997 0.942 0.969
CBLOF (He et al., 2003) 0.978 0.989 0.987 0.988
OCSVM (Sch ̈olkopf et al., 2001) 0.935 0.992 0.939 0.965
MCD (Hardin and Rocke, 2004) 0.982 0.992 0.989 0.990
Feature Bagging (Lazarevic and Kumar, 2005) 0.954 0.977 0.974 0.976
ABOD (Kriegel et al., 2008) 0.979 0.990 0.988 0.989
Isolation Forest (Liu et al., 2008) 0.827 0.995 0.822 0.900
HBOS (Goldstein and Dengel, 2012) 0.919 0.958 0.956 0.957
SOS (Janssens et al., 2012) 0.912 0.955 0.953 0.954
SO-GAAL (Liu et al., 2019) 0.952 0.952 1.000 0.975
MO-GAAL (Liu et al., 2019) 0.952 0.952 1.000 0.975
LSCP (Zhao et al., 2019) 0.978 0.990 0.987 0.989
Stanford Bunny 논문 Accuracy Precision Recall F1
GODE 0.998 0.995 0.993 0.994
LOF (Breunig et al., 2000) 0.956 0.978 0.976 0.977
kNN (Ramaswamy et al., 2000) 0.982 0.992 0.989 0.990
CBLOF (He et al., 2003) 0.978 0.989 0.987 0.988
OCSVM (Sch ̈olkopf et al., 2001) 0.959 0.979 0.978 0.979
MCD (Hardin and Rocke, 2004) 0.982 0.992 0.990 0.991
Feature Bagging (Lazarevic and Kumar, 2005) 0.954 0.977 0.975 0.976
ABOD (Kriegel et al., 2008) 0.979 0.990 0.988 0.989
Isolation Forest (Liu et al., 2008) 0.791 0.997 0.783 0.877
HBOS (Goldstein and Dengel, 2012) 0.919 0.958 0.956 0.957
SOS (Janssens et al., 2012) 0.912 0.955 0.953 0.954
SO-GAAL (Liu et al., 2019) 0.952 0.952 1.000 0.975
MO-GAAL (Liu et al., 2019) 0.952 0.952 1.000 0.975
LSCP (Zhao et al., 2019) 0.978 0.990 0.987 0.989
bunny_rst
Accuracy Precision Recall F1
GODE 0.988 0.995 0.993 0.994
LOF (Breunig et al., 2000) 0.956 0.978 0.976 0.977
kNN (Ramaswamy et al., 2000) 0.982 0.992 0.989 0.990
OCSVM (Sch ̈olkopf et al., 2001) 0.959 0.979 0.978 0.979
MCD (Hardin and Rocke, 2004) 0.982 0.992 0.990 0.991
Feature Bagging (Lazarevic and Kumar, 2005) 0.954 0.977 0.975 0.976
ABOD (Kriegel et al., 2008) 0.979 0.990 0.988 0.989
Isolation Forest (Liu et al., 2008) 0.791 0.997 0.783 0.877
HBOS (Goldstein and Dengel, 2012) 0.919 0.958 0.956 0.957
SOS (Janssens et al., 2012) 0.912 0.955 0.953 0.954
SO-GAAL (Liu et al., 2019) 0.952 0.952 1.000 0.975
MO-GAAL (Liu et al., 2019) 0.952 0.952 1.000 0.975
LSCP (Zhao et al., 2019) 0.978 0.990 0.987 0.989

Import

import numpy as np
import matplotlib.pyplot as plt
import matplotlib
from sklearn.svm import OneClassSVM
from sklearn.linear_model import SGDOneClassSVM
from sklearn.pipeline import make_pipeline

import pandas as pd
from sklearn.neighbors import LocalOutlierFactor

import rpy2
import rpy2.robjects as ro 
from rpy2.robjects.vectors import FloatVector 
from rpy2.robjects.packages import importr

from sklearn.datasets import fetch_kddcup99, fetch_covtype, fetch_openml
from sklearn.preprocessing import LabelBinarizer

import tqdm

from pygsp import graphs, filters, plotting, utils

from sklearn.metrics import confusion_matrix
from sklearn.metrics import precision_score, recall_score, f1_score, accuracy_score

import plotly.graph_objects as go
from IPython.display import HTML

import plotly.express as px

from sklearn.covariance import EmpiricalCovariance, MinCovDet

from alibi_detect.od import IForest
# from pyod.models.iforest import IForest

from pyod.models.abod import ABOD
from pyod.models.cblof import CBLOF

from sklearn import svm

from pyod.models.lscp import LSCP
from pyod.models.hbos import HBOS

from pyod.models.so_gaal import SO_GAAL
from pyod.models.mcd import MCD
from pyod.models.mo_gaal import MO_GAAL
from pyod.models.knn import KNN
from pyod.models.lof import LOF
from pyod.models.ocsvm import OCSVM

from pyod.models.feature_bagging import FeatureBagging
from pyod.models.sos import SOS

Class Code

tab_linear = pd.DataFrame(columns=["Accuracy","Precision","Recall","F1"])
tab_orbit = pd.DataFrame(columns=["Accuracy","Precision","Recall","F1"])
tab_bunny = pd.DataFrame(columns=["Accuracy","Precision","Recall","F1"])
class Conf_matrx:
    def __init__(self,original,compare,tab):
        self.original = original
        self.compare = compare
        self.tab = tab
    def conf(self,name):
        self.conf_matrix = confusion_matrix(self.original, self.compare)
        
        fig, ax = plt.subplots(figsize=(5, 5))
        ax.matshow(self.conf_matrix, cmap=plt.cm.Oranges, alpha=0.3)
        for i in range(self.conf_matrix.shape[0]):
            for j in range(self.conf_matrix.shape[1]):
                ax.text(x=j, y=i,s=self.conf_matrix[i, j], va='center', ha='center', size='xx-large')
        plt.xlabel('Predictions', fontsize=18)
        plt.ylabel('Actuals', fontsize=18)
        plt.title('Confusion Matrix', fontsize=18)
        plt.show()
        
        self.acc = accuracy_score(self.original, self.compare)
        self.pre = precision_score(self.original, self.compare)
        self.rec = recall_score(self.original, self.compare)
        self.f1 = f1_score(self.original, self.compare)
        
        print('Accuracy: %.3f' % self.acc)
        print('Precision: %.3f' % self.pre)
        print('Recall: %.3f' % self.rec)
        print('F1 Score: %.3f' % self.f1)
        
        self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
class Linear:
    def __init__(self,df):
        self.df = df
        self.y = df.y.to_numpy()
        #self.y1 = df.y1.to_numpy()
        self.x = df.x.to_numpy()
        self.n = len(self.y)
        self.W = w
    def _eigen(self):
        d= self.W.sum(axis=1)
        D= np.diag(d)
        self.L = np.diag(1/np.sqrt(d)) @ (D-self.W) @ np.diag(1/np.sqrt(d))
        self.lamb, self.Psi = np.linalg.eigh(self.L)
        self.Lamb = np.diag(self.lamb)      
    def fit(self,sd=20): # fit with ebayesthresh
        self._eigen()
        self.ybar = self.Psi.T @ self.y # fbar := graph fourier transform of f
        self.power = self.ybar**2 
        ebayesthresh = importr('EbayesThresh').ebayesthresh
        self.power_threshed=np.array(ebayesthresh(FloatVector(self.ybar**2),sd=sd))
        self.ybar_threshed = np.where(self.power_threshed>0,self.ybar,0)
        self.yhat = self.Psi@self.ybar_threshed
        self.df = self.df.assign(yHat = self.yhat)
        self.df = self.df.assign(Residual = self.df.y- self.df.yHat)
class Orbit:
    def __init__(self,df):
        self.df = df 
        self.f = df.f.to_numpy()
        self.x = df.x.to_numpy()
        self.y = df.y.to_numpy()
        self.n = len(self.f)
        self.theta= None
    def get_distance(self):
        self.D = np.zeros([self.n,self.n])
        locations = np.stack([self.x, self.y],axis=1)
        for i in tqdm.tqdm(range(self.n)):
            for j in range(i,self.n):
                self.D[i,j]=np.linalg.norm(locations[i]-locations[j])
        self.D = self.D + self.D.T
    def get_weightmatrix(self,theta=1,beta=0.5,kappa=4000):
        self.theta = theta
        dist = np.where(self.D < kappa,self.D,0)
        self.W = np.exp(-(dist/self.theta)**2)
    def _eigen(self):
        d= self.W.sum(axis=1)
        D= np.diag(d)
        self.L = np.diag(1/np.sqrt(d)) @ (D-self.W) @ np.diag(1/np.sqrt(d))
        self.lamb, self.Psi = np.linalg.eigh(self.L)
        self.Lamb = np.diag(self.lamb)       
    def fit(self,sd=5,ref=20): # fit with ebayesthresh
        self._eigen()
        self.fbar = self.Psi.T @ self.f # fbar := graph fourier transform of f
        self.power = self.fbar**2 
        ebayesthresh = importr('EbayesThresh').ebayesthresh
        self.power_threshed=np.array(ebayesthresh(FloatVector(self.fbar**2),sd=sd))
        self.fbar_threshed = np.where(self.power_threshed>0,self.fbar,0)
        self.fhat = self.Psi@self.fbar_threshed
        self.df = self.df.assign(fHat = self.fhat)
        self.df = self.df.assign(Residual = self.df.f- self.df.fHat)
        self.bottom = np.zeros_like(self.f)
        self.width=0.05
        self.depth=0.05
class BUNNY:
    def __init__(self,df):
        self.df = df 
        self.f = df.f.to_numpy()
        self.z = df.z.to_numpy()
        self.x = df.x.to_numpy()
        self.y = df.y.to_numpy()
        self.noise = df.noise.to_numpy()
        self.fnoise = self.f + self.noise
        self.W = _W
        self.n = len(self.f)
        self.theta= None
    def _eigen(self):
        d= self.W.sum(axis=1)
        D= np.diag(d)
        self.L = np.diag(1/np.sqrt(d)) @ (D-self.W) @ np.diag(1/np.sqrt(d))
        self.lamb, self.Psi = np.linalg.eigh(self.L)
        self.Lamb = np.diag(self.lamb)       
    def fit(self,sd=5,ref=6): # fit with ebayesthresh
        self._eigen()
        self.fbar = self.Psi.T @ self.fnoise # fbar := graph fourier transform of f
        self.power = self.fbar**2 
        ebayesthresh = importr('EbayesThresh').ebayesthresh
        self.power_threshed=np.array(ebayesthresh(FloatVector(self.fbar**2),sd=sd))
        self.fbar_threshed = np.where(self.power_threshed>0,self.fbar,0)
        self.fhat = self.Psi@self.fbar_threshed
        self.df = self.df.assign(fnoise = self.fnoise)
        self.df = self.df.assign(fHat = self.fhat)
        self.df = self.df.assign(Residual = self.df.f + self.df.noise - self.df.fHat)
        self.bottom = np.zeros_like(self.f)
        self.width=0.05
        self.depth=0.05

Linear EbayesThresh

%load_ext rpy2.ipython
The rpy2.ipython extension is already loaded. To reload it, use:
  %reload_ext rpy2.ipython
%%R
library(EbayesThresh)
set.seed(1)
epsilon = rnorm(1000)
# signal_1 = sample(c(runif(25,-2,-1.5), runif(25,1.5,2), rep(0,950)))
signal_1 = sample(c(runif(25,-7,-5), runif(25,5,7), rep(0,950)))
index_of_trueoutlier_1 = which(signal_1!=0)
index_of_trueoutlier_1
x_1=signal_1+epsilon
%R -o x_1
%R -o index_of_trueoutlier_1
%R -o signal_1
ebayesthresh = importr('EbayesThresh').ebayesthresh
outlier_true_index_1 = index_of_trueoutlier_1
outlier_true_value_1 = x_1[index_of_trueoutlier_1]
outlier_true_one_1 = signal_1.copy()
outlier_true_one_1 = list(map(lambda x: -1 if x!=0 else 1,outlier_true_one_1))

Linear

_x_1 = np.linspace(0,2,1000)
_y1_1 = 5*_x_1
_y_1 = _y1_1 + x_1 # x is epsilon
_df=pd.DataFrame({'x':_x_1, 'y':_y_1})
X = np.array(_df)
# _df.to_csv('simple_linear_df.csv')
# pd.DataFrame(outlier_true_one_1).to_csv('simple_linear_outlier.csv')

GODE

w=np.zeros((1000,1000))
for i in range(1000):
    for j in range(1000):
        if i==j :
            w[i,j] = 0
        elif np.abs(i-j) <= 1 : 
            w[i,j] = 1
_Linear = Linear(_df)
_Linear.fit(sd=20)
outlier_simul_one = (_Linear.df['Residual']**2).tolist()
  • x > 9.8의 이유 5% 이상치 검출 조정을 위함
outlier_simul_one = list(map(lambda x: -1 if x > 9.8 else 1,outlier_simul_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_simul_one,tab_linear)
outlier_simul_one.count(1)
950
outlier_simul_one.count(-1)
50
_conf.conf("GODE")

Accuracy: 0.998
Precision: 0.999
Recall: 0.999
F1 Score: 0.999
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
one = _conf.tab

LOF

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

  • novelty
    • True : 새로운 데이터 샘플의 이상치 탐지 여부 확인
    • False(default) : 데이터를 훈련시켜서 이상치 탐지 여부 확인
np.random.seed(77)
clf = LocalOutlierFactor(contamination=0.05)
_conf = Conf_matrx(outlier_true_one_1,clf.fit_predict(X),tab_linear)
_conf.conf("LOF (Breunig et al., 2000)")

Accuracy: 0.990
Precision: 0.995
Recall: 0.995
F1 Score: 0.995
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
two = one.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  two = one.append(_conf.tab)
two
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737

KNN

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

from pyod.models.knn import KNN
np.random.seed(77)
clf = KNN(contamination=0.05)
clf.fit(_df[['x', 'y']])
_df['knn_Clf'] = clf.labels_
outlier_KNN_one = list(clf.labels_)
outlier_KNN_one = list(map(lambda x: 1 if x==0  else -1,outlier_KNN_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_KNN_one,tab_linear)
_conf.conf("kNN (Ramaswamy et al., 2000)")

Accuracy: 0.994
Precision: 0.997
Recall: 0.997
F1 Score: 0.997
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
three = two.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  three = two.append(_conf.tab)
three
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842

CBLOF(오류)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

import pandas as pd
import matplotlib.pyplot as plt
_df =  pd.read_csv('simple_linear_df.csv')
outlier_true_one_1 = pd.read_csv('simple_linear_outlier.csv').iloc[:,1].tolist()
clf = CBLOF(contamination=0.05,check_estimator=False, random_state=77)
clf.fit(_df[['x', 'y']])
_df['CBLOF_Clf'] = clf.labels_
/home/csy/anaconda3/envs/pygsp/lib/python3.10/site-packages/sklearn/cluster/_kmeans.py:1412: FutureWarning: The default value of `n_init` will change from 10 to 'auto' in 1.4. Set the value of `n_init` explicitly to suppress the warning
  super()._check_params_vs_input(X, default_n_init=10)
clf = CBLOF(contamination=0.05,check_estimator=False, random_state=77)
clf.fit(_df[['x', 'y']])
_df['CBLOF_Clf'] = clf.labels_

outlier_CBLOF_one = list(clf.labels_)

outlier_CBLOF_one = list(map(lambda x: 1 if x==0  else -1,outlier_CBLOF_one))

_conf = Conf_matrx(outlier_true_one_1,outlier_CBLOF_one,tab_linear)
/home/csy/anaconda3/envs/pygsp/lib/python3.10/site-packages/sklearn/cluster/_kmeans.py:1412: FutureWarning: The default value of `n_init` will change from 10 to 'auto' in 1.4. Set the value of `n_init` explicitly to suppress the warning
  super()._check_params_vs_input(X, default_n_init=10)
_conf.conf("CBLOF (He et al., 2003)")

Accuracy: 0.972
Precision: 0.985
Recall: 0.985
F1 Score: 0.985
AttributeError: 'DataFrame' object has no attribute 'append'
# four = three.append(_conf.tab)
  • Accuracy: 0.972
  • Precision: 0.985
  • Recall: 0.985
  • F1 Score: 0.985

OCSVM

1. random.seed 지정했는가? O

2. contamination 지정했는가? O, nu

3. Iteration 지정할 수 있는가? O, max_iter

np.random.seed(77)
clf = svm.OneClassSVM(nu=0.05)
clf.fit(X)
OneClassSVM(nu=0.05)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
outlier_OSVM_one = list(clf.predict(X))
_conf = Conf_matrx(outlier_true_one_1,outlier_OSVM_one,tab_linear)
_conf.conf("OCSVM (Sch ̈olkopf et al., 2001)")

Accuracy: 0.965
Precision: 0.981
Recall: 0.982
F1 Score: 0.982
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
five = three.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  five = three.append(_conf.tab)
five
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589

MCD\(\star\)

1. random.seed 지정했는가? O, random_state

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

clf = MCD(contamination=0.05, random_state = 77)
clf.fit(_df[['x', 'y']])
_df['MCD_clf'] = clf.labels_
outlier_MCD_one = list(clf.labels_)
outlier_MCD_one = list(map(lambda x: 1 if x==0  else -1,outlier_MCD_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_MCD_one,tab_linear)
_conf.conf("MCD (Hardin and Rocke, 2004)")

Accuracy: 0.998
Precision: 0.999
Recall: 0.999
F1 Score: 0.999
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
six = five.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  six = five.append(_conf.tab)
six
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947

Feature Bagging\(\star\)

1. random.seed 지정했는가? O, random_state

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

clf = FeatureBagging(contamination=0.05, random_state=77)
clf.fit(_df[['x', 'y']])
_df['FeatureBagging_clf'] = clf.labels_
outlier_FeatureBagging_one = list(clf.labels_)
outlier_FeatureBagging_one = list(map(lambda x: 1 if x==0  else -1,outlier_FeatureBagging_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_FeatureBagging_one,tab_linear)
_conf.conf("Feature Bagging (Lazarevic and Kumar, 2005)")

Accuracy: 0.986
Precision: 0.993
Recall: 0.993
F1 Score: 0.993
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
seven = six.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  seven = six.append(_conf.tab)
seven
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947
Feature Bagging (Lazarevic and Kumar, 2005) 0.986 0.992632 0.992632 0.992632

ABOD\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = ABOD(contamination=0.05)
clf.fit(_df[['x', 'y']])
_df['ABOD_Clf'] = clf.labels_
outlier_ABOD_one = list(clf.labels_)
outlier_ABOD_one = list(map(lambda x: 1 if x==0  else -1,outlier_ABOD_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_ABOD_one,tab_linear)
_conf.conf("ABOD (Kriegel et al., 2008)")

Accuracy: 0.988
Precision: 0.994
Recall: 0.994
F1 Score: 0.994
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
eight = seven.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  eight = seven.append(_conf.tab)
eight
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947
Feature Bagging (Lazarevic and Kumar, 2005) 0.986 0.992632 0.992632 0.992632
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684

IForest\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? X, 할 수 없음.

3. Iteration 지정할 수 있는가? X

-> threshold=0. 꼭 지정해주어야 하는 옵션

np.random.seed(77)
od = IForest(
    threshold=0.
)
od.fit(_df[['x', 'y']])
preds = od.predict(
    _df[['x', 'y']],
    return_instance_score=True
)
_df['IF_alibi'] = preds['data']['is_outlier']
outlier_alibi_one = _df['IF_alibi']
outlier_alibi_one = list(map(lambda x: 1 if x==0  else -1,outlier_alibi_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_alibi_one,tab_linear)
_conf.conf("Isolation Forest (Liu et al., 2008)")

Accuracy: 0.893
Precision: 0.999
Recall: 0.888
F1 Score: 0.940
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
nine = eight.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  nine = eight.append(_conf.tab)
nine
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947
Feature Bagging (Lazarevic and Kumar, 2005) 0.986 0.992632 0.992632 0.992632
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.893 0.998817 0.888421 0.940390

HBOS\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = HBOS(contamination=0.05)
clf.fit(_df[['x', 'y']])
_df['HBOS_clf'] = clf.labels_
outlier_HBOS_one = list(clf.labels_)
outlier_HBOS_one = list(map(lambda x: 1 if x==0  else -1,outlier_HBOS_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_HBOS_one,tab_linear)
_conf.conf("HBOS (Goldstein and Dengel, 2012)")

Accuracy: 0.960
Precision: 0.978
Recall: 0.980
F1 Score: 0.979
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
ten = nine.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  ten = nine.append(_conf.tab)
ten
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947
Feature Bagging (Lazarevic and Kumar, 2005) 0.986 0.992632 0.992632 0.992632
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.893 0.998817 0.888421 0.940390
HBOS (Goldstein and Dengel, 2012) 0.960 0.977941 0.980000 0.978970

SOS\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = SOS(contamination=0.05)
clf.fit(_df[['x', 'y']])
_df['SOS_clf'] = clf.labels_
outlier_SOS_one = list(clf.labels_)
outlier_SOS_one = list(map(lambda x: 1 if x==0  else -1,outlier_SOS_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_SOS_one,tab_linear)
_conf.conf("SOS (Janssens et al., 2012)")

Accuracy: 0.916
Precision: 0.956
Recall: 0.956
F1 Score: 0.956
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
eleven = ten.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  eleven = ten.append(_conf.tab)
eleven
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947
Feature Bagging (Lazarevic and Kumar, 2005) 0.986 0.992632 0.992632 0.992632
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.893 0.998817 0.888421 0.940390
HBOS (Goldstein and Dengel, 2012) 0.960 0.977941 0.980000 0.978970
SOS (Janssens et al., 2012) 0.916 0.955789 0.955789 0.955789

SO_GAAL

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? stop_epochs(default,20)

np.random.seed(77)
clf = SO_GAAL(contamination=0.05)
clf.fit(_df[['x', 'y']])
_df['SO_GAAL_clf'] = clf.labels_
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/keras/optimizers/legacy/gradient_descent.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super().__init__(name, **kwargs)
Epoch 1 of 60

Testing for epoch 1 index 1:

Testing for epoch 1 index 2:
Epoch 2 of 60

Testing for epoch 2 index 1:

Testing for epoch 2 index 2:
Epoch 3 of 60

Testing for epoch 3 index 1:

Testing for epoch 3 index 2:
Epoch 4 of 60

Testing for epoch 4 index 1:

Testing for epoch 4 index 2:
Epoch 5 of 60

Testing for epoch 5 index 1:

Testing for epoch 5 index 2:
Epoch 6 of 60

Testing for epoch 6 index 1:

Testing for epoch 6 index 2:
Epoch 7 of 60

Testing for epoch 7 index 1:

Testing for epoch 7 index 2:
Epoch 8 of 60

Testing for epoch 8 index 1:

Testing for epoch 8 index 2:
Epoch 9 of 60

Testing for epoch 9 index 1:

Testing for epoch 9 index 2:
Epoch 10 of 60

Testing for epoch 10 index 1:

Testing for epoch 10 index 2:
Epoch 11 of 60

Testing for epoch 11 index 1:

Testing for epoch 11 index 2:
Epoch 12 of 60

Testing for epoch 12 index 1:

Testing for epoch 12 index 2:
Epoch 13 of 60

Testing for epoch 13 index 1:

Testing for epoch 13 index 2:
Epoch 14 of 60

Testing for epoch 14 index 1:

Testing for epoch 14 index 2:
Epoch 15 of 60

Testing for epoch 15 index 1:

Testing for epoch 15 index 2:
Epoch 16 of 60

Testing for epoch 16 index 1:

Testing for epoch 16 index 2:
Epoch 17 of 60

Testing for epoch 17 index 1:

Testing for epoch 17 index 2:
Epoch 18 of 60

Testing for epoch 18 index 1:

Testing for epoch 18 index 2:
Epoch 19 of 60

Testing for epoch 19 index 1:

Testing for epoch 19 index 2:
Epoch 20 of 60

Testing for epoch 20 index 1:

Testing for epoch 20 index 2:
Epoch 21 of 60

Testing for epoch 21 index 1:

Testing for epoch 21 index 2:
Epoch 22 of 60

Testing for epoch 22 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.1092

Testing for epoch 22 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 1.1082
Epoch 23 of 60

Testing for epoch 23 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.1563

Testing for epoch 23 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.1678
Epoch 24 of 60

Testing for epoch 24 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.1876

Testing for epoch 24 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 1.1687
Epoch 25 of 60

Testing for epoch 25 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.1958

Testing for epoch 25 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.2046
Epoch 26 of 60

Testing for epoch 26 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.2378

Testing for epoch 26 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 1.2626
Epoch 27 of 60

Testing for epoch 27 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.2320

Testing for epoch 27 index 2:
16/16 [==============================] - 0s 6ms/step - loss: 1.2713
Epoch 28 of 60

Testing for epoch 28 index 1:
16/16 [==============================] - 0s 5ms/step - loss: 1.2634

Testing for epoch 28 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.2805
Epoch 29 of 60

Testing for epoch 29 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.2889

Testing for epoch 29 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.3457
Epoch 30 of 60

Testing for epoch 30 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.3424

Testing for epoch 30 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.3205
Epoch 31 of 60

Testing for epoch 31 index 1:
16/16 [==============================] - 0s 5ms/step - loss: 1.3454

Testing for epoch 31 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 1.3575
Epoch 32 of 60

Testing for epoch 32 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 1.3724

Testing for epoch 32 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.3752
Epoch 33 of 60

Testing for epoch 33 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.4270

Testing for epoch 33 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 1.3954
Epoch 34 of 60

Testing for epoch 34 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.4119

Testing for epoch 34 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.4249
Epoch 35 of 60

Testing for epoch 35 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.4231

Testing for epoch 35 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 1.4688
Epoch 36 of 60

Testing for epoch 36 index 1:
16/16 [==============================] - 0s 9ms/step - loss: 1.4804

Testing for epoch 36 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.4657
Epoch 37 of 60

Testing for epoch 37 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.4835

Testing for epoch 37 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.4918
Epoch 38 of 60

Testing for epoch 38 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.4798

Testing for epoch 38 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.5205
Epoch 39 of 60

Testing for epoch 39 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.5316

Testing for epoch 39 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 1.4940
Epoch 40 of 60

Testing for epoch 40 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.5212

Testing for epoch 40 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.5308
Epoch 41 of 60

Testing for epoch 41 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.5662

Testing for epoch 41 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.5915
Epoch 42 of 60

Testing for epoch 42 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.5533

Testing for epoch 42 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.5472
Epoch 43 of 60

Testing for epoch 43 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.5804

Testing for epoch 43 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.6068
Epoch 44 of 60

Testing for epoch 44 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.5694

Testing for epoch 44 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.6250
Epoch 45 of 60

Testing for epoch 45 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.6431

Testing for epoch 45 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.6350
Epoch 46 of 60

Testing for epoch 46 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.6761

Testing for epoch 46 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.6467
Epoch 47 of 60

Testing for epoch 47 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.6223

Testing for epoch 47 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 1.6647
Epoch 48 of 60

Testing for epoch 48 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.6575

Testing for epoch 48 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.7289
Epoch 49 of 60

Testing for epoch 49 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.6535

Testing for epoch 49 index 2:
16/16 [==============================] - 0s 6ms/step - loss: 1.7329
Epoch 50 of 60

Testing for epoch 50 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.6873

Testing for epoch 50 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.6852
Epoch 51 of 60

Testing for epoch 51 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.7073

Testing for epoch 51 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.7016
Epoch 52 of 60

Testing for epoch 52 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.7201

Testing for epoch 52 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 1.7399
Epoch 53 of 60

Testing for epoch 53 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.7397

Testing for epoch 53 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.7532
Epoch 54 of 60

Testing for epoch 54 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.7369

Testing for epoch 54 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 1.7312
Epoch 55 of 60

Testing for epoch 55 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.7425

Testing for epoch 55 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.7862
Epoch 56 of 60

Testing for epoch 56 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.7798

Testing for epoch 56 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.7855
Epoch 57 of 60

Testing for epoch 57 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.8191

Testing for epoch 57 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.8351
Epoch 58 of 60

Testing for epoch 58 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.7898

Testing for epoch 58 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.7968
Epoch 59 of 60

Testing for epoch 59 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 1.8397

Testing for epoch 59 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.7893
Epoch 60 of 60

Testing for epoch 60 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.8570

Testing for epoch 60 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.8601
32/32 [==============================] - 0s 2ms/step
outlier_SO_GAAL_one = list(clf.labels_)
outlier_SO_GAAL_one = list(map(lambda x: 1 if x==0  else -1,outlier_SO_GAAL_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_SO_GAAL_one,tab_linear)
_conf.conf("SO-GAAL (Liu et al., 2019)")

Accuracy: 0.936
Precision: 0.965
Recall: 0.967
F1 Score: 0.966
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
twelve = eleven.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  twelve = eleven.append(_conf.tab)
twelve
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947
Feature Bagging (Lazarevic and Kumar, 2005) 0.986 0.992632 0.992632 0.992632
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.893 0.998817 0.888421 0.940390
HBOS (Goldstein and Dengel, 2012) 0.960 0.977941 0.980000 0.978970
SOS (Janssens et al., 2012) 0.916 0.955789 0.955789 0.955789
SO-GAAL (Liu et al., 2019) 0.936 0.965336 0.967368 0.966351

MO_GAAL\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? stop_epochs(default,20)

np.random.seed(77)
clf = MO_GAAL(contamination=0.05)
clf.fit(_df[['x', 'y']])
_df['MO_GAAL_clf'] = clf.labels_
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/keras/optimizers/legacy/gradient_descent.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super().__init__(name, **kwargs)
Epoch 1 of 60

Testing for epoch 1 index 1:
32/32 [==============================] - 0s 1ms/step

Testing for epoch 1 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 2 of 60

Testing for epoch 2 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 2 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 3 of 60

Testing for epoch 3 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 3 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 4 of 60

Testing for epoch 4 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 4 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 5 of 60

Testing for epoch 5 index 1:
32/32 [==============================] - 0s 1ms/step

Testing for epoch 5 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 6 of 60

Testing for epoch 6 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 6 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 7 of 60

Testing for epoch 7 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 7 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 8 of 60

Testing for epoch 8 index 1:
32/32 [==============================] - 0s 4ms/step

Testing for epoch 8 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 9 of 60

Testing for epoch 9 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 9 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 10 of 60

Testing for epoch 10 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 10 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 11 of 60

Testing for epoch 11 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 11 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 12 of 60

Testing for epoch 12 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 12 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 13 of 60

Testing for epoch 13 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 13 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 14 of 60

Testing for epoch 14 index 1:
32/32 [==============================] - 0s 5ms/step

Testing for epoch 14 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 15 of 60

Testing for epoch 15 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 15 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 16 of 60

Testing for epoch 16 index 1:
32/32 [==============================] - 0s 1ms/step

Testing for epoch 16 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 17 of 60

Testing for epoch 17 index 1:
32/32 [==============================] - 0s 1ms/step

Testing for epoch 17 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 18 of 60

Testing for epoch 18 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 18 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 19 of 60

Testing for epoch 19 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 19 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 20 of 60

Testing for epoch 20 index 1:
32/32 [==============================] - 0s 1ms/step

Testing for epoch 20 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 21 of 60

Testing for epoch 21 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 21 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.4469
16/16 [==============================] - 0s 2ms/step - loss: 0.7664
16/16 [==============================] - 0s 2ms/step - loss: 0.9337
16/16 [==============================] - 0s 2ms/step - loss: 1.0821
16/16 [==============================] - 0s 1ms/step - loss: 1.1541
16/16 [==============================] - 0s 3ms/step - loss: 1.1934
16/16 [==============================] - 0s 4ms/step - loss: 1.2110
16/16 [==============================] - 0s 2ms/step - loss: 1.2195
16/16 [==============================] - 0s 2ms/step - loss: 1.2234
16/16 [==============================] - 0s 2ms/step - loss: 1.2247
Epoch 22 of 60

Testing for epoch 22 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.4352
16/16 [==============================] - 0s 3ms/step - loss: 0.7790
16/16 [==============================] - 0s 2ms/step - loss: 0.9665
16/16 [==============================] - 0s 3ms/step - loss: 1.1272
16/16 [==============================] - 0s 5ms/step - loss: 1.2034
16/16 [==============================] - 0s 4ms/step - loss: 1.2428
16/16 [==============================] - 0s 5ms/step - loss: 1.2599
16/16 [==============================] - 0s 2ms/step - loss: 1.2678
16/16 [==============================] - 0s 3ms/step - loss: 1.2713
16/16 [==============================] - 0s 2ms/step - loss: 1.2724

Testing for epoch 22 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.4311
16/16 [==============================] - 0s 2ms/step - loss: 0.7803
16/16 [==============================] - 0s 2ms/step - loss: 0.9747
16/16 [==============================] - 0s 1ms/step - loss: 1.1371
16/16 [==============================] - 0s 4ms/step - loss: 1.2126
16/16 [==============================] - 0s 3ms/step - loss: 1.2502
16/16 [==============================] - 0s 3ms/step - loss: 1.2659
16/16 [==============================] - 0s 2ms/step - loss: 1.2729
16/16 [==============================] - 0s 3ms/step - loss: 1.2759
16/16 [==============================] - 0s 2ms/step - loss: 1.2768
Epoch 23 of 60

Testing for epoch 23 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.4295
16/16 [==============================] - 0s 4ms/step - loss: 0.7829
16/16 [==============================] - 0s 3ms/step - loss: 0.9863
16/16 [==============================] - 0s 4ms/step - loss: 1.1509
16/16 [==============================] - 0s 4ms/step - loss: 1.2256
16/16 [==============================] - 0s 4ms/step - loss: 1.2613
16/16 [==============================] - 0s 3ms/step - loss: 1.2756
16/16 [==============================] - 0s 3ms/step - loss: 1.2818
16/16 [==============================] - 0s 2ms/step - loss: 1.2843
16/16 [==============================] - 0s 5ms/step - loss: 1.2850

Testing for epoch 23 index 2:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.4256
16/16 [==============================] - 0s 3ms/step - loss: 0.7833
16/16 [==============================] - 0s 5ms/step - loss: 0.9938
16/16 [==============================] - 0s 4ms/step - loss: 1.1608
16/16 [==============================] - 0s 2ms/step - loss: 1.2346
16/16 [==============================] - 0s 4ms/step - loss: 1.2690
16/16 [==============================] - 0s 5ms/step - loss: 1.2823
16/16 [==============================] - 0s 1ms/step - loss: 1.2878
16/16 [==============================] - 0s 2ms/step - loss: 1.2900
16/16 [==============================] - 0s 2ms/step - loss: 1.2905
Epoch 24 of 60

Testing for epoch 24 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.4234
16/16 [==============================] - 0s 5ms/step - loss: 0.7892
16/16 [==============================] - 0s 4ms/step - loss: 1.0094
16/16 [==============================] - 0s 2ms/step - loss: 1.1806
16/16 [==============================] - 0s 6ms/step - loss: 1.2534
16/16 [==============================] - 0s 3ms/step - loss: 1.2867
16/16 [==============================] - 0s 4ms/step - loss: 1.2990
16/16 [==============================] - 0s 3ms/step - loss: 1.3039
16/16 [==============================] - 0s 2ms/step - loss: 1.3058
16/16 [==============================] - 0s 6ms/step - loss: 1.3062

Testing for epoch 24 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.4046
16/16 [==============================] - 0s 4ms/step - loss: 0.7930
16/16 [==============================] - 0s 4ms/step - loss: 1.0306
16/16 [==============================] - 0s 4ms/step - loss: 1.2135
16/16 [==============================] - 0s 3ms/step - loss: 1.2899
16/16 [==============================] - 0s 5ms/step - loss: 1.3240
16/16 [==============================] - 0s 2ms/step - loss: 1.3363
16/16 [==============================] - 0s 3ms/step - loss: 1.3412
16/16 [==============================] - 0s 4ms/step - loss: 1.3429
16/16 [==============================] - 0s 2ms/step - loss: 1.3433
Epoch 25 of 60

Testing for epoch 25 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.4022
16/16 [==============================] - 0s 2ms/step - loss: 0.7947
16/16 [==============================] - 0s 6ms/step - loss: 1.0381
16/16 [==============================] - 0s 4ms/step - loss: 1.2226
16/16 [==============================] - 0s 3ms/step - loss: 1.2976
16/16 [==============================] - 0s 1ms/step - loss: 1.3302
16/16 [==============================] - 0s 4ms/step - loss: 1.3416
16/16 [==============================] - 0s 2ms/step - loss: 1.3460
16/16 [==============================] - 0s 2ms/step - loss: 1.3474
16/16 [==============================] - 0s 2ms/step - loss: 1.3476

Testing for epoch 25 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.4127
16/16 [==============================] - 0s 3ms/step - loss: 0.7883
16/16 [==============================] - 0s 5ms/step - loss: 1.0218
16/16 [==============================] - 0s 2ms/step - loss: 1.1976
16/16 [==============================] - 0s 2ms/step - loss: 1.2677
16/16 [==============================] - 0s 1ms/step - loss: 1.2976
16/16 [==============================] - 0s 2ms/step - loss: 1.3077
16/16 [==============================] - 0s 2ms/step - loss: 1.3115
16/16 [==============================] - 0s 1ms/step - loss: 1.3126
16/16 [==============================] - 0s 3ms/step - loss: 1.3128
Epoch 26 of 60

Testing for epoch 26 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.4085
16/16 [==============================] - 0s 4ms/step - loss: 0.7922
16/16 [==============================] - 0s 2ms/step - loss: 1.0326
16/16 [==============================] - 0s 4ms/step - loss: 1.2121
16/16 [==============================] - 0s 3ms/step - loss: 1.2823
16/16 [==============================] - 0s 6ms/step - loss: 1.3116
16/16 [==============================] - 0s 5ms/step - loss: 1.3213
16/16 [==============================] - 0s 5ms/step - loss: 1.3248
16/16 [==============================] - 0s 3ms/step - loss: 1.3258
16/16 [==============================] - 0s 6ms/step - loss: 1.3259

Testing for epoch 26 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3970
16/16 [==============================] - 0s 6ms/step - loss: 0.7997
16/16 [==============================] - 0s 3ms/step - loss: 1.0528
16/16 [==============================] - 0s 6ms/step - loss: 1.2422
16/16 [==============================] - 0s 7ms/step - loss: 1.3157
16/16 [==============================] - 0s 2ms/step - loss: 1.3459
16/16 [==============================] - 0s 5ms/step - loss: 1.3559
16/16 [==============================] - 0s 2ms/step - loss: 1.3593
16/16 [==============================] - 0s 2ms/step - loss: 1.3604
16/16 [==============================] - 0s 2ms/step - loss: 1.3604
Epoch 27 of 60

Testing for epoch 27 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.3927
16/16 [==============================] - 0s 3ms/step - loss: 0.8010
16/16 [==============================] - 0s 3ms/step - loss: 1.0579
16/16 [==============================] - 0s 4ms/step - loss: 1.2495
16/16 [==============================] - 0s 3ms/step - loss: 1.3225
16/16 [==============================] - 0s 3ms/step - loss: 1.3520
16/16 [==============================] - 0s 2ms/step - loss: 1.3615
16/16 [==============================] - 0s 1ms/step - loss: 1.3647
16/16 [==============================] - 0s 4ms/step - loss: 1.3655
16/16 [==============================] - 0s 3ms/step - loss: 1.3656

Testing for epoch 27 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3941
16/16 [==============================] - 0s 5ms/step - loss: 0.7983
16/16 [==============================] - 0s 2ms/step - loss: 1.0531
16/16 [==============================] - 0s 7ms/step - loss: 1.2436
16/16 [==============================] - 0s 2ms/step - loss: 1.3156
16/16 [==============================] - 0s 5ms/step - loss: 1.3444
16/16 [==============================] - 0s 1ms/step - loss: 1.3536
16/16 [==============================] - 0s 2ms/step - loss: 1.3566
16/16 [==============================] - 0s 2ms/step - loss: 1.3574
16/16 [==============================] - 0s 2ms/step - loss: 1.3574
Epoch 28 of 60

Testing for epoch 28 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.3802
16/16 [==============================] - 0s 2ms/step - loss: 0.8072
16/16 [==============================] - 0s 3ms/step - loss: 1.0792
16/16 [==============================] - 0s 1ms/step - loss: 1.2816
16/16 [==============================] - 0s 3ms/step - loss: 1.3576
16/16 [==============================] - 0s 4ms/step - loss: 1.3874
16/16 [==============================] - 0s 3ms/step - loss: 1.3968
16/16 [==============================] - 0s 3ms/step - loss: 1.3998
16/16 [==============================] - 0s 2ms/step - loss: 1.4005
16/16 [==============================] - 0s 2ms/step - loss: 1.4005

Testing for epoch 28 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 6ms/step - loss: 0.3788
16/16 [==============================] - 0s 3ms/step - loss: 0.8083
16/16 [==============================] - 0s 3ms/step - loss: 1.0827
16/16 [==============================] - 0s 2ms/step - loss: 1.2887
16/16 [==============================] - 0s 2ms/step - loss: 1.3654
16/16 [==============================] - 0s 3ms/step - loss: 1.3951
16/16 [==============================] - 0s 2ms/step - loss: 1.4044
16/16 [==============================] - 0s 2ms/step - loss: 1.4073
16/16 [==============================] - 0s 2ms/step - loss: 1.4080
16/16 [==============================] - 0s 3ms/step - loss: 1.4080
Epoch 29 of 60

Testing for epoch 29 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3626
16/16 [==============================] - 0s 2ms/step - loss: 0.8174
16/16 [==============================] - 0s 2ms/step - loss: 1.1088
16/16 [==============================] - 0s 2ms/step - loss: 1.3295
16/16 [==============================] - 0s 2ms/step - loss: 1.4107
16/16 [==============================] - 0s 2ms/step - loss: 1.4419
16/16 [==============================] - 0s 2ms/step - loss: 1.4516
16/16 [==============================] - 0s 5ms/step - loss: 1.4547
16/16 [==============================] - 0s 1ms/step - loss: 1.4554
16/16 [==============================] - 0s 2ms/step - loss: 1.4554

Testing for epoch 29 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.3702
16/16 [==============================] - 0s 2ms/step - loss: 0.8098
16/16 [==============================] - 0s 2ms/step - loss: 1.0918
16/16 [==============================] - 0s 4ms/step - loss: 1.3074
16/16 [==============================] - 0s 2ms/step - loss: 1.3857
16/16 [==============================] - 0s 2ms/step - loss: 1.4154
16/16 [==============================] - 0s 2ms/step - loss: 1.4246
16/16 [==============================] - 0s 2ms/step - loss: 1.4274
16/16 [==============================] - 0s 3ms/step - loss: 1.4280
16/16 [==============================] - 0s 2ms/step - loss: 1.4279
Epoch 30 of 60

Testing for epoch 30 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 6ms/step - loss: 0.3679
16/16 [==============================] - 0s 2ms/step - loss: 0.8029
16/16 [==============================] - 0s 2ms/step - loss: 1.0827
16/16 [==============================] - 0s 2ms/step - loss: 1.2970
16/16 [==============================] - 0s 4ms/step - loss: 1.3734
16/16 [==============================] - 0s 2ms/step - loss: 1.4020
16/16 [==============================] - 0s 3ms/step - loss: 1.4107
16/16 [==============================] - 0s 2ms/step - loss: 1.4132
16/16 [==============================] - 0s 2ms/step - loss: 1.4137
16/16 [==============================] - 0s 2ms/step - loss: 1.4136

Testing for epoch 30 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3480
16/16 [==============================] - 0s 4ms/step - loss: 0.8203
16/16 [==============================] - 0s 2ms/step - loss: 1.1280
16/16 [==============================] - 0s 2ms/step - loss: 1.3659
16/16 [==============================] - 0s 3ms/step - loss: 1.4504
16/16 [==============================] - 0s 6ms/step - loss: 1.4822
16/16 [==============================] - 0s 3ms/step - loss: 1.4919
16/16 [==============================] - 0s 3ms/step - loss: 1.4948
16/16 [==============================] - 0s 3ms/step - loss: 1.4954
16/16 [==============================] - 0s 4ms/step - loss: 1.4953
Epoch 31 of 60

Testing for epoch 31 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.3451
16/16 [==============================] - 0s 1ms/step - loss: 0.8179
16/16 [==============================] - 0s 2ms/step - loss: 1.1300
16/16 [==============================] - 0s 3ms/step - loss: 1.3696
16/16 [==============================] - 0s 4ms/step - loss: 1.4539
16/16 [==============================] - 0s 4ms/step - loss: 1.4853
16/16 [==============================] - 0s 3ms/step - loss: 1.4948
16/16 [==============================] - 0s 3ms/step - loss: 1.4976
16/16 [==============================] - 0s 2ms/step - loss: 1.4981
16/16 [==============================] - 0s 3ms/step - loss: 1.4980

Testing for epoch 31 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.3471
16/16 [==============================] - 0s 4ms/step - loss: 0.8124
16/16 [==============================] - 0s 3ms/step - loss: 1.1240
16/16 [==============================] - 0s 3ms/step - loss: 1.3565
16/16 [==============================] - 0s 3ms/step - loss: 1.4393
16/16 [==============================] - 0s 2ms/step - loss: 1.4701
16/16 [==============================] - 0s 3ms/step - loss: 1.4793
16/16 [==============================] - 0s 3ms/step - loss: 1.4820
16/16 [==============================] - 0s 2ms/step - loss: 1.4825
16/16 [==============================] - 0s 3ms/step - loss: 1.4823
Epoch 32 of 60

Testing for epoch 32 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3438
16/16 [==============================] - 0s 3ms/step - loss: 0.8174
16/16 [==============================] - 0s 3ms/step - loss: 1.1375
16/16 [==============================] - 0s 5ms/step - loss: 1.3757
16/16 [==============================] - 0s 5ms/step - loss: 1.4597
16/16 [==============================] - 0s 3ms/step - loss: 1.4907
16/16 [==============================] - 0s 2ms/step - loss: 1.4999
16/16 [==============================] - 0s 2ms/step - loss: 1.5025
16/16 [==============================] - 0s 2ms/step - loss: 1.5029
16/16 [==============================] - 0s 3ms/step - loss: 1.5028

Testing for epoch 32 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 7ms/step - loss: 0.3342
16/16 [==============================] - 0s 2ms/step - loss: 0.8204
16/16 [==============================] - 0s 3ms/step - loss: 1.1519
16/16 [==============================] - 0s 2ms/step - loss: 1.4001
16/16 [==============================] - 0s 3ms/step - loss: 1.4863
16/16 [==============================] - 0s 5ms/step - loss: 1.5185
16/16 [==============================] - 0s 3ms/step - loss: 1.5280
16/16 [==============================] - 0s 2ms/step - loss: 1.5306
16/16 [==============================] - 0s 4ms/step - loss: 1.5311
16/16 [==============================] - 0s 6ms/step - loss: 1.5309
Epoch 33 of 60

Testing for epoch 33 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.3439
16/16 [==============================] - 0s 2ms/step - loss: 0.8037
16/16 [==============================] - 0s 2ms/step - loss: 1.1181
16/16 [==============================] - 0s 4ms/step - loss: 1.3523
16/16 [==============================] - 0s 2ms/step - loss: 1.4324
16/16 [==============================] - 0s 6ms/step - loss: 1.4618
16/16 [==============================] - 0s 3ms/step - loss: 1.4703
16/16 [==============================] - 0s 3ms/step - loss: 1.4725
16/16 [==============================] - 0s 2ms/step - loss: 1.4728
16/16 [==============================] - 0s 4ms/step - loss: 1.4726

Testing for epoch 33 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3266
16/16 [==============================] - 0s 2ms/step - loss: 0.8214
16/16 [==============================] - 0s 1ms/step - loss: 1.1596
16/16 [==============================] - 0s 2ms/step - loss: 1.4149
16/16 [==============================] - 0s 2ms/step - loss: 1.5020
16/16 [==============================] - 0s 4ms/step - loss: 1.5344
16/16 [==============================] - 0s 5ms/step - loss: 1.5437
16/16 [==============================] - 0s 3ms/step - loss: 1.5462
16/16 [==============================] - 0s 2ms/step - loss: 1.5466
16/16 [==============================] - 0s 4ms/step - loss: 1.5464
Epoch 34 of 60

Testing for epoch 34 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.3171
16/16 [==============================] - 0s 2ms/step - loss: 0.8333
16/16 [==============================] - 0s 2ms/step - loss: 1.1858
16/16 [==============================] - 0s 4ms/step - loss: 1.4524
16/16 [==============================] - 0s 2ms/step - loss: 1.5425
16/16 [==============================] - 0s 2ms/step - loss: 1.5756
16/16 [==============================] - 0s 2ms/step - loss: 1.5849
16/16 [==============================] - 0s 2ms/step - loss: 1.5875
16/16 [==============================] - 0s 8ms/step - loss: 1.5878
16/16 [==============================] - 0s 4ms/step - loss: 1.5876

Testing for epoch 34 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3186
16/16 [==============================] - 0s 5ms/step - loss: 0.8261
16/16 [==============================] - 0s 2ms/step - loss: 1.1709
16/16 [==============================] - 0s 4ms/step - loss: 1.4344
16/16 [==============================] - 0s 2ms/step - loss: 1.5229
16/16 [==============================] - 0s 2ms/step - loss: 1.5553
16/16 [==============================] - 0s 1ms/step - loss: 1.5644
16/16 [==============================] - 0s 2ms/step - loss: 1.5669
16/16 [==============================] - 0s 3ms/step - loss: 1.5672
16/16 [==============================] - 0s 3ms/step - loss: 1.5670
Epoch 35 of 60

Testing for epoch 35 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3032
16/16 [==============================] - 0s 6ms/step - loss: 0.8445
16/16 [==============================] - 0s 2ms/step - loss: 1.2128
16/16 [==============================] - 0s 4ms/step - loss: 1.4949
16/16 [==============================] - 0s 4ms/step - loss: 1.5888
16/16 [==============================] - 0s 2ms/step - loss: 1.6229
16/16 [==============================] - 0s 3ms/step - loss: 1.6324
16/16 [==============================] - 0s 2ms/step - loss: 1.6350
16/16 [==============================] - 0s 2ms/step - loss: 1.6352
16/16 [==============================] - 0s 2ms/step - loss: 1.6350

Testing for epoch 35 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.3089
16/16 [==============================] - 0s 3ms/step - loss: 0.8364
16/16 [==============================] - 0s 5ms/step - loss: 1.1932
16/16 [==============================] - 0s 3ms/step - loss: 1.4684
16/16 [==============================] - 0s 3ms/step - loss: 1.5592
16/16 [==============================] - 0s 3ms/step - loss: 1.5920
16/16 [==============================] - 0s 3ms/step - loss: 1.6009
16/16 [==============================] - 0s 4ms/step - loss: 1.6033
16/16 [==============================] - 0s 4ms/step - loss: 1.6035
16/16 [==============================] - 0s 2ms/step - loss: 1.6033
Epoch 36 of 60

Testing for epoch 36 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.2996
16/16 [==============================] - 0s 3ms/step - loss: 0.8391
16/16 [==============================] - 0s 2ms/step - loss: 1.2059
16/16 [==============================] - 0s 1ms/step - loss: 1.4884
16/16 [==============================] - 0s 2ms/step - loss: 1.5806
16/16 [==============================] - 0s 3ms/step - loss: 1.6135
16/16 [==============================] - 0s 4ms/step - loss: 1.6224
16/16 [==============================] - 0s 5ms/step - loss: 1.6247
16/16 [==============================] - 0s 2ms/step - loss: 1.6248
16/16 [==============================] - 0s 4ms/step - loss: 1.6246

Testing for epoch 36 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 7ms/step - loss: 0.3025
16/16 [==============================] - 0s 3ms/step - loss: 0.8445
16/16 [==============================] - 0s 3ms/step - loss: 1.2172
16/16 [==============================] - 0s 3ms/step - loss: 1.5034
16/16 [==============================] - 0s 3ms/step - loss: 1.5961
16/16 [==============================] - 0s 2ms/step - loss: 1.6289
16/16 [==============================] - 0s 2ms/step - loss: 1.6377
16/16 [==============================] - 0s 2ms/step - loss: 1.6399
16/16 [==============================] - 0s 3ms/step - loss: 1.6400
16/16 [==============================] - 0s 5ms/step - loss: 1.6397
Epoch 37 of 60

Testing for epoch 37 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.2955
16/16 [==============================] - 0s 4ms/step - loss: 0.8428
16/16 [==============================] - 0s 2ms/step - loss: 1.2231
16/16 [==============================] - 0s 2ms/step - loss: 1.5135
16/16 [==============================] - 0s 3ms/step - loss: 1.6066
16/16 [==============================] - 0s 2ms/step - loss: 1.6393
16/16 [==============================] - 0s 3ms/step - loss: 1.6479
16/16 [==============================] - 0s 4ms/step - loss: 1.6500
16/16 [==============================] - 0s 2ms/step - loss: 1.6500
16/16 [==============================] - 0s 2ms/step - loss: 1.6498

Testing for epoch 37 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2974
16/16 [==============================] - 0s 3ms/step - loss: 0.8399
16/16 [==============================] - 0s 2ms/step - loss: 1.2210
16/16 [==============================] - 0s 3ms/step - loss: 1.5109
16/16 [==============================] - 0s 2ms/step - loss: 1.6030
16/16 [==============================] - 0s 3ms/step - loss: 1.6350
16/16 [==============================] - 0s 4ms/step - loss: 1.6433
16/16 [==============================] - 0s 2ms/step - loss: 1.6453
16/16 [==============================] - 0s 3ms/step - loss: 1.6453
16/16 [==============================] - 0s 3ms/step - loss: 1.6450
Epoch 38 of 60

Testing for epoch 38 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2792
16/16 [==============================] - 0s 2ms/step - loss: 0.8512
16/16 [==============================] - 0s 4ms/step - loss: 1.2571
16/16 [==============================] - 0s 2ms/step - loss: 1.5640
16/16 [==============================] - 0s 3ms/step - loss: 1.6606
16/16 [==============================] - 0s 2ms/step - loss: 1.6938
16/16 [==============================] - 0s 2ms/step - loss: 1.7022
16/16 [==============================] - 0s 2ms/step - loss: 1.7042
16/16 [==============================] - 0s 3ms/step - loss: 1.7041
16/16 [==============================] - 0s 2ms/step - loss: 1.7038

Testing for epoch 38 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2901
16/16 [==============================] - 0s 6ms/step - loss: 0.8366
16/16 [==============================] - 0s 2ms/step - loss: 1.2294
16/16 [==============================] - 0s 2ms/step - loss: 1.5230
16/16 [==============================] - 0s 1ms/step - loss: 1.6149
16/16 [==============================] - 0s 1ms/step - loss: 1.6462
16/16 [==============================] - 0s 2ms/step - loss: 1.6540
16/16 [==============================] - 0s 2ms/step - loss: 1.6558
16/16 [==============================] - 0s 3ms/step - loss: 1.6556
16/16 [==============================] - 0s 3ms/step - loss: 1.6553
Epoch 39 of 60

Testing for epoch 39 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2905
16/16 [==============================] - 0s 2ms/step - loss: 0.8364
16/16 [==============================] - 0s 2ms/step - loss: 1.2316
16/16 [==============================] - 0s 2ms/step - loss: 1.5243
16/16 [==============================] - 0s 3ms/step - loss: 1.6149
16/16 [==============================] - 0s 2ms/step - loss: 1.6451
16/16 [==============================] - 0s 4ms/step - loss: 1.6524
16/16 [==============================] - 0s 2ms/step - loss: 1.6539
16/16 [==============================] - 0s 3ms/step - loss: 1.6537
16/16 [==============================] - 0s 3ms/step - loss: 1.6534

Testing for epoch 39 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2874
16/16 [==============================] - 0s 2ms/step - loss: 0.8347
16/16 [==============================] - 0s 1ms/step - loss: 1.2355
16/16 [==============================] - 0s 2ms/step - loss: 1.5317
16/16 [==============================] - 0s 2ms/step - loss: 1.6227
16/16 [==============================] - 0s 2ms/step - loss: 1.6528
16/16 [==============================] - 0s 3ms/step - loss: 1.6601
16/16 [==============================] - 0s 2ms/step - loss: 1.6616
16/16 [==============================] - 0s 1ms/step - loss: 1.6614
16/16 [==============================] - 0s 3ms/step - loss: 1.6611
Epoch 40 of 60

Testing for epoch 40 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.2839
16/16 [==============================] - 0s 2ms/step - loss: 0.8481
16/16 [==============================] - 0s 5ms/step - loss: 1.2645
16/16 [==============================] - 0s 2ms/step - loss: 1.5698
16/16 [==============================] - 0s 2ms/step - loss: 1.6625
16/16 [==============================] - 0s 2ms/step - loss: 1.6928
16/16 [==============================] - 0s 1ms/step - loss: 1.7000
16/16 [==============================] - 0s 2ms/step - loss: 1.7015
16/16 [==============================] - 0s 2ms/step - loss: 1.7012
16/16 [==============================] - 0s 3ms/step - loss: 1.7008

Testing for epoch 40 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2847
16/16 [==============================] - 0s 2ms/step - loss: 0.8394
16/16 [==============================] - 0s 3ms/step - loss: 1.2524
16/16 [==============================] - 0s 2ms/step - loss: 1.5537
16/16 [==============================] - 0s 2ms/step - loss: 1.6444
16/16 [==============================] - 0s 2ms/step - loss: 1.6736
16/16 [==============================] - 0s 5ms/step - loss: 1.6804
16/16 [==============================] - 0s 3ms/step - loss: 1.6817
16/16 [==============================] - 0s 3ms/step - loss: 1.6814
16/16 [==============================] - 0s 7ms/step - loss: 1.6810
Epoch 41 of 60

Testing for epoch 41 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.2729
16/16 [==============================] - 0s 5ms/step - loss: 0.8445
16/16 [==============================] - 0s 2ms/step - loss: 1.2732
16/16 [==============================] - 0s 5ms/step - loss: 1.5836
16/16 [==============================] - 0s 6ms/step - loss: 1.6759
16/16 [==============================] - 0s 2ms/step - loss: 1.7053
16/16 [==============================] - 0s 3ms/step - loss: 1.7120
16/16 [==============================] - 0s 5ms/step - loss: 1.7131
16/16 [==============================] - 0s 5ms/step - loss: 1.7127
16/16 [==============================] - 0s 2ms/step - loss: 1.7123

Testing for epoch 41 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2893
16/16 [==============================] - 0s 3ms/step - loss: 0.8372
16/16 [==============================] - 0s 2ms/step - loss: 1.2508
16/16 [==============================] - 0s 4ms/step - loss: 1.5485
16/16 [==============================] - 0s 3ms/step - loss: 1.6360
16/16 [==============================] - 0s 1ms/step - loss: 1.6633
16/16 [==============================] - 0s 4ms/step - loss: 1.6694
16/16 [==============================] - 0s 4ms/step - loss: 1.6703
16/16 [==============================] - 0s 2ms/step - loss: 1.6699
16/16 [==============================] - 0s 3ms/step - loss: 1.6694
Epoch 42 of 60

Testing for epoch 42 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2739
16/16 [==============================] - 0s 3ms/step - loss: 0.8525
16/16 [==============================] - 0s 2ms/step - loss: 1.2937
16/16 [==============================] - 0s 2ms/step - loss: 1.6094
16/16 [==============================] - 0s 3ms/step - loss: 1.7018
16/16 [==============================] - 0s 3ms/step - loss: 1.7300
16/16 [==============================] - 0s 2ms/step - loss: 1.7363
16/16 [==============================] - 0s 2ms/step - loss: 1.7373
16/16 [==============================] - 0s 3ms/step - loss: 1.7368
16/16 [==============================] - 0s 2ms/step - loss: 1.7364

Testing for epoch 42 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2656
16/16 [==============================] - 0s 3ms/step - loss: 0.8523
16/16 [==============================] - 0s 2ms/step - loss: 1.3032
16/16 [==============================] - 0s 4ms/step - loss: 1.6241
16/16 [==============================] - 0s 5ms/step - loss: 1.7170
16/16 [==============================] - 0s 2ms/step - loss: 1.7449
16/16 [==============================] - 0s 4ms/step - loss: 1.7510
16/16 [==============================] - 0s 5ms/step - loss: 1.7517
16/16 [==============================] - 0s 2ms/step - loss: 1.7512
16/16 [==============================] - 0s 4ms/step - loss: 1.7507
Epoch 43 of 60

Testing for epoch 43 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2737
16/16 [==============================] - 0s 4ms/step - loss: 0.8490
16/16 [==============================] - 0s 3ms/step - loss: 1.2940
16/16 [==============================] - 0s 4ms/step - loss: 1.6083
16/16 [==============================] - 0s 3ms/step - loss: 1.6984
16/16 [==============================] - 0s 2ms/step - loss: 1.7251
16/16 [==============================] - 0s 3ms/step - loss: 1.7309
16/16 [==============================] - 0s 3ms/step - loss: 1.7316
16/16 [==============================] - 0s 1ms/step - loss: 1.7310
16/16 [==============================] - 0s 4ms/step - loss: 1.7306

Testing for epoch 43 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2868
16/16 [==============================] - 0s 3ms/step - loss: 0.8237
16/16 [==============================] - 0s 3ms/step - loss: 1.2425
16/16 [==============================] - 0s 2ms/step - loss: 1.5341
16/16 [==============================] - 0s 3ms/step - loss: 1.6170
16/16 [==============================] - 0s 5ms/step - loss: 1.6411
16/16 [==============================] - 0s 4ms/step - loss: 1.6460
16/16 [==============================] - 0s 3ms/step - loss: 1.6464
16/16 [==============================] - 0s 4ms/step - loss: 1.6459
16/16 [==============================] - 0s 2ms/step - loss: 1.6454
Epoch 44 of 60

Testing for epoch 44 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2686
16/16 [==============================] - 0s 2ms/step - loss: 0.8542
16/16 [==============================] - 0s 3ms/step - loss: 1.3157
16/16 [==============================] - 0s 2ms/step - loss: 1.6323
16/16 [==============================] - 0s 2ms/step - loss: 1.7218
16/16 [==============================] - 0s 2ms/step - loss: 1.7474
16/16 [==============================] - 0s 1ms/step - loss: 1.7526
16/16 [==============================] - 0s 2ms/step - loss: 1.7530
16/16 [==============================] - 0s 2ms/step - loss: 1.7524
16/16 [==============================] - 0s 5ms/step - loss: 1.7519

Testing for epoch 44 index 2:
32/32 [==============================] - 0s 987us/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2645
16/16 [==============================] - 0s 4ms/step - loss: 0.8529
16/16 [==============================] - 0s 3ms/step - loss: 1.3214
16/16 [==============================] - 0s 3ms/step - loss: 1.6400
16/16 [==============================] - 0s 3ms/step - loss: 1.7295
16/16 [==============================] - 0s 3ms/step - loss: 1.7550
16/16 [==============================] - 0s 2ms/step - loss: 1.7600
16/16 [==============================] - 0s 2ms/step - loss: 1.7604
16/16 [==============================] - 0s 3ms/step - loss: 1.7597
16/16 [==============================] - 0s 3ms/step - loss: 1.7592
Epoch 45 of 60

Testing for epoch 45 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2705
16/16 [==============================] - 0s 3ms/step - loss: 0.8460
16/16 [==============================] - 0s 2ms/step - loss: 1.3052
16/16 [==============================] - 0s 4ms/step - loss: 1.6143
16/16 [==============================] - 0s 4ms/step - loss: 1.6997
16/16 [==============================] - 0s 3ms/step - loss: 1.7234
16/16 [==============================] - 0s 3ms/step - loss: 1.7279
16/16 [==============================] - 0s 2ms/step - loss: 1.7280
16/16 [==============================] - 0s 2ms/step - loss: 1.7273
16/16 [==============================] - 0s 5ms/step - loss: 1.7268

Testing for epoch 45 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.2724
16/16 [==============================] - 0s 2ms/step - loss: 0.8481
16/16 [==============================] - 0s 4ms/step - loss: 1.3104
16/16 [==============================] - 0s 3ms/step - loss: 1.6209
16/16 [==============================] - 0s 2ms/step - loss: 1.7059
16/16 [==============================] - 0s 3ms/step - loss: 1.7292
16/16 [==============================] - 0s 2ms/step - loss: 1.7335
16/16 [==============================] - 0s 5ms/step - loss: 1.7336
16/16 [==============================] - 0s 3ms/step - loss: 1.7329
16/16 [==============================] - 0s 3ms/step - loss: 1.7323
Epoch 46 of 60

Testing for epoch 46 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.2586
16/16 [==============================] - 0s 2ms/step - loss: 0.8545
16/16 [==============================] - 0s 2ms/step - loss: 1.3358
16/16 [==============================] - 0s 2ms/step - loss: 1.6567
16/16 [==============================] - 0s 2ms/step - loss: 1.7432
16/16 [==============================] - 0s 3ms/step - loss: 1.7667
16/16 [==============================] - 0s 3ms/step - loss: 1.7709
16/16 [==============================] - 0s 2ms/step - loss: 1.7709
16/16 [==============================] - 0s 2ms/step - loss: 1.7701
16/16 [==============================] - 0s 2ms/step - loss: 1.7696

Testing for epoch 46 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2490
16/16 [==============================] - 0s 3ms/step - loss: 0.8656
16/16 [==============================] - 0s 2ms/step - loss: 1.3673
16/16 [==============================] - 0s 2ms/step - loss: 1.7015
16/16 [==============================] - 0s 2ms/step - loss: 1.7915
16/16 [==============================] - 0s 5ms/step - loss: 1.8150
16/16 [==============================] - 0s 2ms/step - loss: 1.8193
16/16 [==============================] - 0s 2ms/step - loss: 1.8193
16/16 [==============================] - 0s 3ms/step - loss: 1.8185
16/16 [==============================] - 0s 2ms/step - loss: 1.8179
Epoch 47 of 60

Testing for epoch 47 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.2567
16/16 [==============================] - 0s 1ms/step - loss: 0.8645
16/16 [==============================] - 0s 3ms/step - loss: 1.3621
16/16 [==============================] - 0s 3ms/step - loss: 1.6901
16/16 [==============================] - 0s 2ms/step - loss: 1.7773
16/16 [==============================] - 0s 5ms/step - loss: 1.7996
16/16 [==============================] - 0s 2ms/step - loss: 1.8035
16/16 [==============================] - 0s 2ms/step - loss: 1.8034
16/16 [==============================] - 0s 3ms/step - loss: 1.8026
16/16 [==============================] - 0s 3ms/step - loss: 1.8020

Testing for epoch 47 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2497
16/16 [==============================] - 0s 4ms/step - loss: 0.8630
16/16 [==============================] - 0s 3ms/step - loss: 1.3677
16/16 [==============================] - 0s 3ms/step - loss: 1.6995
16/16 [==============================] - 0s 2ms/step - loss: 1.7867
16/16 [==============================] - 0s 5ms/step - loss: 1.8086
16/16 [==============================] - 0s 2ms/step - loss: 1.8123
16/16 [==============================] - 0s 2ms/step - loss: 1.8120
16/16 [==============================] - 0s 3ms/step - loss: 1.8111
16/16 [==============================] - 0s 2ms/step - loss: 1.8106
Epoch 48 of 60

Testing for epoch 48 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.2430
16/16 [==============================] - 0s 3ms/step - loss: 0.8748
16/16 [==============================] - 0s 4ms/step - loss: 1.4001
16/16 [==============================] - 0s 6ms/step - loss: 1.7426
16/16 [==============================] - 0s 3ms/step - loss: 1.8318
16/16 [==============================] - 0s 4ms/step - loss: 1.8542
16/16 [==============================] - 0s 3ms/step - loss: 1.8580
16/16 [==============================] - 0s 1ms/step - loss: 1.8578
16/16 [==============================] - 0s 5ms/step - loss: 1.8569
16/16 [==============================] - 0s 1ms/step - loss: 1.8564

Testing for epoch 48 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2516
16/16 [==============================] - 0s 5ms/step - loss: 0.8616
16/16 [==============================] - 0s 3ms/step - loss: 1.3706
16/16 [==============================] - 0s 7ms/step - loss: 1.7016
16/16 [==============================] - 0s 6ms/step - loss: 1.7869
16/16 [==============================] - 0s 3ms/step - loss: 1.8079
16/16 [==============================] - 0s 2ms/step - loss: 1.8112
16/16 [==============================] - 0s 2ms/step - loss: 1.8109
16/16 [==============================] - 0s 2ms/step - loss: 1.8100
16/16 [==============================] - 0s 2ms/step - loss: 1.8095
Epoch 49 of 60

Testing for epoch 49 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2607
16/16 [==============================] - 0s 8ms/step - loss: 0.8499
16/16 [==============================] - 0s 2ms/step - loss: 1.3438
16/16 [==============================] - 0s 5ms/step - loss: 1.6606
16/16 [==============================] - 0s 4ms/step - loss: 1.7408
16/16 [==============================] - 0s 2ms/step - loss: 1.7599
16/16 [==============================] - 0s 4ms/step - loss: 1.7627
16/16 [==============================] - 0s 3ms/step - loss: 1.7622
16/16 [==============================] - 0s 2ms/step - loss: 1.7612
16/16 [==============================] - 0s 5ms/step - loss: 1.7607

Testing for epoch 49 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2398
16/16 [==============================] - 0s 2ms/step - loss: 0.8724
16/16 [==============================] - 0s 2ms/step - loss: 1.4071
16/16 [==============================] - 0s 3ms/step - loss: 1.7500
16/16 [==============================] - 0s 2ms/step - loss: 1.8362
16/16 [==============================] - 0s 2ms/step - loss: 1.8567
16/16 [==============================] - 0s 2ms/step - loss: 1.8597
16/16 [==============================] - 0s 3ms/step - loss: 1.8592
16/16 [==============================] - 0s 3ms/step - loss: 1.8582
16/16 [==============================] - 0s 3ms/step - loss: 1.8576
Epoch 50 of 60

Testing for epoch 50 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2464
16/16 [==============================] - 0s 3ms/step - loss: 0.8709
16/16 [==============================] - 0s 5ms/step - loss: 1.4021
16/16 [==============================] - 0s 4ms/step - loss: 1.7389
16/16 [==============================] - 0s 2ms/step - loss: 1.8224
16/16 [==============================] - 0s 2ms/step - loss: 1.8419
16/16 [==============================] - 0s 2ms/step - loss: 1.8446
16/16 [==============================] - 0s 3ms/step - loss: 1.8440
16/16 [==============================] - 0s 2ms/step - loss: 1.8429
16/16 [==============================] - 0s 6ms/step - loss: 1.8424

Testing for epoch 50 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2460
16/16 [==============================] - 0s 4ms/step - loss: 0.8598
16/16 [==============================] - 0s 2ms/step - loss: 1.3860
16/16 [==============================] - 0s 3ms/step - loss: 1.7171
16/16 [==============================] - 0s 2ms/step - loss: 1.7985
16/16 [==============================] - 0s 5ms/step - loss: 1.8172
16/16 [==============================] - 0s 3ms/step - loss: 1.8197
16/16 [==============================] - 0s 2ms/step - loss: 1.8190
16/16 [==============================] - 0s 2ms/step - loss: 1.8180
16/16 [==============================] - 0s 3ms/step - loss: 1.8174
Epoch 51 of 60

Testing for epoch 51 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.2397
16/16 [==============================] - 0s 3ms/step - loss: 0.8597
16/16 [==============================] - 0s 3ms/step - loss: 1.3960
16/16 [==============================] - 0s 4ms/step - loss: 1.7264
16/16 [==============================] - 0s 3ms/step - loss: 1.8066
16/16 [==============================] - 0s 2ms/step - loss: 1.8247
16/16 [==============================] - 0s 2ms/step - loss: 1.8268
16/16 [==============================] - 0s 3ms/step - loss: 1.8260
16/16 [==============================] - 0s 3ms/step - loss: 1.8249
16/16 [==============================] - 0s 5ms/step - loss: 1.8243

Testing for epoch 51 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2460
16/16 [==============================] - 0s 2ms/step - loss: 0.8608
16/16 [==============================] - 0s 3ms/step - loss: 1.3967
16/16 [==============================] - 0s 4ms/step - loss: 1.7260
16/16 [==============================] - 0s 4ms/step - loss: 1.8056
16/16 [==============================] - 0s 2ms/step - loss: 1.8235
16/16 [==============================] - 0s 2ms/step - loss: 1.8257
16/16 [==============================] - 0s 2ms/step - loss: 1.8249
16/16 [==============================] - 0s 3ms/step - loss: 1.8238
16/16 [==============================] - 0s 2ms/step - loss: 1.8232
Epoch 52 of 60

Testing for epoch 52 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 6ms/step - loss: 0.2342
16/16 [==============================] - 0s 2ms/step - loss: 0.8745
16/16 [==============================] - 0s 2ms/step - loss: 1.4368
16/16 [==============================] - 0s 7ms/step - loss: 1.7798
16/16 [==============================] - 0s 4ms/step - loss: 1.8618
16/16 [==============================] - 0s 2ms/step - loss: 1.8799
16/16 [==============================] - 0s 4ms/step - loss: 1.8821
16/16 [==============================] - 0s 3ms/step - loss: 1.8812
16/16 [==============================] - 0s 2ms/step - loss: 1.8801
16/16 [==============================] - 0s 2ms/step - loss: 1.8795

Testing for epoch 52 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2429
16/16 [==============================] - 0s 4ms/step - loss: 0.8624
16/16 [==============================] - 0s 3ms/step - loss: 1.4095
16/16 [==============================] - 0s 3ms/step - loss: 1.7418
16/16 [==============================] - 0s 3ms/step - loss: 1.8205
16/16 [==============================] - 0s 2ms/step - loss: 1.8377
16/16 [==============================] - 0s 2ms/step - loss: 1.8395
16/16 [==============================] - 0s 2ms/step - loss: 1.8386
16/16 [==============================] - 0s 6ms/step - loss: 1.8375
16/16 [==============================] - 0s 2ms/step - loss: 1.8369
Epoch 53 of 60

Testing for epoch 53 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2407
16/16 [==============================] - 0s 3ms/step - loss: 0.8647
16/16 [==============================] - 0s 2ms/step - loss: 1.4187
16/16 [==============================] - 0s 3ms/step - loss: 1.7523
16/16 [==============================] - 0s 2ms/step - loss: 1.8301
16/16 [==============================] - 0s 2ms/step - loss: 1.8467
16/16 [==============================] - 0s 4ms/step - loss: 1.8484
16/16 [==============================] - 0s 5ms/step - loss: 1.8474
16/16 [==============================] - 0s 7ms/step - loss: 1.8463
16/16 [==============================] - 0s 3ms/step - loss: 1.8457

Testing for epoch 53 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2457
16/16 [==============================] - 0s 2ms/step - loss: 0.8634
16/16 [==============================] - 0s 3ms/step - loss: 1.4154
16/16 [==============================] - 0s 1ms/step - loss: 1.7468
16/16 [==============================] - 0s 3ms/step - loss: 1.8234
16/16 [==============================] - 0s 2ms/step - loss: 1.8396
16/16 [==============================] - 0s 5ms/step - loss: 1.8411
16/16 [==============================] - 0s 2ms/step - loss: 1.8401
16/16 [==============================] - 0s 3ms/step - loss: 1.8390
16/16 [==============================] - 0s 2ms/step - loss: 1.8384
Epoch 54 of 60

Testing for epoch 54 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.2491
16/16 [==============================] - 0s 2ms/step - loss: 0.8555
16/16 [==============================] - 0s 1ms/step - loss: 1.3991
16/16 [==============================] - 0s 6ms/step - loss: 1.7216
16/16 [==============================] - 0s 1ms/step - loss: 1.7947
16/16 [==============================] - 0s 2ms/step - loss: 1.8097
16/16 [==============================] - 0s 3ms/step - loss: 1.8108
16/16 [==============================] - 0s 2ms/step - loss: 1.8096
16/16 [==============================] - 0s 2ms/step - loss: 1.8084
16/16 [==============================] - 0s 2ms/step - loss: 1.8077

Testing for epoch 54 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2488
16/16 [==============================] - 0s 2ms/step - loss: 0.8498
16/16 [==============================] - 0s 2ms/step - loss: 1.3926
16/16 [==============================] - 0s 2ms/step - loss: 1.7141
16/16 [==============================] - 0s 3ms/step - loss: 1.7866
16/16 [==============================] - 0s 1ms/step - loss: 1.8014
16/16 [==============================] - 0s 2ms/step - loss: 1.8025
16/16 [==============================] - 0s 4ms/step - loss: 1.8014
16/16 [==============================] - 0s 3ms/step - loss: 1.8002
16/16 [==============================] - 0s 3ms/step - loss: 1.7996
Epoch 55 of 60

Testing for epoch 55 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2347
16/16 [==============================] - 0s 3ms/step - loss: 0.8676
16/16 [==============================] - 0s 3ms/step - loss: 1.4441
16/16 [==============================] - 0s 5ms/step - loss: 1.7837
16/16 [==============================] - 0s 2ms/step - loss: 1.8597
16/16 [==============================] - 0s 2ms/step - loss: 1.8751
16/16 [==============================] - 0s 2ms/step - loss: 1.8763
16/16 [==============================] - 0s 4ms/step - loss: 1.8752
16/16 [==============================] - 0s 3ms/step - loss: 1.8740
16/16 [==============================] - 0s 2ms/step - loss: 1.8734

Testing for epoch 55 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.2231
16/16 [==============================] - 0s 4ms/step - loss: 0.8842
16/16 [==============================] - 0s 2ms/step - loss: 1.4912
16/16 [==============================] - 0s 2ms/step - loss: 1.8481
16/16 [==============================] - 0s 4ms/step - loss: 1.9275
16/16 [==============================] - 0s 2ms/step - loss: 1.9435
16/16 [==============================] - 0s 6ms/step - loss: 1.9447
16/16 [==============================] - 0s 4ms/step - loss: 1.9435
16/16 [==============================] - 0s 3ms/step - loss: 1.9423
16/16 [==============================] - 0s 3ms/step - loss: 1.9417
Epoch 56 of 60

Testing for epoch 56 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2268
16/16 [==============================] - 0s 2ms/step - loss: 0.8747
16/16 [==============================] - 0s 2ms/step - loss: 1.4723
16/16 [==============================] - 0s 4ms/step - loss: 1.8206
16/16 [==============================] - 0s 1ms/step - loss: 1.8969
16/16 [==============================] - 0s 4ms/step - loss: 1.9119
16/16 [==============================] - 0s 2ms/step - loss: 1.9129
16/16 [==============================] - 0s 4ms/step - loss: 1.9116
16/16 [==============================] - 0s 3ms/step - loss: 1.9104
16/16 [==============================] - 0s 1ms/step - loss: 1.9097

Testing for epoch 56 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2272
16/16 [==============================] - 0s 3ms/step - loss: 0.8713
16/16 [==============================] - 0s 4ms/step - loss: 1.4699
16/16 [==============================] - 0s 5ms/step - loss: 1.8185
16/16 [==============================] - 0s 3ms/step - loss: 1.8946
16/16 [==============================] - 0s 1ms/step - loss: 1.9095
16/16 [==============================] - 0s 4ms/step - loss: 1.9105
16/16 [==============================] - 0s 2ms/step - loss: 1.9093
16/16 [==============================] - 0s 4ms/step - loss: 1.9081
16/16 [==============================] - 0s 2ms/step - loss: 1.9075
Epoch 57 of 60

Testing for epoch 57 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2293
16/16 [==============================] - 0s 3ms/step - loss: 0.8743
16/16 [==============================] - 0s 3ms/step - loss: 1.4763
16/16 [==============================] - 0s 3ms/step - loss: 1.8239
16/16 [==============================] - 0s 3ms/step - loss: 1.8987
16/16 [==============================] - 0s 3ms/step - loss: 1.9131
16/16 [==============================] - 0s 3ms/step - loss: 1.9139
16/16 [==============================] - 0s 4ms/step - loss: 1.9126
16/16 [==============================] - 0s 3ms/step - loss: 1.9113
16/16 [==============================] - 0s 3ms/step - loss: 1.9107

Testing for epoch 57 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2216
16/16 [==============================] - 0s 3ms/step - loss: 0.8819
16/16 [==============================] - 0s 6ms/step - loss: 1.5024
16/16 [==============================] - 0s 3ms/step - loss: 1.8595
16/16 [==============================] - 0s 4ms/step - loss: 1.9356
16/16 [==============================] - 0s 2ms/step - loss: 1.9500
16/16 [==============================] - 0s 3ms/step - loss: 1.9506
16/16 [==============================] - 0s 2ms/step - loss: 1.9492
16/16 [==============================] - 0s 2ms/step - loss: 1.9479
16/16 [==============================] - 0s 2ms/step - loss: 1.9473
Epoch 58 of 60

Testing for epoch 58 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.2374
16/16 [==============================] - 0s 4ms/step - loss: 0.8646
16/16 [==============================] - 0s 2ms/step - loss: 1.4559
16/16 [==============================] - 0s 6ms/step - loss: 1.7928
16/16 [==============================] - 0s 1ms/step - loss: 1.8635
16/16 [==============================] - 0s 2ms/step - loss: 1.8764
16/16 [==============================] - 0s 5ms/step - loss: 1.8768
16/16 [==============================] - 0s 2ms/step - loss: 1.8754
16/16 [==============================] - 0s 4ms/step - loss: 1.8742
16/16 [==============================] - 0s 4ms/step - loss: 1.8735

Testing for epoch 58 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.2310
16/16 [==============================] - 0s 4ms/step - loss: 0.8798
16/16 [==============================] - 0s 4ms/step - loss: 1.4927
16/16 [==============================] - 0s 2ms/step - loss: 1.8421
16/16 [==============================] - 0s 4ms/step - loss: 1.9149
16/16 [==============================] - 0s 2ms/step - loss: 1.9282
16/16 [==============================] - 0s 2ms/step - loss: 1.9286
16/16 [==============================] - 0s 2ms/step - loss: 1.9271
16/16 [==============================] - 0s 2ms/step - loss: 1.9258
16/16 [==============================] - 0s 5ms/step - loss: 1.9251
Epoch 59 of 60

Testing for epoch 59 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.2316
16/16 [==============================] - 0s 2ms/step - loss: 0.8690
16/16 [==============================] - 0s 2ms/step - loss: 1.4743
16/16 [==============================] - 0s 2ms/step - loss: 1.8164
16/16 [==============================] - 0s 2ms/step - loss: 1.8867
16/16 [==============================] - 0s 1ms/step - loss: 1.8992
16/16 [==============================] - 0s 2ms/step - loss: 1.8995
16/16 [==============================] - 0s 3ms/step - loss: 1.8980
16/16 [==============================] - 0s 2ms/step - loss: 1.8967
16/16 [==============================] - 0s 3ms/step - loss: 1.8961

Testing for epoch 59 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2227
16/16 [==============================] - 0s 3ms/step - loss: 0.8765
16/16 [==============================] - 0s 3ms/step - loss: 1.5017
16/16 [==============================] - 0s 3ms/step - loss: 1.8541
16/16 [==============================] - 0s 2ms/step - loss: 1.9259
16/16 [==============================] - 0s 3ms/step - loss: 1.9386
16/16 [==============================] - 0s 4ms/step - loss: 1.9389
16/16 [==============================] - 0s 3ms/step - loss: 1.9373
16/16 [==============================] - 0s 4ms/step - loss: 1.9360
16/16 [==============================] - 0s 5ms/step - loss: 1.9353
Epoch 60 of 60

Testing for epoch 60 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.2258
16/16 [==============================] - 0s 4ms/step - loss: 0.8650
16/16 [==============================] - 0s 2ms/step - loss: 1.4787
16/16 [==============================] - 0s 5ms/step - loss: 1.8211
16/16 [==============================] - 0s 2ms/step - loss: 1.8896
16/16 [==============================] - 0s 2ms/step - loss: 1.9013
16/16 [==============================] - 0s 3ms/step - loss: 1.9012
16/16 [==============================] - 0s 3ms/step - loss: 1.8996
16/16 [==============================] - 0s 2ms/step - loss: 1.8983
16/16 [==============================] - 0s 2ms/step - loss: 1.8976

Testing for epoch 60 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.2214
16/16 [==============================] - 0s 2ms/step - loss: 0.8761
16/16 [==============================] - 0s 3ms/step - loss: 1.5090
16/16 [==============================] - 0s 2ms/step - loss: 1.8610
16/16 [==============================] - 0s 2ms/step - loss: 1.9310
16/16 [==============================] - 0s 2ms/step - loss: 1.9428
16/16 [==============================] - 0s 3ms/step - loss: 1.9428
16/16 [==============================] - 0s 2ms/step - loss: 1.9411
16/16 [==============================] - 0s 1ms/step - loss: 1.9398
16/16 [==============================] - 0s 3ms/step - loss: 1.9391
32/32 [==============================] - 0s 2ms/step
outlier_MO_GAAL_one = list(clf.labels_)
outlier_MO_GAAL_one = list(map(lambda x: 1 if x==0  else -1,outlier_MO_GAAL_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_MO_GAAL_one,tab_linear)
_conf.conf("MO-GAAL (Liu et al., 2019)")

Accuracy: 0.936
Precision: 0.965
Recall: 0.967
F1 Score: 0.966
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
thirteen = twelve.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  thirteen = twelve.append(_conf.tab)
thirteen = therteen
thirteen
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947
Feature Bagging (Lazarevic and Kumar, 2005) 0.988 0.993684 0.993684 0.993684
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.895 1.000000 0.889474 0.941504
HBOS (Goldstein and Dengel, 2012) 0.960 0.977941 0.980000 0.978970
SOS (Janssens et al., 2012) 0.916 0.955789 0.955789 0.955789
SO-GAAL (Liu et al., 2019) 0.934 0.965263 0.965263 0.965263
MO-GAAL (Liu et al., 2019) 0.950 0.950000 1.000000 0.974359

LSCP\(\star\)

1. random.seed 지정했는가? O, random_state

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

detectors = [KNN(), LOF(), OCSVM()]
clf = LSCP(detectors,contamination=0.05, random_state=77)
clf.fit(_df[['x', 'y']])
_df['LSCP_clf'] = clf.labels_
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/pyod/models/lscp.py:382: UserWarning: The number of histogram bins is greater than the number of classifiers, reducing n_bins to n_clf.
  warnings.warn(
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/scipy/stats/_stats_py.py:4424: ConstantInputWarning: An input array is constant; the correlation coefficient is not defined.
  warnings.warn(stats.ConstantInputWarning(msg))
outlier_LSCP_one = list(clf.labels_)
outlier_LSCP_one = list(map(lambda x: 1 if x==0  else -1,outlier_LSCP_one))
_conf = Conf_matrx(outlier_true_one_1,outlier_LSCP_one,tab_linear)
_conf.conf("LSCP (Zhao et al., 2019)")

Accuracy: 0.984
Precision: 0.992
Recall: 0.992
F1 Score: 0.992
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
fourteen = therteen.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  fourteen = therteen.append(_conf.tab)
fourteen
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.990 0.994737 0.994737 0.994737
kNN (Ramaswamy et al., 2000) 0.994 0.996842 0.996842 0.996842
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981073 0.982105 0.981589
MCD (Hardin and Rocke, 2004) 0.998 0.998947 0.998947 0.998947
Feature Bagging (Lazarevic and Kumar, 2005) 0.988 0.993684 0.993684 0.993684
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.895 1.000000 0.889474 0.941504
HBOS (Goldstein and Dengel, 2012) 0.960 0.977941 0.980000 0.978970
SOS (Janssens et al., 2012) 0.916 0.955789 0.955789 0.955789
SO-GAAL (Liu et al., 2019) 0.934 0.965263 0.965263 0.965263
MO-GAAL (Liu et al., 2019) 0.950 0.950000 1.000000 0.974359
LSCP (Zhao et al., 2019) 0.984 0.991579 0.991579 0.991579

Linear Result

\(U^\star\), which is a mixture of uniform distributions \(U(5,7)\) and \(U(-7,-5)\).

fourteen.round(3)
Accuracy Precision Recall F1
GODE 0.998 0.999 0.999 0.999
LOF (Breunig et al., 2000) 0.990 0.995 0.995 0.995
kNN (Ramaswamy et al., 2000) 0.994 0.997 0.997 0.997
OCSVM (Sch ̈olkopf et al., 2001) 0.965 0.981 0.982 0.982
MCD (Hardin and Rocke, 2004) 0.998 0.999 0.999 0.999
Feature Bagging (Lazarevic and Kumar, 2005) 0.988 0.994 0.994 0.994
ABOD (Kriegel et al., 2008) 0.988 0.994 0.994 0.994
Isolation Forest (Liu et al., 2008) 0.895 1.000 0.889 0.942
HBOS (Goldstein and Dengel, 2012) 0.960 0.978 0.980 0.979
SOS (Janssens et al., 2012) 0.916 0.956 0.956 0.956
SO-GAAL (Liu et al., 2019) 0.934 0.965 0.965 0.965
MO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
LSCP (Zhao et al., 2019) 0.984 0.992 0.992 0.992
linear_rst = fourteen.round(3)

Orbit EbayesThresh

%load_ext rpy2.ipython
The rpy2.ipython extension is already loaded. To reload it, use:
  %reload_ext rpy2.ipython
%%R
library(EbayesThresh)
set.seed(1)
epsilon = rnorm(1000)
signal = sample(c(runif(25,-7,-5), runif(25,5,7), rep(0,950)))
index_of_trueoutlier = which(signal!=0)
index_of_trueoutlier
x=signal+epsilon
plot(1:1000,x)
points(index_of_trueoutlier,x[index_of_trueoutlier],col=2,cex=4)

#plot(x,type='l')
#mu <- EbayesThresh::ebayesthresh(x,sdev=2)
#lines(mu,col=2,lty=2,lwd=2)

%R -o x
%R -o index_of_trueoutlier
%R -o signal
ebayesthresh = importr('EbayesThresh').ebayesthresh
xhat = np.array(ebayesthresh(FloatVector(x)))
# plt.plot(x)
# plt.plot(xhat)
outlier_true_index = index_of_trueoutlier
outlier_true_value = x[index_of_trueoutlier]

package와 비교를 위해 outlier는 -1, inlier는 1로 표시

outlier_true_one = signal.copy()
outlier_true_one = list(map(lambda x: -1 if x!=0 else 1,outlier_true_one))
# pd.DataFrame(outlier_true_one).to_csv('orbit_outlier.csv')

Orbit

np.random.seed(777)
pi=np.pi
n=1000
ang=np.linspace(-pi,pi-2*pi/n,n)
r=5+np.cos(np.linspace(0,12*pi,n))
vx=r*np.cos(ang)
vy=r*np.sin(ang)
f1=10*np.sin(np.linspace(0,6*pi,n))
f = f1 + x
_df = pd.DataFrame({'x' : vx, 'y' : vy, 'f' : f})
X = np.array(_df)
# save_data(_df,'Orbit.pkl')

GODE

_Orbit = Orbit(_df)
_Orbit.get_distance()
100%|██████████| 1000/1000 [00:03<00:00, 304.56it/s]
_Orbit.get_weightmatrix(theta=(_Orbit.D[_Orbit.D>0].mean()),kappa=2500) 
_Orbit.fit(sd=15,ref=20)
outlier_simul_one = (_Orbit.df['Residual']**2).tolist()
outlier_simul_one = list(map(lambda x: -1 if x > 13 else 1,outlier_simul_one))
outlier_simul_one.count(1)
950
outlier_simul_one.count(-1)
50
_conf = Conf_matrx(outlier_true_one,outlier_simul_one,tab_orbit)
_conf.conf("GODE")

Accuracy: 0.998
Precision: 0.999
Recall: 0.999
F1 Score: 0.999
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
one = _conf.tab

LOF\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = LocalOutlierFactor(contamination=0.05)
_conf = Conf_matrx(outlier_true_one,clf.fit_predict(X),tab_orbit)
_conf.conf("LOF (Breunig et al., 2000)")

Accuracy: 0.950
Precision: 0.974
Recall: 0.974
F1 Score: 0.974
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
two = one.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  two = one.append(_conf.tab)
two
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684

KNN

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = KNN(contamination=0.05)
clf.fit(_df[['x', 'y','f']])
_df['knn_clf'] = clf.labels_
outlier_KNN_one = list(clf.labels_)
outlier_KNN_one = list(map(lambda x: 1 if x==0  else -1,outlier_KNN_one))
_conf = Conf_matrx(outlier_true_one,outlier_KNN_one,tab_orbit)
_conf.conf("kNN (Ramaswamy et al., 2000)")

Accuracy: 0.990
Precision: 0.995
Recall: 0.995
F1 Score: 0.995
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
three = two.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  three = two.append(_conf.tab)
three
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
clf = KNN(contamination=0.05)
clf.fit(_df[['x', 'y','f']])
_df['knn_clf'] = clf.labels_
outlier_KNN_one = list(clf.labels_)
outlier_KNN_one = list(map(lambda x: 1 if x==0  else -1,outlier_KNN_one))
_conf = Conf_matrx(outlier_true_one,outlier_KNN_one,tab_orbit)
_conf.conf("kNN (Ramaswamy et al., 2000)")

Accuracy: 0.990
Precision: 0.995
Recall: 0.995
F1 Score: 0.995
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))

CBLOF

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

import pickle
_df = load_data('Orbit.pkl')
outlier_true_one = pd.read_csv('orbit_outlier.csv').iloc[:,1].tolist()
clf = CBLOF(contamination=0.05,check_estimator=False, random_state=77)
clf.fit(_df[['x', 'y','f']])
_df['CBLOF_Clf'] = clf.labels_
/home/csy/anaconda3/envs/pygsp/lib/python3.10/site-packages/sklearn/cluster/_kmeans.py:1412: FutureWarning: The default value of `n_init` will change from 10 to 'auto' in 1.4. Set the value of `n_init` explicitly to suppress the warning
  super()._check_params_vs_input(X, default_n_init=10)
clf = CBLOF(contamination=0.05,check_estimator=False, random_state=77)
clf.fit(_df[['x', 'y','f']])
_df['CBLOF_Clf'] = clf.labels_

outlier_CBLOF_one = list(clf.labels_)

outlier_CBLOF_one = list(map(lambda x: 1 if x==0  else -1,outlier_CBLOF_one))

_conf = Conf_matrx(outlier_true_one,outlier_CBLOF_one,tab_orbit)

_conf.conf("CBLOF (He et al., 2003)")

# four = three.append(_conf.tab)
/home/csy/anaconda3/envs/pygsp/lib/python3.10/site-packages/sklearn/cluster/_kmeans.py:1412: FutureWarning: The default value of `n_init` will change from 10 to 'auto' in 1.4. Set the value of `n_init` explicitly to suppress the warning
  super()._check_params_vs_input(X, default_n_init=10)

Accuracy: 0.916
Precision: 0.956
Recall: 0.956
F1 Score: 0.956
AttributeError: 'DataFrame' object has no attribute 'append'
  • Accuracy: 0.916
  • Precision: 0.956
  • Recall: 0.956
  • F1 Score: 0.956

OCSVM

1. random.seed 지정했는가? O

2. contamination 지정했는가? O, nu

3. Iteration 지정할 수 있는가? X

nu

  • float, default=0.5
  • An upper bound on the fraction of training errors and a lower bound of the fraction of support vectors. Should be in the interval (0, 1]. By default 0.5 will be taken.
np.random.seed(77)
clf = svm.OneClassSVM(nu=0.05)
clf.fit(X)
OneClassSVM(nu=0.05)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
outlier_OSVM_one = list(clf.predict(X))
_conf = Conf_matrx(outlier_true_one,outlier_OSVM_one,tab_orbit)
_conf.conf("OCSVM (Sch ̈olkopf et al., 2001)")

Accuracy: 0.951
Precision: 0.975
Recall: 0.974
F1 Score: 0.974
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
five = three.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  five = three.append(_conf.tab)
five
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
clf = svm.OneClassSVM(nu=0.05, kernel="rbf")
clf.fit(X)
OneClassSVM(nu=0.05)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
outlier_OSVM_one = list(clf.predict(X))
_conf = Conf_matrx(outlier_true_one,outlier_OSVM_one,tab_orbit)
_conf.conf("OCSVM (Sch ̈olkopf et al., 2001)")

Accuracy: 0.951
Precision: 0.975
Recall: 0.974
F1 Score: 0.974
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))

MCD\(\star\)

1. random.seed 지정했는가? O, random_state

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

clf = MCD(contamination=0.05 , random_state = 77)
clf.fit(_df[['x', 'y','f']])
_df['MCD_clf'] = clf.labels_
outlier_MCD_one = list(clf.labels_)
outlier_MCD_one = list(map(lambda x: 1 if x==0  else -1,outlier_MCD_one))
_conf = Conf_matrx(outlier_true_one,outlier_MCD_one,tab_orbit)
_conf.conf("MCD (Hardin and Rocke, 2004)")

Accuracy: 0.916
Precision: 0.956
Recall: 0.956
F1 Score: 0.956
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
six = five.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  six = five.append(_conf.tab)
six
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789

Feature Bagging\(\star\)

1. random.seed 지정했는가? O, random_state

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

clf = FeatureBagging(contamination=0.05, random_state=77)
clf.fit(_df[['x', 'y','f']])
_df['FeatureBagging_clf'] = clf.labels_
outlier_FeatureBagging_one = list(clf.labels_)
outlier_FeatureBagging_one = list(map(lambda x: 1 if x==0  else -1,outlier_FeatureBagging_one))
_conf = Conf_matrx(outlier_true_one,outlier_FeatureBagging_one,tab_orbit)
_conf.conf("Feature Bagging (Lazarevic and Kumar, 2005)")

Accuracy: 0.958
Precision: 0.978
Recall: 0.978
F1 Score: 0.978
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
seven = six.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  seven = six.append(_conf.tab)
seven
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.977895 0.977895 0.977895

ABOD\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = ABOD(contamination=0.05)
clf.fit(_df[['x', 'y','f']])
_df['ABOD_Clf'] = clf.labels_
outlier_ABOD_one = list(clf.labels_)
outlier_ABOD_one = list(map(lambda x: 1 if x==0  else -1,outlier_ABOD_one))
_conf = Conf_matrx(outlier_true_one,outlier_ABOD_one,tab_orbit)
_conf.conf("ABOD (Kriegel et al., 2008)")

Accuracy: 0.988
Precision: 0.994
Recall: 0.994
F1 Score: 0.994
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
eight = seven.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  eight = seven.append(_conf.tab)
eight
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.977895 0.977895 0.977895
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684

IForest\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? X,할 수 없음

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
od = IForest(
    threshold=0.
)
od.fit(_df[['x', 'y','f']])
preds = od.predict(
    _df[['x', 'y','f']],
    return_instance_score=True
)
_df['IF_alibi'] = preds['data']['is_outlier']
outlier_alibi_one = _df['IF_alibi']
outlier_alibi_one = list(map(lambda x: 1 if x==0  else -1,outlier_alibi_one))
_conf = Conf_matrx(outlier_true_one,outlier_alibi_one,tab_orbit)
_conf.conf("Isolation Forest (Liu et al., 2008)")

Accuracy: 0.348
Precision: 0.990
Recall: 0.317
F1 Score: 0.480
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
nine = eight.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  nine = eight.append(_conf.tab)
nine
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.977895 0.977895 0.977895
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.348 0.990132 0.316842 0.480064

HBOS\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = HBOS(contamination=0.05)
clf.fit(_df[['x', 'y','f']])
_df['HBOS_clf'] = clf.labels_
outlier_HBOS_one = list(clf.labels_)
outlier_HBOS_one = list(map(lambda x: 1 if x==0  else -1,outlier_HBOS_one))
_conf = Conf_matrx(outlier_true_one,outlier_HBOS_one,tab_orbit)
_conf.conf("HBOS (Goldstein and Dengel, 2012)")

Accuracy: 0.935
Precision: 0.960
Recall: 0.973
F1 Score: 0.966
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
ten = nine.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  ten = nine.append(_conf.tab)
ten
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.977895 0.977895 0.977895
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.348 0.990132 0.316842 0.480064
HBOS (Goldstein and Dengel, 2012) 0.935 0.959502 0.972632 0.966022

SOS\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = SOS(contamination=0.05)
clf.fit(_df[['x', 'y','f']])
_df['SOS_clf'] = clf.labels_
outlier_SOS_one = list(clf.labels_)
outlier_SOS_one = list(map(lambda x: 1 if x==0  else -1,outlier_SOS_one))
_conf = Conf_matrx(outlier_true_one,outlier_SOS_one,tab_orbit)
_conf.conf("SOS (Janssens et al., 2012)")

Accuracy: 0.950
Precision: 0.974
Recall: 0.974
F1 Score: 0.974
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
eleven = ten.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  eleven = ten.append(_conf.tab)
eleven
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.977895 0.977895 0.977895
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.348 0.990132 0.316842 0.480064
HBOS (Goldstein and Dengel, 2012) 0.935 0.959502 0.972632 0.966022
SOS (Janssens et al., 2012) 0.950 0.973684 0.973684 0.973684

SO_GAAL\(\star\)

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = SO_GAAL(contamination=0.05)
clf.fit(_df[['x', 'y','f']])
_df['SO_GAAL_clf'] = clf.labels_
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/keras/optimizers/legacy/gradient_descent.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super().__init__(name, **kwargs)
Epoch 1 of 60

Testing for epoch 1 index 1:

Testing for epoch 1 index 2:
Epoch 2 of 60

Testing for epoch 2 index 1:

Testing for epoch 2 index 2:
Epoch 3 of 60

Testing for epoch 3 index 1:

Testing for epoch 3 index 2:
Epoch 4 of 60

Testing for epoch 4 index 1:

Testing for epoch 4 index 2:
Epoch 5 of 60

Testing for epoch 5 index 1:

Testing for epoch 5 index 2:
Epoch 6 of 60

Testing for epoch 6 index 1:

Testing for epoch 6 index 2:
Epoch 7 of 60

Testing for epoch 7 index 1:

Testing for epoch 7 index 2:
Epoch 8 of 60

Testing for epoch 8 index 1:

Testing for epoch 8 index 2:
Epoch 9 of 60

Testing for epoch 9 index 1:

Testing for epoch 9 index 2:
Epoch 10 of 60

Testing for epoch 10 index 1:

Testing for epoch 10 index 2:
Epoch 11 of 60

Testing for epoch 11 index 1:

Testing for epoch 11 index 2:
Epoch 12 of 60

Testing for epoch 12 index 1:

Testing for epoch 12 index 2:
Epoch 13 of 60

Testing for epoch 13 index 1:

Testing for epoch 13 index 2:
Epoch 14 of 60

Testing for epoch 14 index 1:

Testing for epoch 14 index 2:
Epoch 15 of 60

Testing for epoch 15 index 1:

Testing for epoch 15 index 2:
Epoch 16 of 60

Testing for epoch 16 index 1:

Testing for epoch 16 index 2:
Epoch 17 of 60

Testing for epoch 17 index 1:

Testing for epoch 17 index 2:
Epoch 18 of 60

Testing for epoch 18 index 1:

Testing for epoch 18 index 2:
Epoch 19 of 60

Testing for epoch 19 index 1:

Testing for epoch 19 index 2:
Epoch 20 of 60

Testing for epoch 20 index 1:

Testing for epoch 20 index 2:
Epoch 21 of 60

Testing for epoch 21 index 1:

Testing for epoch 21 index 2:
Epoch 22 of 60

Testing for epoch 22 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.0850

Testing for epoch 22 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.0975
Epoch 23 of 60

Testing for epoch 23 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.1089

Testing for epoch 23 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.1179
Epoch 24 of 60

Testing for epoch 24 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.1286

Testing for epoch 24 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.1332
Epoch 25 of 60

Testing for epoch 25 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.1491

Testing for epoch 25 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.1606
Epoch 26 of 60

Testing for epoch 26 index 1:
16/16 [==============================] - 0s 6ms/step - loss: 1.1661

Testing for epoch 26 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.1853
Epoch 27 of 60

Testing for epoch 27 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.1870

Testing for epoch 27 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.2067
Epoch 28 of 60

Testing for epoch 28 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.2145

Testing for epoch 28 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.2273
Epoch 29 of 60

Testing for epoch 29 index 1:
16/16 [==============================] - 0s 5ms/step - loss: 1.2376

Testing for epoch 29 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.2507
Epoch 30 of 60

Testing for epoch 30 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.2574

Testing for epoch 30 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.2705
Epoch 31 of 60

Testing for epoch 31 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.2847

Testing for epoch 31 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.2936
Epoch 32 of 60

Testing for epoch 32 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.3070

Testing for epoch 32 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.3158
Epoch 33 of 60

Testing for epoch 33 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.3297

Testing for epoch 33 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.3447
Epoch 34 of 60

Testing for epoch 34 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.3459

Testing for epoch 34 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.3594
Epoch 35 of 60

Testing for epoch 35 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.3657

Testing for epoch 35 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.3798
Epoch 36 of 60

Testing for epoch 36 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.3941

Testing for epoch 36 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 1.3974
Epoch 37 of 60

Testing for epoch 37 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 1.4150

Testing for epoch 37 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.4232
Epoch 38 of 60

Testing for epoch 38 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.4309

Testing for epoch 38 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.4481
Epoch 39 of 60

Testing for epoch 39 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.4541

Testing for epoch 39 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 1.4639
Epoch 40 of 60

Testing for epoch 40 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.4673

Testing for epoch 40 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.4835
Epoch 41 of 60

Testing for epoch 41 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.4892

Testing for epoch 41 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 1.5005
Epoch 42 of 60

Testing for epoch 42 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.5083

Testing for epoch 42 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.5134
Epoch 43 of 60

Testing for epoch 43 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.5304

Testing for epoch 43 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.5360
Epoch 44 of 60

Testing for epoch 44 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.5416

Testing for epoch 44 index 2:
16/16 [==============================] - 0s 6ms/step - loss: 1.5501
Epoch 45 of 60

Testing for epoch 45 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.5606

Testing for epoch 45 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.5713
Epoch 46 of 60

Testing for epoch 46 index 1:
16/16 [==============================] - 0s 5ms/step - loss: 1.5791

Testing for epoch 46 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 1.5794
Epoch 47 of 60

Testing for epoch 47 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.5915

Testing for epoch 47 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.6106
Epoch 48 of 60

Testing for epoch 48 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.6092

Testing for epoch 48 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.6142
Epoch 49 of 60

Testing for epoch 49 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.6311

Testing for epoch 49 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.6402
Epoch 50 of 60

Testing for epoch 50 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.6493

Testing for epoch 50 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 1.6538
Epoch 51 of 60

Testing for epoch 51 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.6606

Testing for epoch 51 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.6788
Epoch 52 of 60

Testing for epoch 52 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.6774

Testing for epoch 52 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.6863
Epoch 53 of 60

Testing for epoch 53 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.6937

Testing for epoch 53 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.7057
Epoch 54 of 60

Testing for epoch 54 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.7092

Testing for epoch 54 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.7315
Epoch 55 of 60

Testing for epoch 55 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.7249

Testing for epoch 55 index 2:
16/16 [==============================] - 0s 8ms/step - loss: 1.7349
Epoch 56 of 60

Testing for epoch 56 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.7461

Testing for epoch 56 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.7603
Epoch 57 of 60

Testing for epoch 57 index 1:
16/16 [==============================] - 0s 5ms/step - loss: 1.7650

Testing for epoch 57 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.7662
Epoch 58 of 60

Testing for epoch 58 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.7723

Testing for epoch 58 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.7855
Epoch 59 of 60

Testing for epoch 59 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 1.7993

Testing for epoch 59 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.8014
Epoch 60 of 60

Testing for epoch 60 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.8103

Testing for epoch 60 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 1.8134
32/32 [==============================] - 0s 2ms/step
outlier_SO_GAAL_one = list(clf.labels_)
outlier_SO_GAAL_one = list(map(lambda x: 1 if x==0  else -1,outlier_SO_GAAL_one))
_conf = Conf_matrx(outlier_true_one,outlier_SO_GAAL_one,tab_orbit)
_conf.conf("SO-GAAL (Liu et al., 2019)")

Accuracy: 0.950
Precision: 0.950
Recall: 1.000
F1 Score: 0.974
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
twelve = eleven.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  twelve = eleven.append(_conf.tab)
twelve
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.977895 0.977895 0.977895
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.348 0.990132 0.316842 0.480064
HBOS (Goldstein and Dengel, 2012) 0.935 0.959502 0.972632 0.966022
SOS (Janssens et al., 2012) 0.950 0.973684 0.973684 0.973684
SO-GAAL (Liu et al., 2019) 0.950 0.950000 1.000000 0.974359

MO_GAAL\(\star\)

np.random.seed(77)
clf = MO_GAAL(contamination=0.05)
clf.fit(_df[['x', 'y','f']])
_df['MO_GAAL_clf'] = clf.labels_
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/keras/optimizers/legacy/gradient_descent.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super().__init__(name, **kwargs)
Epoch 1 of 60

Testing for epoch 1 index 1:
32/32 [==============================] - 0s 1ms/step

Testing for epoch 1 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 2 of 60

Testing for epoch 2 index 1:
32/32 [==============================] - 0s 1ms/step

Testing for epoch 2 index 2:
32/32 [==============================] - 0s 4ms/step
Epoch 3 of 60

Testing for epoch 3 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 3 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 4 of 60

Testing for epoch 4 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 4 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 5 of 60

Testing for epoch 5 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 5 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 6 of 60

Testing for epoch 6 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 6 index 2:
32/32 [==============================] - 0s 4ms/step
Epoch 7 of 60

Testing for epoch 7 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 7 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 8 of 60

Testing for epoch 8 index 1:
32/32 [==============================] - 0s 4ms/step

Testing for epoch 8 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 9 of 60

Testing for epoch 9 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 9 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 10 of 60

Testing for epoch 10 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 10 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 11 of 60

Testing for epoch 11 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 11 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 12 of 60

Testing for epoch 12 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 12 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 13 of 60

Testing for epoch 13 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 13 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 14 of 60

Testing for epoch 14 index 1:
32/32 [==============================] - 0s 5ms/step

Testing for epoch 14 index 2:
32/32 [==============================] - 0s 5ms/step
Epoch 15 of 60

Testing for epoch 15 index 1:
32/32 [==============================] - 0s 4ms/step

Testing for epoch 15 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 16 of 60

Testing for epoch 16 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 16 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 17 of 60

Testing for epoch 17 index 1:
32/32 [==============================] - 0s 1ms/step

Testing for epoch 17 index 2:
32/32 [==============================] - 0s 1ms/step
Epoch 18 of 60

Testing for epoch 18 index 1:
32/32 [==============================] - 0s 4ms/step

Testing for epoch 18 index 2:
32/32 [==============================] - 0s 2ms/step
Epoch 19 of 60

Testing for epoch 19 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 19 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 20 of 60

Testing for epoch 20 index 1:
32/32 [==============================] - 0s 2ms/step

Testing for epoch 20 index 2:
32/32 [==============================] - 0s 3ms/step
Epoch 21 of 60

Testing for epoch 21 index 1:
32/32 [==============================] - 0s 3ms/step

Testing for epoch 21 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5553
16/16 [==============================] - 0s 3ms/step - loss: 1.0240
16/16 [==============================] - 0s 2ms/step - loss: 1.1290
16/16 [==============================] - 0s 2ms/step - loss: 1.1408
16/16 [==============================] - 0s 2ms/step - loss: 1.1467
16/16 [==============================] - 0s 2ms/step - loss: 1.1487
16/16 [==============================] - 0s 2ms/step - loss: 1.1491
16/16 [==============================] - 0s 2ms/step - loss: 1.1491
16/16 [==============================] - 0s 4ms/step - loss: 1.1492
16/16 [==============================] - 0s 3ms/step - loss: 1.1492
Epoch 22 of 60

Testing for epoch 22 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5479
16/16 [==============================] - 0s 6ms/step - loss: 1.0399
16/16 [==============================] - 0s 6ms/step - loss: 1.1489
16/16 [==============================] - 0s 2ms/step - loss: 1.1611
16/16 [==============================] - 0s 3ms/step - loss: 1.1672
16/16 [==============================] - 0s 3ms/step - loss: 1.1692
16/16 [==============================] - 0s 3ms/step - loss: 1.1696
16/16 [==============================] - 0s 7ms/step - loss: 1.1696
16/16 [==============================] - 0s 2ms/step - loss: 1.1696
16/16 [==============================] - 0s 3ms/step - loss: 1.1696

Testing for epoch 22 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5478
16/16 [==============================] - 0s 3ms/step - loss: 1.0447
16/16 [==============================] - 0s 5ms/step - loss: 1.1528
16/16 [==============================] - 0s 2ms/step - loss: 1.1648
16/16 [==============================] - 0s 2ms/step - loss: 1.1706
16/16 [==============================] - 0s 4ms/step - loss: 1.1724
16/16 [==============================] - 0s 3ms/step - loss: 1.1728
16/16 [==============================] - 0s 3ms/step - loss: 1.1728
16/16 [==============================] - 0s 5ms/step - loss: 1.1729
16/16 [==============================] - 0s 4ms/step - loss: 1.1729
Epoch 23 of 60

Testing for epoch 23 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5415
16/16 [==============================] - 0s 7ms/step - loss: 1.0610
16/16 [==============================] - 0s 2ms/step - loss: 1.1735
16/16 [==============================] - 0s 5ms/step - loss: 1.1858
16/16 [==============================] - 0s 6ms/step - loss: 1.1916
16/16 [==============================] - 0s 2ms/step - loss: 1.1935
16/16 [==============================] - 0s 4ms/step - loss: 1.1938
16/16 [==============================] - 0s 5ms/step - loss: 1.1939
16/16 [==============================] - 0s 3ms/step - loss: 1.1939
16/16 [==============================] - 0s 2ms/step - loss: 1.1939

Testing for epoch 23 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5424
16/16 [==============================] - 0s 2ms/step - loss: 1.0709
16/16 [==============================] - 0s 2ms/step - loss: 1.1832
16/16 [==============================] - 0s 7ms/step - loss: 1.1950
16/16 [==============================] - 0s 2ms/step - loss: 1.2007
16/16 [==============================] - 0s 6ms/step - loss: 1.2025
16/16 [==============================] - 0s 3ms/step - loss: 1.2029
16/16 [==============================] - 0s 3ms/step - loss: 1.2029
16/16 [==============================] - 0s 3ms/step - loss: 1.2030
16/16 [==============================] - 0s 4ms/step - loss: 1.2030
Epoch 24 of 60

Testing for epoch 24 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5367
16/16 [==============================] - 0s 3ms/step - loss: 1.0907
16/16 [==============================] - 0s 4ms/step - loss: 1.2084
16/16 [==============================] - 0s 6ms/step - loss: 1.2205
16/16 [==============================] - 0s 2ms/step - loss: 1.2262
16/16 [==============================] - 0s 3ms/step - loss: 1.2281
16/16 [==============================] - 0s 6ms/step - loss: 1.2284
16/16 [==============================] - 0s 2ms/step - loss: 1.2285
16/16 [==============================] - 0s 2ms/step - loss: 1.2285
16/16 [==============================] - 0s 3ms/step - loss: 1.2285

Testing for epoch 24 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5409
16/16 [==============================] - 0s 5ms/step - loss: 1.0894
16/16 [==============================] - 0s 4ms/step - loss: 1.2021
16/16 [==============================] - 0s 3ms/step - loss: 1.2132
16/16 [==============================] - 0s 4ms/step - loss: 1.2186
16/16 [==============================] - 0s 2ms/step - loss: 1.2203
16/16 [==============================] - 0s 4ms/step - loss: 1.2207
16/16 [==============================] - 0s 5ms/step - loss: 1.2207
16/16 [==============================] - 0s 2ms/step - loss: 1.2208
16/16 [==============================] - 0s 4ms/step - loss: 1.2208
Epoch 25 of 60

Testing for epoch 25 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.5362
16/16 [==============================] - 0s 3ms/step - loss: 1.1100
16/16 [==============================] - 0s 2ms/step - loss: 1.2276
16/16 [==============================] - 0s 2ms/step - loss: 1.2390
16/16 [==============================] - 0s 2ms/step - loss: 1.2444
16/16 [==============================] - 0s 3ms/step - loss: 1.2461
16/16 [==============================] - 0s 5ms/step - loss: 1.2465
16/16 [==============================] - 0s 3ms/step - loss: 1.2465
16/16 [==============================] - 0s 3ms/step - loss: 1.2466
16/16 [==============================] - 0s 2ms/step - loss: 1.2466

Testing for epoch 25 index 2:
32/32 [==============================] - 0s 5ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.5388
16/16 [==============================] - 0s 3ms/step - loss: 1.1243
16/16 [==============================] - 0s 4ms/step - loss: 1.2397
16/16 [==============================] - 0s 3ms/step - loss: 1.2508
16/16 [==============================] - 0s 3ms/step - loss: 1.2559
16/16 [==============================] - 0s 3ms/step - loss: 1.2576
16/16 [==============================] - 0s 4ms/step - loss: 1.2580
16/16 [==============================] - 0s 4ms/step - loss: 1.2580
16/16 [==============================] - 0s 2ms/step - loss: 1.2580
16/16 [==============================] - 0s 2ms/step - loss: 1.2580
Epoch 26 of 60

Testing for epoch 26 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5364
16/16 [==============================] - 0s 3ms/step - loss: 1.1357
16/16 [==============================] - 0s 2ms/step - loss: 1.2525
16/16 [==============================] - 0s 3ms/step - loss: 1.2635
16/16 [==============================] - 0s 2ms/step - loss: 1.2685
16/16 [==============================] - 0s 3ms/step - loss: 1.2702
16/16 [==============================] - 0s 6ms/step - loss: 1.2706
16/16 [==============================] - 0s 4ms/step - loss: 1.2706
16/16 [==============================] - 0s 3ms/step - loss: 1.2706
16/16 [==============================] - 0s 3ms/step - loss: 1.2706

Testing for epoch 26 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5406
16/16 [==============================] - 0s 2ms/step - loss: 1.1390
16/16 [==============================] - 0s 4ms/step - loss: 1.2535
16/16 [==============================] - 0s 3ms/step - loss: 1.2641
16/16 [==============================] - 0s 4ms/step - loss: 1.2688
16/16 [==============================] - 0s 1ms/step - loss: 1.2704
16/16 [==============================] - 0s 5ms/step - loss: 1.2707
16/16 [==============================] - 0s 2ms/step - loss: 1.2708
16/16 [==============================] - 0s 2ms/step - loss: 1.2708
16/16 [==============================] - 0s 2ms/step - loss: 1.2708
Epoch 27 of 60

Testing for epoch 27 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5386
16/16 [==============================] - 0s 2ms/step - loss: 1.1539
16/16 [==============================] - 0s 2ms/step - loss: 1.2703
16/16 [==============================] - 0s 1ms/step - loss: 1.2809
16/16 [==============================] - 0s 3ms/step - loss: 1.2856
16/16 [==============================] - 0s 4ms/step - loss: 1.2872
16/16 [==============================] - 0s 5ms/step - loss: 1.2875
16/16 [==============================] - 0s 2ms/step - loss: 1.2876
16/16 [==============================] - 0s 4ms/step - loss: 1.2876
16/16 [==============================] - 0s 4ms/step - loss: 1.2876

Testing for epoch 27 index 2:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5411
16/16 [==============================] - 0s 4ms/step - loss: 1.1706
16/16 [==============================] - 0s 2ms/step - loss: 1.2875
16/16 [==============================] - 0s 2ms/step - loss: 1.2978
16/16 [==============================] - 0s 2ms/step - loss: 1.3025
16/16 [==============================] - 0s 4ms/step - loss: 1.3040
16/16 [==============================] - 0s 7ms/step - loss: 1.3043
16/16 [==============================] - 0s 2ms/step - loss: 1.3043
16/16 [==============================] - 0s 4ms/step - loss: 1.3044
16/16 [==============================] - 0s 2ms/step - loss: 1.3044
Epoch 28 of 60

Testing for epoch 28 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.5399
16/16 [==============================] - 0s 2ms/step - loss: 1.1769
16/16 [==============================] - 0s 5ms/step - loss: 1.2937
16/16 [==============================] - 0s 5ms/step - loss: 1.3039
16/16 [==============================] - 0s 3ms/step - loss: 1.3085
16/16 [==============================] - 0s 3ms/step - loss: 1.3099
16/16 [==============================] - 0s 7ms/step - loss: 1.3102
16/16 [==============================] - 0s 3ms/step - loss: 1.3103
16/16 [==============================] - 0s 7ms/step - loss: 1.3103
16/16 [==============================] - 0s 3ms/step - loss: 1.3103

Testing for epoch 28 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.5440
16/16 [==============================] - 0s 3ms/step - loss: 1.1958
16/16 [==============================] - 0s 4ms/step - loss: 1.3132
16/16 [==============================] - 0s 4ms/step - loss: 1.3232
16/16 [==============================] - 0s 2ms/step - loss: 1.3277
16/16 [==============================] - 0s 3ms/step - loss: 1.3291
16/16 [==============================] - 0s 2ms/step - loss: 1.3294
16/16 [==============================] - 0s 3ms/step - loss: 1.3295
16/16 [==============================] - 0s 3ms/step - loss: 1.3295
16/16 [==============================] - 0s 2ms/step - loss: 1.3295
Epoch 29 of 60

Testing for epoch 29 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.5440
16/16 [==============================] - 0s 6ms/step - loss: 1.1885
16/16 [==============================] - 0s 5ms/step - loss: 1.3032
16/16 [==============================] - 0s 5ms/step - loss: 1.3129
16/16 [==============================] - 0s 2ms/step - loss: 1.3173
16/16 [==============================] - 0s 3ms/step - loss: 1.3186
16/16 [==============================] - 0s 2ms/step - loss: 1.3189
16/16 [==============================] - 0s 5ms/step - loss: 1.3189
16/16 [==============================] - 0s 2ms/step - loss: 1.3189
16/16 [==============================] - 0s 6ms/step - loss: 1.3190

Testing for epoch 29 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.5477
16/16 [==============================] - 0s 5ms/step - loss: 1.2129
16/16 [==============================] - 0s 6ms/step - loss: 1.3295
16/16 [==============================] - 0s 4ms/step - loss: 1.3391
16/16 [==============================] - 0s 4ms/step - loss: 1.3434
16/16 [==============================] - 0s 3ms/step - loss: 1.3446
16/16 [==============================] - 0s 2ms/step - loss: 1.3450
16/16 [==============================] - 0s 4ms/step - loss: 1.3450
16/16 [==============================] - 0s 2ms/step - loss: 1.3450
16/16 [==============================] - 0s 3ms/step - loss: 1.3450
Epoch 30 of 60

Testing for epoch 30 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5465
16/16 [==============================] - 0s 5ms/step - loss: 1.2201
16/16 [==============================] - 0s 4ms/step - loss: 1.3369
16/16 [==============================] - 0s 3ms/step - loss: 1.3465
16/16 [==============================] - 0s 4ms/step - loss: 1.3507
16/16 [==============================] - 0s 4ms/step - loss: 1.3519
16/16 [==============================] - 0s 3ms/step - loss: 1.3523
16/16 [==============================] - 0s 2ms/step - loss: 1.3523
16/16 [==============================] - 0s 3ms/step - loss: 1.3523
16/16 [==============================] - 0s 3ms/step - loss: 1.3523

Testing for epoch 30 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5513
16/16 [==============================] - 0s 3ms/step - loss: 1.2241
16/16 [==============================] - 0s 2ms/step - loss: 1.3391
16/16 [==============================] - 0s 4ms/step - loss: 1.3484
16/16 [==============================] - 0s 3ms/step - loss: 1.3524
16/16 [==============================] - 0s 3ms/step - loss: 1.3535
16/16 [==============================] - 0s 8ms/step - loss: 1.3538
16/16 [==============================] - 0s 5ms/step - loss: 1.3539
16/16 [==============================] - 0s 4ms/step - loss: 1.3539
16/16 [==============================] - 0s 3ms/step - loss: 1.3539
Epoch 31 of 60

Testing for epoch 31 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5501
16/16 [==============================] - 0s 3ms/step - loss: 1.2319
16/16 [==============================] - 0s 2ms/step - loss: 1.3477
16/16 [==============================] - 0s 2ms/step - loss: 1.3569
16/16 [==============================] - 0s 3ms/step - loss: 1.3609
16/16 [==============================] - 0s 2ms/step - loss: 1.3620
16/16 [==============================] - 0s 2ms/step - loss: 1.3623
16/16 [==============================] - 0s 3ms/step - loss: 1.3623
16/16 [==============================] - 0s 7ms/step - loss: 1.3623
16/16 [==============================] - 0s 5ms/step - loss: 1.3623

Testing for epoch 31 index 2:
32/32 [==============================] - 0s 5ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.5537
16/16 [==============================] - 0s 4ms/step - loss: 1.2510
16/16 [==============================] - 0s 3ms/step - loss: 1.3679
16/16 [==============================] - 0s 2ms/step - loss: 1.3773
16/16 [==============================] - 0s 4ms/step - loss: 1.3811
16/16 [==============================] - 0s 2ms/step - loss: 1.3822
16/16 [==============================] - 0s 3ms/step - loss: 1.3825
16/16 [==============================] - 0s 2ms/step - loss: 1.3826
16/16 [==============================] - 0s 2ms/step - loss: 1.3826
16/16 [==============================] - 0s 5ms/step - loss: 1.3826
Epoch 32 of 60

Testing for epoch 32 index 1:
32/32 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5526
16/16 [==============================] - 0s 6ms/step - loss: 1.2615
16/16 [==============================] - 0s 3ms/step - loss: 1.3798
16/16 [==============================] - 0s 3ms/step - loss: 1.3893
16/16 [==============================] - 0s 4ms/step - loss: 1.3931
16/16 [==============================] - 0s 3ms/step - loss: 1.3942
16/16 [==============================] - 0s 3ms/step - loss: 1.3945
16/16 [==============================] - 0s 4ms/step - loss: 1.3946
16/16 [==============================] - 0s 3ms/step - loss: 1.3946
16/16 [==============================] - 0s 3ms/step - loss: 1.3946

Testing for epoch 32 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.5561
16/16 [==============================] - 0s 3ms/step - loss: 1.2774
16/16 [==============================] - 0s 2ms/step - loss: 1.3966
16/16 [==============================] - 0s 5ms/step - loss: 1.4061
16/16 [==============================] - 0s 4ms/step - loss: 1.4098
16/16 [==============================] - 0s 8ms/step - loss: 1.4110
16/16 [==============================] - 0s 4ms/step - loss: 1.4113
16/16 [==============================] - 0s 4ms/step - loss: 1.4113
16/16 [==============================] - 0s 3ms/step - loss: 1.4113
16/16 [==============================] - 0s 5ms/step - loss: 1.4113
Epoch 33 of 60

Testing for epoch 33 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5547
16/16 [==============================] - 0s 2ms/step - loss: 1.2810
16/16 [==============================] - 0s 5ms/step - loss: 1.4007
16/16 [==============================] - 0s 3ms/step - loss: 1.4101
16/16 [==============================] - 0s 2ms/step - loss: 1.4139
16/16 [==============================] - 0s 2ms/step - loss: 1.4150
16/16 [==============================] - 0s 1ms/step - loss: 1.4153
16/16 [==============================] - 0s 2ms/step - loss: 1.4153
16/16 [==============================] - 0s 4ms/step - loss: 1.4153
16/16 [==============================] - 0s 3ms/step - loss: 1.4153

Testing for epoch 33 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5591
16/16 [==============================] - 0s 2ms/step - loss: 1.2888
16/16 [==============================] - 0s 3ms/step - loss: 1.4079
16/16 [==============================] - 0s 2ms/step - loss: 1.4171
16/16 [==============================] - 0s 3ms/step - loss: 1.4207
16/16 [==============================] - 0s 3ms/step - loss: 1.4218
16/16 [==============================] - 0s 2ms/step - loss: 1.4220
16/16 [==============================] - 0s 3ms/step - loss: 1.4221
16/16 [==============================] - 0s 2ms/step - loss: 1.4221
16/16 [==============================] - 0s 3ms/step - loss: 1.4221
Epoch 34 of 60

Testing for epoch 34 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5576
16/16 [==============================] - 0s 2ms/step - loss: 1.3077
16/16 [==============================] - 0s 2ms/step - loss: 1.4301
16/16 [==============================] - 0s 3ms/step - loss: 1.4393
16/16 [==============================] - 0s 3ms/step - loss: 1.4430
16/16 [==============================] - 0s 3ms/step - loss: 1.4441
16/16 [==============================] - 0s 3ms/step - loss: 1.4443
16/16 [==============================] - 0s 3ms/step - loss: 1.4444
16/16 [==============================] - 0s 4ms/step - loss: 1.4444
16/16 [==============================] - 0s 4ms/step - loss: 1.4444

Testing for epoch 34 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5622
16/16 [==============================] - 0s 5ms/step - loss: 1.3084
16/16 [==============================] - 0s 2ms/step - loss: 1.4283
16/16 [==============================] - 0s 3ms/step - loss: 1.4370
16/16 [==============================] - 0s 5ms/step - loss: 1.4405
16/16 [==============================] - 0s 3ms/step - loss: 1.4416
16/16 [==============================] - 0s 2ms/step - loss: 1.4418
16/16 [==============================] - 0s 3ms/step - loss: 1.4419
16/16 [==============================] - 0s 3ms/step - loss: 1.4419
16/16 [==============================] - 0s 5ms/step - loss: 1.4419
Epoch 35 of 60

Testing for epoch 35 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.5604
16/16 [==============================] - 0s 5ms/step - loss: 1.3295
16/16 [==============================] - 0s 2ms/step - loss: 1.4530
16/16 [==============================] - 0s 2ms/step - loss: 1.4617
16/16 [==============================] - 0s 2ms/step - loss: 1.4654
16/16 [==============================] - 0s 3ms/step - loss: 1.4664
16/16 [==============================] - 0s 2ms/step - loss: 1.4667
16/16 [==============================] - 0s 3ms/step - loss: 1.4667
16/16 [==============================] - 0s 6ms/step - loss: 1.4667
16/16 [==============================] - 0s 3ms/step - loss: 1.4667

Testing for epoch 35 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.5652
16/16 [==============================] - 0s 3ms/step - loss: 1.3310
16/16 [==============================] - 0s 3ms/step - loss: 1.4531
16/16 [==============================] - 0s 3ms/step - loss: 1.4615
16/16 [==============================] - 0s 5ms/step - loss: 1.4650
16/16 [==============================] - 0s 3ms/step - loss: 1.4660
16/16 [==============================] - 0s 2ms/step - loss: 1.4663
16/16 [==============================] - 0s 2ms/step - loss: 1.4663
16/16 [==============================] - 0s 4ms/step - loss: 1.4663
16/16 [==============================] - 0s 3ms/step - loss: 1.4663
Epoch 36 of 60

Testing for epoch 36 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.5649
16/16 [==============================] - 0s 1ms/step - loss: 1.3383
16/16 [==============================] - 0s 4ms/step - loss: 1.4614
16/16 [==============================] - 0s 3ms/step - loss: 1.4697
16/16 [==============================] - 0s 4ms/step - loss: 1.4732
16/16 [==============================] - 0s 2ms/step - loss: 1.4741
16/16 [==============================] - 0s 3ms/step - loss: 1.4744
16/16 [==============================] - 0s 3ms/step - loss: 1.4744
16/16 [==============================] - 0s 3ms/step - loss: 1.4744
16/16 [==============================] - 0s 3ms/step - loss: 1.4744

Testing for epoch 36 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.5703
16/16 [==============================] - 0s 2ms/step - loss: 1.3447
16/16 [==============================] - 0s 3ms/step - loss: 1.4657
16/16 [==============================] - 0s 1ms/step - loss: 1.4737
16/16 [==============================] - 0s 2ms/step - loss: 1.4770
16/16 [==============================] - 0s 2ms/step - loss: 1.4780
16/16 [==============================] - 0s 4ms/step - loss: 1.4782
16/16 [==============================] - 0s 4ms/step - loss: 1.4782
16/16 [==============================] - 0s 4ms/step - loss: 1.4783
16/16 [==============================] - 0s 3ms/step - loss: 1.4783
Epoch 37 of 60

Testing for epoch 37 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5701
16/16 [==============================] - 0s 2ms/step - loss: 1.3665
16/16 [==============================] - 0s 2ms/step - loss: 1.4906
16/16 [==============================] - 0s 3ms/step - loss: 1.4986
16/16 [==============================] - 0s 4ms/step - loss: 1.5020
16/16 [==============================] - 0s 2ms/step - loss: 1.5030
16/16 [==============================] - 0s 4ms/step - loss: 1.5032
16/16 [==============================] - 0s 1ms/step - loss: 1.5033
16/16 [==============================] - 0s 3ms/step - loss: 1.5033
16/16 [==============================] - 0s 2ms/step - loss: 1.5033

Testing for epoch 37 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5762
16/16 [==============================] - 0s 4ms/step - loss: 1.3901
16/16 [==============================] - 0s 3ms/step - loss: 1.5159
16/16 [==============================] - 0s 4ms/step - loss: 1.5241
16/16 [==============================] - 0s 5ms/step - loss: 1.5275
16/16 [==============================] - 0s 3ms/step - loss: 1.5285
16/16 [==============================] - 0s 2ms/step - loss: 1.5287
16/16 [==============================] - 0s 2ms/step - loss: 1.5287
16/16 [==============================] - 0s 2ms/step - loss: 1.5288
16/16 [==============================] - 0s 3ms/step - loss: 1.5288
Epoch 38 of 60

Testing for epoch 38 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5758
16/16 [==============================] - 0s 2ms/step - loss: 1.3873
16/16 [==============================] - 0s 5ms/step - loss: 1.5121
16/16 [==============================] - 0s 2ms/step - loss: 1.5202
16/16 [==============================] - 0s 5ms/step - loss: 1.5235
16/16 [==============================] - 0s 2ms/step - loss: 1.5245
16/16 [==============================] - 0s 3ms/step - loss: 1.5247
16/16 [==============================] - 0s 2ms/step - loss: 1.5247
16/16 [==============================] - 0s 2ms/step - loss: 1.5248
16/16 [==============================] - 0s 5ms/step - loss: 1.5248

Testing for epoch 38 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.5830
16/16 [==============================] - 0s 5ms/step - loss: 1.4114
16/16 [==============================] - 0s 4ms/step - loss: 1.5350
16/16 [==============================] - 0s 3ms/step - loss: 1.5431
16/16 [==============================] - 0s 4ms/step - loss: 1.5464
16/16 [==============================] - 0s 3ms/step - loss: 1.5474
16/16 [==============================] - 0s 5ms/step - loss: 1.5476
16/16 [==============================] - 0s 5ms/step - loss: 1.5477
16/16 [==============================] - 0s 3ms/step - loss: 1.5477
16/16 [==============================] - 0s 2ms/step - loss: 1.5477
Epoch 39 of 60

Testing for epoch 39 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5822
16/16 [==============================] - 0s 3ms/step - loss: 1.4097
16/16 [==============================] - 0s 2ms/step - loss: 1.5318
16/16 [==============================] - 0s 2ms/step - loss: 1.5399
16/16 [==============================] - 0s 4ms/step - loss: 1.5431
16/16 [==============================] - 0s 4ms/step - loss: 1.5440
16/16 [==============================] - 0s 3ms/step - loss: 1.5443
16/16 [==============================] - 0s 3ms/step - loss: 1.5443
16/16 [==============================] - 0s 3ms/step - loss: 1.5443
16/16 [==============================] - 0s 3ms/step - loss: 1.5443

Testing for epoch 39 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.5888
16/16 [==============================] - 0s 3ms/step - loss: 1.4204
16/16 [==============================] - 0s 3ms/step - loss: 1.5389
16/16 [==============================] - 0s 2ms/step - loss: 1.5468
16/16 [==============================] - 0s 1ms/step - loss: 1.5500
16/16 [==============================] - 0s 2ms/step - loss: 1.5509
16/16 [==============================] - 0s 2ms/step - loss: 1.5511
16/16 [==============================] - 0s 1ms/step - loss: 1.5512
16/16 [==============================] - 0s 4ms/step - loss: 1.5512
16/16 [==============================] - 0s 2ms/step - loss: 1.5512
Epoch 40 of 60

Testing for epoch 40 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5896
16/16 [==============================] - 0s 3ms/step - loss: 1.4418
16/16 [==============================] - 0s 2ms/step - loss: 1.5620
16/16 [==============================] - 0s 3ms/step - loss: 1.5701
16/16 [==============================] - 0s 8ms/step - loss: 1.5733
16/16 [==============================] - 0s 3ms/step - loss: 1.5742
16/16 [==============================] - 0s 3ms/step - loss: 1.5744
16/16 [==============================] - 0s 4ms/step - loss: 1.5744
16/16 [==============================] - 0s 3ms/step - loss: 1.5745
16/16 [==============================] - 0s 2ms/step - loss: 1.5745

Testing for epoch 40 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.5973
16/16 [==============================] - 0s 3ms/step - loss: 1.4584
16/16 [==============================] - 0s 2ms/step - loss: 1.5759
16/16 [==============================] - 0s 3ms/step - loss: 1.5840
16/16 [==============================] - 0s 4ms/step - loss: 1.5871
16/16 [==============================] - 0s 2ms/step - loss: 1.5881
16/16 [==============================] - 0s 3ms/step - loss: 1.5883
16/16 [==============================] - 0s 3ms/step - loss: 1.5883
16/16 [==============================] - 0s 3ms/step - loss: 1.5883
16/16 [==============================] - 0s 1ms/step - loss: 1.5883
Epoch 41 of 60

Testing for epoch 41 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.5964
16/16 [==============================] - 0s 3ms/step - loss: 1.4481
16/16 [==============================] - 0s 4ms/step - loss: 1.5639
16/16 [==============================] - 0s 2ms/step - loss: 1.5718
16/16 [==============================] - 0s 5ms/step - loss: 1.5749
16/16 [==============================] - 0s 2ms/step - loss: 1.5758
16/16 [==============================] - 0s 5ms/step - loss: 1.5760
16/16 [==============================] - 0s 4ms/step - loss: 1.5760
16/16 [==============================] - 0s 2ms/step - loss: 1.5760
16/16 [==============================] - 0s 2ms/step - loss: 1.5760

Testing for epoch 41 index 2:
32/32 [==============================] - 0s 5ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.6039
16/16 [==============================] - 0s 2ms/step - loss: 1.4573
16/16 [==============================] - 0s 3ms/step - loss: 1.5728
16/16 [==============================] - 0s 2ms/step - loss: 1.5807
16/16 [==============================] - 0s 6ms/step - loss: 1.5837
16/16 [==============================] - 0s 2ms/step - loss: 1.5846
16/16 [==============================] - 0s 4ms/step - loss: 1.5848
16/16 [==============================] - 0s 2ms/step - loss: 1.5848
16/16 [==============================] - 0s 4ms/step - loss: 1.5848
16/16 [==============================] - 0s 1ms/step - loss: 1.5848
Epoch 42 of 60

Testing for epoch 42 index 1:
32/32 [==============================] - 0s 5ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.6053
16/16 [==============================] - 0s 4ms/step - loss: 1.4831
16/16 [==============================] - 0s 5ms/step - loss: 1.6017
16/16 [==============================] - 0s 5ms/step - loss: 1.6098
16/16 [==============================] - 0s 1ms/step - loss: 1.6128
16/16 [==============================] - 0s 2ms/step - loss: 1.6137
16/16 [==============================] - 0s 3ms/step - loss: 1.6139
16/16 [==============================] - 0s 2ms/step - loss: 1.6139
16/16 [==============================] - 0s 1ms/step - loss: 1.6140
16/16 [==============================] - 0s 5ms/step - loss: 1.6140

Testing for epoch 42 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6125
16/16 [==============================] - 0s 5ms/step - loss: 1.4932
16/16 [==============================] - 0s 4ms/step - loss: 1.6117
16/16 [==============================] - 0s 1ms/step - loss: 1.6197
16/16 [==============================] - 0s 2ms/step - loss: 1.6225
16/16 [==============================] - 0s 4ms/step - loss: 1.6235
16/16 [==============================] - 0s 3ms/step - loss: 1.6237
16/16 [==============================] - 0s 3ms/step - loss: 1.6237
16/16 [==============================] - 0s 2ms/step - loss: 1.6237
16/16 [==============================] - 0s 2ms/step - loss: 1.6237
Epoch 43 of 60

Testing for epoch 43 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6134
16/16 [==============================] - 0s 3ms/step - loss: 1.5079
16/16 [==============================] - 0s 4ms/step - loss: 1.6281
16/16 [==============================] - 0s 5ms/step - loss: 1.6361
16/16 [==============================] - 0s 2ms/step - loss: 1.6389
16/16 [==============================] - 0s 3ms/step - loss: 1.6398
16/16 [==============================] - 0s 2ms/step - loss: 1.6400
16/16 [==============================] - 0s 3ms/step - loss: 1.6401
16/16 [==============================] - 0s 1ms/step - loss: 1.6401
16/16 [==============================] - 0s 3ms/step - loss: 1.6401

Testing for epoch 43 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.6192
16/16 [==============================] - 0s 1ms/step - loss: 1.5022
16/16 [==============================] - 0s 2ms/step - loss: 1.6202
16/16 [==============================] - 0s 3ms/step - loss: 1.6280
16/16 [==============================] - 0s 2ms/step - loss: 1.6307
16/16 [==============================] - 0s 6ms/step - loss: 1.6316
16/16 [==============================] - 0s 2ms/step - loss: 1.6318
16/16 [==============================] - 0s 5ms/step - loss: 1.6318
16/16 [==============================] - 0s 2ms/step - loss: 1.6318
16/16 [==============================] - 0s 2ms/step - loss: 1.6318
Epoch 44 of 60

Testing for epoch 44 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6209
16/16 [==============================] - 0s 3ms/step - loss: 1.5241
16/16 [==============================] - 0s 2ms/step - loss: 1.6445
16/16 [==============================] - 0s 4ms/step - loss: 1.6524
16/16 [==============================] - 0s 1ms/step - loss: 1.6551
16/16 [==============================] - 0s 2ms/step - loss: 1.6561
16/16 [==============================] - 0s 1ms/step - loss: 1.6563
16/16 [==============================] - 0s 4ms/step - loss: 1.6563
16/16 [==============================] - 0s 4ms/step - loss: 1.6563
16/16 [==============================] - 0s 3ms/step - loss: 1.6563

Testing for epoch 44 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.6285
16/16 [==============================] - 0s 3ms/step - loss: 1.5348
16/16 [==============================] - 0s 3ms/step - loss: 1.6553
16/16 [==============================] - 0s 2ms/step - loss: 1.6631
16/16 [==============================] - 0s 4ms/step - loss: 1.6657
16/16 [==============================] - 0s 3ms/step - loss: 1.6666
16/16 [==============================] - 0s 3ms/step - loss: 1.6668
16/16 [==============================] - 0s 2ms/step - loss: 1.6669
16/16 [==============================] - 0s 2ms/step - loss: 1.6669
16/16 [==============================] - 0s 4ms/step - loss: 1.6669
Epoch 45 of 60

Testing for epoch 45 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6294
16/16 [==============================] - 0s 2ms/step - loss: 1.5465
16/16 [==============================] - 0s 3ms/step - loss: 1.6682
16/16 [==============================] - 0s 4ms/step - loss: 1.6759
16/16 [==============================] - 0s 1ms/step - loss: 1.6785
16/16 [==============================] - 0s 6ms/step - loss: 1.6795
16/16 [==============================] - 0s 2ms/step - loss: 1.6797
16/16 [==============================] - 0s 2ms/step - loss: 1.6797
16/16 [==============================] - 0s 2ms/step - loss: 1.6797
16/16 [==============================] - 0s 2ms/step - loss: 1.6797

Testing for epoch 45 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6357
16/16 [==============================] - 0s 3ms/step - loss: 1.5459
16/16 [==============================] - 0s 1ms/step - loss: 1.6663
16/16 [==============================] - 0s 3ms/step - loss: 1.6737
16/16 [==============================] - 0s 4ms/step - loss: 1.6763
16/16 [==============================] - 0s 2ms/step - loss: 1.6772
16/16 [==============================] - 0s 6ms/step - loss: 1.6774
16/16 [==============================] - 0s 3ms/step - loss: 1.6774
16/16 [==============================] - 0s 2ms/step - loss: 1.6774
16/16 [==============================] - 0s 4ms/step - loss: 1.6774
Epoch 46 of 60

Testing for epoch 46 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6386
16/16 [==============================] - 0s 2ms/step - loss: 1.5682
16/16 [==============================] - 0s 3ms/step - loss: 1.6908
16/16 [==============================] - 0s 5ms/step - loss: 1.6984
16/16 [==============================] - 0s 3ms/step - loss: 1.7009
16/16 [==============================] - 0s 3ms/step - loss: 1.7019
16/16 [==============================] - 0s 4ms/step - loss: 1.7021
16/16 [==============================] - 0s 6ms/step - loss: 1.7021
16/16 [==============================] - 0s 3ms/step - loss: 1.7021
16/16 [==============================] - 0s 3ms/step - loss: 1.7021

Testing for epoch 46 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.6447
16/16 [==============================] - 0s 2ms/step - loss: 1.5654
16/16 [==============================] - 0s 2ms/step - loss: 1.6862
16/16 [==============================] - 0s 4ms/step - loss: 1.6935
16/16 [==============================] - 0s 2ms/step - loss: 1.6960
16/16 [==============================] - 0s 2ms/step - loss: 1.6970
16/16 [==============================] - 0s 3ms/step - loss: 1.6971
16/16 [==============================] - 0s 2ms/step - loss: 1.6972
16/16 [==============================] - 0s 3ms/step - loss: 1.6972
16/16 [==============================] - 0s 2ms/step - loss: 1.6972
Epoch 47 of 60

Testing for epoch 47 index 1:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6475
16/16 [==============================] - 0s 2ms/step - loss: 1.5843
16/16 [==============================] - 0s 3ms/step - loss: 1.7070
16/16 [==============================] - 0s 3ms/step - loss: 1.7145
16/16 [==============================] - 0s 4ms/step - loss: 1.7169
16/16 [==============================] - 0s 3ms/step - loss: 1.7179
16/16 [==============================] - 0s 2ms/step - loss: 1.7181
16/16 [==============================] - 0s 2ms/step - loss: 1.7181
16/16 [==============================] - 0s 2ms/step - loss: 1.7181
16/16 [==============================] - 0s 2ms/step - loss: 1.7181

Testing for epoch 47 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.6546
16/16 [==============================] - 0s 1ms/step - loss: 1.5858
16/16 [==============================] - 0s 3ms/step - loss: 1.7068
16/16 [==============================] - 0s 2ms/step - loss: 1.7141
16/16 [==============================] - 0s 2ms/step - loss: 1.7165
16/16 [==============================] - 0s 3ms/step - loss: 1.7174
16/16 [==============================] - 0s 3ms/step - loss: 1.7176
16/16 [==============================] - 0s 2ms/step - loss: 1.7176
16/16 [==============================] - 0s 2ms/step - loss: 1.7176
16/16 [==============================] - 0s 2ms/step - loss: 1.7176
Epoch 48 of 60

Testing for epoch 48 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.6556
16/16 [==============================] - 0s 2ms/step - loss: 1.5897
16/16 [==============================] - 0s 2ms/step - loss: 1.7107
16/16 [==============================] - 0s 2ms/step - loss: 1.7179
16/16 [==============================] - 0s 2ms/step - loss: 1.7203
16/16 [==============================] - 0s 5ms/step - loss: 1.7212
16/16 [==============================] - 0s 2ms/step - loss: 1.7214
16/16 [==============================] - 0s 3ms/step - loss: 1.7214
16/16 [==============================] - 0s 2ms/step - loss: 1.7214
16/16 [==============================] - 0s 2ms/step - loss: 1.7214

Testing for epoch 48 index 2:
32/32 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6635
16/16 [==============================] - 0s 2ms/step - loss: 1.5928
16/16 [==============================] - 0s 5ms/step - loss: 1.7123
16/16 [==============================] - 0s 2ms/step - loss: 1.7194
16/16 [==============================] - 0s 5ms/step - loss: 1.7216
16/16 [==============================] - 0s 2ms/step - loss: 1.7225
16/16 [==============================] - 0s 2ms/step - loss: 1.7227
16/16 [==============================] - 0s 2ms/step - loss: 1.7227
16/16 [==============================] - 0s 2ms/step - loss: 1.7227
16/16 [==============================] - 0s 2ms/step - loss: 1.7227
Epoch 49 of 60

Testing for epoch 49 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.6683
16/16 [==============================] - 0s 2ms/step - loss: 1.6206
16/16 [==============================] - 0s 1ms/step - loss: 1.7425
16/16 [==============================] - 0s 4ms/step - loss: 1.7495
16/16 [==============================] - 0s 3ms/step - loss: 1.7518
16/16 [==============================] - 0s 2ms/step - loss: 1.7528
16/16 [==============================] - 0s 6ms/step - loss: 1.7529
16/16 [==============================] - 0s 2ms/step - loss: 1.7530
16/16 [==============================] - 0s 2ms/step - loss: 1.7530
16/16 [==============================] - 0s 2ms/step - loss: 1.7530

Testing for epoch 49 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.6756
16/16 [==============================] - 0s 3ms/step - loss: 1.6212
16/16 [==============================] - 0s 2ms/step - loss: 1.7406
16/16 [==============================] - 0s 3ms/step - loss: 1.7474
16/16 [==============================] - 0s 6ms/step - loss: 1.7495
16/16 [==============================] - 0s 2ms/step - loss: 1.7505
16/16 [==============================] - 0s 3ms/step - loss: 1.7506
16/16 [==============================] - 0s 3ms/step - loss: 1.7507
16/16 [==============================] - 0s 2ms/step - loss: 1.7507
16/16 [==============================] - 0s 2ms/step - loss: 1.7507
Epoch 50 of 60

Testing for epoch 50 index 1:
32/32 [==============================] - 0s 5ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.6819
16/16 [==============================] - 0s 2ms/step - loss: 1.6547
16/16 [==============================] - 0s 4ms/step - loss: 1.7767
16/16 [==============================] - 0s 1ms/step - loss: 1.7836
16/16 [==============================] - 0s 1ms/step - loss: 1.7858
16/16 [==============================] - 0s 1ms/step - loss: 1.7868
16/16 [==============================] - 0s 3ms/step - loss: 1.7870
16/16 [==============================] - 0s 1ms/step - loss: 1.7870
16/16 [==============================] - 0s 1ms/step - loss: 1.7870
16/16 [==============================] - 0s 2ms/step - loss: 1.7870

Testing for epoch 50 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.6896
16/16 [==============================] - 0s 5ms/step - loss: 1.6543
16/16 [==============================] - 0s 1ms/step - loss: 1.7729
16/16 [==============================] - 0s 2ms/step - loss: 1.7796
16/16 [==============================] - 0s 2ms/step - loss: 1.7816
16/16 [==============================] - 0s 2ms/step - loss: 1.7826
16/16 [==============================] - 0s 8ms/step - loss: 1.7828
16/16 [==============================] - 0s 3ms/step - loss: 1.7828
16/16 [==============================] - 0s 1ms/step - loss: 1.7828
16/16 [==============================] - 0s 2ms/step - loss: 1.7828
Epoch 51 of 60

Testing for epoch 51 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 6ms/step - loss: 0.6887
16/16 [==============================] - 0s 2ms/step - loss: 1.6473
16/16 [==============================] - 0s 1ms/step - loss: 1.7643
16/16 [==============================] - 0s 2ms/step - loss: 1.7708
16/16 [==============================] - 0s 1ms/step - loss: 1.7728
16/16 [==============================] - 0s 2ms/step - loss: 1.7738
16/16 [==============================] - 0s 2ms/step - loss: 1.7740
16/16 [==============================] - 0s 2ms/step - loss: 1.7740
16/16 [==============================] - 0s 1ms/step - loss: 1.7740
16/16 [==============================] - 0s 1ms/step - loss: 1.7740

Testing for epoch 51 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.7000
16/16 [==============================] - 0s 4ms/step - loss: 1.6676
16/16 [==============================] - 0s 2ms/step - loss: 1.7852
16/16 [==============================] - 0s 5ms/step - loss: 1.7918
16/16 [==============================] - 0s 2ms/step - loss: 1.7937
16/16 [==============================] - 0s 2ms/step - loss: 1.7947
16/16 [==============================] - 0s 2ms/step - loss: 1.7949
16/16 [==============================] - 0s 5ms/step - loss: 1.7949
16/16 [==============================] - 0s 4ms/step - loss: 1.7949
16/16 [==============================] - 0s 2ms/step - loss: 1.7949
Epoch 52 of 60

Testing for epoch 52 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.7028
16/16 [==============================] - 0s 3ms/step - loss: 1.6785
16/16 [==============================] - 0s 5ms/step - loss: 1.7968
16/16 [==============================] - 0s 2ms/step - loss: 1.8034
16/16 [==============================] - 0s 1ms/step - loss: 1.8053
16/16 [==============================] - 0s 3ms/step - loss: 1.8063
16/16 [==============================] - 0s 1ms/step - loss: 1.8064
16/16 [==============================] - 0s 2ms/step - loss: 1.8065
16/16 [==============================] - 0s 4ms/step - loss: 1.8065
16/16 [==============================] - 0s 5ms/step - loss: 1.8065

Testing for epoch 52 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7118
16/16 [==============================] - 0s 2ms/step - loss: 1.6876
16/16 [==============================] - 0s 2ms/step - loss: 1.8054
16/16 [==============================] - 0s 1ms/step - loss: 1.8119
16/16 [==============================] - 0s 4ms/step - loss: 1.8137
16/16 [==============================] - 0s 5ms/step - loss: 1.8148
16/16 [==============================] - 0s 2ms/step - loss: 1.8149
16/16 [==============================] - 0s 3ms/step - loss: 1.8150
16/16 [==============================] - 0s 2ms/step - loss: 1.8150
16/16 [==============================] - 0s 4ms/step - loss: 1.8150
Epoch 53 of 60

Testing for epoch 53 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.7128
16/16 [==============================] - 0s 2ms/step - loss: 1.6905
16/16 [==============================] - 0s 2ms/step - loss: 1.8079
16/16 [==============================] - 0s 2ms/step - loss: 1.8143
16/16 [==============================] - 0s 1ms/step - loss: 1.8161
16/16 [==============================] - 0s 3ms/step - loss: 1.8171
16/16 [==============================] - 0s 1ms/step - loss: 1.8173
16/16 [==============================] - 0s 2ms/step - loss: 1.8173
16/16 [==============================] - 0s 1ms/step - loss: 1.8173
16/16 [==============================] - 0s 2ms/step - loss: 1.8173

Testing for epoch 53 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7235
16/16 [==============================] - 0s 4ms/step - loss: 1.7051
16/16 [==============================] - 0s 3ms/step - loss: 1.8214
16/16 [==============================] - 0s 4ms/step - loss: 1.8278
16/16 [==============================] - 0s 1ms/step - loss: 1.8296
16/16 [==============================] - 0s 2ms/step - loss: 1.8306
16/16 [==============================] - 0s 1ms/step - loss: 1.8307
16/16 [==============================] - 0s 3ms/step - loss: 1.8308
16/16 [==============================] - 0s 2ms/step - loss: 1.8308
16/16 [==============================] - 0s 2ms/step - loss: 1.8308
Epoch 54 of 60

Testing for epoch 54 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.7304
16/16 [==============================] - 0s 2ms/step - loss: 1.7378
16/16 [==============================] - 0s 2ms/step - loss: 1.8562
16/16 [==============================] - 0s 2ms/step - loss: 1.8628
16/16 [==============================] - 0s 2ms/step - loss: 1.8646
16/16 [==============================] - 0s 2ms/step - loss: 1.8656
16/16 [==============================] - 0s 2ms/step - loss: 1.8658
16/16 [==============================] - 0s 4ms/step - loss: 1.8658
16/16 [==============================] - 0s 2ms/step - loss: 1.8658
16/16 [==============================] - 0s 3ms/step - loss: 1.8658

Testing for epoch 54 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7352
16/16 [==============================] - 0s 1ms/step - loss: 1.7225
16/16 [==============================] - 0s 1ms/step - loss: 1.8369
16/16 [==============================] - 0s 2ms/step - loss: 1.8432
16/16 [==============================] - 0s 1ms/step - loss: 1.8449
16/16 [==============================] - 0s 5ms/step - loss: 1.8459
16/16 [==============================] - 0s 2ms/step - loss: 1.8461
16/16 [==============================] - 0s 2ms/step - loss: 1.8461
16/16 [==============================] - 0s 2ms/step - loss: 1.8461
16/16 [==============================] - 0s 1ms/step - loss: 1.8461
Epoch 55 of 60

Testing for epoch 55 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.7407
16/16 [==============================] - 0s 3ms/step - loss: 1.7460
16/16 [==============================] - 0s 2ms/step - loss: 1.8617
16/16 [==============================] - 0s 1ms/step - loss: 1.8680
16/16 [==============================] - 0s 2ms/step - loss: 1.8698
16/16 [==============================] - 0s 2ms/step - loss: 1.8708
16/16 [==============================] - 0s 2ms/step - loss: 1.8709
16/16 [==============================] - 0s 2ms/step - loss: 1.8709
16/16 [==============================] - 0s 2ms/step - loss: 1.8710
16/16 [==============================] - 0s 1ms/step - loss: 1.8710

Testing for epoch 55 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7499
16/16 [==============================] - 0s 1ms/step - loss: 1.7524
16/16 [==============================] - 0s 3ms/step - loss: 1.8662
16/16 [==============================] - 0s 2ms/step - loss: 1.8724
16/16 [==============================] - 0s 3ms/step - loss: 1.8741
16/16 [==============================] - 0s 3ms/step - loss: 1.8751
16/16 [==============================] - 0s 2ms/step - loss: 1.8752
16/16 [==============================] - 0s 2ms/step - loss: 1.8752
16/16 [==============================] - 0s 2ms/step - loss: 1.8753
16/16 [==============================] - 0s 3ms/step - loss: 1.8753
Epoch 56 of 60

Testing for epoch 56 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7520
16/16 [==============================] - 0s 3ms/step - loss: 1.7601
16/16 [==============================] - 0s 3ms/step - loss: 1.8736
16/16 [==============================] - 0s 2ms/step - loss: 1.8798
16/16 [==============================] - 0s 3ms/step - loss: 1.8814
16/16 [==============================] - 0s 2ms/step - loss: 1.8824
16/16 [==============================] - 0s 2ms/step - loss: 1.8825
16/16 [==============================] - 0s 2ms/step - loss: 1.8826
16/16 [==============================] - 0s 2ms/step - loss: 1.8826
16/16 [==============================] - 0s 2ms/step - loss: 1.8826

Testing for epoch 56 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7582
16/16 [==============================] - 0s 2ms/step - loss: 1.7561
16/16 [==============================] - 0s 2ms/step - loss: 1.8669
16/16 [==============================] - 0s 2ms/step - loss: 1.8729
16/16 [==============================] - 0s 2ms/step - loss: 1.8745
16/16 [==============================] - 0s 3ms/step - loss: 1.8755
16/16 [==============================] - 0s 2ms/step - loss: 1.8756
16/16 [==============================] - 0s 3ms/step - loss: 1.8756
16/16 [==============================] - 0s 2ms/step - loss: 1.8756
16/16 [==============================] - 0s 2ms/step - loss: 1.8756
Epoch 57 of 60

Testing for epoch 57 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7649
16/16 [==============================] - 0s 3ms/step - loss: 1.7843
16/16 [==============================] - 0s 2ms/step - loss: 1.8968
16/16 [==============================] - 0s 3ms/step - loss: 1.9030
16/16 [==============================] - 0s 2ms/step - loss: 1.9046
16/16 [==============================] - 0s 3ms/step - loss: 1.9055
16/16 [==============================] - 0s 2ms/step - loss: 1.9057
16/16 [==============================] - 0s 2ms/step - loss: 1.9057
16/16 [==============================] - 0s 2ms/step - loss: 1.9057
16/16 [==============================] - 0s 2ms/step - loss: 1.9057

Testing for epoch 57 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7728
16/16 [==============================] - 0s 2ms/step - loss: 1.7873
16/16 [==============================] - 0s 2ms/step - loss: 1.8989
16/16 [==============================] - 0s 3ms/step - loss: 1.9050
16/16 [==============================] - 0s 1ms/step - loss: 1.9066
16/16 [==============================] - 0s 1ms/step - loss: 1.9076
16/16 [==============================] - 0s 5ms/step - loss: 1.9077
16/16 [==============================] - 0s 1ms/step - loss: 1.9077
16/16 [==============================] - 0s 2ms/step - loss: 1.9077
16/16 [==============================] - 0s 2ms/step - loss: 1.9077
Epoch 58 of 60

Testing for epoch 58 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 7ms/step - loss: 0.7734
16/16 [==============================] - 0s 7ms/step - loss: 1.7901
16/16 [==============================] - 0s 2ms/step - loss: 1.9017
16/16 [==============================] - 0s 2ms/step - loss: 1.9077
16/16 [==============================] - 0s 5ms/step - loss: 1.9092
16/16 [==============================] - 0s 2ms/step - loss: 1.9102
16/16 [==============================] - 0s 2ms/step - loss: 1.9103
16/16 [==============================] - 0s 1ms/step - loss: 1.9104
16/16 [==============================] - 0s 4ms/step - loss: 1.9104
16/16 [==============================] - 0s 1ms/step - loss: 1.9104

Testing for epoch 58 index 2:
32/32 [==============================] - 0s 983us/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7871
16/16 [==============================] - 0s 1ms/step - loss: 1.8161
16/16 [==============================] - 0s 1ms/step - loss: 1.9284
16/16 [==============================] - 0s 2ms/step - loss: 1.9346
16/16 [==============================] - 0s 2ms/step - loss: 1.9361
16/16 [==============================] - 0s 2ms/step - loss: 1.9371
16/16 [==============================] - 0s 2ms/step - loss: 1.9372
16/16 [==============================] - 0s 1ms/step - loss: 1.9373
16/16 [==============================] - 0s 1ms/step - loss: 1.9373
16/16 [==============================] - 0s 1ms/step - loss: 1.9373
Epoch 59 of 60

Testing for epoch 59 index 1:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7876
16/16 [==============================] - 0s 2ms/step - loss: 1.8171
16/16 [==============================] - 0s 3ms/step - loss: 1.9287
16/16 [==============================] - 0s 2ms/step - loss: 1.9349
16/16 [==============================] - 0s 3ms/step - loss: 1.9363
16/16 [==============================] - 0s 4ms/step - loss: 1.9373
16/16 [==============================] - 0s 1ms/step - loss: 1.9374
16/16 [==============================] - 0s 2ms/step - loss: 1.9375
16/16 [==============================] - 0s 2ms/step - loss: 1.9375
16/16 [==============================] - 0s 1ms/step - loss: 1.9375

Testing for epoch 59 index 2:
32/32 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.7968
16/16 [==============================] - 0s 5ms/step - loss: 1.8233
16/16 [==============================] - 0s 3ms/step - loss: 1.9329
16/16 [==============================] - 0s 1ms/step - loss: 1.9391
16/16 [==============================] - 0s 1ms/step - loss: 1.9405
16/16 [==============================] - 0s 1ms/step - loss: 1.9415
16/16 [==============================] - 0s 1ms/step - loss: 1.9416
16/16 [==============================] - 0s 3ms/step - loss: 1.9416
16/16 [==============================] - 0s 5ms/step - loss: 1.9416
16/16 [==============================] - 0s 1ms/step - loss: 1.9416
Epoch 60 of 60

Testing for epoch 60 index 1:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 7ms/step - loss: 0.7940
16/16 [==============================] - 0s 4ms/step - loss: 1.8102
16/16 [==============================] - 0s 2ms/step - loss: 1.9179
16/16 [==============================] - 0s 1ms/step - loss: 1.9240
16/16 [==============================] - 0s 3ms/step - loss: 1.9254
16/16 [==============================] - 0s 1ms/step - loss: 1.9263
16/16 [==============================] - 0s 2ms/step - loss: 1.9264
16/16 [==============================] - 0s 2ms/step - loss: 1.9264
16/16 [==============================] - 0s 1ms/step - loss: 1.9264
16/16 [==============================] - 0s 1ms/step - loss: 1.9264

Testing for epoch 60 index 2:
32/32 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.8151
16/16 [==============================] - 0s 1ms/step - loss: 1.8641
16/16 [==============================] - 0s 2ms/step - loss: 1.9738
16/16 [==============================] - 0s 2ms/step - loss: 1.9801
16/16 [==============================] - 0s 1ms/step - loss: 1.9815
16/16 [==============================] - 0s 4ms/step - loss: 1.9825
16/16 [==============================] - 0s 1ms/step - loss: 1.9826
16/16 [==============================] - 0s 1ms/step - loss: 1.9826
16/16 [==============================] - 0s 2ms/step - loss: 1.9826
16/16 [==============================] - 0s 2ms/step - loss: 1.9826
32/32 [==============================] - 0s 2ms/step
outlier_MO_GAAL_one = list(clf.labels_)
outlier_MO_GAAL_one = list(map(lambda x: 1 if x==0  else -1,outlier_MO_GAAL_one))
_conf = Conf_matrx(outlier_true_one,outlier_MO_GAAL_one,tab_orbit)
_conf.conf("MO-GAAL (Liu et al., 2019)")

Accuracy: 0.950
Precision: 0.950
Recall: 1.000
F1 Score: 0.974
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
thirteen = twelve.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  thirteen = twelve.append(_conf.tab)
thirteen
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.977895 0.977895 0.977895
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.348 0.990132 0.316842 0.480064
HBOS (Goldstein and Dengel, 2012) 0.935 0.959502 0.972632 0.966022
SOS (Janssens et al., 2012) 0.950 0.973684 0.973684 0.973684
SO-GAAL (Liu et al., 2019) 0.950 0.950000 1.000000 0.974359
MO-GAAL (Liu et al., 2019) 0.950 0.950000 1.000000 0.974359

LSCP\(\star\)

1. random.seed 지정했는가? O, random_state

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

detectors = [KNN(), LOF(), OCSVM()]
clf = LSCP(detectors,contamination=0.05, random_state=77)
clf.fit(_df[['x', 'y','f']])
_df['LSCP_clf'] = clf.labels_
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/pyod/models/lscp.py:382: UserWarning: The number of histogram bins is greater than the number of classifiers, reducing n_bins to n_clf.
  warnings.warn(
outlier_LSCP_one = list(clf.labels_)
outlier_LSCP_one = list(map(lambda x: 1 if x==0  else -1,outlier_LSCP_one))
_conf = Conf_matrx(outlier_true_one,outlier_LSCP_one,tab_orbit)
_conf.conf("LSCP (Zhao et al., 2019)")

Accuracy: 0.988
Precision: 0.994
Recall: 0.994
F1 Score: 0.994
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
fourteen = thirteen.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  fourteen = thirteen.append(_conf.tab)
fourteen
Accuracy Precision Recall F1
GODE 0.998 0.998947 0.998947 0.998947
LOF (Breunig et al., 2000) 0.950 0.973684 0.973684 0.973684
kNN (Ramaswamy et al., 2000) 0.990 0.994737 0.994737 0.994737
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.974710 0.973684 0.974197
MCD (Hardin and Rocke, 2004) 0.916 0.955789 0.955789 0.955789
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.977895 0.977895 0.977895
ABOD (Kriegel et al., 2008) 0.988 0.993684 0.993684 0.993684
Isolation Forest (Liu et al., 2008) 0.348 0.990132 0.316842 0.480064
HBOS (Goldstein and Dengel, 2012) 0.935 0.959502 0.972632 0.966022
SOS (Janssens et al., 2012) 0.950 0.973684 0.973684 0.973684
SO-GAAL (Liu et al., 2019) 0.950 0.950000 1.000000 0.974359
MO-GAAL (Liu et al., 2019) 0.950 0.950000 1.000000 0.974359
LSCP (Zhao et al., 2019) 0.988 0.993684 0.993684 0.993684

Orbit Result

round(fourteen,3)
Accuracy Precision Recall F1
GODE 0.998 0.999 0.999 0.999
LOF (Breunig et al., 2000) 0.950 0.974 0.974 0.974
kNN (Ramaswamy et al., 2000) 0.990 0.995 0.995 0.995
OCSVM (Sch ̈olkopf et al., 2001) 0.951 0.975 0.974 0.974
MCD (Hardin and Rocke, 2004) 0.916 0.956 0.956 0.956
Feature Bagging (Lazarevic and Kumar, 2005) 0.958 0.978 0.978 0.978
ABOD (Kriegel et al., 2008) 0.988 0.994 0.994 0.994
Isolation Forest (Liu et al., 2008) 0.348 0.990 0.317 0.480
HBOS (Goldstein and Dengel, 2012) 0.935 0.960 0.973 0.966
SOS (Janssens et al., 2012) 0.950 0.974 0.974 0.974
SO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
MO-GAAL (Liu et al., 2019) 0.950 0.950 1.000 0.974
LSCP (Zhao et al., 2019) 0.988 0.994 0.994 0.994
orbit_rst = fourteen

Bunny


bunny 저장용

from pygsp import graphs, filters, plotting, utils
def save_data(data_dict,fname):
    with open(fname,'wb') as outfile:
        pickle.dump(data_dict,outfile)
import numpy as np
G = graphs.Bunny()
n = G.N
g = filters.Heat(G, tau=75) 
n=2503
normal = np.random.randn(n)
unif = np.concatenate([np.random.uniform(low=3,high=7,size=60), np.random.uniform(low=-7,high=-3,size=60),np.zeros(n-120)]); np.random.shuffle(unif)
noise = normal + unif
index_of_trueoutlier2 = np.where(unif!=0)
f = np.zeros(n)
f[1000] = -3234
f = g.filter(f, method='chebyshev') 
2023-07-04 17:37:32,017:[WARNING](pygsp.graphs.graph.lmax): The largest eigenvalue G.lmax is not available, we need to estimate it. Explicitly call G.estimate_lmax() or G.compute_fourier_basis() once beforehand to suppress the warning.
G.coords.shape
(2503, 3)
_W = G.W.toarray()
_x = G.coords[:,0]
_y = G.coords[:,1]
_z = -G.coords[:,2]
import pandas as pd
_df = pd.DataFrame({'x':_x,'y':_y,'z':_z})
import pickle
_df = {'W':_W,'x':_x,'y':_y,'z':_z, 'fnoise':f+noise,'f' : f, 'noise': noise,'unif':unif,'index_of_trueoutlier2':index_of_trueoutlier2}
save_data(_df,'Bunny.pkl')
_df
{'W': array([[0., 0., 0., ..., 0., 0., 0.],
        [0., 0., 0., ..., 0., 0., 0.],
        [0., 0., 0., ..., 0., 0., 0.],
        ...,
        [0., 0., 0., ..., 0., 0., 0.],
        [0., 0., 0., ..., 0., 0., 0.],
        [0., 0., 0., ..., 0., 0., 0.]]),
 'x': array([ 0.26815193, -0.58456893, -0.02730755, ...,  0.15397547,
        -0.45056488, -0.29405249]),
 'y': array([ 0.39314334,  0.63468595,  0.33280949, ...,  0.80205526,
         0.6207154 , -0.40187451]),
 'z': array([-0.13834514, -0.22438843,  0.08658215, ...,  0.33698514,
         0.58353051, -0.08647485]),
 'fnoise': array([-1.63569131,  0.49423926, -1.04026277, ..., -1.0694093 ,
        -0.24395499,  0.41729667]),
 'f': array([-1.54422488, -0.03596483, -0.93972715, ..., -0.01924028,
        -0.02470869, -0.26266752]),
 'noise': array([-0.09146643,  0.53020409, -0.10053563, ..., -1.05016902,
        -0.2192463 ,  0.67996419]),
 'unif': array([0., 0., 0., ..., 0., 0., 0.]),
 'index_of_trueoutlier2': (array([  15,   33,   34,   36,   45,   52,   61,  153,  227,  228,  235,
          240,  249,  267,  270,  273,  291,  313,  333,  353,  375,  389,
          397,  402,  439,  440,  447,  449,  456,  457,  472,  509,  564,
          569,  589,  638,  700,  713,  714,  732,  749,  814,  836,  851,
          858,  888,  910,  927,  934,  948,  953,  972,  986, 1002, 1041,
         1073, 1087, 1090, 1139, 1182, 1227, 1270, 1276, 1344, 1347, 1459,
         1461, 1467, 1499, 1500, 1512, 1515, 1544, 1562, 1610, 1637, 1640,
         1649, 1665, 1695, 1699, 1737, 1740, 1783, 1788, 1808, 1857, 1868,
         1882, 1928, 1941, 1954, 1973, 2014, 2017, 2020, 2065, 2108, 2115,
         2135, 2153, 2191, 2198, 2210, 2219, 2241, 2274, 2278, 2283, 2292,
         2314, 2328, 2340, 2341, 2357, 2387, 2399, 2477, 2485, 2487]),)}

def load_data(fname):
    with open(fname, 'rb') as outfile:
        data_dict = pickle.load(outfile)
    return data_dict
_df1 = load_data('Bunny.pkl')
_df = pd.DataFrame({'x': _df1['x'],'y':_df1['y'],'z':_df1['z'],'fnoise':_df1['fnoise'],'f':_df1['f'],'noise':_df1['noise']})
unif = _df1['unif']
_df1['index_of_trueoutlier2']
(array([  15,   33,   34,   36,   45,   52,   61,  153,  227,  228,  235,
         240,  249,  267,  270,  273,  291,  313,  333,  353,  375,  389,
         397,  402,  439,  440,  447,  449,  456,  457,  472,  509,  564,
         569,  589,  638,  700,  713,  714,  732,  749,  814,  836,  851,
         858,  888,  910,  927,  934,  948,  953,  972,  986, 1002, 1041,
        1073, 1087, 1090, 1139, 1182, 1227, 1270, 1276, 1344, 1347, 1459,
        1461, 1467, 1499, 1500, 1512, 1515, 1544, 1562, 1610, 1637, 1640,
        1649, 1665, 1695, 1699, 1737, 1740, 1783, 1788, 1808, 1857, 1868,
        1882, 1928, 1941, 1954, 1973, 2014, 2017, 2020, 2065, 2108, 2115,
        2135, 2153, 2191, 2198, 2210, 2219, 2241, 2274, 2278, 2283, 2292,
        2314, 2328, 2340, 2341, 2357, 2387, 2399, 2477, 2485, 2487]),)
# _df = pd.DataFrame({'x' : _x, 'y' : _y, 'z' : _z, 'fnoise':f+noise,'f' : f, 'noise': noise})
outlier_true_one_2 = unif.copy()
outlier_true_one_2 = list(map(lambda x: -1 if x !=0  else 1,outlier_true_one_2))
# pd.DataFrame(outlier_true_one_2).to_csv('bunny_outlier.csv')
X = np.array(_df)[:,:4]

GODE

_W = _df1['W']
_BUNNY = BUNNY(_df)
_BUNNY.fit(sd=20,ref=10)
len(_BUNNY.f)
2503
2503*0.05
125.15
outlier_simul_one = (_BUNNY.df['Residual']**2).tolist()
# outlier_simul_one = list(map(lambda x: -1 if x > 8.7 else 1,outlier_simul_one))
outlier_simul_one = list(map(lambda x: -1 if x > 8.05 else 1,outlier_simul_one))
outlier_simul_one.count(1)
2378
outlier_simul_one.count(-1)
125
_conf = Conf_matrx(outlier_true_one_2,outlier_simul_one,tab_bunny)
_conf.conf("GODE")

Accuracy: 0.988
Precision: 0.995
Recall: 0.993
F1 Score: 0.994
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
one = _conf.tab

LOF

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = LocalOutlierFactor(contamination=0.05)
_conf = Conf_matrx(outlier_true_one_2,clf.fit_predict(X),tab_bunny)
_conf.conf("LOF (Breunig et al., 2000)")

Accuracy: 0.956
Precision: 0.978
Recall: 0.976
F1 Score: 0.977
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
two = one.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  two = one.append(_conf.tab)
two
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891

KNN

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = KNN(contamination=0.05)
clf.fit(_df[['x', 'y','fnoise']])
_df['knn_Clf'] = clf.labels_
outlier_KNN_one = list(clf.labels_)
outlier_KNN_one = list(map(lambda x: 1 if x==0  else -1,outlier_KNN_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_KNN_one,tab_bunny)
_conf.conf("kNN (Ramaswamy et al., 2000)")

Accuracy: 0.982
Precision: 0.992
Recall: 0.989
F1 Score: 0.990
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
three = two.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  three = two.append(_conf.tab)
three
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336

CBLOF

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

_df1 = load_data('Bunny.pkl')
outlier_true_one_2 = pd.read_csv('bunny_outlier.csv').iloc[:,1].to_list()
_df = pd.DataFrame({'x': _df1['x'],'y':_df1['y'],'z':_df1['z'],'fnoise':_df1['fnoise'],'f':_df1['f'],'noise':_df1['noise']})
clf = CBLOF(contamination=0.05,check_estimator=False, random_state=77)
clf.fit(_df[['x', 'y','fnoise']])
_df['CBLOF_Clf'] = clf.labels_
/home/csy/anaconda3/envs/pygsp/lib/python3.10/site-packages/sklearn/cluster/_kmeans.py:1412: FutureWarning: The default value of `n_init` will change from 10 to 'auto' in 1.4. Set the value of `n_init` explicitly to suppress the warning
  super()._check_params_vs_input(X, default_n_init=10)
outlier_CBLOF_one = list(clf.labels_)
outlier_CBLOF_one = list(map(lambda x: 1 if x==0  else -1,outlier_CBLOF_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_CBLOF_one,tab_bunny)
_conf.conf("CBLOF (He et al., 2003)")

Accuracy: 0.974
Precision: 0.988
Recall: 0.985
F1 Score: 0.987
AttributeError: 'DataFrame' object has no attribute 'append'
# four = three.append(_conf.tab)
  • Accuracy: 0.974
  • Precision: 0.988
  • Recall: 0.985
  • F1 Score: 0.987

OCSVM

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = svm.OneClassSVM(nu=0.05)
clf.fit(X)
OneClassSVM(nu=0.05)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
outlier_OSVM_one = list(clf.predict(X))
_conf = Conf_matrx(outlier_true_one_2,outlier_OSVM_one,tab_bunny)
_conf.conf("OCSVM (Sch ̈olkopf et al., 2001)")

Accuracy: 0.959
Precision: 0.979
Recall: 0.978
F1 Score: 0.979
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
five = three.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  five = three.append(_conf.tab)
five
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580

MCD

1. random.seed 지정했는가? O, random_state

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

clf = MCD(contamination=0.05 , random_state = 77)
clf.fit(_df[['x', 'y','fnoise']])
_df['MCD_clf'] = clf.labels_
outlier_MCD_one = list(clf.labels_)
outlier_MCD_one = list(map(lambda x: 1 if x==0  else -1,outlier_MCD_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_MCD_one,tab_bunny)
_conf.conf("MCD (Hardin and Rocke, 2004)")

Accuracy: 0.982
Precision: 0.992
Recall: 0.990
F1 Score: 0.991
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
six = five.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  six = five.append(_conf.tab)
six
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756

Feature Bagging

1. random.seed 지정했는가? O, random_state

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

clf = FeatureBagging(contamination=0.05, random_state=77)
clf.fit(_df[['x', 'y','fnoise']])
_df['FeatureBagging_clf'] = clf.labels_
outlier_FeatureBagging_one = list(clf.labels_)
outlier_FeatureBagging_one = list(map(lambda x: 1 if x==0  else -1,outlier_FeatureBagging_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_FeatureBagging_one,tab_bunny)
_conf.conf("Feature Bagging (Lazarevic and Kumar, 2005)")

Accuracy: 0.954
Precision: 0.977
Recall: 0.975
F1 Score: 0.976
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
seven = six.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  seven = six.append(_conf.tab)
seven
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756
Feature Bagging (Lazarevic and Kumar, 2005) 0.954455 0.977282 0.974822 0.976050

ABOD

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = ABOD(contamination=0.05)
clf.fit(_df[['x', 'y','fnoise']])
_df['ABOD_Clf'] = clf.labels_
outlier_ABOD_one = list(clf.labels_)
outlier_ABOD_one = list(map(lambda x: 1 if x==0  else -1,outlier_ABOD_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_ABOD_one,tab_bunny)
_conf.conf("ABOD (Kriegel et al., 2008)")

Accuracy: 0.979
Precision: 0.990
Recall: 0.988
F1 Score: 0.989
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
eight = seven.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  eight = seven.append(_conf.tab)
eight
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756
Feature Bagging (Lazarevic and Kumar, 2005) 0.954455 0.977282 0.974822 0.976050
ABOD (Kriegel et al., 2008) 0.979225 0.990324 0.987830 0.989076

normal fix 안 해줘서 좀 다른듯

IForest

1. random.seed 지정했는가? O

2. contamination 지정했는가? X, 할 수 없음

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
od = IForest(
    threshold=0.
)
od.fit(_df[['x', 'y','fnoise']])
preds = od.predict(
    _df[['x', 'y','fnoise']],
    return_instance_score=True
)
_df['IF_alibi'] = preds['data']['is_outlier']
outlier_alibi_one = _df['IF_alibi']
outlier_alibi_one = list(map(lambda x: 1 if x==0  else -1,outlier_alibi_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_alibi_one,tab_bunny)
_conf.conf("Isolation Forest (Liu et al., 2008)")

Accuracy: 0.791
Precision: 0.997
Recall: 0.783
F1 Score: 0.877
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
nine = eight.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  nine = eight.append(_conf.tab)
nine
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756
Feature Bagging (Lazarevic and Kumar, 2005) 0.954455 0.977282 0.974822 0.976050
ABOD (Kriegel et al., 2008) 0.979225 0.990324 0.987830 0.989076
Isolation Forest (Liu et al., 2008) 0.791051 0.996795 0.783047 0.877086

HBOS

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = HBOS(contamination=0.05)
clf.fit(_df[['x', 'y','fnoise']])
_df['HBOS_clf'] = clf.labels_
outlier_HBOS_one = list(clf.labels_)
outlier_HBOS_one = list(map(lambda x: 1 if x==0  else -1,outlier_HBOS_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_HBOS_one,tab_bunny)
_conf.conf("HBOS (Goldstein and Dengel, 2012)")

Accuracy: 0.919
Precision: 0.958
Recall: 0.956
F1 Score: 0.957
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
ten = nine.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  ten = nine.append(_conf.tab)
ten
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756
Feature Bagging (Lazarevic and Kumar, 2005) 0.954455 0.977282 0.974822 0.976050
ABOD (Kriegel et al., 2008) 0.979225 0.990324 0.987830 0.989076
Isolation Forest (Liu et al., 2008) 0.791051 0.996795 0.783047 0.877086
HBOS (Goldstein and Dengel, 2012) 0.918897 0.958368 0.956358 0.957362

SOS

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = SOS(contamination=0.05)
clf.fit(_df[['x', 'y','fnoise']])
_df['SOS_clf'] = clf.labels_
outlier_SOS_one = list(clf.labels_)
outlier_SOS_one = list(map(lambda x: 1 if x==0  else -1,outlier_SOS_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_SOS_one,tab_bunny)
_conf.conf("SOS (Janssens et al., 2012)")

Accuracy: 0.912
Precision: 0.955
Recall: 0.953
F1 Score: 0.954
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
eleven = ten.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  eleven = ten.append(_conf.tab)
eleven
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756
Feature Bagging (Lazarevic and Kumar, 2005) 0.954455 0.977282 0.974822 0.976050
ABOD (Kriegel et al., 2008) 0.979225 0.990324 0.987830 0.989076
Isolation Forest (Liu et al., 2008) 0.791051 0.996795 0.783047 0.877086
HBOS (Goldstein and Dengel, 2012) 0.918897 0.958368 0.956358 0.957362
SOS (Janssens et al., 2012) 0.912105 0.954985 0.952581 0.953782

SO_GAAL

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = SO_GAAL(contamination=0.05)
clf.fit(_df[['x', 'y','fnoise']])
_df['SO_GAAL_clf'] = clf.labels_
Epoch 1 of 60

Testing for epoch 1 index 1:

Testing for epoch 1 index 2:

Testing for epoch 1 index 3:

Testing for epoch 1 index 4:

Testing for epoch 1 index 5:
Epoch 2 of 60

Testing for epoch 2 index 1:

Testing for epoch 2 index 2:

Testing for epoch 2 index 3:

Testing for epoch 2 index 4:

Testing for epoch 2 index 5:
Epoch 3 of 60

Testing for epoch 3 index 1:

Testing for epoch 3 index 2:

Testing for epoch 3 index 3:

Testing for epoch 3 index 4:

Testing for epoch 3 index 5:
Epoch 4 of 60

Testing for epoch 4 index 1:

Testing for epoch 4 index 2:

Testing for epoch 4 index 3:

Testing for epoch 4 index 4:

Testing for epoch 4 index 5:
Epoch 5 of 60

Testing for epoch 5 index 1:

Testing for epoch 5 index 2:

Testing for epoch 5 index 3:

Testing for epoch 5 index 4:

Testing for epoch 5 index 5:
Epoch 6 of 60

Testing for epoch 6 index 1:

Testing for epoch 6 index 2:

Testing for epoch 6 index 3:

Testing for epoch 6 index 4:

Testing for epoch 6 index 5:
Epoch 7 of 60

Testing for epoch 7 index 1:

Testing for epoch 7 index 2:

Testing for epoch 7 index 3:

Testing for epoch 7 index 4:

Testing for epoch 7 index 5:
Epoch 8 of 60

Testing for epoch 8 index 1:

Testing for epoch 8 index 2:

Testing for epoch 8 index 3:

Testing for epoch 8 index 4:

Testing for epoch 8 index 5:
Epoch 9 of 60

Testing for epoch 9 index 1:

Testing for epoch 9 index 2:

Testing for epoch 9 index 3:

Testing for epoch 9 index 4:

Testing for epoch 9 index 5:
Epoch 10 of 60

Testing for epoch 10 index 1:

Testing for epoch 10 index 2:

Testing for epoch 10 index 3:

Testing for epoch 10 index 4:

Testing for epoch 10 index 5:
Epoch 11 of 60

Testing for epoch 11 index 1:

Testing for epoch 11 index 2:

Testing for epoch 11 index 3:

Testing for epoch 11 index 4:

Testing for epoch 11 index 5:
Epoch 12 of 60

Testing for epoch 12 index 1:

Testing for epoch 12 index 2:

Testing for epoch 12 index 3:

Testing for epoch 12 index 4:

Testing for epoch 12 index 5:
Epoch 13 of 60

Testing for epoch 13 index 1:

Testing for epoch 13 index 2:

Testing for epoch 13 index 3:

Testing for epoch 13 index 4:

Testing for epoch 13 index 5:
Epoch 14 of 60

Testing for epoch 14 index 1:

Testing for epoch 14 index 2:

Testing for epoch 14 index 3:

Testing for epoch 14 index 4:

Testing for epoch 14 index 5:
Epoch 15 of 60

Testing for epoch 15 index 1:

Testing for epoch 15 index 2:

Testing for epoch 15 index 3:

Testing for epoch 15 index 4:

Testing for epoch 15 index 5:
Epoch 16 of 60

Testing for epoch 16 index 1:

Testing for epoch 16 index 2:

Testing for epoch 16 index 3:

Testing for epoch 16 index 4:

Testing for epoch 16 index 5:
Epoch 17 of 60

Testing for epoch 17 index 1:

Testing for epoch 17 index 2:

Testing for epoch 17 index 3:

Testing for epoch 17 index 4:

Testing for epoch 17 index 5:
Epoch 18 of 60

Testing for epoch 18 index 1:

Testing for epoch 18 index 2:

Testing for epoch 18 index 3:

Testing for epoch 18 index 4:

Testing for epoch 18 index 5:
Epoch 19 of 60

Testing for epoch 19 index 1:

Testing for epoch 19 index 2:

Testing for epoch 19 index 3:

Testing for epoch 19 index 4:

Testing for epoch 19 index 5:
Epoch 20 of 60

Testing for epoch 20 index 1:

Testing for epoch 20 index 2:

Testing for epoch 20 index 3:

Testing for epoch 20 index 4:

Testing for epoch 20 index 5:
Epoch 21 of 60

Testing for epoch 21 index 1:

Testing for epoch 21 index 2:

Testing for epoch 21 index 3:

Testing for epoch 21 index 4:

Testing for epoch 21 index 5:
Epoch 22 of 60

Testing for epoch 22 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.8590

Testing for epoch 22 index 2:
16/16 [==============================] - 0s 6ms/step - loss: 1.8499

Testing for epoch 22 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 1.8693

Testing for epoch 22 index 4:
16/16 [==============================] - 0s 3ms/step - loss: 1.8307

Testing for epoch 22 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 1.9203
Epoch 23 of 60

Testing for epoch 23 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 1.8866

Testing for epoch 23 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.9701

Testing for epoch 23 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 1.9676

Testing for epoch 23 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 1.9077

Testing for epoch 23 index 5:
16/16 [==============================] - 0s 6ms/step - loss: 1.9041
Epoch 24 of 60

Testing for epoch 24 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 1.9794

Testing for epoch 24 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 1.9225

Testing for epoch 24 index 3:
16/16 [==============================] - 0s 3ms/step - loss: 1.9675

Testing for epoch 24 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 1.9565

Testing for epoch 24 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 1.9478
Epoch 25 of 60

Testing for epoch 25 index 1:
16/16 [==============================] - 0s 4ms/step - loss: 2.0477

Testing for epoch 25 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 1.9885

Testing for epoch 25 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.0022

Testing for epoch 25 index 4:
16/16 [==============================] - 0s 3ms/step - loss: 2.0252

Testing for epoch 25 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.0552
Epoch 26 of 60

Testing for epoch 26 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 2.0609

Testing for epoch 26 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.0696

Testing for epoch 26 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 1.9849

Testing for epoch 26 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.0556

Testing for epoch 26 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.0447
Epoch 27 of 60

Testing for epoch 27 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.1151

Testing for epoch 27 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.0612

Testing for epoch 27 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.0724

Testing for epoch 27 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.1036

Testing for epoch 27 index 5:
16/16 [==============================] - 0s 6ms/step - loss: 2.1279
Epoch 28 of 60

Testing for epoch 28 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.1125

Testing for epoch 28 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.1372

Testing for epoch 28 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.0736

Testing for epoch 28 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.0582

Testing for epoch 28 index 5:
16/16 [==============================] - 0s 879us/step - loss: 2.0884
Epoch 29 of 60

Testing for epoch 29 index 1:
16/16 [==============================] - 0s 961us/step - loss: 2.0994

Testing for epoch 29 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.0811

Testing for epoch 29 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.0689

Testing for epoch 29 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.0332

Testing for epoch 29 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.0652
Epoch 30 of 60

Testing for epoch 30 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.0928

Testing for epoch 30 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.1343

Testing for epoch 30 index 3:
16/16 [==============================] - 0s 927us/step - loss: 2.1753

Testing for epoch 30 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.1181

Testing for epoch 30 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.1054
Epoch 31 of 60

Testing for epoch 31 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.1283

Testing for epoch 31 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.1953

Testing for epoch 31 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.1516

Testing for epoch 31 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.1296

Testing for epoch 31 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.2142
Epoch 32 of 60

Testing for epoch 32 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.1189

Testing for epoch 32 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 2.2023

Testing for epoch 32 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.1718

Testing for epoch 32 index 4:
16/16 [==============================] - 0s 4ms/step - loss: 2.1823

Testing for epoch 32 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.1967
Epoch 33 of 60

Testing for epoch 33 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.2128

Testing for epoch 33 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 2.1388

Testing for epoch 33 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.2566

Testing for epoch 33 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.1834

Testing for epoch 33 index 5:
16/16 [==============================] - 0s 4ms/step - loss: 2.2388
Epoch 34 of 60

Testing for epoch 34 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.1707

Testing for epoch 34 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 2.1956

Testing for epoch 34 index 3:
16/16 [==============================] - 0s 3ms/step - loss: 2.1545

Testing for epoch 34 index 4:
16/16 [==============================] - 0s 4ms/step - loss: 2.2207

Testing for epoch 34 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.2211
Epoch 35 of 60

Testing for epoch 35 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 2.2329

Testing for epoch 35 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 2.2325

Testing for epoch 35 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.2226

Testing for epoch 35 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.2150

Testing for epoch 35 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.2658
Epoch 36 of 60

Testing for epoch 36 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 2.2247

Testing for epoch 36 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 2.2402

Testing for epoch 36 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.2650

Testing for epoch 36 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.3093

Testing for epoch 36 index 5:
16/16 [==============================] - 0s 3ms/step - loss: 2.2320
Epoch 37 of 60

Testing for epoch 37 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 2.2643

Testing for epoch 37 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 2.2125

Testing for epoch 37 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.2464

Testing for epoch 37 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.2432

Testing for epoch 37 index 5:
16/16 [==============================] - 0s 3ms/step - loss: 2.2341
Epoch 38 of 60

Testing for epoch 38 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.2445

Testing for epoch 38 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 2.2269

Testing for epoch 38 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.2351

Testing for epoch 38 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.2808

Testing for epoch 38 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.2105
Epoch 39 of 60

Testing for epoch 39 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.2943

Testing for epoch 39 index 2:
16/16 [==============================] - 0s 943us/step - loss: 2.2408

Testing for epoch 39 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.2960

Testing for epoch 39 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.3145

Testing for epoch 39 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.3057
Epoch 40 of 60

Testing for epoch 40 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.3277

Testing for epoch 40 index 2:
16/16 [==============================] - 0s 996us/step - loss: 2.3367

Testing for epoch 40 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.3631

Testing for epoch 40 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.2565

Testing for epoch 40 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.2756
Epoch 41 of 60

Testing for epoch 41 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.3101

Testing for epoch 41 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 2.3082

Testing for epoch 41 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.2945

Testing for epoch 41 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.2994

Testing for epoch 41 index 5:
16/16 [==============================] - 0s 939us/step - loss: 2.3138
Epoch 42 of 60

Testing for epoch 42 index 1:
16/16 [==============================] - 0s 874us/step - loss: 2.3803

Testing for epoch 42 index 2:
16/16 [==============================] - 0s 998us/step - loss: 2.3517

Testing for epoch 42 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.3013

Testing for epoch 42 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.2690

Testing for epoch 42 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.3627
Epoch 43 of 60

Testing for epoch 43 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.3115

Testing for epoch 43 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.3110

Testing for epoch 43 index 3:
16/16 [==============================] - 0s 979us/step - loss: 2.3358

Testing for epoch 43 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.4296

Testing for epoch 43 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.3862
Epoch 44 of 60

Testing for epoch 44 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.3632

Testing for epoch 44 index 2:
16/16 [==============================] - 0s 4ms/step - loss: 2.3453

Testing for epoch 44 index 3:
16/16 [==============================] - 0s 928us/step - loss: 2.3650

Testing for epoch 44 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.3889

Testing for epoch 44 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.3607
Epoch 45 of 60

Testing for epoch 45 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.4377

Testing for epoch 45 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 2.3895

Testing for epoch 45 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.3930

Testing for epoch 45 index 4:
16/16 [==============================] - 0s 3ms/step - loss: 2.3649

Testing for epoch 45 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.3685
Epoch 46 of 60

Testing for epoch 46 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.3881

Testing for epoch 46 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.3902

Testing for epoch 46 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.3941

Testing for epoch 46 index 4:
16/16 [==============================] - 0s 3ms/step - loss: 2.4867

Testing for epoch 46 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.3186
Epoch 47 of 60

Testing for epoch 47 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.4013

Testing for epoch 47 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 2.3809

Testing for epoch 47 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.4101

Testing for epoch 47 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.3932

Testing for epoch 47 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.3479
Epoch 48 of 60

Testing for epoch 48 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.3996

Testing for epoch 48 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.4063

Testing for epoch 48 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.3579

Testing for epoch 48 index 4:
16/16 [==============================] - 0s 3ms/step - loss: 2.4571

Testing for epoch 48 index 5:
16/16 [==============================] - 0s 3ms/step - loss: 2.4374
Epoch 49 of 60

Testing for epoch 49 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 2.3838

Testing for epoch 49 index 2:
16/16 [==============================] - 0s 7ms/step - loss: 2.4396

Testing for epoch 49 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.3745

Testing for epoch 49 index 4:
16/16 [==============================] - 0s 5ms/step - loss: 2.4410

Testing for epoch 49 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.3887
Epoch 50 of 60

Testing for epoch 50 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.4778

Testing for epoch 50 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 2.4447

Testing for epoch 50 index 3:
16/16 [==============================] - 0s 3ms/step - loss: 2.4832

Testing for epoch 50 index 4:
16/16 [==============================] - 0s 4ms/step - loss: 2.4226

Testing for epoch 50 index 5:
16/16 [==============================] - 0s 3ms/step - loss: 2.4365
Epoch 51 of 60

Testing for epoch 51 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 2.5053

Testing for epoch 51 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.4531

Testing for epoch 51 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.4464

Testing for epoch 51 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.4767

Testing for epoch 51 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.4262
Epoch 52 of 60

Testing for epoch 52 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.4279

Testing for epoch 52 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 2.4345

Testing for epoch 52 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.4548

Testing for epoch 52 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.4368

Testing for epoch 52 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.4678
Epoch 53 of 60

Testing for epoch 53 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.4467

Testing for epoch 53 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.4252

Testing for epoch 53 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.5437

Testing for epoch 53 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.3843

Testing for epoch 53 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.4852
Epoch 54 of 60

Testing for epoch 54 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.4319

Testing for epoch 54 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.4995

Testing for epoch 54 index 3:
16/16 [==============================] - 0s 1ms/step - loss: 2.4826

Testing for epoch 54 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.5096

Testing for epoch 54 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.5623
Epoch 55 of 60

Testing for epoch 55 index 1:
16/16 [==============================] - 0s 2ms/step - loss: 2.4769

Testing for epoch 55 index 2:
16/16 [==============================] - 0s 1ms/step - loss: 2.5265

Testing for epoch 55 index 3:
16/16 [==============================] - 0s 977us/step - loss: 2.5283

Testing for epoch 55 index 4:
16/16 [==============================] - 0s 1ms/step - loss: 2.4743

Testing for epoch 55 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.4720
Epoch 56 of 60

Testing for epoch 56 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.5007

Testing for epoch 56 index 2:
16/16 [==============================] - 0s 2ms/step - loss: 2.4843

Testing for epoch 56 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.4741

Testing for epoch 56 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.4727

Testing for epoch 56 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.4869
Epoch 57 of 60

Testing for epoch 57 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.4948

Testing for epoch 57 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 2.4662

Testing for epoch 57 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.5548

Testing for epoch 57 index 4:
16/16 [==============================] - 0s 4ms/step - loss: 2.5593

Testing for epoch 57 index 5:
16/16 [==============================] - 0s 1ms/step - loss: 2.4578
Epoch 58 of 60

Testing for epoch 58 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 2.5805

Testing for epoch 58 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 2.5497

Testing for epoch 58 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.6178

Testing for epoch 58 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.4948

Testing for epoch 58 index 5:
16/16 [==============================] - 0s 2ms/step - loss: 2.5090
Epoch 59 of 60

Testing for epoch 59 index 1:
16/16 [==============================] - 0s 1ms/step - loss: 2.5129

Testing for epoch 59 index 2:
16/16 [==============================] - 0s 5ms/step - loss: 2.4891

Testing for epoch 59 index 3:
16/16 [==============================] - 0s 2ms/step - loss: 2.5204

Testing for epoch 59 index 4:
16/16 [==============================] - 0s 5ms/step - loss: 2.5198

Testing for epoch 59 index 5:
16/16 [==============================] - 0s 3ms/step - loss: 2.5057
Epoch 60 of 60

Testing for epoch 60 index 1:
16/16 [==============================] - 0s 3ms/step - loss: 2.5818

Testing for epoch 60 index 2:
16/16 [==============================] - 0s 3ms/step - loss: 2.6327

Testing for epoch 60 index 3:
16/16 [==============================] - 0s 3ms/step - loss: 2.5796

Testing for epoch 60 index 4:
16/16 [==============================] - 0s 2ms/step - loss: 2.5172

Testing for epoch 60 index 5:
16/16 [==============================] - 0s 3ms/step - loss: 2.5418
79/79 [==============================] - 0s 906us/step
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/keras/optimizers/legacy/gradient_descent.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super().__init__(name, **kwargs)
outlier_SO_GAAL_one = list(clf.labels_)
outlier_SO_GAAL_one = list(map(lambda x: 1 if x==0  else -1,outlier_SO_GAAL_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_SO_GAAL_one,tab_bunny)
_conf.conf("SO-GAAL (Liu et al., 2019)")

Accuracy: 0.952
Precision: 0.952
Recall: 1.000
F1 Score: 0.975
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
twelve = eleven.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  twelve = eleven.append(_conf.tab)
twelve
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756
Feature Bagging (Lazarevic and Kumar, 2005) 0.954455 0.977282 0.974822 0.976050
ABOD (Kriegel et al., 2008) 0.979225 0.990324 0.987830 0.989076
Isolation Forest (Liu et al., 2008) 0.791051 0.996795 0.783047 0.877086
HBOS (Goldstein and Dengel, 2012) 0.918897 0.958368 0.956358 0.957362
SOS (Janssens et al., 2012) 0.912105 0.954985 0.952581 0.953782
SO-GAAL (Liu et al., 2019) 0.952058 0.952058 1.000000 0.975440

MO_GAAL

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

np.random.seed(77)
clf = MO_GAAL(contamination=0.05)
clf.fit(_df[['x', 'y','fnoise']])
_df['MO_GAAL_clf'] = clf.labels_
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/keras/optimizers/legacy/gradient_descent.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super().__init__(name, **kwargs)
Epoch 1 of 60

Testing for epoch 1 index 1:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 1 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 1 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 1 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 1 index 5:
79/79 [==============================] - 0s 959us/step
Epoch 2 of 60

Testing for epoch 2 index 1:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 2 index 2:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 2 index 3:
79/79 [==============================] - 0s 742us/step

Testing for epoch 2 index 4:
79/79 [==============================] - 0s 976us/step

Testing for epoch 2 index 5:
79/79 [==============================] - 0s 1ms/step
Epoch 3 of 60

Testing for epoch 3 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 3 index 2:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 3 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 3 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 3 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 4 of 60

Testing for epoch 4 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 4 index 2:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 4 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 4 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 4 index 5:
79/79 [==============================] - 0s 1ms/step
Epoch 5 of 60

Testing for epoch 5 index 1:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 5 index 2:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 5 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 5 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 5 index 5:
79/79 [==============================] - 0s 4ms/step
Epoch 6 of 60

Testing for epoch 6 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 6 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 6 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 6 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 6 index 5:
79/79 [==============================] - 0s 3ms/step
Epoch 7 of 60

Testing for epoch 7 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 7 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 7 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 7 index 4:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 7 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 8 of 60

Testing for epoch 8 index 1:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 8 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 8 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 8 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 8 index 5:
79/79 [==============================] - 0s 1ms/step
Epoch 9 of 60

Testing for epoch 9 index 1:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 9 index 2:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 9 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 9 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 9 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 10 of 60

Testing for epoch 10 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 10 index 2:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 10 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 10 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 10 index 5:
79/79 [==============================] - 0s 1ms/step
Epoch 11 of 60

Testing for epoch 11 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 11 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 11 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 11 index 4:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 11 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 12 of 60

Testing for epoch 12 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 12 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 12 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 12 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 12 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 13 of 60

Testing for epoch 13 index 1:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 13 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 13 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 13 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 13 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 14 of 60

Testing for epoch 14 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 14 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 14 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 14 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 14 index 5:
79/79 [==============================] - 0s 3ms/step
Epoch 15 of 60

Testing for epoch 15 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 15 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 15 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 15 index 4:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 15 index 5:
79/79 [==============================] - 0s 1ms/step
Epoch 16 of 60

Testing for epoch 16 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 16 index 2:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 16 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 16 index 4:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 16 index 5:
79/79 [==============================] - 0s 1ms/step
Epoch 17 of 60

Testing for epoch 17 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 17 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 17 index 3:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 17 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 17 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 18 of 60

Testing for epoch 18 index 1:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 18 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 18 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 18 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 18 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 19 of 60

Testing for epoch 19 index 1:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 19 index 2:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 19 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 19 index 4:
79/79 [==============================] - 0s 1ms/step

Testing for epoch 19 index 5:
79/79 [==============================] - 0s 1ms/step
Epoch 20 of 60

Testing for epoch 20 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 20 index 2:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 20 index 3:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 20 index 4:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 20 index 5:
79/79 [==============================] - 0s 2ms/step
Epoch 21 of 60

Testing for epoch 21 index 1:
79/79 [==============================] - 0s 2ms/step

Testing for epoch 21 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1602
16/16 [==============================] - 0s 2ms/step - loss: 1.4145
16/16 [==============================] - 0s 1ms/step - loss: 1.6287
16/16 [==============================] - 0s 2ms/step - loss: 1.6822
16/16 [==============================] - 0s 1ms/step - loss: 1.6938
16/16 [==============================] - 0s 2ms/step - loss: 1.6905
16/16 [==============================] - 0s 2ms/step - loss: 1.6922
16/16 [==============================] - 0s 2ms/step - loss: 1.6904
16/16 [==============================] - 0s 2ms/step - loss: 1.6879
16/16 [==============================] - 0s 2ms/step - loss: 1.6872

Testing for epoch 21 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1591
16/16 [==============================] - 0s 3ms/step - loss: 1.4517
16/16 [==============================] - 0s 3ms/step - loss: 1.6841
16/16 [==============================] - 0s 3ms/step - loss: 1.7434
16/16 [==============================] - 0s 2ms/step - loss: 1.7568
16/16 [==============================] - 0s 3ms/step - loss: 1.7521
16/16 [==============================] - 0s 2ms/step - loss: 1.7528
16/16 [==============================] - 0s 2ms/step - loss: 1.7504
16/16 [==============================] - 0s 4ms/step - loss: 1.7476
16/16 [==============================] - 0s 3ms/step - loss: 1.7468

Testing for epoch 21 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1558
16/16 [==============================] - 0s 2ms/step - loss: 1.4216
16/16 [==============================] - 0s 2ms/step - loss: 1.6411
16/16 [==============================] - 0s 2ms/step - loss: 1.6944
16/16 [==============================] - 0s 3ms/step - loss: 1.7053
16/16 [==============================] - 0s 2ms/step - loss: 1.6992
16/16 [==============================] - 0s 1ms/step - loss: 1.6990
16/16 [==============================] - 0s 3ms/step - loss: 1.6966
16/16 [==============================] - 0s 5ms/step - loss: 1.6939
16/16 [==============================] - 0s 7ms/step - loss: 1.6932

Testing for epoch 21 index 5:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1536
16/16 [==============================] - 0s 2ms/step - loss: 1.4717
16/16 [==============================] - 0s 2ms/step - loss: 1.7083
16/16 [==============================] - 0s 3ms/step - loss: 1.7651
16/16 [==============================] - 0s 2ms/step - loss: 1.7773
16/16 [==============================] - 0s 2ms/step - loss: 1.7715
16/16 [==============================] - 0s 3ms/step - loss: 1.7720
16/16 [==============================] - 0s 3ms/step - loss: 1.7697
16/16 [==============================] - 0s 3ms/step - loss: 1.7670
16/16 [==============================] - 0s 2ms/step - loss: 1.7663
Epoch 22 of 60

Testing for epoch 22 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1535
16/16 [==============================] - 0s 3ms/step - loss: 1.4341
16/16 [==============================] - 0s 2ms/step - loss: 1.6618
16/16 [==============================] - 0s 2ms/step - loss: 1.7166
16/16 [==============================] - 0s 2ms/step - loss: 1.7284
16/16 [==============================] - 0s 4ms/step - loss: 1.7219
16/16 [==============================] - 0s 2ms/step - loss: 1.7219
16/16 [==============================] - 0s 3ms/step - loss: 1.7195
16/16 [==============================] - 0s 3ms/step - loss: 1.7168
16/16 [==============================] - 0s 2ms/step - loss: 1.7161

Testing for epoch 22 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1440
16/16 [==============================] - 0s 3ms/step - loss: 1.5130
16/16 [==============================] - 0s 4ms/step - loss: 1.7593
16/16 [==============================] - 0s 3ms/step - loss: 1.8174
16/16 [==============================] - 0s 2ms/step - loss: 1.8284
16/16 [==============================] - 0s 2ms/step - loss: 1.8199
16/16 [==============================] - 0s 4ms/step - loss: 1.8168
16/16 [==============================] - 0s 4ms/step - loss: 1.8134
16/16 [==============================] - 0s 4ms/step - loss: 1.8102
16/16 [==============================] - 0s 3ms/step - loss: 1.8094

Testing for epoch 22 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1499
16/16 [==============================] - 0s 1ms/step - loss: 1.4553
16/16 [==============================] - 0s 2ms/step - loss: 1.6950
16/16 [==============================] - 0s 2ms/step - loss: 1.7545
16/16 [==============================] - 0s 2ms/step - loss: 1.7690
16/16 [==============================] - 0s 4ms/step - loss: 1.7643
16/16 [==============================] - 0s 3ms/step - loss: 1.7642
16/16 [==============================] - 0s 2ms/step - loss: 1.7617
16/16 [==============================] - 0s 2ms/step - loss: 1.7589
16/16 [==============================] - 0s 2ms/step - loss: 1.7581

Testing for epoch 22 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1461
16/16 [==============================] - 0s 2ms/step - loss: 1.4849
16/16 [==============================] - 0s 2ms/step - loss: 1.7261
16/16 [==============================] - 0s 3ms/step - loss: 1.7829
16/16 [==============================] - 0s 3ms/step - loss: 1.7952
16/16 [==============================] - 0s 2ms/step - loss: 1.7874
16/16 [==============================] - 0s 2ms/step - loss: 1.7849
16/16 [==============================] - 0s 4ms/step - loss: 1.7816
16/16 [==============================] - 0s 2ms/step - loss: 1.7785
16/16 [==============================] - 0s 2ms/step - loss: 1.7777

Testing for epoch 22 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1459
16/16 [==============================] - 0s 5ms/step - loss: 1.4614
16/16 [==============================] - 0s 3ms/step - loss: 1.7041
16/16 [==============================] - 0s 4ms/step - loss: 1.7608
16/16 [==============================] - 0s 2ms/step - loss: 1.7736
16/16 [==============================] - 0s 1ms/step - loss: 1.7667
16/16 [==============================] - 0s 3ms/step - loss: 1.7650
16/16 [==============================] - 0s 2ms/step - loss: 1.7619
16/16 [==============================] - 0s 4ms/step - loss: 1.7589
16/16 [==============================] - 0s 2ms/step - loss: 1.7581
Epoch 23 of 60

Testing for epoch 23 index 1:
79/79 [==============================] - 0s 4ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1410
16/16 [==============================] - 0s 2ms/step - loss: 1.5029
16/16 [==============================] - 0s 1ms/step - loss: 1.7589
16/16 [==============================] - 0s 4ms/step - loss: 1.8172
16/16 [==============================] - 0s 3ms/step - loss: 1.8293
16/16 [==============================] - 0s 4ms/step - loss: 1.8201
16/16 [==============================] - 0s 2ms/step - loss: 1.8168
16/16 [==============================] - 0s 2ms/step - loss: 1.8132
16/16 [==============================] - 0s 5ms/step - loss: 1.8099
16/16 [==============================] - 0s 2ms/step - loss: 1.8091

Testing for epoch 23 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1422
16/16 [==============================] - 0s 3ms/step - loss: 1.4805
16/16 [==============================] - 0s 4ms/step - loss: 1.7343
16/16 [==============================] - 0s 6ms/step - loss: 1.7927
16/16 [==============================] - 0s 1ms/step - loss: 1.8063
16/16 [==============================] - 0s 2ms/step - loss: 1.7979
16/16 [==============================] - 0s 4ms/step - loss: 1.7954
16/16 [==============================] - 0s 5ms/step - loss: 1.7921
16/16 [==============================] - 0s 5ms/step - loss: 1.7890
16/16 [==============================] - 0s 2ms/step - loss: 1.7882

Testing for epoch 23 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1407
16/16 [==============================] - 0s 3ms/step - loss: 1.4985
16/16 [==============================] - 0s 7ms/step - loss: 1.7588
16/16 [==============================] - 0s 3ms/step - loss: 1.8177
16/16 [==============================] - 0s 3ms/step - loss: 1.8301
16/16 [==============================] - 0s 3ms/step - loss: 1.8190
16/16 [==============================] - 0s 3ms/step - loss: 1.8150
16/16 [==============================] - 0s 4ms/step - loss: 1.8110
16/16 [==============================] - 0s 3ms/step - loss: 1.8076
16/16 [==============================] - 0s 4ms/step - loss: 1.8068

Testing for epoch 23 index 4:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1364
16/16 [==============================] - 0s 3ms/step - loss: 1.5060
16/16 [==============================] - 0s 4ms/step - loss: 1.7685
16/16 [==============================] - 0s 3ms/step - loss: 1.8276
16/16 [==============================] - 0s 4ms/step - loss: 1.8395
16/16 [==============================] - 0s 4ms/step - loss: 1.8295
16/16 [==============================] - 0s 3ms/step - loss: 1.8263
16/16 [==============================] - 0s 6ms/step - loss: 1.8226
16/16 [==============================] - 0s 3ms/step - loss: 1.8194
16/16 [==============================] - 0s 2ms/step - loss: 1.8186

Testing for epoch 23 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1374
16/16 [==============================] - 0s 5ms/step - loss: 1.4749
16/16 [==============================] - 0s 2ms/step - loss: 1.7254
16/16 [==============================] - 0s 1ms/step - loss: 1.7811
16/16 [==============================] - 0s 4ms/step - loss: 1.7928
16/16 [==============================] - 0s 3ms/step - loss: 1.7824
16/16 [==============================] - 0s 3ms/step - loss: 1.7778
16/16 [==============================] - 0s 5ms/step - loss: 1.7738
16/16 [==============================] - 0s 4ms/step - loss: 1.7706
16/16 [==============================] - 0s 2ms/step - loss: 1.7698
Epoch 24 of 60

Testing for epoch 24 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 7ms/step - loss: 0.1324
16/16 [==============================] - 0s 4ms/step - loss: 1.5032
16/16 [==============================] - 0s 3ms/step - loss: 1.7614
16/16 [==============================] - 0s 5ms/step - loss: 1.8196
16/16 [==============================] - 0s 2ms/step - loss: 1.8312
16/16 [==============================] - 0s 3ms/step - loss: 1.8199
16/16 [==============================] - 0s 5ms/step - loss: 1.8151
16/16 [==============================] - 0s 4ms/step - loss: 1.8110
16/16 [==============================] - 0s 5ms/step - loss: 1.8077
16/16 [==============================] - 0s 3ms/step - loss: 1.8069

Testing for epoch 24 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1296
16/16 [==============================] - 0s 2ms/step - loss: 1.5379
16/16 [==============================] - 0s 4ms/step - loss: 1.8026
16/16 [==============================] - 0s 3ms/step - loss: 1.8600
16/16 [==============================] - 0s 7ms/step - loss: 1.8698
16/16 [==============================] - 0s 4ms/step - loss: 1.8562
16/16 [==============================] - 0s 6ms/step - loss: 1.8492
16/16 [==============================] - 0s 2ms/step - loss: 1.8444
16/16 [==============================] - 0s 4ms/step - loss: 1.8408
16/16 [==============================] - 0s 5ms/step - loss: 1.8399

Testing for epoch 24 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1307
16/16 [==============================] - 0s 5ms/step - loss: 1.5296
16/16 [==============================] - 0s 3ms/step - loss: 1.8004
16/16 [==============================] - 0s 5ms/step - loss: 1.8588
16/16 [==============================] - 0s 5ms/step - loss: 1.8702
16/16 [==============================] - 0s 2ms/step - loss: 1.8585
16/16 [==============================] - 0s 2ms/step - loss: 1.8529
16/16 [==============================] - 0s 2ms/step - loss: 1.8484
16/16 [==============================] - 0s 3ms/step - loss: 1.8448
16/16 [==============================] - 0s 2ms/step - loss: 1.8439

Testing for epoch 24 index 4:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1306
16/16 [==============================] - 0s 2ms/step - loss: 1.4955
16/16 [==============================] - 0s 2ms/step - loss: 1.7664
16/16 [==============================] - 0s 7ms/step - loss: 1.8273
16/16 [==============================] - 0s 8ms/step - loss: 1.8413
16/16 [==============================] - 0s 5ms/step - loss: 1.8324
16/16 [==============================] - 0s 4ms/step - loss: 1.8288
16/16 [==============================] - 0s 4ms/step - loss: 1.8249
16/16 [==============================] - 0s 3ms/step - loss: 1.8217
16/16 [==============================] - 0s 3ms/step - loss: 1.8209

Testing for epoch 24 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.1282
16/16 [==============================] - 0s 2ms/step - loss: 1.5449
16/16 [==============================] - 0s 2ms/step - loss: 1.8235
16/16 [==============================] - 0s 2ms/step - loss: 1.8823
16/16 [==============================] - 0s 4ms/step - loss: 1.8918
16/16 [==============================] - 0s 2ms/step - loss: 1.8766
16/16 [==============================] - 0s 2ms/step - loss: 1.8691
16/16 [==============================] - 0s 4ms/step - loss: 1.8638
16/16 [==============================] - 0s 2ms/step - loss: 1.8600
16/16 [==============================] - 0s 2ms/step - loss: 1.8591
Epoch 25 of 60

Testing for epoch 25 index 1:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1296
16/16 [==============================] - 0s 3ms/step - loss: 1.5490
16/16 [==============================] - 0s 3ms/step - loss: 1.8307
16/16 [==============================] - 0s 1ms/step - loss: 1.8933
16/16 [==============================] - 0s 2ms/step - loss: 1.9061
16/16 [==============================] - 0s 4ms/step - loss: 1.8956
16/16 [==============================] - 0s 4ms/step - loss: 1.8905
16/16 [==============================] - 0s 2ms/step - loss: 1.8858
16/16 [==============================] - 0s 4ms/step - loss: 1.8822
16/16 [==============================] - 0s 2ms/step - loss: 1.8812

Testing for epoch 25 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1311
16/16 [==============================] - 0s 3ms/step - loss: 1.5699
16/16 [==============================] - 0s 2ms/step - loss: 1.8451
16/16 [==============================] - 0s 4ms/step - loss: 1.9027
16/16 [==============================] - 0s 3ms/step - loss: 1.9107
16/16 [==============================] - 0s 2ms/step - loss: 1.8948
16/16 [==============================] - 0s 4ms/step - loss: 1.8866
16/16 [==============================] - 0s 2ms/step - loss: 1.8812
16/16 [==============================] - 0s 5ms/step - loss: 1.8774
16/16 [==============================] - 0s 3ms/step - loss: 1.8764

Testing for epoch 25 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1255
16/16 [==============================] - 0s 3ms/step - loss: 1.6025
16/16 [==============================] - 0s 4ms/step - loss: 1.8840
16/16 [==============================] - 0s 3ms/step - loss: 1.9426
16/16 [==============================] - 0s 3ms/step - loss: 1.9505
16/16 [==============================] - 0s 4ms/step - loss: 1.9343
16/16 [==============================] - 0s 6ms/step - loss: 1.9257
16/16 [==============================] - 0s 2ms/step - loss: 1.9201
16/16 [==============================] - 0s 2ms/step - loss: 1.9162
16/16 [==============================] - 0s 3ms/step - loss: 1.9152

Testing for epoch 25 index 4:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1239
16/16 [==============================] - 0s 2ms/step - loss: 1.5906
16/16 [==============================] - 0s 7ms/step - loss: 1.8776
16/16 [==============================] - 0s 3ms/step - loss: 1.9404
16/16 [==============================] - 0s 3ms/step - loss: 1.9509
16/16 [==============================] - 0s 2ms/step - loss: 1.9368
16/16 [==============================] - 0s 6ms/step - loss: 1.9290
16/16 [==============================] - 0s 4ms/step - loss: 1.9235
16/16 [==============================] - 0s 3ms/step - loss: 1.9197
16/16 [==============================] - 0s 3ms/step - loss: 1.9187

Testing for epoch 25 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1245
16/16 [==============================] - 0s 3ms/step - loss: 1.5595
16/16 [==============================] - 0s 4ms/step - loss: 1.8350
16/16 [==============================] - 0s 6ms/step - loss: 1.8945
16/16 [==============================] - 0s 2ms/step - loss: 1.9053
16/16 [==============================] - 0s 3ms/step - loss: 1.8928
16/16 [==============================] - 0s 2ms/step - loss: 1.8858
16/16 [==============================] - 0s 2ms/step - loss: 1.8806
16/16 [==============================] - 0s 5ms/step - loss: 1.8769
16/16 [==============================] - 0s 2ms/step - loss: 1.8760
Epoch 26 of 60

Testing for epoch 26 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1235
16/16 [==============================] - 0s 3ms/step - loss: 1.5805
16/16 [==============================] - 0s 2ms/step - loss: 1.8646
16/16 [==============================] - 0s 6ms/step - loss: 1.9234
16/16 [==============================] - 0s 5ms/step - loss: 1.9300
16/16 [==============================] - 0s 5ms/step - loss: 1.9141
16/16 [==============================] - 0s 1ms/step - loss: 1.9050
16/16 [==============================] - 0s 4ms/step - loss: 1.8992
16/16 [==============================] - 0s 3ms/step - loss: 1.8953
16/16 [==============================] - 0s 4ms/step - loss: 1.8943

Testing for epoch 26 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1230
16/16 [==============================] - 0s 7ms/step - loss: 1.5989
16/16 [==============================] - 0s 1ms/step - loss: 1.8831
16/16 [==============================] - 0s 3ms/step - loss: 1.9438
16/16 [==============================] - 0s 4ms/step - loss: 1.9506
16/16 [==============================] - 0s 4ms/step - loss: 1.9346
16/16 [==============================] - 0s 3ms/step - loss: 1.9248
16/16 [==============================] - 0s 4ms/step - loss: 1.9186
16/16 [==============================] - 0s 2ms/step - loss: 1.9145
16/16 [==============================] - 0s 4ms/step - loss: 1.9135

Testing for epoch 26 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1229
16/16 [==============================] - 0s 2ms/step - loss: 1.5703
16/16 [==============================] - 0s 2ms/step - loss: 1.8490
16/16 [==============================] - 0s 3ms/step - loss: 1.9109
16/16 [==============================] - 0s 2ms/step - loss: 1.9199
16/16 [==============================] - 0s 4ms/step - loss: 1.9064
16/16 [==============================] - 0s 2ms/step - loss: 1.8982
16/16 [==============================] - 0s 2ms/step - loss: 1.8926
16/16 [==============================] - 0s 3ms/step - loss: 1.8888
16/16 [==============================] - 0s 3ms/step - loss: 1.8879

Testing for epoch 26 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1186
16/16 [==============================] - 0s 4ms/step - loss: 1.6004
16/16 [==============================] - 0s 4ms/step - loss: 1.8844
16/16 [==============================] - 0s 5ms/step - loss: 1.9438
16/16 [==============================] - 0s 1ms/step - loss: 1.9499
16/16 [==============================] - 0s 4ms/step - loss: 1.9342
16/16 [==============================] - 0s 5ms/step - loss: 1.9248
16/16 [==============================] - 0s 4ms/step - loss: 1.9190
16/16 [==============================] - 0s 4ms/step - loss: 1.9151
16/16 [==============================] - 0s 4ms/step - loss: 1.9142

Testing for epoch 26 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1202
16/16 [==============================] - 0s 4ms/step - loss: 1.6045
16/16 [==============================] - 0s 2ms/step - loss: 1.8904
16/16 [==============================] - 0s 2ms/step - loss: 1.9510
16/16 [==============================] - 0s 2ms/step - loss: 1.9585
16/16 [==============================] - 0s 6ms/step - loss: 1.9438
16/16 [==============================] - 0s 2ms/step - loss: 1.9345
16/16 [==============================] - 0s 2ms/step - loss: 1.9284
16/16 [==============================] - 0s 3ms/step - loss: 1.9244
16/16 [==============================] - 0s 4ms/step - loss: 1.9234
Epoch 27 of 60

Testing for epoch 27 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1205
16/16 [==============================] - 0s 7ms/step - loss: 1.5761
16/16 [==============================] - 0s 4ms/step - loss: 1.8568
16/16 [==============================] - 0s 4ms/step - loss: 1.9158
16/16 [==============================] - 0s 5ms/step - loss: 1.9225
16/16 [==============================] - 0s 2ms/step - loss: 1.9060
16/16 [==============================] - 0s 4ms/step - loss: 1.8964
16/16 [==============================] - 0s 5ms/step - loss: 1.8906
16/16 [==============================] - 0s 3ms/step - loss: 1.8868
16/16 [==============================] - 0s 3ms/step - loss: 1.8858

Testing for epoch 27 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.1225
16/16 [==============================] - 0s 2ms/step - loss: 1.5595
16/16 [==============================] - 0s 3ms/step - loss: 1.8307
16/16 [==============================] - 0s 2ms/step - loss: 1.8882
16/16 [==============================] - 0s 2ms/step - loss: 1.8921
16/16 [==============================] - 0s 3ms/step - loss: 1.8755
16/16 [==============================] - 0s 3ms/step - loss: 1.8653
16/16 [==============================] - 0s 4ms/step - loss: 1.8592
16/16 [==============================] - 0s 2ms/step - loss: 1.8553
16/16 [==============================] - 0s 3ms/step - loss: 1.8544

Testing for epoch 27 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1238
16/16 [==============================] - 0s 2ms/step - loss: 1.6028
16/16 [==============================] - 0s 2ms/step - loss: 1.8730
16/16 [==============================] - 0s 3ms/step - loss: 1.9310
16/16 [==============================] - 0s 2ms/step - loss: 1.9331
16/16 [==============================] - 0s 2ms/step - loss: 1.9149
16/16 [==============================] - 0s 2ms/step - loss: 1.9020
16/16 [==============================] - 0s 4ms/step - loss: 1.8952
16/16 [==============================] - 0s 2ms/step - loss: 1.8912
16/16 [==============================] - 0s 2ms/step - loss: 1.8902

Testing for epoch 27 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1168
16/16 [==============================] - 0s 4ms/step - loss: 1.6344
16/16 [==============================] - 0s 2ms/step - loss: 1.9166
16/16 [==============================] - 0s 3ms/step - loss: 1.9781
16/16 [==============================] - 0s 4ms/step - loss: 1.9810
16/16 [==============================] - 0s 3ms/step - loss: 1.9628
16/16 [==============================] - 0s 2ms/step - loss: 1.9502
16/16 [==============================] - 0s 4ms/step - loss: 1.9437
16/16 [==============================] - 0s 4ms/step - loss: 1.9397
16/16 [==============================] - 0s 2ms/step - loss: 1.9387

Testing for epoch 27 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1144
16/16 [==============================] - 0s 4ms/step - loss: 1.6924
16/16 [==============================] - 0s 8ms/step - loss: 1.9940
16/16 [==============================] - 0s 3ms/step - loss: 2.0605
16/16 [==============================] - 0s 5ms/step - loss: 2.0633
16/16 [==============================] - 0s 4ms/step - loss: 2.0426
16/16 [==============================] - 0s 4ms/step - loss: 2.0277
16/16 [==============================] - 0s 4ms/step - loss: 2.0201
16/16 [==============================] - 0s 3ms/step - loss: 2.0155
16/16 [==============================] - 0s 4ms/step - loss: 2.0144
Epoch 28 of 60

Testing for epoch 28 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1141
16/16 [==============================] - 0s 5ms/step - loss: 1.6646
16/16 [==============================] - 0s 2ms/step - loss: 1.9580
16/16 [==============================] - 0s 4ms/step - loss: 2.0202
16/16 [==============================] - 0s 2ms/step - loss: 2.0216
16/16 [==============================] - 0s 3ms/step - loss: 2.0019
16/16 [==============================] - 0s 4ms/step - loss: 1.9877
16/16 [==============================] - 0s 3ms/step - loss: 1.9803
16/16 [==============================] - 0s 4ms/step - loss: 1.9760
16/16 [==============================] - 0s 3ms/step - loss: 1.9749

Testing for epoch 28 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1181
16/16 [==============================] - 0s 2ms/step - loss: 1.6206
16/16 [==============================] - 0s 2ms/step - loss: 1.9088
16/16 [==============================] - 0s 4ms/step - loss: 1.9691
16/16 [==============================] - 0s 3ms/step - loss: 1.9696
16/16 [==============================] - 0s 2ms/step - loss: 1.9502
16/16 [==============================] - 0s 2ms/step - loss: 1.9360
16/16 [==============================] - 0s 3ms/step - loss: 1.9287
16/16 [==============================] - 0s 2ms/step - loss: 1.9244
16/16 [==============================] - 0s 2ms/step - loss: 1.9234

Testing for epoch 28 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.1140
16/16 [==============================] - 0s 2ms/step - loss: 1.6620
16/16 [==============================] - 0s 2ms/step - loss: 1.9577
16/16 [==============================] - 0s 6ms/step - loss: 2.0168
16/16 [==============================] - 0s 3ms/step - loss: 2.0163
16/16 [==============================] - 0s 3ms/step - loss: 1.9950
16/16 [==============================] - 0s 3ms/step - loss: 1.9799
16/16 [==============================] - 0s 3ms/step - loss: 1.9723
16/16 [==============================] - 0s 4ms/step - loss: 1.9679
16/16 [==============================] - 0s 3ms/step - loss: 1.9669

Testing for epoch 28 index 4:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1127
16/16 [==============================] - 0s 3ms/step - loss: 1.6321
16/16 [==============================] - 0s 2ms/step - loss: 1.9212
16/16 [==============================] - 0s 3ms/step - loss: 1.9825
16/16 [==============================] - 0s 3ms/step - loss: 1.9846
16/16 [==============================] - 0s 4ms/step - loss: 1.9663
16/16 [==============================] - 0s 2ms/step - loss: 1.9529
16/16 [==============================] - 0s 3ms/step - loss: 1.9460
16/16 [==============================] - 0s 3ms/step - loss: 1.9418
16/16 [==============================] - 0s 3ms/step - loss: 1.9408

Testing for epoch 28 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1140
16/16 [==============================] - 0s 2ms/step - loss: 1.6441
16/16 [==============================] - 0s 4ms/step - loss: 1.9362
16/16 [==============================] - 0s 3ms/step - loss: 1.9930
16/16 [==============================] - 0s 2ms/step - loss: 1.9915
16/16 [==============================] - 0s 3ms/step - loss: 1.9689
16/16 [==============================] - 0s 3ms/step - loss: 1.9528
16/16 [==============================] - 0s 3ms/step - loss: 1.9451
16/16 [==============================] - 0s 4ms/step - loss: 1.9407
16/16 [==============================] - 0s 3ms/step - loss: 1.9397
Epoch 29 of 60

Testing for epoch 29 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1115
16/16 [==============================] - 0s 2ms/step - loss: 1.6333
16/16 [==============================] - 0s 3ms/step - loss: 1.9178
16/16 [==============================] - 0s 2ms/step - loss: 1.9741
16/16 [==============================] - 0s 2ms/step - loss: 1.9732
16/16 [==============================] - 0s 2ms/step - loss: 1.9529
16/16 [==============================] - 0s 3ms/step - loss: 1.9378
16/16 [==============================] - 0s 3ms/step - loss: 1.9303
16/16 [==============================] - 0s 3ms/step - loss: 1.9259
16/16 [==============================] - 0s 2ms/step - loss: 1.9249

Testing for epoch 29 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1122
16/16 [==============================] - 0s 3ms/step - loss: 1.6924
16/16 [==============================] - 0s 2ms/step - loss: 1.9936
16/16 [==============================] - 0s 2ms/step - loss: 2.0544
16/16 [==============================] - 0s 2ms/step - loss: 2.0540
16/16 [==============================] - 0s 3ms/step - loss: 2.0323
16/16 [==============================] - 0s 3ms/step - loss: 2.0167
16/16 [==============================] - 0s 3ms/step - loss: 2.0090
16/16 [==============================] - 0s 2ms/step - loss: 2.0045
16/16 [==============================] - 0s 2ms/step - loss: 2.0035

Testing for epoch 29 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.1109
16/16 [==============================] - 0s 3ms/step - loss: 1.7113
16/16 [==============================] - 0s 3ms/step - loss: 2.0097
16/16 [==============================] - 0s 2ms/step - loss: 2.0719
16/16 [==============================] - 0s 2ms/step - loss: 2.0732
16/16 [==============================] - 0s 3ms/step - loss: 2.0515
16/16 [==============================] - 0s 3ms/step - loss: 2.0351
16/16 [==============================] - 0s 3ms/step - loss: 2.0270
16/16 [==============================] - 0s 3ms/step - loss: 2.0224
16/16 [==============================] - 0s 2ms/step - loss: 2.0213

Testing for epoch 29 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1089
16/16 [==============================] - 0s 2ms/step - loss: 1.7229
16/16 [==============================] - 0s 1ms/step - loss: 2.0276
16/16 [==============================] - 0s 3ms/step - loss: 2.0895
16/16 [==============================] - 0s 3ms/step - loss: 2.0895
16/16 [==============================] - 0s 2ms/step - loss: 2.0655
16/16 [==============================] - 0s 6ms/step - loss: 2.0486
16/16 [==============================] - 0s 5ms/step - loss: 2.0401
16/16 [==============================] - 0s 9ms/step - loss: 2.0354
16/16 [==============================] - 0s 4ms/step - loss: 2.0343

Testing for epoch 29 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1114
16/16 [==============================] - 0s 2ms/step - loss: 1.6694
16/16 [==============================] - 0s 2ms/step - loss: 1.9604
16/16 [==============================] - 0s 2ms/step - loss: 2.0189
16/16 [==============================] - 0s 5ms/step - loss: 2.0203
16/16 [==============================] - 0s 6ms/step - loss: 1.9993
16/16 [==============================] - 0s 3ms/step - loss: 1.9838
16/16 [==============================] - 0s 5ms/step - loss: 1.9760
16/16 [==============================] - 0s 3ms/step - loss: 1.9715
16/16 [==============================] - 0s 3ms/step - loss: 1.9705
Epoch 30 of 60

Testing for epoch 30 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1114
16/16 [==============================] - 0s 4ms/step - loss: 1.6707
16/16 [==============================] - 0s 5ms/step - loss: 1.9600
16/16 [==============================] - 0s 4ms/step - loss: 2.0144
16/16 [==============================] - 0s 6ms/step - loss: 2.0143
16/16 [==============================] - 0s 2ms/step - loss: 1.9913
16/16 [==============================] - 0s 3ms/step - loss: 1.9751
16/16 [==============================] - 0s 2ms/step - loss: 1.9672
16/16 [==============================] - 0s 4ms/step - loss: 1.9628
16/16 [==============================] - 0s 5ms/step - loss: 1.9617

Testing for epoch 30 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1087
16/16 [==============================] - 0s 3ms/step - loss: 1.7058
16/16 [==============================] - 0s 3ms/step - loss: 1.9957
16/16 [==============================] - 0s 2ms/step - loss: 2.0492
16/16 [==============================] - 0s 2ms/step - loss: 2.0478
16/16 [==============================] - 0s 3ms/step - loss: 2.0216
16/16 [==============================] - 0s 3ms/step - loss: 2.0033
16/16 [==============================] - 0s 2ms/step - loss: 1.9946
16/16 [==============================] - 0s 4ms/step - loss: 1.9898
16/16 [==============================] - 0s 2ms/step - loss: 1.9887

Testing for epoch 30 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1084
16/16 [==============================] - 0s 5ms/step - loss: 1.7294
16/16 [==============================] - 0s 3ms/step - loss: 2.0292
16/16 [==============================] - 0s 2ms/step - loss: 2.0837
16/16 [==============================] - 0s 2ms/step - loss: 2.0831
16/16 [==============================] - 0s 2ms/step - loss: 2.0574
16/16 [==============================] - 0s 4ms/step - loss: 2.0388
16/16 [==============================] - 0s 2ms/step - loss: 2.0299
16/16 [==============================] - 0s 5ms/step - loss: 2.0250
16/16 [==============================] - 0s 3ms/step - loss: 2.0239

Testing for epoch 30 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1079
16/16 [==============================] - 0s 3ms/step - loss: 1.7416
16/16 [==============================] - 0s 2ms/step - loss: 2.0390
16/16 [==============================] - 0s 3ms/step - loss: 2.0909
16/16 [==============================] - 0s 3ms/step - loss: 2.0883
16/16 [==============================] - 0s 4ms/step - loss: 2.0603
16/16 [==============================] - 0s 3ms/step - loss: 2.0399
16/16 [==============================] - 0s 5ms/step - loss: 2.0306
16/16 [==============================] - 0s 3ms/step - loss: 2.0256
16/16 [==============================] - 0s 3ms/step - loss: 2.0245

Testing for epoch 30 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1080
16/16 [==============================] - 0s 5ms/step - loss: 1.7275
16/16 [==============================] - 0s 3ms/step - loss: 2.0255
16/16 [==============================] - 0s 2ms/step - loss: 2.0781
16/16 [==============================] - 0s 3ms/step - loss: 2.0756
16/16 [==============================] - 0s 3ms/step - loss: 2.0489
16/16 [==============================] - 0s 3ms/step - loss: 2.0297
16/16 [==============================] - 0s 2ms/step - loss: 2.0207
16/16 [==============================] - 0s 3ms/step - loss: 2.0158
16/16 [==============================] - 0s 3ms/step - loss: 2.0147
Epoch 31 of 60

Testing for epoch 31 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1054
16/16 [==============================] - 0s 3ms/step - loss: 1.7355
16/16 [==============================] - 0s 2ms/step - loss: 2.0347
16/16 [==============================] - 0s 6ms/step - loss: 2.0875
16/16 [==============================] - 0s 4ms/step - loss: 2.0849
16/16 [==============================] - 0s 3ms/step - loss: 2.0574
16/16 [==============================] - 0s 3ms/step - loss: 2.0373
16/16 [==============================] - 0s 3ms/step - loss: 2.0282
16/16 [==============================] - 0s 4ms/step - loss: 2.0234
16/16 [==============================] - 0s 2ms/step - loss: 2.0223

Testing for epoch 31 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.1085
16/16 [==============================] - 0s 3ms/step - loss: 1.7213
16/16 [==============================] - 0s 3ms/step - loss: 2.0163
16/16 [==============================] - 0s 1ms/step - loss: 2.0664
16/16 [==============================] - 0s 6ms/step - loss: 2.0630
16/16 [==============================] - 0s 2ms/step - loss: 2.0350
16/16 [==============================] - 0s 2ms/step - loss: 2.0145
16/16 [==============================] - 0s 3ms/step - loss: 2.0051
16/16 [==============================] - 0s 2ms/step - loss: 2.0003
16/16 [==============================] - 0s 3ms/step - loss: 1.9991

Testing for epoch 31 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.1052
16/16 [==============================] - 0s 2ms/step - loss: 1.7251
16/16 [==============================] - 0s 2ms/step - loss: 2.0171
16/16 [==============================] - 0s 2ms/step - loss: 2.0655
16/16 [==============================] - 0s 3ms/step - loss: 2.0603
16/16 [==============================] - 0s 7ms/step - loss: 2.0309
16/16 [==============================] - 0s 4ms/step - loss: 2.0113
16/16 [==============================] - 0s 3ms/step - loss: 2.0024
16/16 [==============================] - 0s 3ms/step - loss: 1.9977
16/16 [==============================] - 0s 3ms/step - loss: 1.9966

Testing for epoch 31 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1019
16/16 [==============================] - 0s 2ms/step - loss: 1.7198
16/16 [==============================] - 0s 4ms/step - loss: 2.0178
16/16 [==============================] - 0s 2ms/step - loss: 2.0698
16/16 [==============================] - 0s 2ms/step - loss: 2.0670
16/16 [==============================] - 0s 4ms/step - loss: 2.0408
16/16 [==============================] - 0s 3ms/step - loss: 2.0219
16/16 [==============================] - 0s 3ms/step - loss: 2.0132
16/16 [==============================] - 0s 4ms/step - loss: 2.0085
16/16 [==============================] - 0s 2ms/step - loss: 2.0075

Testing for epoch 31 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1045
16/16 [==============================] - 0s 6ms/step - loss: 1.6971
16/16 [==============================] - 0s 2ms/step - loss: 1.9852
16/16 [==============================] - 0s 2ms/step - loss: 2.0312
16/16 [==============================] - 0s 4ms/step - loss: 2.0247
16/16 [==============================] - 0s 3ms/step - loss: 1.9954
16/16 [==============================] - 0s 2ms/step - loss: 1.9749
16/16 [==============================] - 0s 2ms/step - loss: 1.9656
16/16 [==============================] - 0s 2ms/step - loss: 1.9608
16/16 [==============================] - 0s 2ms/step - loss: 1.9597
Epoch 32 of 60

Testing for epoch 32 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.1018
16/16 [==============================] - 0s 3ms/step - loss: 1.7524
16/16 [==============================] - 0s 3ms/step - loss: 2.0572
16/16 [==============================] - 0s 7ms/step - loss: 2.1069
16/16 [==============================] - 0s 5ms/step - loss: 2.1027
16/16 [==============================] - 0s 3ms/step - loss: 2.0732
16/16 [==============================] - 0s 4ms/step - loss: 2.0516
16/16 [==============================] - 0s 3ms/step - loss: 2.0417
16/16 [==============================] - 0s 3ms/step - loss: 2.0367
16/16 [==============================] - 0s 4ms/step - loss: 2.0356

Testing for epoch 32 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0999
16/16 [==============================] - 0s 3ms/step - loss: 1.7857
16/16 [==============================] - 0s 2ms/step - loss: 2.0992
16/16 [==============================] - 0s 3ms/step - loss: 2.1482
16/16 [==============================] - 0s 2ms/step - loss: 2.1422
16/16 [==============================] - 0s 2ms/step - loss: 2.1096
16/16 [==============================] - 0s 3ms/step - loss: 2.0860
16/16 [==============================] - 0s 3ms/step - loss: 2.0755
16/16 [==============================] - 0s 5ms/step - loss: 2.0702
16/16 [==============================] - 0s 6ms/step - loss: 2.0690

Testing for epoch 32 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1059
16/16 [==============================] - 0s 2ms/step - loss: 1.7318
16/16 [==============================] - 0s 3ms/step - loss: 2.0348
16/16 [==============================] - 0s 1ms/step - loss: 2.0819
16/16 [==============================] - 0s 1ms/step - loss: 2.0774
16/16 [==============================] - 0s 4ms/step - loss: 2.0461
16/16 [==============================] - 0s 3ms/step - loss: 2.0241
16/16 [==============================] - 0s 3ms/step - loss: 2.0143
16/16 [==============================] - 0s 2ms/step - loss: 2.0092
16/16 [==============================] - 0s 5ms/step - loss: 2.0081

Testing for epoch 32 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.1012
16/16 [==============================] - 0s 3ms/step - loss: 1.7792
16/16 [==============================] - 0s 2ms/step - loss: 2.0956
16/16 [==============================] - 0s 3ms/step - loss: 2.1436
16/16 [==============================] - 0s 3ms/step - loss: 2.1387
16/16 [==============================] - 0s 5ms/step - loss: 2.1063
16/16 [==============================] - 0s 3ms/step - loss: 2.0836
16/16 [==============================] - 0s 2ms/step - loss: 2.0733
16/16 [==============================] - 0s 3ms/step - loss: 2.0681
16/16 [==============================] - 0s 2ms/step - loss: 2.0669

Testing for epoch 32 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1016
16/16 [==============================] - 0s 5ms/step - loss: 1.8004
16/16 [==============================] - 0s 6ms/step - loss: 2.1224
16/16 [==============================] - 0s 3ms/step - loss: 2.1713
16/16 [==============================] - 0s 2ms/step - loss: 2.1662
16/16 [==============================] - 0s 6ms/step - loss: 2.1330
16/16 [==============================] - 0s 4ms/step - loss: 2.1094
16/16 [==============================] - 0s 3ms/step - loss: 2.0986
16/16 [==============================] - 0s 1ms/step - loss: 2.0932
16/16 [==============================] - 0s 2ms/step - loss: 2.0920
Epoch 33 of 60

Testing for epoch 33 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1005
16/16 [==============================] - 0s 2ms/step - loss: 1.7759
16/16 [==============================] - 0s 3ms/step - loss: 2.0941
16/16 [==============================] - 0s 2ms/step - loss: 2.1428
16/16 [==============================] - 0s 2ms/step - loss: 2.1392
16/16 [==============================] - 0s 3ms/step - loss: 2.1097
16/16 [==============================] - 0s 2ms/step - loss: 2.0884
16/16 [==============================] - 0s 6ms/step - loss: 2.0786
16/16 [==============================] - 0s 3ms/step - loss: 2.0736
16/16 [==============================] - 0s 2ms/step - loss: 2.0724

Testing for epoch 33 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1002
16/16 [==============================] - 0s 4ms/step - loss: 1.7343
16/16 [==============================] - 0s 5ms/step - loss: 2.0411
16/16 [==============================] - 0s 4ms/step - loss: 2.0849
16/16 [==============================] - 0s 1ms/step - loss: 2.0803
16/16 [==============================] - 0s 5ms/step - loss: 2.0506
16/16 [==============================] - 0s 3ms/step - loss: 2.0294
16/16 [==============================] - 0s 2ms/step - loss: 2.0199
16/16 [==============================] - 0s 3ms/step - loss: 2.0150
16/16 [==============================] - 0s 6ms/step - loss: 2.0139

Testing for epoch 33 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.1010
16/16 [==============================] - 0s 4ms/step - loss: 1.7638
16/16 [==============================] - 0s 2ms/step - loss: 2.0781
16/16 [==============================] - 0s 2ms/step - loss: 2.1208
16/16 [==============================] - 0s 5ms/step - loss: 2.1140
16/16 [==============================] - 0s 2ms/step - loss: 2.0799
16/16 [==============================] - 0s 3ms/step - loss: 2.0560
16/16 [==============================] - 0s 1ms/step - loss: 2.0456
16/16 [==============================] - 0s 2ms/step - loss: 2.0405
16/16 [==============================] - 0s 3ms/step - loss: 2.0393

Testing for epoch 33 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0992
16/16 [==============================] - 0s 4ms/step - loss: 1.7667
16/16 [==============================] - 0s 3ms/step - loss: 2.0823
16/16 [==============================] - 0s 2ms/step - loss: 2.1267
16/16 [==============================] - 0s 2ms/step - loss: 2.1230
16/16 [==============================] - 0s 4ms/step - loss: 2.0922
16/16 [==============================] - 0s 2ms/step - loss: 2.0703
16/16 [==============================] - 0s 6ms/step - loss: 2.0605
16/16 [==============================] - 0s 3ms/step - loss: 2.0555
16/16 [==============================] - 0s 4ms/step - loss: 2.0544

Testing for epoch 33 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0995
16/16 [==============================] - 0s 7ms/step - loss: 1.7736
16/16 [==============================] - 0s 2ms/step - loss: 2.0877
16/16 [==============================] - 0s 3ms/step - loss: 2.1287
16/16 [==============================] - 0s 3ms/step - loss: 2.1219
16/16 [==============================] - 0s 4ms/step - loss: 2.0894
16/16 [==============================] - 0s 7ms/step - loss: 2.0662
16/16 [==============================] - 0s 3ms/step - loss: 2.0558
16/16 [==============================] - 0s 2ms/step - loss: 2.0507
16/16 [==============================] - 0s 3ms/step - loss: 2.0496
Epoch 34 of 60

Testing for epoch 34 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.1006
16/16 [==============================] - 0s 4ms/step - loss: 1.7401
16/16 [==============================] - 0s 3ms/step - loss: 2.0487
16/16 [==============================] - 0s 1ms/step - loss: 2.0935
16/16 [==============================] - 0s 3ms/step - loss: 2.0881
16/16 [==============================] - 0s 2ms/step - loss: 2.0573
16/16 [==============================] - 0s 2ms/step - loss: 2.0354
16/16 [==============================] - 0s 2ms/step - loss: 2.0255
16/16 [==============================] - 0s 2ms/step - loss: 2.0206
16/16 [==============================] - 0s 2ms/step - loss: 2.0195

Testing for epoch 34 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0980
16/16 [==============================] - 0s 6ms/step - loss: 1.7892
16/16 [==============================] - 0s 2ms/step - loss: 2.1039
16/16 [==============================] - 0s 1ms/step - loss: 2.1444
16/16 [==============================] - 0s 3ms/step - loss: 2.1362
16/16 [==============================] - 0s 3ms/step - loss: 2.1021
16/16 [==============================] - 0s 2ms/step - loss: 2.0772
16/16 [==============================] - 0s 2ms/step - loss: 2.0664
16/16 [==============================] - 0s 4ms/step - loss: 2.0611
16/16 [==============================] - 0s 3ms/step - loss: 2.0600

Testing for epoch 34 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0950
16/16 [==============================] - 0s 2ms/step - loss: 1.8045
16/16 [==============================] - 0s 5ms/step - loss: 2.1236
16/16 [==============================] - 0s 1ms/step - loss: 2.1670
16/16 [==============================] - 0s 3ms/step - loss: 2.1604
16/16 [==============================] - 0s 3ms/step - loss: 2.1271
16/16 [==============================] - 0s 3ms/step - loss: 2.1033
16/16 [==============================] - 0s 2ms/step - loss: 2.0928
16/16 [==============================] - 0s 2ms/step - loss: 2.0877
16/16 [==============================] - 0s 4ms/step - loss: 2.0865

Testing for epoch 34 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0972
16/16 [==============================] - 0s 2ms/step - loss: 1.8229
16/16 [==============================] - 0s 4ms/step - loss: 2.1503
16/16 [==============================] - 0s 2ms/step - loss: 2.1925
16/16 [==============================] - 0s 3ms/step - loss: 2.1839
16/16 [==============================] - 0s 2ms/step - loss: 2.1480
16/16 [==============================] - 0s 4ms/step - loss: 2.1219
16/16 [==============================] - 0s 3ms/step - loss: 2.1108
16/16 [==============================] - 0s 3ms/step - loss: 2.1054
16/16 [==============================] - 0s 2ms/step - loss: 2.1042

Testing for epoch 34 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0953
16/16 [==============================] - 0s 2ms/step - loss: 1.7948
16/16 [==============================] - 0s 5ms/step - loss: 2.1144
16/16 [==============================] - 0s 4ms/step - loss: 2.1561
16/16 [==============================] - 0s 3ms/step - loss: 2.1493
16/16 [==============================] - 0s 4ms/step - loss: 2.1159
16/16 [==============================] - 0s 2ms/step - loss: 2.0910
16/16 [==============================] - 0s 2ms/step - loss: 2.0803
16/16 [==============================] - 0s 3ms/step - loss: 2.0751
16/16 [==============================] - 0s 4ms/step - loss: 2.0740
Epoch 35 of 60

Testing for epoch 35 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.0972
16/16 [==============================] - 0s 1ms/step - loss: 1.8135
16/16 [==============================] - 0s 3ms/step - loss: 2.1352
16/16 [==============================] - 0s 3ms/step - loss: 2.1743
16/16 [==============================] - 0s 4ms/step - loss: 2.1643
16/16 [==============================] - 0s 2ms/step - loss: 2.1271
16/16 [==============================] - 0s 3ms/step - loss: 2.0997
16/16 [==============================] - 0s 5ms/step - loss: 2.0879
16/16 [==============================] - 0s 3ms/step - loss: 2.0823
16/16 [==============================] - 0s 2ms/step - loss: 2.0811

Testing for epoch 35 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0969
16/16 [==============================] - 0s 2ms/step - loss: 1.8111
16/16 [==============================] - 0s 3ms/step - loss: 2.1322
16/16 [==============================] - 0s 3ms/step - loss: 2.1731
16/16 [==============================] - 0s 6ms/step - loss: 2.1648
16/16 [==============================] - 0s 6ms/step - loss: 2.1287
16/16 [==============================] - 0s 1ms/step - loss: 2.1022
16/16 [==============================] - 0s 2ms/step - loss: 2.0911
16/16 [==============================] - 0s 2ms/step - loss: 2.0858
16/16 [==============================] - 0s 2ms/step - loss: 2.0847

Testing for epoch 35 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0958
16/16 [==============================] - 0s 2ms/step - loss: 1.7698
16/16 [==============================] - 0s 3ms/step - loss: 2.0844
16/16 [==============================] - 0s 3ms/step - loss: 2.1236
16/16 [==============================] - 0s 3ms/step - loss: 2.1147
16/16 [==============================] - 0s 5ms/step - loss: 2.0794
16/16 [==============================] - 0s 3ms/step - loss: 2.0543
16/16 [==============================] - 0s 2ms/step - loss: 2.0437
16/16 [==============================] - 0s 4ms/step - loss: 2.0386
16/16 [==============================] - 0s 3ms/step - loss: 2.0375

Testing for epoch 35 index 4:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0922
16/16 [==============================] - 0s 2ms/step - loss: 1.8317
16/16 [==============================] - 0s 2ms/step - loss: 2.1621
16/16 [==============================] - 0s 3ms/step - loss: 2.2019
16/16 [==============================] - 0s 2ms/step - loss: 2.1910
16/16 [==============================] - 0s 2ms/step - loss: 2.1507
16/16 [==============================] - 0s 4ms/step - loss: 2.1214
16/16 [==============================] - 0s 2ms/step - loss: 2.1091
16/16 [==============================] - 0s 5ms/step - loss: 2.1033
16/16 [==============================] - 0s 2ms/step - loss: 2.1020

Testing for epoch 35 index 5:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.0958
16/16 [==============================] - 0s 1ms/step - loss: 1.8059
16/16 [==============================] - 0s 1ms/step - loss: 2.1364
16/16 [==============================] - 0s 3ms/step - loss: 2.1761
16/16 [==============================] - 0s 3ms/step - loss: 2.1664
16/16 [==============================] - 0s 2ms/step - loss: 2.1298
16/16 [==============================] - 0s 2ms/step - loss: 2.1035
16/16 [==============================] - 0s 5ms/step - loss: 2.0922
16/16 [==============================] - 0s 3ms/step - loss: 2.0868
16/16 [==============================] - 0s 2ms/step - loss: 2.0856
Epoch 36 of 60

Testing for epoch 36 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.0911
16/16 [==============================] - 0s 3ms/step - loss: 1.8563
16/16 [==============================] - 0s 2ms/step - loss: 2.1948
16/16 [==============================] - 0s 2ms/step - loss: 2.2329
16/16 [==============================] - 0s 4ms/step - loss: 2.2215
16/16 [==============================] - 0s 6ms/step - loss: 2.1822
16/16 [==============================] - 0s 3ms/step - loss: 2.1534
16/16 [==============================] - 0s 3ms/step - loss: 2.1413
16/16 [==============================] - 0s 2ms/step - loss: 2.1356
16/16 [==============================] - 0s 4ms/step - loss: 2.1343

Testing for epoch 36 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0982
16/16 [==============================] - 0s 2ms/step - loss: 1.7691
16/16 [==============================] - 0s 3ms/step - loss: 2.0846
16/16 [==============================] - 0s 4ms/step - loss: 2.1214
16/16 [==============================] - 0s 3ms/step - loss: 2.1133
16/16 [==============================] - 0s 3ms/step - loss: 2.0789
16/16 [==============================] - 0s 3ms/step - loss: 2.0538
16/16 [==============================] - 0s 2ms/step - loss: 2.0434
16/16 [==============================] - 0s 3ms/step - loss: 2.0384
16/16 [==============================] - 0s 2ms/step - loss: 2.0373

Testing for epoch 36 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0922
16/16 [==============================] - 0s 2ms/step - loss: 1.7897
16/16 [==============================] - 0s 2ms/step - loss: 2.1056
16/16 [==============================] - 0s 4ms/step - loss: 2.1385
16/16 [==============================] - 0s 3ms/step - loss: 2.1276
16/16 [==============================] - 0s 3ms/step - loss: 2.0910
16/16 [==============================] - 0s 2ms/step - loss: 2.0636
16/16 [==============================] - 0s 5ms/step - loss: 2.0524
16/16 [==============================] - 0s 4ms/step - loss: 2.0471
16/16 [==============================] - 0s 3ms/step - loss: 2.0460

Testing for epoch 36 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0954
16/16 [==============================] - 0s 3ms/step - loss: 1.8136
16/16 [==============================] - 0s 2ms/step - loss: 2.1457
16/16 [==============================] - 0s 3ms/step - loss: 2.1838
16/16 [==============================] - 0s 3ms/step - loss: 2.1739
16/16 [==============================] - 0s 2ms/step - loss: 2.1369
16/16 [==============================] - 0s 1ms/step - loss: 2.1082
16/16 [==============================] - 0s 5ms/step - loss: 2.0964
16/16 [==============================] - 0s 3ms/step - loss: 2.0910
16/16 [==============================] - 0s 3ms/step - loss: 2.0898

Testing for epoch 36 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0933
16/16 [==============================] - 0s 2ms/step - loss: 1.8650
16/16 [==============================] - 0s 2ms/step - loss: 2.2075
16/16 [==============================] - 0s 2ms/step - loss: 2.2433
16/16 [==============================] - 0s 2ms/step - loss: 2.2291
16/16 [==============================] - 0s 3ms/step - loss: 2.1871
16/16 [==============================] - 0s 2ms/step - loss: 2.1556
16/16 [==============================] - 0s 2ms/step - loss: 2.1428
16/16 [==============================] - 0s 2ms/step - loss: 2.1369
16/16 [==============================] - 0s 2ms/step - loss: 2.1356
Epoch 37 of 60

Testing for epoch 37 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0926
16/16 [==============================] - 0s 4ms/step - loss: 1.8497
16/16 [==============================] - 0s 6ms/step - loss: 2.1942
16/16 [==============================] - 0s 3ms/step - loss: 2.2349
16/16 [==============================] - 0s 1ms/step - loss: 2.2261
16/16 [==============================] - 0s 5ms/step - loss: 2.1894
16/16 [==============================] - 0s 1ms/step - loss: 2.1607
16/16 [==============================] - 0s 2ms/step - loss: 2.1488
16/16 [==============================] - 0s 4ms/step - loss: 2.1432
16/16 [==============================] - 0s 3ms/step - loss: 2.1420

Testing for epoch 37 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0898
16/16 [==============================] - 0s 2ms/step - loss: 1.8976
16/16 [==============================] - 0s 3ms/step - loss: 2.2391
16/16 [==============================] - 0s 2ms/step - loss: 2.2739
16/16 [==============================] - 0s 3ms/step - loss: 2.2604
16/16 [==============================] - 0s 2ms/step - loss: 2.2188
16/16 [==============================] - 0s 3ms/step - loss: 2.1874
16/16 [==============================] - 0s 5ms/step - loss: 2.1746
16/16 [==============================] - 0s 6ms/step - loss: 2.1687
16/16 [==============================] - 0s 3ms/step - loss: 2.1675

Testing for epoch 37 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0915
16/16 [==============================] - 0s 5ms/step - loss: 1.8454
16/16 [==============================] - 0s 4ms/step - loss: 2.1812
16/16 [==============================] - 0s 5ms/step - loss: 2.2183
16/16 [==============================] - 0s 2ms/step - loss: 2.2087
16/16 [==============================] - 0s 6ms/step - loss: 2.1718
16/16 [==============================] - 0s 2ms/step - loss: 2.1427
16/16 [==============================] - 0s 5ms/step - loss: 2.1308
16/16 [==============================] - 0s 2ms/step - loss: 2.1253
16/16 [==============================] - 0s 2ms/step - loss: 2.1241

Testing for epoch 37 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0895
16/16 [==============================] - 0s 5ms/step - loss: 1.8933
16/16 [==============================] - 0s 2ms/step - loss: 2.2379
16/16 [==============================] - 0s 3ms/step - loss: 2.2727
16/16 [==============================] - 0s 4ms/step - loss: 2.2607
16/16 [==============================] - 0s 4ms/step - loss: 2.2207
16/16 [==============================] - 0s 4ms/step - loss: 2.1899
16/16 [==============================] - 0s 4ms/step - loss: 2.1774
16/16 [==============================] - 0s 5ms/step - loss: 2.1716
16/16 [==============================] - 0s 2ms/step - loss: 2.1703

Testing for epoch 37 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0951
16/16 [==============================] - 0s 4ms/step - loss: 1.8258
16/16 [==============================] - 0s 4ms/step - loss: 2.1560
16/16 [==============================] - 0s 1ms/step - loss: 2.1894
16/16 [==============================] - 0s 2ms/step - loss: 2.1774
16/16 [==============================] - 0s 2ms/step - loss: 2.1385
16/16 [==============================] - 0s 2ms/step - loss: 2.1087
16/16 [==============================] - 0s 3ms/step - loss: 2.0969
16/16 [==============================] - 0s 2ms/step - loss: 2.0915
16/16 [==============================] - 0s 6ms/step - loss: 2.0904
Epoch 38 of 60

Testing for epoch 38 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0884
16/16 [==============================] - 0s 4ms/step - loss: 1.8548
16/16 [==============================] - 0s 5ms/step - loss: 2.1912
16/16 [==============================] - 0s 3ms/step - loss: 2.2252
16/16 [==============================] - 0s 4ms/step - loss: 2.2124
16/16 [==============================] - 0s 6ms/step - loss: 2.1726
16/16 [==============================] - 0s 2ms/step - loss: 2.1410
16/16 [==============================] - 0s 2ms/step - loss: 2.1284
16/16 [==============================] - 0s 5ms/step - loss: 2.1226
16/16 [==============================] - 0s 4ms/step - loss: 2.1214

Testing for epoch 38 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0924
16/16 [==============================] - 0s 6ms/step - loss: 1.9036
16/16 [==============================] - 0s 5ms/step - loss: 2.2402
16/16 [==============================] - 0s 3ms/step - loss: 2.2670
16/16 [==============================] - 0s 2ms/step - loss: 2.2493
16/16 [==============================] - 0s 4ms/step - loss: 2.2048
16/16 [==============================] - 0s 2ms/step - loss: 2.1705
16/16 [==============================] - 0s 3ms/step - loss: 2.1571
16/16 [==============================] - 0s 2ms/step - loss: 2.1511
16/16 [==============================] - 0s 3ms/step - loss: 2.1499

Testing for epoch 38 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0910
16/16 [==============================] - 0s 2ms/step - loss: 1.8982
16/16 [==============================] - 0s 2ms/step - loss: 2.2345
16/16 [==============================] - 0s 2ms/step - loss: 2.2651
16/16 [==============================] - 0s 2ms/step - loss: 2.2507
16/16 [==============================] - 0s 2ms/step - loss: 2.2087
16/16 [==============================] - 0s 2ms/step - loss: 2.1762
16/16 [==============================] - 0s 3ms/step - loss: 2.1632
16/16 [==============================] - 0s 3ms/step - loss: 2.1573
16/16 [==============================] - 0s 4ms/step - loss: 2.1561

Testing for epoch 38 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0910
16/16 [==============================] - 0s 4ms/step - loss: 1.8448
16/16 [==============================] - 0s 2ms/step - loss: 2.1747
16/16 [==============================] - 0s 1ms/step - loss: 2.2106
16/16 [==============================] - 0s 2ms/step - loss: 2.2010
16/16 [==============================] - 0s 3ms/step - loss: 2.1658
16/16 [==============================] - 0s 3ms/step - loss: 2.1379
16/16 [==============================] - 0s 5ms/step - loss: 2.1266
16/16 [==============================] - 0s 5ms/step - loss: 2.1213
16/16 [==============================] - 0s 4ms/step - loss: 2.1202

Testing for epoch 38 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0894
16/16 [==============================] - 0s 2ms/step - loss: 1.8749
16/16 [==============================] - 0s 6ms/step - loss: 2.2036
16/16 [==============================] - 0s 2ms/step - loss: 2.2291
16/16 [==============================] - 0s 3ms/step - loss: 2.2120
16/16 [==============================] - 0s 2ms/step - loss: 2.1685
16/16 [==============================] - 0s 5ms/step - loss: 2.1352
16/16 [==============================] - 0s 4ms/step - loss: 2.1221
16/16 [==============================] - 0s 3ms/step - loss: 2.1163
16/16 [==============================] - 0s 4ms/step - loss: 2.1150
Epoch 39 of 60

Testing for epoch 39 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0898
16/16 [==============================] - 0s 3ms/step - loss: 1.9104
16/16 [==============================] - 0s 4ms/step - loss: 2.2534
16/16 [==============================] - 0s 4ms/step - loss: 2.2826
16/16 [==============================] - 0s 4ms/step - loss: 2.2665
16/16 [==============================] - 0s 4ms/step - loss: 2.2238
16/16 [==============================] - 0s 3ms/step - loss: 2.1909
16/16 [==============================] - 0s 3ms/step - loss: 2.1779
16/16 [==============================] - 0s 3ms/step - loss: 2.1720
16/16 [==============================] - 0s 4ms/step - loss: 2.1708

Testing for epoch 39 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0890
16/16 [==============================] - 0s 3ms/step - loss: 1.9281
16/16 [==============================] - 0s 3ms/step - loss: 2.2730
16/16 [==============================] - 0s 3ms/step - loss: 2.3030
16/16 [==============================] - 0s 4ms/step - loss: 2.2873
16/16 [==============================] - 0s 4ms/step - loss: 2.2427
16/16 [==============================] - 0s 1ms/step - loss: 2.2083
16/16 [==============================] - 0s 2ms/step - loss: 2.1947
16/16 [==============================] - 0s 2ms/step - loss: 2.1887
16/16 [==============================] - 0s 2ms/step - loss: 2.1874

Testing for epoch 39 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0900
16/16 [==============================] - 0s 3ms/step - loss: 1.8870
16/16 [==============================] - 0s 3ms/step - loss: 2.2178
16/16 [==============================] - 0s 4ms/step - loss: 2.2431
16/16 [==============================] - 0s 3ms/step - loss: 2.2269
16/16 [==============================] - 0s 1ms/step - loss: 2.1845
16/16 [==============================] - 0s 2ms/step - loss: 2.1514
16/16 [==============================] - 0s 1ms/step - loss: 2.1386
16/16 [==============================] - 0s 2ms/step - loss: 2.1328
16/16 [==============================] - 0s 3ms/step - loss: 2.1316

Testing for epoch 39 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 6ms/step - loss: 0.0888
16/16 [==============================] - 0s 2ms/step - loss: 1.9160
16/16 [==============================] - 0s 4ms/step - loss: 2.2540
16/16 [==============================] - 0s 3ms/step - loss: 2.2778
16/16 [==============================] - 0s 5ms/step - loss: 2.2606
16/16 [==============================] - 0s 4ms/step - loss: 2.2155
16/16 [==============================] - 0s 3ms/step - loss: 2.1801
16/16 [==============================] - 0s 5ms/step - loss: 2.1666
16/16 [==============================] - 0s 3ms/step - loss: 2.1607
16/16 [==============================] - 0s 3ms/step - loss: 2.1595

Testing for epoch 39 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0865
16/16 [==============================] - 0s 2ms/step - loss: 1.8909
16/16 [==============================] - 0s 2ms/step - loss: 2.2297
16/16 [==============================] - 0s 1ms/step - loss: 2.2554
16/16 [==============================] - 0s 4ms/step - loss: 2.2386
16/16 [==============================] - 0s 2ms/step - loss: 2.1953
16/16 [==============================] - 0s 1ms/step - loss: 2.1612
16/16 [==============================] - 0s 3ms/step - loss: 2.1480
16/16 [==============================] - 0s 4ms/step - loss: 2.1421
16/16 [==============================] - 0s 4ms/step - loss: 2.1409
Epoch 40 of 60

Testing for epoch 40 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.0870
16/16 [==============================] - 0s 1ms/step - loss: 1.8980
16/16 [==============================] - 0s 2ms/step - loss: 2.2406
16/16 [==============================] - 0s 2ms/step - loss: 2.2659
16/16 [==============================] - 0s 1ms/step - loss: 2.2476
16/16 [==============================] - 0s 3ms/step - loss: 2.2028
16/16 [==============================] - 0s 2ms/step - loss: 2.1688
16/16 [==============================] - 0s 4ms/step - loss: 2.1557
16/16 [==============================] - 0s 4ms/step - loss: 2.1499
16/16 [==============================] - 0s 5ms/step - loss: 2.1487

Testing for epoch 40 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0886
16/16 [==============================] - 0s 2ms/step - loss: 1.9522
16/16 [==============================] - 0s 2ms/step - loss: 2.3047
16/16 [==============================] - 0s 4ms/step - loss: 2.3299
16/16 [==============================] - 0s 5ms/step - loss: 2.3126
16/16 [==============================] - 0s 4ms/step - loss: 2.2662
16/16 [==============================] - 0s 2ms/step - loss: 2.2303
16/16 [==============================] - 0s 2ms/step - loss: 2.2164
16/16 [==============================] - 0s 2ms/step - loss: 2.2102
16/16 [==============================] - 0s 1ms/step - loss: 2.2089

Testing for epoch 40 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0862
16/16 [==============================] - 0s 3ms/step - loss: 1.8861
16/16 [==============================] - 0s 3ms/step - loss: 2.2158
16/16 [==============================] - 0s 2ms/step - loss: 2.2384
16/16 [==============================] - 0s 3ms/step - loss: 2.2223
16/16 [==============================] - 0s 2ms/step - loss: 2.1786
16/16 [==============================] - 0s 4ms/step - loss: 2.1448
16/16 [==============================] - 0s 3ms/step - loss: 2.1317
16/16 [==============================] - 0s 4ms/step - loss: 2.1259
16/16 [==============================] - 0s 1ms/step - loss: 2.1247

Testing for epoch 40 index 4:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0869
16/16 [==============================] - 0s 4ms/step - loss: 1.9213
16/16 [==============================] - 0s 4ms/step - loss: 2.2568
16/16 [==============================] - 0s 2ms/step - loss: 2.2771
16/16 [==============================] - 0s 5ms/step - loss: 2.2598
16/16 [==============================] - 0s 3ms/step - loss: 2.2147
16/16 [==============================] - 0s 3ms/step - loss: 2.1792
16/16 [==============================] - 0s 2ms/step - loss: 2.1654
16/16 [==============================] - 0s 3ms/step - loss: 2.1594
16/16 [==============================] - 0s 6ms/step - loss: 2.1581

Testing for epoch 40 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0845
16/16 [==============================] - 0s 3ms/step - loss: 1.9368
16/16 [==============================] - 0s 2ms/step - loss: 2.2728
16/16 [==============================] - 0s 3ms/step - loss: 2.2901
16/16 [==============================] - 0s 2ms/step - loss: 2.2691
16/16 [==============================] - 0s 3ms/step - loss: 2.2211
16/16 [==============================] - 0s 3ms/step - loss: 2.1854
16/16 [==============================] - 0s 3ms/step - loss: 2.1716
16/16 [==============================] - 0s 3ms/step - loss: 2.1656
16/16 [==============================] - 0s 4ms/step - loss: 2.1643
Epoch 41 of 60

Testing for epoch 41 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0841
16/16 [==============================] - 0s 3ms/step - loss: 1.9565
16/16 [==============================] - 0s 3ms/step - loss: 2.3056
16/16 [==============================] - 0s 3ms/step - loss: 2.3274
16/16 [==============================] - 0s 3ms/step - loss: 2.3088
16/16 [==============================] - 0s 3ms/step - loss: 2.2609
16/16 [==============================] - 0s 4ms/step - loss: 2.2247
16/16 [==============================] - 0s 6ms/step - loss: 2.2107
16/16 [==============================] - 0s 3ms/step - loss: 2.2045
16/16 [==============================] - 0s 3ms/step - loss: 2.2031

Testing for epoch 41 index 2:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0875
16/16 [==============================] - 0s 2ms/step - loss: 1.8946
16/16 [==============================] - 0s 2ms/step - loss: 2.2239
16/16 [==============================] - 0s 3ms/step - loss: 2.2440
16/16 [==============================] - 0s 2ms/step - loss: 2.2251
16/16 [==============================] - 0s 2ms/step - loss: 2.1791
16/16 [==============================] - 0s 1ms/step - loss: 2.1436
16/16 [==============================] - 0s 2ms/step - loss: 2.1300
16/16 [==============================] - 0s 2ms/step - loss: 2.1240
16/16 [==============================] - 0s 3ms/step - loss: 2.1228

Testing for epoch 41 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0834
16/16 [==============================] - 0s 2ms/step - loss: 1.9220
16/16 [==============================] - 0s 4ms/step - loss: 2.2506
16/16 [==============================] - 0s 7ms/step - loss: 2.2661
16/16 [==============================] - 0s 4ms/step - loss: 2.2457
16/16 [==============================] - 0s 3ms/step - loss: 2.1974
16/16 [==============================] - 0s 4ms/step - loss: 2.1612
16/16 [==============================] - 0s 5ms/step - loss: 2.1474
16/16 [==============================] - 0s 4ms/step - loss: 2.1414
16/16 [==============================] - 0s 3ms/step - loss: 2.1402

Testing for epoch 41 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0881
16/16 [==============================] - 0s 4ms/step - loss: 1.9730
16/16 [==============================] - 0s 4ms/step - loss: 2.3220
16/16 [==============================] - 0s 8ms/step - loss: 2.3420
16/16 [==============================] - 0s 8ms/step - loss: 2.3243
16/16 [==============================] - 0s 3ms/step - loss: 2.2770
16/16 [==============================] - 0s 5ms/step - loss: 2.2415
16/16 [==============================] - 0s 7ms/step - loss: 2.2279
16/16 [==============================] - 0s 4ms/step - loss: 2.2219
16/16 [==============================] - 0s 5ms/step - loss: 2.2206

Testing for epoch 41 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0849
16/16 [==============================] - 0s 2ms/step - loss: 1.9937
16/16 [==============================] - 0s 1ms/step - loss: 2.3389
16/16 [==============================] - 0s 2ms/step - loss: 2.3538
16/16 [==============================] - 0s 4ms/step - loss: 2.3329
16/16 [==============================] - 0s 4ms/step - loss: 2.2819
16/16 [==============================] - 0s 4ms/step - loss: 2.2434
16/16 [==============================] - 0s 3ms/step - loss: 2.2289
16/16 [==============================] - 0s 2ms/step - loss: 2.2226
16/16 [==============================] - 0s 5ms/step - loss: 2.2213
Epoch 42 of 60

Testing for epoch 42 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0832
16/16 [==============================] - 0s 2ms/step - loss: 1.9249
16/16 [==============================] - 0s 3ms/step - loss: 2.2636
16/16 [==============================] - 0s 6ms/step - loss: 2.2817
16/16 [==============================] - 0s 5ms/step - loss: 2.2637
16/16 [==============================] - 0s 2ms/step - loss: 2.2167
16/16 [==============================] - 0s 2ms/step - loss: 2.1803
16/16 [==============================] - 0s 2ms/step - loss: 2.1662
16/16 [==============================] - 0s 3ms/step - loss: 2.1600
16/16 [==============================] - 0s 4ms/step - loss: 2.1587

Testing for epoch 42 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0809
16/16 [==============================] - 0s 5ms/step - loss: 2.0034
16/16 [==============================] - 0s 3ms/step - loss: 2.3497
16/16 [==============================] - 0s 2ms/step - loss: 2.3653
16/16 [==============================] - 0s 2ms/step - loss: 2.3442
16/16 [==============================] - 0s 1ms/step - loss: 2.2940
16/16 [==============================] - 0s 2ms/step - loss: 2.2556
16/16 [==============================] - 0s 3ms/step - loss: 2.2410
16/16 [==============================] - 0s 3ms/step - loss: 2.2347
16/16 [==============================] - 0s 2ms/step - loss: 2.2334

Testing for epoch 42 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0854
16/16 [==============================] - 0s 2ms/step - loss: 1.9266
16/16 [==============================] - 0s 2ms/step - loss: 2.2565
16/16 [==============================] - 0s 1ms/step - loss: 2.2663
16/16 [==============================] - 0s 3ms/step - loss: 2.2438
16/16 [==============================] - 0s 3ms/step - loss: 2.1929
16/16 [==============================] - 0s 3ms/step - loss: 2.1553
16/16 [==============================] - 0s 3ms/step - loss: 2.1412
16/16 [==============================] - 0s 3ms/step - loss: 2.1352
16/16 [==============================] - 0s 4ms/step - loss: 2.1339

Testing for epoch 42 index 4:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0843
16/16 [==============================] - 0s 2ms/step - loss: 2.0321
16/16 [==============================] - 0s 2ms/step - loss: 2.3956
16/16 [==============================] - 0s 1ms/step - loss: 2.4101
16/16 [==============================] - 0s 2ms/step - loss: 2.3885
16/16 [==============================] - 0s 2ms/step - loss: 2.3366
16/16 [==============================] - 0s 2ms/step - loss: 2.2967
16/16 [==============================] - 0s 2ms/step - loss: 2.2815
16/16 [==============================] - 0s 4ms/step - loss: 2.2750
16/16 [==============================] - 0s 2ms/step - loss: 2.2736

Testing for epoch 42 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0854
16/16 [==============================] - 0s 3ms/step - loss: 1.9724
16/16 [==============================] - 0s 2ms/step - loss: 2.3241
16/16 [==============================] - 0s 2ms/step - loss: 2.3382
16/16 [==============================] - 0s 3ms/step - loss: 2.3173
16/16 [==============================] - 0s 2ms/step - loss: 2.2673
16/16 [==============================] - 0s 5ms/step - loss: 2.2289
16/16 [==============================] - 0s 5ms/step - loss: 2.2145
16/16 [==============================] - 0s 5ms/step - loss: 2.2082
16/16 [==============================] - 0s 4ms/step - loss: 2.2070
Epoch 43 of 60

Testing for epoch 43 index 1:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0832
16/16 [==============================] - 0s 4ms/step - loss: 1.9420
16/16 [==============================] - 0s 4ms/step - loss: 2.2840
16/16 [==============================] - 0s 4ms/step - loss: 2.2962
16/16 [==============================] - 0s 4ms/step - loss: 2.2750
16/16 [==============================] - 0s 3ms/step - loss: 2.2253
16/16 [==============================] - 0s 5ms/step - loss: 2.1876
16/16 [==============================] - 0s 3ms/step - loss: 2.1736
16/16 [==============================] - 0s 6ms/step - loss: 2.1676
16/16 [==============================] - 0s 4ms/step - loss: 2.1664

Testing for epoch 43 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0820
16/16 [==============================] - 0s 2ms/step - loss: 1.9946
16/16 [==============================] - 0s 4ms/step - loss: 2.3462
16/16 [==============================] - 0s 2ms/step - loss: 2.3576
16/16 [==============================] - 0s 3ms/step - loss: 2.3358
16/16 [==============================] - 0s 2ms/step - loss: 2.2851
16/16 [==============================] - 0s 2ms/step - loss: 2.2453
16/16 [==============================] - 0s 3ms/step - loss: 2.2306
16/16 [==============================] - 0s 3ms/step - loss: 2.2242
16/16 [==============================] - 0s 7ms/step - loss: 2.2228

Testing for epoch 43 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0838
16/16 [==============================] - 0s 4ms/step - loss: 1.9777
16/16 [==============================] - 0s 2ms/step - loss: 2.3269
16/16 [==============================] - 0s 5ms/step - loss: 2.3413
16/16 [==============================] - 0s 4ms/step - loss: 2.3211
16/16 [==============================] - 0s 6ms/step - loss: 2.2733
16/16 [==============================] - 0s 3ms/step - loss: 2.2361
16/16 [==============================] - 0s 5ms/step - loss: 2.2224
16/16 [==============================] - 0s 1ms/step - loss: 2.2165
16/16 [==============================] - 0s 3ms/step - loss: 2.2153

Testing for epoch 43 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0812
16/16 [==============================] - 0s 2ms/step - loss: 2.0077
16/16 [==============================] - 0s 2ms/step - loss: 2.3511
16/16 [==============================] - 0s 2ms/step - loss: 2.3584
16/16 [==============================] - 0s 2ms/step - loss: 2.3329
16/16 [==============================] - 0s 3ms/step - loss: 2.2791
16/16 [==============================] - 0s 3ms/step - loss: 2.2366
16/16 [==============================] - 0s 4ms/step - loss: 2.2212
16/16 [==============================] - 0s 2ms/step - loss: 2.2145
16/16 [==============================] - 0s 2ms/step - loss: 2.2132

Testing for epoch 43 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0795
16/16 [==============================] - 0s 3ms/step - loss: 2.0192
16/16 [==============================] - 0s 5ms/step - loss: 2.3733
16/16 [==============================] - 0s 2ms/step - loss: 2.3863
16/16 [==============================] - 0s 2ms/step - loss: 2.3650
16/16 [==============================] - 0s 4ms/step - loss: 2.3154
16/16 [==============================] - 0s 4ms/step - loss: 2.2763
16/16 [==============================] - 0s 3ms/step - loss: 2.2617
16/16 [==============================] - 0s 5ms/step - loss: 2.2554
16/16 [==============================] - 0s 2ms/step - loss: 2.2541
Epoch 44 of 60

Testing for epoch 44 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0827
16/16 [==============================] - 0s 1ms/step - loss: 2.0020
16/16 [==============================] - 0s 2ms/step - loss: 2.3477
16/16 [==============================] - 0s 2ms/step - loss: 2.3566
16/16 [==============================] - 0s 2ms/step - loss: 2.3324
16/16 [==============================] - 0s 3ms/step - loss: 2.2801
16/16 [==============================] - 0s 4ms/step - loss: 2.2393
16/16 [==============================] - 0s 2ms/step - loss: 2.2244
16/16 [==============================] - 0s 2ms/step - loss: 2.2181
16/16 [==============================] - 0s 1ms/step - loss: 2.2168

Testing for epoch 44 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0814
16/16 [==============================] - 0s 4ms/step - loss: 2.0398
16/16 [==============================] - 0s 3ms/step - loss: 2.3945
16/16 [==============================] - 0s 3ms/step - loss: 2.4049
16/16 [==============================] - 0s 6ms/step - loss: 2.3818
16/16 [==============================] - 0s 4ms/step - loss: 2.3307
16/16 [==============================] - 0s 2ms/step - loss: 2.2906
16/16 [==============================] - 0s 3ms/step - loss: 2.2758
16/16 [==============================] - 0s 4ms/step - loss: 2.2694
16/16 [==============================] - 0s 3ms/step - loss: 2.2681

Testing for epoch 44 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0816
16/16 [==============================] - 0s 3ms/step - loss: 1.9994
16/16 [==============================] - 0s 3ms/step - loss: 2.3389
16/16 [==============================] - 0s 3ms/step - loss: 2.3474
16/16 [==============================] - 0s 4ms/step - loss: 2.3241
16/16 [==============================] - 0s 4ms/step - loss: 2.2731
16/16 [==============================] - 0s 3ms/step - loss: 2.2334
16/16 [==============================] - 0s 3ms/step - loss: 2.2190
16/16 [==============================] - 0s 2ms/step - loss: 2.2129
16/16 [==============================] - 0s 4ms/step - loss: 2.2117

Testing for epoch 44 index 4:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0797
16/16 [==============================] - 0s 2ms/step - loss: 2.0091
16/16 [==============================] - 0s 3ms/step - loss: 2.3464
16/16 [==============================] - 0s 3ms/step - loss: 2.3510
16/16 [==============================] - 0s 3ms/step - loss: 2.3250
16/16 [==============================] - 0s 4ms/step - loss: 2.2729
16/16 [==============================] - 0s 5ms/step - loss: 2.2329
16/16 [==============================] - 0s 2ms/step - loss: 2.2181
16/16 [==============================] - 0s 4ms/step - loss: 2.2117
16/16 [==============================] - 0s 2ms/step - loss: 2.2104

Testing for epoch 44 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0825
16/16 [==============================] - 0s 2ms/step - loss: 1.9744
16/16 [==============================] - 0s 2ms/step - loss: 2.3137
16/16 [==============================] - 0s 4ms/step - loss: 2.3222
16/16 [==============================] - 0s 2ms/step - loss: 2.2990
16/16 [==============================] - 0s 2ms/step - loss: 2.2498
16/16 [==============================] - 0s 2ms/step - loss: 2.2112
16/16 [==============================] - 0s 2ms/step - loss: 2.1970
16/16 [==============================] - 0s 1ms/step - loss: 2.1910
16/16 [==============================] - 0s 2ms/step - loss: 2.1897
Epoch 45 of 60

Testing for epoch 45 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0792
16/16 [==============================] - 0s 1ms/step - loss: 2.0531
16/16 [==============================] - 0s 3ms/step - loss: 2.4108
16/16 [==============================] - 0s 2ms/step - loss: 2.4187
16/16 [==============================] - 0s 1ms/step - loss: 2.3939
16/16 [==============================] - 0s 2ms/step - loss: 2.3408
16/16 [==============================] - 0s 2ms/step - loss: 2.2992
16/16 [==============================] - 0s 5ms/step - loss: 2.2842
16/16 [==============================] - 0s 1ms/step - loss: 2.2778
16/16 [==============================] - 0s 3ms/step - loss: 2.2765

Testing for epoch 45 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0792
16/16 [==============================] - 0s 3ms/step - loss: 2.0146
16/16 [==============================] - 0s 2ms/step - loss: 2.3598
16/16 [==============================] - 0s 3ms/step - loss: 2.3669
16/16 [==============================] - 0s 3ms/step - loss: 2.3426
16/16 [==============================] - 0s 3ms/step - loss: 2.2916
16/16 [==============================] - 0s 4ms/step - loss: 2.2518
16/16 [==============================] - 0s 3ms/step - loss: 2.2371
16/16 [==============================] - 0s 2ms/step - loss: 2.2309
16/16 [==============================] - 0s 5ms/step - loss: 2.2297

Testing for epoch 45 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0816
16/16 [==============================] - 0s 4ms/step - loss: 2.0082
16/16 [==============================] - 0s 4ms/step - loss: 2.3503
16/16 [==============================] - 0s 2ms/step - loss: 2.3571
16/16 [==============================] - 0s 3ms/step - loss: 2.3324
16/16 [==============================] - 0s 2ms/step - loss: 2.2803
16/16 [==============================] - 0s 2ms/step - loss: 2.2389
16/16 [==============================] - 0s 2ms/step - loss: 2.2237
16/16 [==============================] - 0s 2ms/step - loss: 2.2173
16/16 [==============================] - 0s 3ms/step - loss: 2.2160

Testing for epoch 45 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0774
16/16 [==============================] - 0s 3ms/step - loss: 1.9912
16/16 [==============================] - 0s 2ms/step - loss: 2.3264
16/16 [==============================] - 0s 4ms/step - loss: 2.3337
16/16 [==============================] - 0s 4ms/step - loss: 2.3102
16/16 [==============================] - 0s 2ms/step - loss: 2.2606
16/16 [==============================] - 0s 3ms/step - loss: 2.2219
16/16 [==============================] - 0s 3ms/step - loss: 2.2077
16/16 [==============================] - 0s 4ms/step - loss: 2.2016
16/16 [==============================] - 0s 2ms/step - loss: 2.2004

Testing for epoch 45 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0793
16/16 [==============================] - 0s 2ms/step - loss: 2.0418
16/16 [==============================] - 0s 2ms/step - loss: 2.3856
16/16 [==============================] - 0s 3ms/step - loss: 2.3872
16/16 [==============================] - 0s 3ms/step - loss: 2.3592
16/16 [==============================] - 0s 4ms/step - loss: 2.3048
16/16 [==============================] - 0s 4ms/step - loss: 2.2625
16/16 [==============================] - 0s 2ms/step - loss: 2.2471
16/16 [==============================] - 0s 5ms/step - loss: 2.2406
16/16 [==============================] - 0s 2ms/step - loss: 2.2393
Epoch 46 of 60

Testing for epoch 46 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0780
16/16 [==============================] - 0s 3ms/step - loss: 2.0858
16/16 [==============================] - 0s 6ms/step - loss: 2.4408
16/16 [==============================] - 0s 2ms/step - loss: 2.4428
16/16 [==============================] - 0s 3ms/step - loss: 2.4121
16/16 [==============================] - 0s 2ms/step - loss: 2.3534
16/16 [==============================] - 0s 2ms/step - loss: 2.3081
16/16 [==============================] - 0s 2ms/step - loss: 2.2916
16/16 [==============================] - 0s 2ms/step - loss: 2.2847
16/16 [==============================] - 0s 2ms/step - loss: 2.2833

Testing for epoch 46 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0771
16/16 [==============================] - 0s 3ms/step - loss: 2.0436
16/16 [==============================] - 0s 3ms/step - loss: 2.3928
16/16 [==============================] - 0s 2ms/step - loss: 2.3977
16/16 [==============================] - 0s 3ms/step - loss: 2.3709
16/16 [==============================] - 0s 6ms/step - loss: 2.3165
16/16 [==============================] - 0s 6ms/step - loss: 2.2736
16/16 [==============================] - 0s 2ms/step - loss: 2.2580
16/16 [==============================] - 0s 3ms/step - loss: 2.2515
16/16 [==============================] - 0s 3ms/step - loss: 2.2501

Testing for epoch 46 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0778
16/16 [==============================] - 0s 2ms/step - loss: 2.0621
16/16 [==============================] - 0s 2ms/step - loss: 2.4195
16/16 [==============================] - 0s 5ms/step - loss: 2.4255
16/16 [==============================] - 0s 5ms/step - loss: 2.4000
16/16 [==============================] - 0s 3ms/step - loss: 2.3467
16/16 [==============================] - 0s 4ms/step - loss: 2.3046
16/16 [==============================] - 0s 2ms/step - loss: 2.2894
16/16 [==============================] - 0s 2ms/step - loss: 2.2830
16/16 [==============================] - 0s 2ms/step - loss: 2.2817

Testing for epoch 46 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0783
16/16 [==============================] - 0s 7ms/step - loss: 2.0333
16/16 [==============================] - 0s 6ms/step - loss: 2.3792
16/16 [==============================] - 0s 2ms/step - loss: 2.3831
16/16 [==============================] - 0s 3ms/step - loss: 2.3585
16/16 [==============================] - 0s 1ms/step - loss: 2.3056
16/16 [==============================] - 0s 2ms/step - loss: 2.2634
16/16 [==============================] - 0s 2ms/step - loss: 2.2481
16/16 [==============================] - 0s 4ms/step - loss: 2.2417
16/16 [==============================] - 0s 2ms/step - loss: 2.2404

Testing for epoch 46 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.0744
16/16 [==============================] - 0s 1ms/step - loss: 2.0783
16/16 [==============================] - 0s 3ms/step - loss: 2.4311
16/16 [==============================] - 0s 3ms/step - loss: 2.4319
16/16 [==============================] - 0s 2ms/step - loss: 2.4041
16/16 [==============================] - 0s 1ms/step - loss: 2.3502
16/16 [==============================] - 0s 2ms/step - loss: 2.3075
16/16 [==============================] - 0s 4ms/step - loss: 2.2920
16/16 [==============================] - 0s 5ms/step - loss: 2.2855
16/16 [==============================] - 0s 1ms/step - loss: 2.2842
Epoch 47 of 60

Testing for epoch 47 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0796
16/16 [==============================] - 0s 2ms/step - loss: 2.0341
16/16 [==============================] - 0s 2ms/step - loss: 2.3811
16/16 [==============================] - 0s 4ms/step - loss: 2.3828
16/16 [==============================] - 0s 5ms/step - loss: 2.3555
16/16 [==============================] - 0s 6ms/step - loss: 2.3014
16/16 [==============================] - 0s 3ms/step - loss: 2.2588
16/16 [==============================] - 0s 6ms/step - loss: 2.2435
16/16 [==============================] - 0s 4ms/step - loss: 2.2371
16/16 [==============================] - 0s 4ms/step - loss: 2.2358

Testing for epoch 47 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0762
16/16 [==============================] - 0s 4ms/step - loss: 2.1056
16/16 [==============================] - 0s 2ms/step - loss: 2.4687
16/16 [==============================] - 0s 4ms/step - loss: 2.4699
16/16 [==============================] - 0s 2ms/step - loss: 2.4404
16/16 [==============================] - 0s 2ms/step - loss: 2.3825
16/16 [==============================] - 0s 2ms/step - loss: 2.3370
16/16 [==============================] - 0s 2ms/step - loss: 2.3206
16/16 [==============================] - 0s 2ms/step - loss: 2.3137
16/16 [==============================] - 0s 3ms/step - loss: 2.3124

Testing for epoch 47 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0779
16/16 [==============================] - 0s 1ms/step - loss: 2.0457
16/16 [==============================] - 0s 1ms/step - loss: 2.3898
16/16 [==============================] - 0s 2ms/step - loss: 2.3892
16/16 [==============================] - 0s 1ms/step - loss: 2.3612
16/16 [==============================] - 0s 2ms/step - loss: 2.3064
16/16 [==============================] - 0s 2ms/step - loss: 2.2633
16/16 [==============================] - 0s 3ms/step - loss: 2.2479
16/16 [==============================] - 0s 2ms/step - loss: 2.2416
16/16 [==============================] - 0s 2ms/step - loss: 2.2403

Testing for epoch 47 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 6ms/step - loss: 0.0770
16/16 [==============================] - 0s 2ms/step - loss: 2.0410
16/16 [==============================] - 0s 2ms/step - loss: 2.3848
16/16 [==============================] - 0s 4ms/step - loss: 2.3812
16/16 [==============================] - 0s 2ms/step - loss: 2.3510
16/16 [==============================] - 0s 2ms/step - loss: 2.2954
16/16 [==============================] - 0s 6ms/step - loss: 2.2526
16/16 [==============================] - 0s 3ms/step - loss: 2.2373
16/16 [==============================] - 0s 3ms/step - loss: 2.2310
16/16 [==============================] - 0s 3ms/step - loss: 2.2297

Testing for epoch 47 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0754
16/16 [==============================] - 0s 1ms/step - loss: 2.0138
16/16 [==============================] - 0s 5ms/step - loss: 2.3558
16/16 [==============================] - 0s 3ms/step - loss: 2.3557
16/16 [==============================] - 0s 5ms/step - loss: 2.3275
16/16 [==============================] - 0s 3ms/step - loss: 2.2739
16/16 [==============================] - 0s 5ms/step - loss: 2.2314
16/16 [==============================] - 0s 3ms/step - loss: 2.2163
16/16 [==============================] - 0s 4ms/step - loss: 2.2100
16/16 [==============================] - 0s 2ms/step - loss: 2.2088
Epoch 48 of 60

Testing for epoch 48 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0786
16/16 [==============================] - 0s 4ms/step - loss: 2.0329
16/16 [==============================] - 0s 2ms/step - loss: 2.3849
16/16 [==============================] - 0s 2ms/step - loss: 2.3869
16/16 [==============================] - 0s 2ms/step - loss: 2.3607
16/16 [==============================] - 0s 2ms/step - loss: 2.3096
16/16 [==============================] - 0s 4ms/step - loss: 2.2689
16/16 [==============================] - 0s 3ms/step - loss: 2.2540
16/16 [==============================] - 0s 2ms/step - loss: 2.2478
16/16 [==============================] - 0s 2ms/step - loss: 2.2465

Testing for epoch 48 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0762
16/16 [==============================] - 0s 6ms/step - loss: 2.0825
16/16 [==============================] - 0s 5ms/step - loss: 2.4396
16/16 [==============================] - 0s 5ms/step - loss: 2.4393
16/16 [==============================] - 0s 3ms/step - loss: 2.4095
16/16 [==============================] - 0s 4ms/step - loss: 2.3539
16/16 [==============================] - 0s 2ms/step - loss: 2.3104
16/16 [==============================] - 0s 5ms/step - loss: 2.2946
16/16 [==============================] - 0s 2ms/step - loss: 2.2880
16/16 [==============================] - 0s 2ms/step - loss: 2.2867

Testing for epoch 48 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0743
16/16 [==============================] - 0s 2ms/step - loss: 2.0177
16/16 [==============================] - 0s 2ms/step - loss: 2.3563
16/16 [==============================] - 0s 4ms/step - loss: 2.3539
16/16 [==============================] - 0s 4ms/step - loss: 2.3245
16/16 [==============================] - 0s 3ms/step - loss: 2.2703
16/16 [==============================] - 0s 3ms/step - loss: 2.2281
16/16 [==============================] - 0s 2ms/step - loss: 2.2131
16/16 [==============================] - 0s 2ms/step - loss: 2.2068
16/16 [==============================] - 0s 3ms/step - loss: 2.2056

Testing for epoch 48 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0756
16/16 [==============================] - 0s 2ms/step - loss: 2.0942
16/16 [==============================] - 0s 3ms/step - loss: 2.4575
16/16 [==============================] - 0s 2ms/step - loss: 2.4553
16/16 [==============================] - 0s 2ms/step - loss: 2.4232
16/16 [==============================] - 0s 3ms/step - loss: 2.3650
16/16 [==============================] - 0s 4ms/step - loss: 2.3205
16/16 [==============================] - 0s 3ms/step - loss: 2.3045
16/16 [==============================] - 0s 2ms/step - loss: 2.2978
16/16 [==============================] - 0s 4ms/step - loss: 2.2964

Testing for epoch 48 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0753
16/16 [==============================] - 0s 5ms/step - loss: 2.0437
16/16 [==============================] - 0s 3ms/step - loss: 2.3921
16/16 [==============================] - 0s 2ms/step - loss: 2.3916
16/16 [==============================] - 0s 6ms/step - loss: 2.3638
16/16 [==============================] - 0s 3ms/step - loss: 2.3112
16/16 [==============================] - 0s 6ms/step - loss: 2.2693
16/16 [==============================] - 0s 5ms/step - loss: 2.2541
16/16 [==============================] - 0s 5ms/step - loss: 2.2478
16/16 [==============================] - 0s 2ms/step - loss: 2.2466
Epoch 49 of 60

Testing for epoch 49 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0752
16/16 [==============================] - 0s 2ms/step - loss: 2.0416
16/16 [==============================] - 0s 3ms/step - loss: 2.3876
16/16 [==============================] - 0s 4ms/step - loss: 2.3798
16/16 [==============================] - 0s 2ms/step - loss: 2.3464
16/16 [==============================] - 0s 2ms/step - loss: 2.2892
16/16 [==============================] - 0s 4ms/step - loss: 2.2459
16/16 [==============================] - 0s 2ms/step - loss: 2.2306
16/16 [==============================] - 0s 2ms/step - loss: 2.2242
16/16 [==============================] - 0s 3ms/step - loss: 2.2230

Testing for epoch 49 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0766
16/16 [==============================] - 0s 3ms/step - loss: 2.0463
16/16 [==============================] - 0s 5ms/step - loss: 2.3939
16/16 [==============================] - 0s 3ms/step - loss: 2.3914
16/16 [==============================] - 0s 3ms/step - loss: 2.3601
16/16 [==============================] - 0s 1ms/step - loss: 2.3048
16/16 [==============================] - 0s 3ms/step - loss: 2.2621
16/16 [==============================] - 0s 2ms/step - loss: 2.2466
16/16 [==============================] - 0s 2ms/step - loss: 2.2402
16/16 [==============================] - 0s 2ms/step - loss: 2.2390

Testing for epoch 49 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0717
16/16 [==============================] - 0s 2ms/step - loss: 2.1322
16/16 [==============================] - 0s 1ms/step - loss: 2.5002
16/16 [==============================] - 0s 4ms/step - loss: 2.4966
16/16 [==============================] - 0s 2ms/step - loss: 2.4633
16/16 [==============================] - 0s 2ms/step - loss: 2.4026
16/16 [==============================] - 0s 4ms/step - loss: 2.3544
16/16 [==============================] - 0s 2ms/step - loss: 2.3371
16/16 [==============================] - 0s 2ms/step - loss: 2.3300
16/16 [==============================] - 0s 4ms/step - loss: 2.3285

Testing for epoch 49 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0735
16/16 [==============================] - 0s 2ms/step - loss: 2.1173
16/16 [==============================] - 0s 7ms/step - loss: 2.4757
16/16 [==============================] - 0s 2ms/step - loss: 2.4684
16/16 [==============================] - 0s 2ms/step - loss: 2.4342
16/16 [==============================] - 0s 2ms/step - loss: 2.3750
16/16 [==============================] - 0s 3ms/step - loss: 2.3288
16/16 [==============================] - 0s 3ms/step - loss: 2.3123
16/16 [==============================] - 0s 2ms/step - loss: 2.3056
16/16 [==============================] - 0s 2ms/step - loss: 2.3042

Testing for epoch 49 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0750
16/16 [==============================] - 0s 2ms/step - loss: 2.0286
16/16 [==============================] - 0s 2ms/step - loss: 2.3718
16/16 [==============================] - 0s 2ms/step - loss: 2.3672
16/16 [==============================] - 0s 2ms/step - loss: 2.3362
16/16 [==============================] - 0s 2ms/step - loss: 2.2820
16/16 [==============================] - 0s 3ms/step - loss: 2.2396
16/16 [==============================] - 0s 2ms/step - loss: 2.2242
16/16 [==============================] - 0s 2ms/step - loss: 2.2179
16/16 [==============================] - 0s 2ms/step - loss: 2.2166
Epoch 50 of 60

Testing for epoch 50 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0716
16/16 [==============================] - 0s 3ms/step - loss: 2.1164
16/16 [==============================] - 0s 3ms/step - loss: 2.4831
16/16 [==============================] - 0s 4ms/step - loss: 2.4840
16/16 [==============================] - 0s 3ms/step - loss: 2.4550
16/16 [==============================] - 0s 3ms/step - loss: 2.3996
16/16 [==============================] - 0s 5ms/step - loss: 2.3560
16/16 [==============================] - 0s 2ms/step - loss: 2.3401
16/16 [==============================] - 0s 3ms/step - loss: 2.3335
16/16 [==============================] - 0s 5ms/step - loss: 2.3322

Testing for epoch 50 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0714
16/16 [==============================] - 0s 4ms/step - loss: 2.1177
16/16 [==============================] - 0s 1ms/step - loss: 2.4719
16/16 [==============================] - 0s 3ms/step - loss: 2.4648
16/16 [==============================] - 0s 3ms/step - loss: 2.4308
16/16 [==============================] - 0s 6ms/step - loss: 2.3716
16/16 [==============================] - 0s 2ms/step - loss: 2.3262
16/16 [==============================] - 0s 4ms/step - loss: 2.3102
16/16 [==============================] - 0s 4ms/step - loss: 2.3037
16/16 [==============================] - 0s 5ms/step - loss: 2.3023

Testing for epoch 50 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0757
16/16 [==============================] - 0s 2ms/step - loss: 2.0808
16/16 [==============================] - 0s 2ms/step - loss: 2.4292
16/16 [==============================] - 0s 2ms/step - loss: 2.4246
16/16 [==============================] - 0s 3ms/step - loss: 2.3928
16/16 [==============================] - 0s 6ms/step - loss: 2.3370
16/16 [==============================] - 0s 3ms/step - loss: 2.2930
16/16 [==============================] - 0s 4ms/step - loss: 2.2773
16/16 [==============================] - 0s 5ms/step - loss: 2.2710
16/16 [==============================] - 0s 2ms/step - loss: 2.2697

Testing for epoch 50 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0752
16/16 [==============================] - 0s 2ms/step - loss: 2.0438
16/16 [==============================] - 0s 3ms/step - loss: 2.3875
16/16 [==============================] - 0s 2ms/step - loss: 2.3841
16/16 [==============================] - 0s 6ms/step - loss: 2.3519
16/16 [==============================] - 0s 4ms/step - loss: 2.2963
16/16 [==============================] - 0s 2ms/step - loss: 2.2531
16/16 [==============================] - 0s 2ms/step - loss: 2.2376
16/16 [==============================] - 0s 5ms/step - loss: 2.2313
16/16 [==============================] - 0s 3ms/step - loss: 2.2300

Testing for epoch 50 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0710
16/16 [==============================] - 0s 4ms/step - loss: 2.1273
16/16 [==============================] - 0s 4ms/step - loss: 2.4834
16/16 [==============================] - 0s 4ms/step - loss: 2.4786
16/16 [==============================] - 0s 4ms/step - loss: 2.4437
16/16 [==============================] - 0s 2ms/step - loss: 2.3835
16/16 [==============================] - 0s 4ms/step - loss: 2.3367
16/16 [==============================] - 0s 2ms/step - loss: 2.3202
16/16 [==============================] - 0s 3ms/step - loss: 2.3136
16/16 [==============================] - 0s 6ms/step - loss: 2.3122
Epoch 51 of 60

Testing for epoch 51 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0751
16/16 [==============================] - 0s 2ms/step - loss: 2.0811
16/16 [==============================] - 0s 2ms/step - loss: 2.4367
16/16 [==============================] - 0s 4ms/step - loss: 2.4371
16/16 [==============================] - 0s 3ms/step - loss: 2.4065
16/16 [==============================] - 0s 5ms/step - loss: 2.3518
16/16 [==============================] - 0s 2ms/step - loss: 2.3086
16/16 [==============================] - 0s 5ms/step - loss: 2.2933
16/16 [==============================] - 0s 3ms/step - loss: 2.2870
16/16 [==============================] - 0s 3ms/step - loss: 2.2858

Testing for epoch 51 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0718
16/16 [==============================] - 0s 2ms/step - loss: 2.0804
16/16 [==============================] - 0s 2ms/step - loss: 2.4242
16/16 [==============================] - 0s 2ms/step - loss: 2.4198
16/16 [==============================] - 0s 2ms/step - loss: 2.3856
16/16 [==============================] - 0s 3ms/step - loss: 2.3276
16/16 [==============================] - 0s 2ms/step - loss: 2.2831
16/16 [==============================] - 0s 3ms/step - loss: 2.2673
16/16 [==============================] - 0s 4ms/step - loss: 2.2608
16/16 [==============================] - 0s 2ms/step - loss: 2.2595

Testing for epoch 51 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0702
16/16 [==============================] - 0s 5ms/step - loss: 2.1221
16/16 [==============================] - 0s 3ms/step - loss: 2.4689
16/16 [==============================] - 0s 2ms/step - loss: 2.4639
16/16 [==============================] - 0s 2ms/step - loss: 2.4292
16/16 [==============================] - 0s 2ms/step - loss: 2.3702
16/16 [==============================] - 0s 5ms/step - loss: 2.3237
16/16 [==============================] - 0s 4ms/step - loss: 2.3073
16/16 [==============================] - 0s 7ms/step - loss: 2.3007
16/16 [==============================] - 0s 3ms/step - loss: 2.2994

Testing for epoch 51 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0732
16/16 [==============================] - 0s 3ms/step - loss: 2.0961
16/16 [==============================] - 0s 2ms/step - loss: 2.4326
16/16 [==============================] - 0s 3ms/step - loss: 2.4264
16/16 [==============================] - 0s 3ms/step - loss: 2.3920
16/16 [==============================] - 0s 2ms/step - loss: 2.3355
16/16 [==============================] - 0s 1ms/step - loss: 2.2913
16/16 [==============================] - 0s 3ms/step - loss: 2.2757
16/16 [==============================] - 0s 3ms/step - loss: 2.2693
16/16 [==============================] - 0s 2ms/step - loss: 2.2681

Testing for epoch 51 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0719
16/16 [==============================] - 0s 2ms/step - loss: 2.1583
16/16 [==============================] - 0s 4ms/step - loss: 2.5087
16/16 [==============================] - 0s 2ms/step - loss: 2.4985
16/16 [==============================] - 0s 3ms/step - loss: 2.4595
16/16 [==============================] - 0s 3ms/step - loss: 2.3951
16/16 [==============================] - 0s 2ms/step - loss: 2.3453
16/16 [==============================] - 0s 3ms/step - loss: 2.3280
16/16 [==============================] - 0s 3ms/step - loss: 2.3210
16/16 [==============================] - 0s 4ms/step - loss: 2.3196
Epoch 52 of 60

Testing for epoch 52 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0706
16/16 [==============================] - 0s 3ms/step - loss: 2.1555
16/16 [==============================] - 0s 3ms/step - loss: 2.5190
16/16 [==============================] - 0s 3ms/step - loss: 2.5216
16/16 [==============================] - 0s 3ms/step - loss: 2.4899
16/16 [==============================] - 0s 2ms/step - loss: 2.4321
16/16 [==============================] - 0s 3ms/step - loss: 2.3862
16/16 [==============================] - 0s 1ms/step - loss: 2.3697
16/16 [==============================] - 0s 4ms/step - loss: 2.3629
16/16 [==============================] - 0s 4ms/step - loss: 2.3615

Testing for epoch 52 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0702
16/16 [==============================] - 0s 2ms/step - loss: 2.1170
16/16 [==============================] - 0s 5ms/step - loss: 2.4675
16/16 [==============================] - 0s 2ms/step - loss: 2.4636
16/16 [==============================] - 0s 4ms/step - loss: 2.4306
16/16 [==============================] - 0s 5ms/step - loss: 2.3743
16/16 [==============================] - 0s 2ms/step - loss: 2.3293
16/16 [==============================] - 0s 4ms/step - loss: 2.3134
16/16 [==============================] - 0s 2ms/step - loss: 2.3069
16/16 [==============================] - 0s 3ms/step - loss: 2.3056

Testing for epoch 52 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0722
16/16 [==============================] - 0s 3ms/step - loss: 2.1466
16/16 [==============================] - 0s 3ms/step - loss: 2.4949
16/16 [==============================] - 0s 2ms/step - loss: 2.4892
16/16 [==============================] - 0s 3ms/step - loss: 2.4543
16/16 [==============================] - 0s 2ms/step - loss: 2.3950
16/16 [==============================] - 0s 4ms/step - loss: 2.3486
16/16 [==============================] - 0s 5ms/step - loss: 2.3323
16/16 [==============================] - 0s 2ms/step - loss: 2.3257
16/16 [==============================] - 0s 4ms/step - loss: 2.3244

Testing for epoch 52 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0702
16/16 [==============================] - 0s 3ms/step - loss: 2.1543
16/16 [==============================] - 0s 3ms/step - loss: 2.4993
16/16 [==============================] - 0s 2ms/step - loss: 2.4897
16/16 [==============================] - 0s 3ms/step - loss: 2.4510
16/16 [==============================] - 0s 4ms/step - loss: 2.3881
16/16 [==============================] - 0s 2ms/step - loss: 2.3398
16/16 [==============================] - 0s 2ms/step - loss: 2.3229
16/16 [==============================] - 0s 5ms/step - loss: 2.3160
16/16 [==============================] - 0s 9ms/step - loss: 2.3147

Testing for epoch 52 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0724
16/16 [==============================] - 0s 2ms/step - loss: 2.1653
16/16 [==============================] - 0s 2ms/step - loss: 2.5186
16/16 [==============================] - 0s 2ms/step - loss: 2.5079
16/16 [==============================] - 0s 2ms/step - loss: 2.4676
16/16 [==============================] - 0s 5ms/step - loss: 2.4031
16/16 [==============================] - 0s 4ms/step - loss: 2.3537
16/16 [==============================] - 0s 3ms/step - loss: 2.3366
16/16 [==============================] - 0s 2ms/step - loss: 2.3296
16/16 [==============================] - 0s 3ms/step - loss: 2.3283
Epoch 53 of 60

Testing for epoch 53 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0694
16/16 [==============================] - 0s 3ms/step - loss: 2.1789
16/16 [==============================] - 0s 4ms/step - loss: 2.5339
16/16 [==============================] - 0s 3ms/step - loss: 2.5265
16/16 [==============================] - 0s 4ms/step - loss: 2.4889
16/16 [==============================] - 0s 3ms/step - loss: 2.4261
16/16 [==============================] - 0s 2ms/step - loss: 2.3774
16/16 [==============================] - 0s 2ms/step - loss: 2.3605
16/16 [==============================] - 0s 7ms/step - loss: 2.3536
16/16 [==============================] - 0s 4ms/step - loss: 2.3522

Testing for epoch 53 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0693
16/16 [==============================] - 0s 2ms/step - loss: 2.2138
16/16 [==============================] - 0s 3ms/step - loss: 2.5848
16/16 [==============================] - 0s 4ms/step - loss: 2.5827
16/16 [==============================] - 0s 2ms/step - loss: 2.5465
16/16 [==============================] - 0s 7ms/step - loss: 2.4850
16/16 [==============================] - 0s 6ms/step - loss: 2.4367
16/16 [==============================] - 0s 6ms/step - loss: 2.4197
16/16 [==============================] - 0s 5ms/step - loss: 2.4128
16/16 [==============================] - 0s 2ms/step - loss: 2.4114

Testing for epoch 53 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0717
16/16 [==============================] - 0s 1ms/step - loss: 2.1729
16/16 [==============================] - 0s 5ms/step - loss: 2.5276
16/16 [==============================] - 0s 4ms/step - loss: 2.5205
16/16 [==============================] - 0s 2ms/step - loss: 2.4828
16/16 [==============================] - 0s 6ms/step - loss: 2.4197
16/16 [==============================] - 0s 3ms/step - loss: 2.3702
16/16 [==============================] - 0s 2ms/step - loss: 2.3528
16/16 [==============================] - 0s 4ms/step - loss: 2.3457
16/16 [==============================] - 0s 4ms/step - loss: 2.3443

Testing for epoch 53 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0668
16/16 [==============================] - 0s 2ms/step - loss: 2.1583
16/16 [==============================] - 0s 3ms/step - loss: 2.4990
16/16 [==============================] - 0s 3ms/step - loss: 2.4885
16/16 [==============================] - 0s 2ms/step - loss: 2.4500
16/16 [==============================] - 0s 5ms/step - loss: 2.3878
16/16 [==============================] - 0s 4ms/step - loss: 2.3392
16/16 [==============================] - 0s 4ms/step - loss: 2.3222
16/16 [==============================] - 0s 3ms/step - loss: 2.3152
16/16 [==============================] - 0s 3ms/step - loss: 2.3138

Testing for epoch 53 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0691
16/16 [==============================] - 0s 3ms/step - loss: 2.1618
16/16 [==============================] - 0s 4ms/step - loss: 2.5082
16/16 [==============================] - 0s 3ms/step - loss: 2.4990
16/16 [==============================] - 0s 2ms/step - loss: 2.4612
16/16 [==============================] - 0s 3ms/step - loss: 2.3996
16/16 [==============================] - 0s 3ms/step - loss: 2.3531
16/16 [==============================] - 0s 2ms/step - loss: 2.3368
16/16 [==============================] - 0s 8ms/step - loss: 2.3301
16/16 [==============================] - 0s 3ms/step - loss: 2.3288
Epoch 54 of 60

Testing for epoch 54 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0689
16/16 [==============================] - 0s 5ms/step - loss: 2.1606
16/16 [==============================] - 0s 5ms/step - loss: 2.5024
16/16 [==============================] - 0s 2ms/step - loss: 2.4910
16/16 [==============================] - 0s 2ms/step - loss: 2.4533
16/16 [==============================] - 0s 2ms/step - loss: 2.3908
16/16 [==============================] - 0s 2ms/step - loss: 2.3428
16/16 [==============================] - 0s 2ms/step - loss: 2.3262
16/16 [==============================] - 0s 2ms/step - loss: 2.3194
16/16 [==============================] - 0s 3ms/step - loss: 2.3181

Testing for epoch 54 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0718
16/16 [==============================] - 0s 5ms/step - loss: 2.2242
16/16 [==============================] - 0s 5ms/step - loss: 2.5896
16/16 [==============================] - 0s 3ms/step - loss: 2.5818
16/16 [==============================] - 0s 2ms/step - loss: 2.5439
16/16 [==============================] - 0s 2ms/step - loss: 2.4818
16/16 [==============================] - 0s 4ms/step - loss: 2.4330
16/16 [==============================] - 0s 5ms/step - loss: 2.4159
16/16 [==============================] - 0s 4ms/step - loss: 2.4090
16/16 [==============================] - 0s 3ms/step - loss: 2.4076

Testing for epoch 54 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0700
16/16 [==============================] - 0s 2ms/step - loss: 2.0796
16/16 [==============================] - 0s 2ms/step - loss: 2.4090
16/16 [==============================] - 0s 3ms/step - loss: 2.4028
16/16 [==============================] - 0s 4ms/step - loss: 2.3683
16/16 [==============================] - 0s 3ms/step - loss: 2.3110
16/16 [==============================] - 0s 3ms/step - loss: 2.2667
16/16 [==============================] - 0s 2ms/step - loss: 2.2513
16/16 [==============================] - 0s 3ms/step - loss: 2.2450
16/16 [==============================] - 0s 5ms/step - loss: 2.2438

Testing for epoch 54 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0695
16/16 [==============================] - 0s 5ms/step - loss: 2.1617
16/16 [==============================] - 0s 2ms/step - loss: 2.4997
16/16 [==============================] - 0s 5ms/step - loss: 2.4863
16/16 [==============================] - 0s 3ms/step - loss: 2.4451
16/16 [==============================] - 0s 3ms/step - loss: 2.3809
16/16 [==============================] - 0s 2ms/step - loss: 2.3320
16/16 [==============================] - 0s 2ms/step - loss: 2.3150
16/16 [==============================] - 0s 2ms/step - loss: 2.3081
16/16 [==============================] - 0s 3ms/step - loss: 2.3067

Testing for epoch 54 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.0688
16/16 [==============================] - 0s 5ms/step - loss: 2.1732
16/16 [==============================] - 0s 2ms/step - loss: 2.5211
16/16 [==============================] - 0s 2ms/step - loss: 2.5106
16/16 [==============================] - 0s 2ms/step - loss: 2.4728
16/16 [==============================] - 0s 2ms/step - loss: 2.4129
16/16 [==============================] - 0s 2ms/step - loss: 2.3664
16/16 [==============================] - 0s 1ms/step - loss: 2.3499
16/16 [==============================] - 0s 2ms/step - loss: 2.3432
16/16 [==============================] - 0s 3ms/step - loss: 2.3418
Epoch 55 of 60

Testing for epoch 55 index 1:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0707
16/16 [==============================] - 0s 2ms/step - loss: 2.1196
16/16 [==============================] - 0s 2ms/step - loss: 2.4570
16/16 [==============================] - 0s 5ms/step - loss: 2.4479
16/16 [==============================] - 0s 1ms/step - loss: 2.4099
16/16 [==============================] - 0s 5ms/step - loss: 2.3490
16/16 [==============================] - 0s 2ms/step - loss: 2.3021
16/16 [==============================] - 0s 2ms/step - loss: 2.2856
16/16 [==============================] - 0s 7ms/step - loss: 2.2789
16/16 [==============================] - 0s 2ms/step - loss: 2.2775

Testing for epoch 55 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0670
16/16 [==============================] - 0s 4ms/step - loss: 2.1730
16/16 [==============================] - 0s 2ms/step - loss: 2.5250
16/16 [==============================] - 0s 2ms/step - loss: 2.5199
16/16 [==============================] - 0s 4ms/step - loss: 2.4841
16/16 [==============================] - 0s 4ms/step - loss: 2.4243
16/16 [==============================] - 0s 7ms/step - loss: 2.3775
16/16 [==============================] - 0s 3ms/step - loss: 2.3610
16/16 [==============================] - 0s 2ms/step - loss: 2.3543
16/16 [==============================] - 0s 6ms/step - loss: 2.3530

Testing for epoch 55 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0678
16/16 [==============================] - 0s 4ms/step - loss: 2.1617
16/16 [==============================] - 0s 2ms/step - loss: 2.4931
16/16 [==============================] - 0s 4ms/step - loss: 2.4799
16/16 [==============================] - 0s 2ms/step - loss: 2.4422
16/16 [==============================] - 0s 3ms/step - loss: 2.3820
16/16 [==============================] - 0s 3ms/step - loss: 2.3347
16/16 [==============================] - 0s 4ms/step - loss: 2.3181
16/16 [==============================] - 0s 6ms/step - loss: 2.3115
16/16 [==============================] - 0s 4ms/step - loss: 2.3101

Testing for epoch 55 index 4:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0683
16/16 [==============================] - 0s 4ms/step - loss: 2.1592
16/16 [==============================] - 0s 4ms/step - loss: 2.4883
16/16 [==============================] - 0s 2ms/step - loss: 2.4757
16/16 [==============================] - 0s 2ms/step - loss: 2.4365
16/16 [==============================] - 0s 4ms/step - loss: 2.3739
16/16 [==============================] - 0s 3ms/step - loss: 2.3259
16/16 [==============================] - 0s 3ms/step - loss: 2.3094
16/16 [==============================] - 0s 4ms/step - loss: 2.3028
16/16 [==============================] - 0s 3ms/step - loss: 2.3015

Testing for epoch 55 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0679
16/16 [==============================] - 0s 2ms/step - loss: 2.1904
16/16 [==============================] - 0s 6ms/step - loss: 2.5396
16/16 [==============================] - 0s 4ms/step - loss: 2.5289
16/16 [==============================] - 0s 5ms/step - loss: 2.4891
16/16 [==============================] - 0s 3ms/step - loss: 2.4263
16/16 [==============================] - 0s 4ms/step - loss: 2.3784
16/16 [==============================] - 0s 2ms/step - loss: 2.3616
16/16 [==============================] - 0s 3ms/step - loss: 2.3548
16/16 [==============================] - 0s 2ms/step - loss: 2.3535
Epoch 56 of 60

Testing for epoch 56 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0666
16/16 [==============================] - 0s 4ms/step - loss: 2.2312
16/16 [==============================] - 0s 4ms/step - loss: 2.5832
16/16 [==============================] - 0s 1ms/step - loss: 2.5710
16/16 [==============================] - 0s 4ms/step - loss: 2.5294
16/16 [==============================] - 0s 5ms/step - loss: 2.4650
16/16 [==============================] - 0s 5ms/step - loss: 2.4159
16/16 [==============================] - 0s 4ms/step - loss: 2.3990
16/16 [==============================] - 0s 6ms/step - loss: 2.3922
16/16 [==============================] - 0s 4ms/step - loss: 2.3908

Testing for epoch 56 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0681
16/16 [==============================] - 0s 5ms/step - loss: 2.1673
16/16 [==============================] - 0s 2ms/step - loss: 2.5037
16/16 [==============================] - 0s 2ms/step - loss: 2.4935
16/16 [==============================] - 0s 3ms/step - loss: 2.4537
16/16 [==============================] - 0s 2ms/step - loss: 2.3913
16/16 [==============================] - 0s 6ms/step - loss: 2.3433
16/16 [==============================] - 0s 1ms/step - loss: 2.3267
16/16 [==============================] - 0s 2ms/step - loss: 2.3200
16/16 [==============================] - 0s 4ms/step - loss: 2.3186

Testing for epoch 56 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0661
16/16 [==============================] - 0s 2ms/step - loss: 2.1985
16/16 [==============================] - 0s 4ms/step - loss: 2.5259
16/16 [==============================] - 0s 2ms/step - loss: 2.5061
16/16 [==============================] - 0s 5ms/step - loss: 2.4597
16/16 [==============================] - 0s 2ms/step - loss: 2.3917
16/16 [==============================] - 0s 3ms/step - loss: 2.3403
16/16 [==============================] - 0s 2ms/step - loss: 2.3227
16/16 [==============================] - 0s 4ms/step - loss: 2.3157
16/16 [==============================] - 0s 3ms/step - loss: 2.3143

Testing for epoch 56 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0650
16/16 [==============================] - 0s 2ms/step - loss: 2.2607
16/16 [==============================] - 0s 2ms/step - loss: 2.6098
16/16 [==============================] - 0s 5ms/step - loss: 2.5929
16/16 [==============================] - 0s 4ms/step - loss: 2.5481
16/16 [==============================] - 0s 5ms/step - loss: 2.4807
16/16 [==============================] - 0s 4ms/step - loss: 2.4289
16/16 [==============================] - 0s 4ms/step - loss: 2.4107
16/16 [==============================] - 0s 2ms/step - loss: 2.4034
16/16 [==============================] - 0s 1ms/step - loss: 2.4019

Testing for epoch 56 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0674
16/16 [==============================] - 0s 7ms/step - loss: 2.1392
16/16 [==============================] - 0s 1ms/step - loss: 2.4669
16/16 [==============================] - 0s 6ms/step - loss: 2.4538
16/16 [==============================] - 0s 3ms/step - loss: 2.4177
16/16 [==============================] - 0s 4ms/step - loss: 2.3590
16/16 [==============================] - 0s 5ms/step - loss: 2.3130
16/16 [==============================] - 0s 5ms/step - loss: 2.2968
16/16 [==============================] - 0s 3ms/step - loss: 2.2902
16/16 [==============================] - 0s 3ms/step - loss: 2.2889
Epoch 57 of 60

Testing for epoch 57 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0637
16/16 [==============================] - 0s 5ms/step - loss: 2.2322
16/16 [==============================] - 0s 4ms/step - loss: 2.5742
16/16 [==============================] - 0s 5ms/step - loss: 2.5555
16/16 [==============================] - 0s 3ms/step - loss: 2.5105
16/16 [==============================] - 0s 3ms/step - loss: 2.4434
16/16 [==============================] - 0s 4ms/step - loss: 2.3925
16/16 [==============================] - 0s 3ms/step - loss: 2.3750
16/16 [==============================] - 0s 3ms/step - loss: 2.3680
16/16 [==============================] - 0s 4ms/step - loss: 2.3666

Testing for epoch 57 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0668
16/16 [==============================] - 0s 4ms/step - loss: 2.2124
16/16 [==============================] - 0s 3ms/step - loss: 2.5543
16/16 [==============================] - 0s 2ms/step - loss: 2.5399
16/16 [==============================] - 0s 4ms/step - loss: 2.4971
16/16 [==============================] - 0s 6ms/step - loss: 2.4316
16/16 [==============================] - 0s 2ms/step - loss: 2.3815
16/16 [==============================] - 0s 4ms/step - loss: 2.3642
16/16 [==============================] - 0s 2ms/step - loss: 2.3573
16/16 [==============================] - 0s 3ms/step - loss: 2.3559

Testing for epoch 57 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0696
16/16 [==============================] - 0s 6ms/step - loss: 2.1721
16/16 [==============================] - 0s 6ms/step - loss: 2.4990
16/16 [==============================] - 0s 2ms/step - loss: 2.4813
16/16 [==============================] - 0s 5ms/step - loss: 2.4392
16/16 [==============================] - 0s 3ms/step - loss: 2.3767
16/16 [==============================] - 0s 5ms/step - loss: 2.3292
16/16 [==============================] - 0s 6ms/step - loss: 2.3126
16/16 [==============================] - 0s 4ms/step - loss: 2.3059
16/16 [==============================] - 0s 2ms/step - loss: 2.3046

Testing for epoch 57 index 4:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0646
16/16 [==============================] - 0s 3ms/step - loss: 2.2119
16/16 [==============================] - 0s 2ms/step - loss: 2.5506
16/16 [==============================] - 0s 2ms/step - loss: 2.5316
16/16 [==============================] - 0s 3ms/step - loss: 2.4873
16/16 [==============================] - 0s 3ms/step - loss: 2.4208
16/16 [==============================] - 0s 7ms/step - loss: 2.3702
16/16 [==============================] - 0s 4ms/step - loss: 2.3529
16/16 [==============================] - 0s 4ms/step - loss: 2.3461
16/16 [==============================] - 0s 3ms/step - loss: 2.3447

Testing for epoch 57 index 5:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0640
16/16 [==============================] - 0s 1ms/step - loss: 2.2799
16/16 [==============================] - 0s 2ms/step - loss: 2.6299
16/16 [==============================] - 0s 2ms/step - loss: 2.6088
16/16 [==============================] - 0s 2ms/step - loss: 2.5609
16/16 [==============================] - 0s 3ms/step - loss: 2.4911
16/16 [==============================] - 0s 3ms/step - loss: 2.4389
16/16 [==============================] - 0s 5ms/step - loss: 2.4209
16/16 [==============================] - 0s 3ms/step - loss: 2.4138
16/16 [==============================] - 0s 6ms/step - loss: 2.4123
Epoch 58 of 60

Testing for epoch 58 index 1:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0664
16/16 [==============================] - 0s 4ms/step - loss: 2.2084
16/16 [==============================] - 0s 8ms/step - loss: 2.5451
16/16 [==============================] - 0s 2ms/step - loss: 2.5239
16/16 [==============================] - 0s 3ms/step - loss: 2.4772
16/16 [==============================] - 0s 5ms/step - loss: 2.4094
16/16 [==============================] - 0s 3ms/step - loss: 2.3583
16/16 [==============================] - 0s 5ms/step - loss: 2.3410
16/16 [==============================] - 0s 3ms/step - loss: 2.3341
16/16 [==============================] - 0s 4ms/step - loss: 2.3327

Testing for epoch 58 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 4ms/step - loss: 0.0669
16/16 [==============================] - 0s 5ms/step - loss: 2.2197
16/16 [==============================] - 0s 4ms/step - loss: 2.5561
16/16 [==============================] - 0s 4ms/step - loss: 2.5388
16/16 [==============================] - 0s 7ms/step - loss: 2.4969
16/16 [==============================] - 0s 2ms/step - loss: 2.4321
16/16 [==============================] - 0s 5ms/step - loss: 2.3831
16/16 [==============================] - 0s 7ms/step - loss: 2.3663
16/16 [==============================] - 0s 4ms/step - loss: 2.3595
16/16 [==============================] - 0s 4ms/step - loss: 2.3582

Testing for epoch 58 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0669
16/16 [==============================] - 0s 4ms/step - loss: 2.2550
16/16 [==============================] - 0s 4ms/step - loss: 2.5991
16/16 [==============================] - 0s 5ms/step - loss: 2.5808
16/16 [==============================] - 0s 6ms/step - loss: 2.5364
16/16 [==============================] - 0s 6ms/step - loss: 2.4695
16/16 [==============================] - 0s 2ms/step - loss: 2.4187
16/16 [==============================] - 0s 3ms/step - loss: 2.4012
16/16 [==============================] - 0s 3ms/step - loss: 2.3943
16/16 [==============================] - 0s 6ms/step - loss: 2.3929

Testing for epoch 58 index 4:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0657
16/16 [==============================] - 0s 2ms/step - loss: 2.2386
16/16 [==============================] - 0s 5ms/step - loss: 2.5687
16/16 [==============================] - 0s 3ms/step - loss: 2.5429
16/16 [==============================] - 0s 5ms/step - loss: 2.4949
16/16 [==============================] - 0s 5ms/step - loss: 2.4270
16/16 [==============================] - 0s 7ms/step - loss: 2.3759
16/16 [==============================] - 0s 2ms/step - loss: 2.3583
16/16 [==============================] - 0s 2ms/step - loss: 2.3512
16/16 [==============================] - 0s 4ms/step - loss: 2.3498

Testing for epoch 58 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0652
16/16 [==============================] - 0s 4ms/step - loss: 2.2322
16/16 [==============================] - 0s 3ms/step - loss: 2.5782
16/16 [==============================] - 0s 3ms/step - loss: 2.5634
16/16 [==============================] - 0s 6ms/step - loss: 2.5201
16/16 [==============================] - 0s 5ms/step - loss: 2.4550
16/16 [==============================] - 0s 2ms/step - loss: 2.4058
16/16 [==============================] - 0s 2ms/step - loss: 2.3886
16/16 [==============================] - 0s 3ms/step - loss: 2.3818
16/16 [==============================] - 0s 3ms/step - loss: 2.3804
Epoch 59 of 60

Testing for epoch 59 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0649
16/16 [==============================] - 0s 3ms/step - loss: 2.2533
16/16 [==============================] - 0s 1ms/step - loss: 2.6075
16/16 [==============================] - 0s 2ms/step - loss: 2.5935
16/16 [==============================] - 0s 4ms/step - loss: 2.5503
16/16 [==============================] - 0s 3ms/step - loss: 2.4850
16/16 [==============================] - 0s 1ms/step - loss: 2.4350
16/16 [==============================] - 0s 2ms/step - loss: 2.4176
16/16 [==============================] - 0s 4ms/step - loss: 2.4107
16/16 [==============================] - 0s 2ms/step - loss: 2.4093

Testing for epoch 59 index 2:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0674
16/16 [==============================] - 0s 2ms/step - loss: 2.2017
16/16 [==============================] - 0s 3ms/step - loss: 2.5263
16/16 [==============================] - 0s 3ms/step - loss: 2.5029
16/16 [==============================] - 0s 6ms/step - loss: 2.4571
16/16 [==============================] - 0s 2ms/step - loss: 2.3895
16/16 [==============================] - 0s 8ms/step - loss: 2.3384
16/16 [==============================] - 0s 5ms/step - loss: 2.3209
16/16 [==============================] - 0s 2ms/step - loss: 2.3140
16/16 [==============================] - 0s 3ms/step - loss: 2.3126

Testing for epoch 59 index 3:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0658
16/16 [==============================] - 0s 2ms/step - loss: 2.2221
16/16 [==============================] - 0s 3ms/step - loss: 2.5649
16/16 [==============================] - 0s 2ms/step - loss: 2.5472
16/16 [==============================] - 0s 1ms/step - loss: 2.5030
16/16 [==============================] - 0s 5ms/step - loss: 2.4375
16/16 [==============================] - 0s 3ms/step - loss: 2.3875
16/16 [==============================] - 0s 5ms/step - loss: 2.3700
16/16 [==============================] - 0s 1ms/step - loss: 2.3630
16/16 [==============================] - 0s 3ms/step - loss: 2.3616

Testing for epoch 59 index 4:
79/79 [==============================] - 0s 3ms/step
16/16 [==============================] - 0s 1ms/step - loss: 0.0636
16/16 [==============================] - 0s 2ms/step - loss: 2.2680
16/16 [==============================] - 0s 1ms/step - loss: 2.6172
16/16 [==============================] - 0s 2ms/step - loss: 2.5991
16/16 [==============================] - 0s 3ms/step - loss: 2.5530
16/16 [==============================] - 0s 4ms/step - loss: 2.4840
16/16 [==============================] - 0s 1ms/step - loss: 2.4308
16/16 [==============================] - 0s 2ms/step - loss: 2.4125
16/16 [==============================] - 0s 2ms/step - loss: 2.4052
16/16 [==============================] - 0s 4ms/step - loss: 2.4037

Testing for epoch 59 index 5:
79/79 [==============================] - 0s 2ms/step
16/16 [==============================] - 0s 3ms/step - loss: 0.0645
16/16 [==============================] - 0s 2ms/step - loss: 2.3262
16/16 [==============================] - 0s 2ms/step - loss: 2.6846
16/16 [==============================] - 0s 2ms/step - loss: 2.6643
16/16 [==============================] - 0s 4ms/step - loss: 2.6170
16/16 [==============================] - 0s 4ms/step - loss: 2.5475
16/16 [==============================] - 0s 2ms/step - loss: 2.4943
16/16 [==============================] - 0s 4ms/step - loss: 2.4758
16/16 [==============================] - 0s 3ms/step - loss: 2.4684
16/16 [==============================] - 0s 5ms/step - loss: 2.4669
Epoch 60 of 60

Testing for epoch 60 index 1:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0624
16/16 [==============================] - 0s 4ms/step - loss: 2.2836
16/16 [==============================] - 0s 5ms/step - loss: 2.6286
16/16 [==============================] - 0s 6ms/step - loss: 2.6071
16/16 [==============================] - 0s 5ms/step - loss: 2.5584
16/16 [==============================] - 0s 3ms/step - loss: 2.4869
16/16 [==============================] - 0s 2ms/step - loss: 2.4337
16/16 [==============================] - 0s 2ms/step - loss: 2.4154
16/16 [==============================] - 0s 5ms/step - loss: 2.4082
16/16 [==============================] - 0s 6ms/step - loss: 2.4067

Testing for epoch 60 index 2:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 8ms/step - loss: 0.0674
16/16 [==============================] - 0s 2ms/step - loss: 2.2322
16/16 [==============================] - 0s 2ms/step - loss: 2.5733
16/16 [==============================] - 0s 2ms/step - loss: 2.5542
16/16 [==============================] - 0s 2ms/step - loss: 2.5100
16/16 [==============================] - 0s 3ms/step - loss: 2.4449
16/16 [==============================] - 0s 4ms/step - loss: 2.3954
16/16 [==============================] - 0s 2ms/step - loss: 2.3783
16/16 [==============================] - 0s 2ms/step - loss: 2.3716
16/16 [==============================] - 0s 3ms/step - loss: 2.3703

Testing for epoch 60 index 3:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 5ms/step - loss: 0.0663
16/16 [==============================] - 0s 1ms/step - loss: 2.2388
16/16 [==============================] - 0s 5ms/step - loss: 2.5764
16/16 [==============================] - 0s 3ms/step - loss: 2.5554
16/16 [==============================] - 0s 2ms/step - loss: 2.5087
16/16 [==============================] - 0s 4ms/step - loss: 2.4413
16/16 [==============================] - 0s 2ms/step - loss: 2.3905
16/16 [==============================] - 0s 2ms/step - loss: 2.3728
16/16 [==============================] - 0s 1ms/step - loss: 2.3658
16/16 [==============================] - 0s 2ms/step - loss: 2.3644

Testing for epoch 60 index 4:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 6ms/step - loss: 0.0640
16/16 [==============================] - 0s 2ms/step - loss: 2.2213
16/16 [==============================] - 0s 3ms/step - loss: 2.5594
16/16 [==============================] - 0s 4ms/step - loss: 2.5391
16/16 [==============================] - 0s 2ms/step - loss: 2.4931
16/16 [==============================] - 0s 2ms/step - loss: 2.4266
16/16 [==============================] - 0s 6ms/step - loss: 2.3762
16/16 [==============================] - 0s 3ms/step - loss: 2.3591
16/16 [==============================] - 0s 1ms/step - loss: 2.3523
16/16 [==============================] - 0s 2ms/step - loss: 2.3510

Testing for epoch 60 index 5:
79/79 [==============================] - 0s 1ms/step
16/16 [==============================] - 0s 2ms/step - loss: 0.0643
16/16 [==============================] - 0s 3ms/step - loss: 2.2371
16/16 [==============================] - 0s 2ms/step - loss: 2.5697
16/16 [==============================] - 0s 6ms/step - loss: 2.5466
16/16 [==============================] - 0s 5ms/step - loss: 2.4985
16/16 [==============================] - 0s 3ms/step - loss: 2.4292
16/16 [==============================] - 0s 5ms/step - loss: 2.3773
16/16 [==============================] - 0s 2ms/step - loss: 2.3598
16/16 [==============================] - 0s 2ms/step - loss: 2.3529
16/16 [==============================] - 0s 2ms/step - loss: 2.3515
79/79 [==============================] - 0s 2ms/step
outlier_MO_GAAL_one = list(clf.labels_)
outlier_MO_GAAL_one = list(map(lambda x: 1 if x==0  else -1,outlier_MO_GAAL_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_MO_GAAL_one,tab_bunny)
_conf.conf("MO-GAAL (Liu et al., 2019)")

Accuracy: 0.952
Precision: 0.952
Recall: 1.000
F1 Score: 0.975
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
thirteen = twelve.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  thirteen = twelve.append(_conf.tab)
thirteen
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756
Feature Bagging (Lazarevic and Kumar, 2005) 0.954455 0.977282 0.974822 0.976050
ABOD (Kriegel et al., 2008) 0.979225 0.990324 0.987830 0.989076
Isolation Forest (Liu et al., 2008) 0.791051 0.996795 0.783047 0.877086
HBOS (Goldstein and Dengel, 2012) 0.918897 0.958368 0.956358 0.957362
SOS (Janssens et al., 2012) 0.912105 0.954985 0.952581 0.953782
SO-GAAL (Liu et al., 2019) 0.952058 0.952058 1.000000 0.975440
MO-GAAL (Liu et al., 2019) 0.952058 0.952058 1.000000 0.975440

LSCP

1. random.seed 지정했는가? O

2. contamination 지정했는가? O

3. Iteration 지정할 수 있는가? X

detectors = [KNN(), LOF(), OCSVM()]
clf = LSCP(detectors,contamination=0.05, random_state=77)
clf.fit(_df[['x', 'y','fnoise']])
_df['LSCP_clf'] = clf.labels_
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/pyod/models/lscp.py:382: UserWarning: The number of histogram bins is greater than the number of classifiers, reducing n_bins to n_clf.
  warnings.warn(
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/scipy/stats/_stats_py.py:4424: ConstantInputWarning: An input array is constant; the correlation coefficient is not defined.
  warnings.warn(stats.ConstantInputWarning(msg))
/home/csy/anaconda3/envs/temp_csy/lib/python3.8/site-packages/scipy/stats/_stats_py.py:4424: ConstantInputWarning: An input array is constant; the correlation coefficient is not defined.
  warnings.warn(stats.ConstantInputWarning(msg))
outlier_LSCP_one = list(clf.labels_)
outlier_LSCP_one = list(map(lambda x: 1 if x==0  else -1,outlier_LSCP_one))
_conf = Conf_matrx(outlier_true_one_2,outlier_LSCP_one,tab_bunny)
_conf.conf("LSCP (Zhao et al., 2019)")

Accuracy: 0.978
Precision: 0.990
Recall: 0.987
F1 Score: 0.989
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  self.tab = self.tab.append(pd.DataFrame({"Accuracy":[self.acc],"Precision":[self.pre],"Recall":[self.rec],"F1":[self.f1]},index = [name]))
fourteen = thirteen.append(_conf.tab)
FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  fourteen = thirteen.append(_conf.tab)
fourteen
Accuracy Precision Recall F1
GODE 0.988414 0.994954 0.992866 0.993909
LOF (Breunig et al., 2000) 0.956053 0.978124 0.975661 0.976891
kNN (Ramaswamy et al., 2000) 0.981622 0.991586 0.989089 0.990336
OCSVM (Sch ̈olkopf et al., 2001) 0.959249 0.979403 0.977759 0.978580
MCD (Hardin and Rocke, 2004) 0.982421 0.992007 0.989509 0.990756
Feature Bagging (Lazarevic and Kumar, 2005) 0.954455 0.977282 0.974822 0.976050
ABOD (Kriegel et al., 2008) 0.979225 0.990324 0.987830 0.989076
Isolation Forest (Liu et al., 2008) 0.791051 0.996795 0.783047 0.877086
HBOS (Goldstein and Dengel, 2012) 0.918897 0.958368 0.956358 0.957362
SOS (Janssens et al., 2012) 0.912105 0.954985 0.952581 0.953782
SO-GAAL (Liu et al., 2019) 0.952058 0.952058 1.000000 0.975440
MO-GAAL (Liu et al., 2019) 0.952058 0.952058 1.000000 0.975440
LSCP (Zhao et al., 2019) 0.978426 0.989903 0.987411 0.988655

Bunny Result

fourteen.round(3)
Accuracy Precision Recall F1
GODE 0.988 0.995 0.993 0.994
LOF (Breunig et al., 2000) 0.956 0.978 0.976 0.977
kNN (Ramaswamy et al., 2000) 0.982 0.992 0.989 0.990
OCSVM (Sch ̈olkopf et al., 2001) 0.959 0.979 0.978 0.979
MCD (Hardin and Rocke, 2004) 0.982 0.992 0.990 0.991
Feature Bagging (Lazarevic and Kumar, 2005) 0.954 0.977 0.975 0.976
ABOD (Kriegel et al., 2008) 0.979 0.990 0.988 0.989
Isolation Forest (Liu et al., 2008) 0.791 0.997 0.783 0.877
HBOS (Goldstein and Dengel, 2012) 0.919 0.958 0.956 0.957
SOS (Janssens et al., 2012) 0.912 0.955 0.953 0.954
SO-GAAL (Liu et al., 2019) 0.952 0.952 1.000 0.975
MO-GAAL (Liu et al., 2019) 0.952 0.952 1.000 0.975
LSCP (Zhao et al., 2019) 0.978 0.990 0.987 0.989
bunny_rst = fourteen.round(3)