Warm tip: This article is reproduced from serverfault.com, please click

Why am I getting an almost straight line model accuracy curve?

发布于 2020-09-20 15:01:49

enter image description here

I've plot my model accuracy curve in train and test data and I have obtained the following curve which looks rather unusual. What does this curve indicate? Is it overfitting or underfitting? Can anyone please help me, where am I going wrong? I am working on the ABIDE dataset. I have 871 samples, I used cc400 parcellation which generated 76636 features.

I have provided the code snippet below:

import tensorflow as tf
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
#create model
model = Sequential()

#add model layers
model.add(Dropout(0.2))
initializer_relu = tf.keras.initializers.HeUniform()
model.add(Dense(128, activation='relu', 
   kernel_initializer=initializer_relu, 
   kernel_regularizer=tf.keras.regularizers.l1(0.0001), input_shape= 
   (76636,)))
model.add(Dropout(0.2))
model.add(Dense(64, activation='relu', 
kernel_initializer=initializer_relu, 
kernel_regularizer=tf.keras.regularizers.l1(0.0001)))
model.add(Dropout(0.2))
initializer_sigmoid = tf.keras.initializers.GlorotUniform()
model.add(Dense(1, activation='sigmoid', 
kernel_initializer=initializer_sigmoid))

#compile model using mse as a measure of model performance
model.compile(optimizer='adam', loss='binary_crossentropy', 
metrics='accuracy')

from keras.callbacks import EarlyStopping
early_stopping_monitor = EarlyStopping(patience=3)
#train model
history= model.fit(X_train, y_train, validation_data=(X_test, y_test), 
batch_size=64 , epochs=20, callbacks=[early_stopping_monitor])

import matplotlib.pyplot as plt
print(history.history.keys())
# summarize history for accuracy
plt.plot(history.history[ 'accuracy' ])
plt.plot(history.history[ 'val_accuracy' ])
plt.title( 'model accuracy' )
plt.ylabel( 'accuracy' )
plt.xlabel( 'epoch' )
plt.legend([ 'train' , 'test' ], loc= 'lower right' )
plt.show()
Questioner
Faria Zarin Subah 1504027
Viewed
0
Saikrishna dugyala 2020-12-07 02:57:53

The reason for straight accuracy line is that the model is not able to learn in 20 epochs. Because different features do not have similar ranges of values and hence gradients may end up taking a long time and can oscillate back and forth and take a long time before it can finally find its way to the global/local minimum. To overcome the model learning problem, we normalize the data. We make sure that the different features take on similar ranges of values so that gradient descents can converge more quickly