Problem mit Linear Discriminant Analysis
Verfasst: Donnerstag 30. Mai 2019, 12:29
Hi,
ich habe Probleme bei einer Abgabe.
In this assignment you will estimate cognitive states from electroencephalogram (EEG) data. The data matrix X contains 5 selected time windows of EEG activity at 62 electrodes after a visual stimulus was presented on the screen in front of the subject. If the first row of 𝑌 is 1, the stimulus was a target stimulus, if the second row of 𝑌 is 1, the stimulus was a non-target stimulus.
Train a linear discriminant classifier and compare it with the NCC one.
Über Hilfe freue ich mich. 
ich habe Probleme bei einer Abgabe.
In this assignment you will estimate cognitive states from electroencephalogram (EEG) data. The data matrix X contains 5 selected time windows of EEG activity at 62 electrodes after a visual stimulus was presented on the screen in front of the subject. If the first row of 𝑌 is 1, the stimulus was a target stimulus, if the second row of 𝑌 is 1, the stimulus was a non-target stimulus.
Code: Alles auswählen
import pylab as pl
import scipy as sp
from scipy.linalg import eig
from scipy.io import loadmat
from sklearn.model_selection import train_test_split
def load_data(fname):
# load the data
data = loadmat(fname)
# extract images and labels
X = data['X']
Y = data['Y']
# collapse the time-electrode dimensions
X = sp.reshape(X,(X.shape[0]*X.shape[1],X.shape[2])).T
# transform the labels to (-1,1)
Y = sp.sign((Y[0,:]>0) -.5)
return X,Y
X,Y = load_data(fname='bcidata.mat')
Code: Alles auswählen
def ncc_fit(X, Y):
'''
Train a nearest centroid classifier for N data points in D dimensions
Input:
X N-by-D Data Matrix
Y label vector of length N, labels are -1 or 1
Output:
w weight vector of length D
b bias vector of length D
'''
# class means
# IMPLEMENT CODE HERE
mupos =
muneg =
w =
b = (w.dot(mupos) + w.dot(muneg))/2.
# return the weight vector
return w,b
X_train, X_test, Y_train, Y_test = train_test_split(X,Y)
w_ncc, b_ncc = ncc_fit(X_train, Y_train)
pl.hist(X_test[Y_test<0, :] @ w_ncc)
pl.hist(X_test[Y_test>0, :] @ w_ncc)
pl.plot([b_ncc, b_ncc], [0, 500], color='k')
pl.xlabel('$Xw_{NCC}$')
pl.legend(('$b_{ncc}$','non-target','target'))
pl.ylim([0, 450])
acc = int((sp.sign(X_test @ w_ncc - b_ncc)==Y_test).mean()*100)
pl.title(f"NCC Acc {acc}%");
Train a linear discriminant classifier and compare it with the NCC one.
Code: Alles auswählen
def lda_fit(X,Y):
'''
Train a Linear Discriminant Analysis classifier
Input:
X N-by-D Data Matrix
Y label vector of length N, labels are -1 or 1
Output:
w weight vector of length D
b bias vector of length D
'''
# class means
# IMPLEMENT CODE HERE
mupos = ...
muneg = ...
# D-by-D inter class covariance matrix (signal)
Sinter = ...
# D-by-D intra class covariance matrices (noise)
Sintra = ...
# solve eigenproblem
eigvals, eigvecs = sp.linalg.eig(Sinter,Sintra)
w = eigvecs[:,eigvals.argmax()]
# bias term
b = (w.dot(mupos) + w.dot(muneg))/2.
# return the weight vector
return w,b
