Problems with Linear Discriminant Analysis Classifier

I wrote two functions for determining the linear discriminant classifier of an EEG data set. The data set consists of preprocessed EEG data ∈5×62×5322 and stimulus labels ∈2×5322 during a copy-spelling paradigm with a P300 speller. The data matrix X contains 5 selected time windows of EEG activity at 62 electrodes after a visual stimulus was presented on the screen in front of the subject. If the first row of is 1, the stimulus was a target stimulus, if the second row of is 1, the stimulus was a non-target stimulus. The first function returns the weight vector and the bias term. The second function is a graph class to show the result

def lda_fit(X,Y):
# class means

unique_classes=np.unique(Y)
mu=np.zeros((len(unique_classes),X.shape[1]))
for i,name in enumerate(unique_classes):
    mu[i,:] = X[Y==name,:].mean(axis=0)

mupos=mu[1]
muneg=mu[0]
mupos=mupos.reshape(155,2)
muneg=muneg.reshape(155,2)
Xneu=X[0].reshape(155,2)

# D-by-D inter class covariance matrix (signal)
Sinter = np.dot((muneg-mupos),(muneg-mupos).T)

# D-by-D intra class covariance matrices (noise)
Sintra =np.dot((Xneu-mupos),(Xneu-mupos).T)+np.dot((Xneu-muneg),(Xneu-muneg).T)

# solve eigenproblem
eigvals, eigvecs = sp.linalg.eig(Sinter,Sintra)
w = eigvecs[:,eigvals.argmax()]
# bias term
b = (w.dot(mupos) + w.dot(muneg))/2.
# return the weight vector
return w,b

I get the following error: ValueError: matmul: Input operand 1 has a mismatch in its core dimension 0, with gufunc signature (n?,k),(k,m?)-(n?,m?) (size 155 is different from 310)

I know it has something to do with the shape of the matrices, but I really stuck.

Topic lda-classifier python

Category Data Science


As you say; it has something to do with the shape of the matrices. In this case, you have matrices in the same of (155,2) - whereas it's expecting them to be of (310, 1).

Try looking into the shape of each matrix as you transform them (tip: insert a print statement wherever a transformation occurs); and make sure everything stays consistent.

Also, can you provide us with more information, such as exactly where this error occurs? It looks to be an issue with the dot product.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.