Difference between loss and cost function in the specific context of MAE in multiple-regression?
I've often met with the Mean Absolute Error loss function when dealing with regression problems in Artificial Neural Networks, but I'm still slightly confused about the difference between the word 'loss' and 'cost' function in this context.
I understand that the 'cost' function is an average of the 'loss' functions, for instance when dealing with mini-batches. The loss is a single value for a single sample in the mini-batch, and the cost is the mean of summed up losses over the whole mini-batch.
However, consider a multiple-regression problem where the output of the network is a vector of 5 values, and the true label is also a vector of 5 values. Would, in this context, the loss function still be labeled as 'loss' or would it now be the 'cost' function? Since we have to compute the absolute error sample wise, then sum it and take the mean, but we have to also do the same for every sample in the mini-batch.
Topic cost-function loss-function neural-network
Category Data Science