We know that in the case of logistic regression, a classification threshold p=0.5 is generally not an optimal choice when seeking to optimise sensitivity and specificity. This is generally due to the fact that the dataset is unbalanced. To solve this problem, one can simply vary the threshold and take the one that verifies a certain criterion (e.g. that maximises sensitivity+specificity, or such that sensitivity=specificity etc.).
However, in the case of multinomial logistic regression (with 3 or more classes), I find much less literature on the subject to determine robust decision rules. All software I know does a classification using a maximum a posteriori, but I am not satisfied with this solution, in the same way that p=0.5 is rarely satisfactory in the binary case. I imagine it is much more difficult with 3 or more classes, as one can put more emphasis on the sensitivity/specificity to a particular class where it was not an issue in the 2 class case.
So let's say I have 3 classes (or N classes), with an unbalanced dataset, and I don't favour any class (i.e I give equal weight to sensitivity and specificity for each class), how should I make my decision from the posterior probabilities returned by the multinomial logistic regression?