2
$\begingroup$

For example, suppose we want to maximize the 3 expressions on the right, subject to some constraints.

To give some context, this is a problem about generating prototypes in unsupervised learning. In our example, we have 3 collinear centroids (representing 3 different classes) positioned (1,1), (7,1), (9,1) in the 2D-plane, with a line through all of them. The two centroids at the ends of the line are called prototypes, which carry information (the vectors with a's and b's).

To determine which class a random point on the plane belongs to, you calculate 3 different values: $\frac{a_i}{\text{distance from point to Prototype 1}}+\frac{b_i}{\text{distance from point to Prototype 2}}$ where $i=1,2,3$, and the largest value among the 3 gives you the $i^{th}$ class. For example, if $i=2$ is the largest, then the point belongs to the class at (7,1). (We order the classes from left to right, so the 1st class is at (1,1), 2nd class is at (7,1) and 3rd class is at (9,1).

Essentially, what we are saying in our 3 objective functions is that points lying arbitrarily close to a centroid must hold a strong membership to that class (the epsilons are very small values). The equality constraints say that the midpoints between each centroid must hold equal membership to its adjacent classes. Finally, the last line of constraints just requires that the a's and b's are probabilities.

enter image description here

$\endgroup$
2
  • $\begingroup$ So $\epsilon$ is just one fixed small positive constant, not to be changed while optimizing? $\endgroup$ Commented Aug 25, 2022 at 6:37
  • $\begingroup$ Yes. We're just considering a point in the vicinity of a centroid. For example, if we consider the centroid $(1,1)$, a point arbitrarily close to it would have a distance $0+\epsilon$. This way, we also don't divide by zero. $\endgroup$ Commented Aug 25, 2022 at 14:33

1 Answer 1

1
$\begingroup$

You already realized that, if there was only one objective function, that would be a linear-programming problem in the $a_i, b_i$. But in general, multiple such objective functions don't have to have the same solutions.

So first you could try whether you are lucky and they have a common optimum. Second, you could consider combining your multiple objective functions into a single one, e.g. by optimizing the sum or something similar. If that also doesn't fit your needs, you might want to have a look at Pareto optimization, which is dealing with what is achievable when in multi-objective optimization.

$\endgroup$
1
  • $\begingroup$ Just to add to @frank's response - one pretty easy to use method of multiobjective optimization is the multi-objective genetic algorithm, which is implemented in a range of languages and doesn't require a huge amount of theoretical knowledge. Not saying it's the best thing ever for this task, but it could be a starting point. That said, if you care about all your clusters equally (or you know the weights), using the (weighted) sum of the 3 objectives functions as a single one may be good enough. $\endgroup$ Commented Aug 25, 2022 at 16:19

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.