0

I'm a little new to optimization in Python, particularly on Scipy. If I only have one function such as this:

def prod(x):
    return -x[0] * x[1]

Then I can easily know how to do maximization through:

linear_constraint = LinearConstraint([[1,1]], [20], [20])
x0 = np.array([15,5])
result = minimize(prod, x0, constraints=linear_constraint)
print(result['x'])

This is basically maximizing x0 * x1 given the constraint that x0 + x1 = 20. The answer here is array([10., 10.]).

So now, I wish to do something more sophisticated and not just optimizing one function. Suppose I would like to optimize multiple functions. Say I have this matrix P of size n x 3:

A     B     C
3     2     7
6     3     4
8     1     5
...

Is there a way to optimize (minimize) P[i] * x, where * here is a dot product for all i = 1..n? In short, I wish to optimize:

3*x[0] + 2*x[1] + 7*x[2]
6*x[0] + 3*x[1] + 4*x[2]
8*x[0] + 1*x[1] + 5*x[2]
...

and under the constraint that x[0] + x[1] + x[2] = 1. Anyone knows how to implement this correctly on Python (I'm using Scipy by the way)? I'm still not sure where to begin. Thanks in advance!

1 Answer 1

1

For standard optimization tools, the objective function must return a scalar. This is called a "single objective". There is a branch in optimization that deals with multiple objectives (with names like vector optimization, multiple objective optimization and multiple criteria optimization). There is a wide literature about this. I suggest doing some reading to get familiar with the concepts, ideas and algorithms for this.

Sign up to request clarification or add additional context in comments.

4 Comments

I thought about obtaining a vector by multiplying this array (or matrix) of size n x 3 by the 3 x 1 vector [x0, x1, x2]. As a result, I would get an n x 1 vector. And then find the L2-Norm of this vector. Then minimize on that norm (scalar); as opposed to trying to minimize multiple functions all at once. Not sure if that's the right way to do it... Thoughts?
I don't know. That is just a very different problem, You stated a multi-objective problem. If you do min ||Px|| you have a single objective problem.
Hmmm, but why does it have to be a multi-objective problem? Can't I just frame it into a single objective problem? So basically computing Px just gives me a vector. And if I want to "minimize" my vector, then I'll do so by find the minimum of the L2 norm of the resultant vector.
Sure. You can redefine the objective to anything you want. It would be one of the possible scalarizations.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.