0
$\begingroup$

Briefly, have the following problem: \begin{equation} \sum_{i = 0}^n a_i \ (max [ F_i( \bar x ), 0 ] )^2 \rightarrow min, \\\\ s.t.\\\\ A \bar x \leq b \end{equation} where $ F( \bar x ) $ is a linear function, $a_i \gt 0$, $n$ is huge comparing to the size of $x$.

It is possible to write an equal Quadratic Programming problem, such as

$$ \sum_{i=0}^n a_i \ ( G_i )^2 \rightarrow min \\\\ s.t. \\\\ G_i \geq {\bf 0}, \quad i = 0..n \\\\ G_i \geq F_i( \bar x ) \quad i = 0..n \\\\ A \bar x \leq b $$

which can be solved very efficiently with an appropriate numerical method.

Unfortunately in my particular case such conversion doesn't work: it adds a lot of new restrictions, and that appropriate numerical method doesn't converge.

I tried to figure out another equal QPP, which adds fewer new constraints, but nothing came across my mind. Is there another way?

$\endgroup$
2
  • $\begingroup$ If you are talking about $\min[\max(F( \bar x), 0)]^2$ where $\bar{x}$ is unconstrained, you don't need any software to do it: as $F$ is linear, the answer is zero, and global minimum is attained at $\bar{x}=0$. If there are additional constraints, you may try to specify them in your question. $\endgroup$ Commented Dec 7, 2012 at 22:27
  • $\begingroup$ @user1551 there are standart linear constraints. I edited the question. $\endgroup$ Commented Dec 7, 2012 at 22:47

2 Answers 2

1
$\begingroup$

This is not really an answer. I just want to say that your optimization problem can be converted into a linear programming problem:

$\min F(\bar{x})$ subject to $A\bar{x}\le b$.

If the minimum found is $m$ and the minimizer is $x_0$, then the minimum for your original problem is $\max(m,0)^2$ and the minimizer is $x_0$.

Reason: If $F(x_0)=m<0$, then $\max(F(x_0), 0)^2=\max(m, 0)^2=0$, which is the least possible value of $\max(F(\bar{x}), 0)^2$ over the whole space. Hence $x_0$ is a feasible and global minimizer.

If $F(x_0)=m\ge0$, then $F(\bar{x})\ge F(x_0)\ge0$ for every $\bar{x}\in D=\{\bar{x}: A\bar{x}\le b\}$. Hence $\max(F(\bar{x}), 0) = F(\bar{x})\ge0$ for every $\bar{x}\in D$. Therefore $$\min_{\bar{x}\in D} \max(F(\bar{x}), 0)^2 = \min_{\bar{x}\in D} F(\bar{x})^2 = \left(\min_{\bar{x}\in D} F(\bar{x})\right)^2 = m^2 = \max(m,0)^2.$$

$\endgroup$
3
  • $\begingroup$ Yes, that's right. I apologize for being not enough specific. Please, see the edited question. $\endgroup$ Commented Dec 8, 2012 at 14:14
  • $\begingroup$ The original problem has multiple $F_i$. I don’t think this reformulation works in general $\endgroup$ Commented Aug 22, 2024 at 15:10
  • $\begingroup$ @goldboy I don’t remember this answer, but according to the timeline, the OP had changed his problem setting after I gave this answer. $\endgroup$ Commented Aug 22, 2024 at 15:34
0
$\begingroup$

Let $F(x) = Bx$. Test if $Bx\leq 0, Ax\leq b$ is feasible. If so, you are done. Any feasible solution is an optimal solution.

If not, go with your epigraph reformulation. Give it to a convex QP solver. You said a particular algorithm didn’t converge. Can you describe the details of that algorithm?

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.