1

Matlab offers multiple algorithms for solving Linear Programs. For example Matlab R2012b offers: 'active-set', 'trust-region-reflective', 'interior-point', 'interior-point-convex', 'levenberg-marquardt', 'trust-region-dogleg', 'lm-line-search', or 'sqp'.

But other versions of Matlab support different algorithms.

I would like to run a loop over all algorithms that are supported by the users Matlab-Version. And I would like them to be ordered like the recommendation order of Matlab.

I would like to implement something like this:

i=1;
x=[];
while (isempty(x))
    options=optimset(options,'Algorithm',Here_I_need_a_list_of_Algorithms(i))
    x = linprog(f,A,b,Aeq,beq,lb,ub,x0,options);
end

In 99% this code should be equivalent to

x = linprog(f,A,b,Aeq,beq,lb,ub,x0,options);

but sometimes the algorithm gives back an empty array because of numerical problems (exitflag -4). If there is a chance that one of the other algorithms can find a solution I would like to try them too.

So my question is: Is there a possibility to automatically get a list of all linprog-algorithms that are supported by the installed Matlab-version ordered like Matlab recommends them.

I think looping through all algorithms can make sense in other scenarios too. For example when you need very precise data and have a lot of time, you could run them all and than evaluate which gives the best results. Or one would like to loop through all algorithms, if one wants to find which algorithms is the best for LPs with a certain structure.

2
  • I'm not that familiar with matlab, but a quick check shows only dual-simplex and two interior-point algorithms for me, which i expected (are you really talking about LPs? ). Of course most of the nonlinear approaches can solve LPs too, but i would not recommend that. I don't know how good those implementations are, but LPs should not be too troublesome (except for very instable formulations). I would try dual-simplex and interior-point algs and ignore everything else! (this also simplifies reasoning about accuracy) Commented Jun 5, 2017 at 0:57
  • @sascha unfortunately dual-simplex isn't implemented in my old Matlab R2012b. But you are right, that dual-simplex is the default in the current Matlab R2017a. And you are right, that for LPs only •Large-scale interior-point •Medium-scale active set •Medium-scale Simplex are useful options, because some of the algorithms mentioned in my original question aren't LP-solvers. Commented Jun 5, 2017 at 10:19

1 Answer 1

1

There's no automatic way to do this as far as I know. If you really want to do it, the easiest thing to do would be to go to the online documentation, and check through previous versions (online documentation is available for old versions, not just the most recent release), and construct some variables like this:

r2012balgos = {'active-set', 'trust-region-reflective', 'interior-point', 'interior-point-convex', 'levenberg-marquardt', 'trust-region-dogleg', 'lm-line-search', 'sqp'};

...

r2017aalgos = {...};

v = ver('matlab');

switch v.Release
    case '(R2012b)'
        algos = r2012balgos;
    ....
    case '(R2017a)'
        algos = r2017aalgos;
end

% loop through each of the algorithms

Seems boring, but it should only take you about 30 minutes.

There's a reason MathWorks aren't making this as easy as you might hope, though, because what you're asking for isn't a great idea.

It is possible to construct artificial problems where one algorithm finds a solution and the others don't. But in practice, typically if the recommended algorithm doesn't find a solution this doesn't indicate that you should switch algorithms, it indicates that your problem wasn't well-formulated, and you should consider modifying it, perhaps by modifying some constraints, or reformulating the objective function.

And after all, why stop with just looping through the alternative algorithms? Why not also loop through lots of values for other options such as constraint tolerances, optimality tolerances, maximum number of function evaluations, etc.? These may have just as much likelihood of affecting things as a choice of algorithm. And soon you're running an optimisation algorithm to search through the space of meta-parameters for your original optimisation.

That's not a great plan - probably better to just choose one of the recommended algorithms, stick to that, and if things don't work out then focus on improving your formulation of the problems rather than over-tweaking the optimisation itself.

Sign up to request clarification or add additional context in comments.

3 Comments

In my algorithm the LPs are formulated in a way that they theoretically must have a solution. But in less than 1% I get an exitflag -4: Exiting: cannot converge because the primal residual, dual residual, or upper-bound feasibility is NaN. (of course I checked that there are no NaNs in the solver-input.) The only thing I found about exitflag -4 in the internet is de.mathworks.com/matlabcentral/answers/… where it appears to be a bug of the matlab-solver. Can exitflag -4 be avoided by just changing parameters.
If you have exit flag -4, that means you're getting NaNs somewhere in the optimisation, and it can't continue. Sure, that could possibly be a bug, but it's unlikely and there's nothing in the link you mention to indicate that. More likely is that you've formulated your problem in such a way that the NaN is correct at some points. Maybe changing parameters would mean that those points were avoided. Or you could reformulate your problem so that the NaNs didn't appear.
To help diagnose things, why not try setting the Display option to iter-detailed, so that you can see exactly what's happening at every iteration. I would guess you'll find that the NaN was right, and this will help you to reorganise things so that they don't continue to occur.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.