Intel® oneAPI Math Kernel Library
Ask questions and share information with other developers who use Intel® Math Kernel Library.
7110 Discussions

Reformulating a constrained minimization problem into a constrained nonlinear least square problem

tennican
Beginner
1,087 Views
I'd like to find the value of three parameters constrained to be between 0 and 1 which minimize the value of a function. In R, the optim function would use the L-BFGS-B algorithm to do this. However, I've read in the forum that MKL doesn't have an equivalent general optimization function. However, I also read that some problems can be reformulated so that the dtrnlspbc_* functions may be used. Does anyone know if my problem may be reformulated so that I can use this MKL solver?

thanks, Scott
0 Kudos
7 Replies
mecej4
Honored Contributor III
1,087 Views
If the objective function has the form

F = f2 + g2+ h2+ ...

the answer is yes.
0 Kudos
tennican
Beginner
1,087 Views
Actually, that is the case.
I'd like to find the smoothing parameters which minimize the sum of the squared errors of a triple exponential smoothing fit. So, F is just the sum of the squared difference between the true values and the fitted values. But, the fitted values are computed iteratively over the time series from updating values of the level, trend and season components.

How would I formulate this problem to use the solver?

thanks, Scott
0 Kudos
mecej4
Honored Contributor III
1,087 Views
The MKL ?trnlspbc_xxxxx routines are the only routines that address this type of problem, as far as I am aware. However, they can only handle bound constraints on your smoothing parameters. Therefore, a constraint such as

N = x

(where there are N observations x, and their mean appears in the smoothing expression) cannot be handled directly. You can try to recast your problem in this form; if you post details of your smoothing function, I'll try to help.

Should it prove impossible to recast the problem as a bound-constrained least-squares problem, you will need to use another solver than the one in MKL. See the most useful Decision Tree for Optimization Software by Professor H.J. Mittelmann.
0 Kudos
tennican
Beginner
1,087 Views
I don't think this is going to work.
Since the nonlinear least square problems is:
min(sum(Fi(x)^2)) where i=1..m and x=(x1,..,xn) and m >= n
where the constraints are on Fi(x)

My problem could be:
min(sum(Ft(alpha,beta,gamma)^2)) where t=1..T and x=(alpha,beta,gamma) and T >= 3
where Ft returns the residual for period t
But, my constraints are: 0 <= alpha, beta, gamma <= 1 when I need constaints on F instead.
And, I don't see any way to reformulate the constraints in terms of F.

Plus, the number of periods T=m could be large and I don't think this solver is intended to handle large m.

Thanks for the pointer to Decision Tree site.
I'll find something outside of MKL.
0 Kudos
mecej4
Honored Contributor III
1,087 Views
> when I need constaints on F instead.

Why do you say that? Your constraints are bounds (-,1) on and (0, +) on and . These should work fine, with something such as 1D20 used in place of .

The MKL routine should handle expressions with a few hundred terms without running into problems.
0 Kudos
tennican
Beginner
1,087 Views
Nope, the constraints have to be on F because that's the way the method works and my constraints are on X.
There is no way to derive constraints on F from the contraints on X because F isn't that simple.
Anyway, I've implemented constrained multivariate steepest descent which seems to work fine for now.

What I really want is for MKL to implement some general optimization.
In particular, I would like: L-BFGS-C.
0 Kudos
mecej4
Honored Contributor III
1,087 Views
> Nope, the constraints have to be on F because that's the way the method works and my constraints are on X.

I cannot agree or disagree with that statement because it makes no sense to me. Which "method" is it that you mean here?

Before you made the claim in #4, there was nothing stated to indicate that you had constraints on Fi. In the later posts, I see a statement that such constraints exist, but no supporting evidence or examples of such constraints.

For each equality constraint on a function of a single Fi, you can find one or more equality constraint on Fi, and that Fi can be taken out of the objective expression.

L-BFGS is an algorithm for unconstrained optimization.

Steepest descent is a very inefficient algorithm for unconstrained minimization.
0 Kudos
Reply