Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
17 Views

Function minimization subroutine

I was wondering if mkl has a subroutine like the "dtrnlsp" but that works not with least squares, but with an external function, and find the minimum point of this function. I've searchd in the manual and couldn't find something like this.

If someone knowns, or even used the dtrnlsp modifing the external function to do this could please tell me. Thanks

0 Kudos
5 Replies
Highlighted
Beginner
17 Views

You can use that solver for your purpose. Actually you have to provide an "external" function. There is a ready to run example somewhere in MKL\Examples\Solver. The least squares calculation is occuring inside the solver.

Best regards, tj

0 Kudos
Highlighted
Moderator
17 Views

yes, \Examples\Solver directory contains 6 examples (C and Fortran API) ofnonlinear least square problem with andwithoutboundary constraints.. see, as an example :ex_nlsqp_bc_c.c and ex_nlsqp_bc_c_x.c

0 Kudos
Highlighted
Beginner
17 Views

Sorry guys! i've wrote really bad so I will describe better now. Supose tha I want to find the minimum of an external function, provided by me. I wish something that just find the minimum point of a generic function. f(x) = x^2 for example. The nlsqp subroutines only find this with least squares calculations and I have to provide the y(x) of || F(x) - y(x) ||. For a generic function I don't have the y(x). Is there a subroutine in mkl that does this?

0 Kudos
Highlighted
Valued Contributor I
17 Views

This one you wrote a bit better:-) but still not (entirely) clear what you do have and what don't.

If you mean that you have a function and do not know the Jacobian matrix (required for least squares) then MKL comes with the djacobi function which obtains it for you using finite difference method.

In general, all optimization algorithms provided within MKL belong to gradient methods, and it means that you need to calculate derrivative (gradient) in order to obtain the next step and finally solution.

If you mean that you want MKL to minimize function using only its value(s) (don't need to know the function griadient, I guess that's what you had in mind) then the answer is no.

There are non-gradient methods capable of providing solution in that situation, with the most popular being the genetic algorithms (class of evalutionary algorithms), but you need to try to find it elsewhere.

A.

0 Kudos
Highlighted
Employee
17 Views

Hi Renepreto,

The MKL provides solvers for nonlinear linear square problems with and without boundary constraints, but doesnt provide solvers for functional minimization.

BTW, in some cases its possible to reformulate the functional minimization problem as the nonlinear square problem, for example, your generic function could be reformulated. (y(x) == 0)

Thank you!

--Nikita

0 Kudos