Showing results for

- Intel Community
- Software Development SDKs and Libraries
- Intel® oneAPI Math Kernel Library & Intel® Math Kernel Library
- Function minimization subroutine

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page

Highlighted
##

renepreto

Beginner

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

03-02-2010
05:59 AM

17 Views

Function minimization subroutine

I was wondering if mkl has a subroutine like the "dtrnlsp" but that works not with least squares, but with an external function, and find the minimum point of this function. I've searchd in the manual and couldn't find something like this.

If someone knowns, or even used the dtrnlsp modifing the external function to do this could please tell me. Thanks

5 Replies

Highlighted
##

Thomas_B_3

Beginner

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

03-02-2010
06:34 AM

17 Views

You can use that solver for your purpose. Actually you **have **to provide an "external" function. There is a ready to run example somewhere in MKL\Examples\Solver. The least squares calculation is occuring inside the solver.

Best regards, tj

Highlighted
##

yes, \Examples\Solver directory contains 6 examples (C and Fortran API) ofnonlinear least square problem with andwithoutboundary constraints.. see, as an example :ex_nlsqp_bc_c.c and ex_nlsqp_bc_c_x.c

Gennady_F_Intel

Moderator

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

03-02-2010
08:12 AM

17 Views

Highlighted
##

Sorry guys! i've wrote really bad so I will describe better now. Supose tha I want to find the minimum of an external function, provided by me. I wish something that just find the minimum point of a generic function. f(x) = x^2 for example. The nlsqp subroutines only find this with least squares calculations and I have to provide the y(x) of || F(x) - y(x) ||. For a generic function I don't have the y(x). Is there a subroutine in mkl that does this?

renepreto

Beginner

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

03-02-2010
09:25 AM

17 Views

Highlighted
##

ArturGuzik

Valued Contributor I

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

03-02-2010
04:25 PM

17 Views

This one you wrote a bit better:-) but still not (entirely) clear what you do have and what don't.

If you mean that you have a function and do not know the Jacobian matrix (required for least squares) then MKL comes with the djacobi function which obtains it for you using finite difference method.

In general, all optimization algorithms provided within MKL belong to gradient methods, and it means that you need to calculate derrivative (gradient) in order to obtain the next step and finally solution.

If you mean that you want MKL to minimize function using only its value(s) (don't need to know the function griadient, I guess that's what you had in mind) then the answer is no.

There are non-gradient methods capable of providing solution in that situation, with the most popular being the genetic algorithms (class of evalutionary algorithms), but you need to try to find it elsewhere.

A.

Highlighted
##

Nikita_S_Intel

Employee

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

03-02-2010
09:48 PM

17 Views

Hi Renepreto,

The MKL provides solvers for nonlinear linear square problems with and without boundary constraints, but doesnt provide solvers for functional minimization.

BTW, in some cases its possible to reformulate the functional minimization problem as the nonlinear square problem, for example, your generic function could be reformulated. (y(x) == 0)

Thank you!

--Nikita

For more complete information about compiler optimizations, see our Optimization Notice.