- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have been doing some development of neural nets using ifort. The current topic being Levenberg Marquardt in which the jacobian is normally calculated by backpropagation and then used to create a quasi Hessian and an eqn of form Ax = b is solved to find x, the delta in network weights.
I came across the jacobi family of routines the other day and was wondering if they would be useful as an alternate way of jacobian calculation. Most textbooks say that the jacobian can be calculated vi backpropagation or finite differences. Is djacobi using finite differences to calculate the jacobian?
Is one method superior to another ie backprop vs finite difference - obviously with neural nets one of the key factors is speed and performance of algorithm
Link Copied
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page