- Marquer comme nouveau
- Marquer
- S'abonner
- Sourdine
- S'abonner au fil RSS
- Surligner
- Imprimer
- Signaler un contenu inapproprié
I have been doing some development of neural nets using ifort. The current topic being Levenberg Marquardt in which the jacobian is normally calculated by backpropagation and then used to create a quasi Hessian and an eqn of form Ax = b is solved to find x, the delta in network weights.
I came across the jacobi family of routines the other day and was wondering if they would be useful as an alternate way of jacobian calculation. Most textbooks say that the jacobian can be calculated vi backpropagation or finite differences. Is djacobi using finite differences to calculate the jacobian?
Is one method superior to another ie backprop vs finite difference - obviously with neural nets one of the key factors is speed and performance of algorithm
Lien copié
- S'abonner au fil RSS
- Marquer le sujet comme nouveau
- Marquer le sujet comme lu
- Placer ce Sujet en tête de liste pour l'utilisateur actuel
- Marquer
- S'abonner
- Page imprimable