Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
9 Views

Working with Triangular Matrices

In my code, I do matrix multiplication of a square matrix (doubles) with a vector that has the same size as the square matrix dimension. The operation is such that I only need to do matrix math for the lower triangle values of the matrix including the main diagonal (sometimes it's ones, sometimes it's not).

So far, I used the general matrix MKL function dgemv to perform the entire matrix multiplication by the vector while zeroing out the upper triangle of the matrix values so these elements have no effect upon the result vector.

Behind the scenes, what's the difference between using dgemv as I described above and dtrmv to perform this operation? Is dtrmv faster? Without using parallelism which MKL function is the fastest one to use for my operation?

Thanks.
0 Kudos
4 Replies
Highlighted
Employee
9 Views

Hello,

dtrmv will only use upper or lower triangle data, but it does not assume the upper or lower triangle data is 0. For your case, it looks you can zero the upper data, and then use dgemv. But using dtrmv, it may create a different result.

Thanks,
Chao

0 Kudos
Highlighted
Beginner
9 Views

I did an experiment where I used dgemv with a square matrix with the upper triangle all zeroed out. I then used dtrmv. It appeared to be slightly faster. I was just wondering if dtrmv multiplied less elements in the matrix than dgemv does (multiplying every matrix element). It appears to do so.
0 Kudos
Highlighted
Employee
9 Views

Hello,

yes, dtrmv multiplied less elements as it is assuming the symmetric matrix, the dgemv will take the full matrix for multiplication. But have you checkif the result is right for you?

Thanks,
Chao

0 Kudos
Highlighted
Beginner
9 Views

Thanks for the background. In my experiment, the results were the same for both methods. I am not using the parallel libraries for matrix multiplication, thus there shouldn't be any discrepancy due to that.

Thanks for the responses.

Blake
0 Kudos