This project has moved and is read-only. For the latest updates, please go here.

Problem with sparse matrices of very large row and column sizes

Apr 27, 2011 at 3:38 AM

I had occasion to create a very large, but narrowly banded matrix, on the order of 100k by 100k. I found that member-wise item setting using  At() failed due an internal step where 'rows * columns' is calculated, and since both are type int, I was getting an integer overflow. I fixed this for my own use by modifying the source to simply cast the row and column integers to type long prior to evaluation, but perhaps it would be worthwhile to include something in the code to handle such cases. I'm sure I'm in the extreme minority using such large matrices (for signal analysis, FWIW), but I'm probably not the only one.

Apr 27, 2011 at 12:04 PM
This discussion has been copied to a work item. Click here to go to the work item and continue the discussion.
Apr 27, 2011 at 1:52 PM

Thanks for pointing this out. A fix has been checked in.