Think of it like this:
If all of the elements in that matrix are non-zeros, then you have
So 34 billion elements in that array. Since each will be a double precision number
That means 273 gigabytes, just to store the array. Do you have that much contiguous, addressable memory? Unlikely, unless you work for a well heeled employer. And anything you do with it, you will probably end up needing to make temporary copies of the array, etc. So expect to need at least 3x that much, so lets just call it a terabyte of main memory for kicks.
That does not mean nothing can be done of course. Computer memory is relatively cheap these days. And you can always get a large SSD drive. Solving your problem using brute force is rarely that great of an idea though. And by next year, I expect that your problem will have doubled in size.
That means you cannot do anything with such a large array, unless it is perhaps a sparse array. In fact, much of the time, when you are working with such huge matrices, they are often sparse ones. So almost all zeros. MATLAB has sparse arrays just for that purpose, and computations with them are vastly more efficient. So I would strongly consider using sparse storage, IF your array is mostly zero. I do mean seriously mostly zeros. If your array is only 50% non-zero, don't bother. Sparse storage won't gain you anything then.
There are other things you can do of course. It may well be possible to reorganize your computations such that this matrix never needed to be computed in the first place. That is a very good idea, IF possible.
There are other possibiities, for example tall arrays. (Your array is tall in one sense, but it is pretty squat too.)