I still do not see you say what you are doing with the matrix. "downstream processing" is not sufficient information.
There may be good reasons why you need it as a matrix. Or perhaps there are not.
Remember that this matrix is huge. Even in single precision, it will require something like
163 GIGABYTES of memory to store that matrix.
Even if you store it in sparse form, and it is 90% zero, sparse is not supported for single precision. (At least not in R2017b. I need to download R2018a, but the release notes to R2018a do not indicate support for single sparse arrays.) Therefore you would need to store the matrix as a sparse double precision array.
The memory required for a sparse double of that size would still be on the order of 31 gigabytes of RAM. In order to use it in any way, depending on what you would do with it, MATLAB might even be forced to make copies of the array. While that might be possible, you would need a lot of RAM, and a fast hard drive. A SSD drive would be useful, because your computer will be doing a lot of memory shuffling.
Next, while you said that you TRIED a sparse matrix, we don't know how you tried to create that sparse matrix. My guess is you did not use sparse correctly, nor did you create the matrix properly. No matter what, it will require a LOT of memory just to create the list of non-zero elements, and their positions in that final sparse matrix. Then to make the matrix itself, you will create a copy of all that information. So you will end up needing something on the order of 60 to 80 GIGABYTES of RAM to create the sparse matrix. Again, a lot of memory.
You might want to read this link carefully:
In the end, I would suggest that you are trying to process too large an amount of data at once for the capabilities that you have, both in terms of the memory management skills you have, and in terms of what your current computer is capable of storing. Just because you were able to process weekly data like this does not mean that you should jump to now processing yearly data all at once. Of course, even if that was easily done, then you might decide to get good accuracy, what you really needed to do was to process 5 or 10 years of data at a time. This is how things work. I need MORE DATA is the common refrain. But can you work more efficiently instead?
So I would strongly suggest that you consider reformulating how you process things.
Perhaps you might generate the array in blocks. For example, you could generate blocks that are 4 weeks in size, saving them out to disk. Save as many such blocks of data as you wish in separate files. Then read them in as you need them, replacing the previous block of data in current memory. Yes, this will require fast disk access speeds, so a large SSD drive will be useful.
Perhaps a better way to store this data would be to use a DATASTORE.
This will help MATLAB to do some of the memory management work for you. Again, I don't know what you will do with this array after you create it, as you never told us that.