I have been struggling with this for quite some time now. I would really appreciate some help if possible.
I have a frequency dependent function of this form : Z=R1+(1./((1./(s*L1))+(1./(R2+(1./((1./(s*L2)+(1./(R3+(1./((1./(s*L3)+(1/R4))))))))))))) with s = 1j*f*2*pi and f =1e6:1e6:4.99e9. Where all the parameters (R_i, L_i) are set to one (=1) as a strating point. I also have a scatter plot Z_measured of the same length of Z and defined over the same frequency range f.
My question is: how can I run a MATLAB optimization script that will minimize the difference between Z(f) and Z_measured(f) by selecting the values of the parameters (R_i,L_i) that minimize the difference abs(Z)-abs(Z_measured)?
Thanks in advance!