You "want" the green line. I want the Buffalo Bills to win the Super bowl. Probably not gonna happen in either case. ;-)
You want to solve a weighted linear least squares problem, but you don't really understand linear least squares. And that is the fundamental problem.
opts = fitoptions( 'Method', 'LinearLeastSquares' );
opts.Weights = W;
f = fit(WX,WY,'poly1',opts);
Linear least squares looks ONLY at errors in the y variable. Large positve or negative residuls, thus points that fall far above or below the line will have more importance in the fit. They will drag the curve around. In this case, it is a straight line.
Now, consider the green line that you "want" to see produce. Do you see a problem in this context? Down near x==0, ALL of the errors will be above the line, at lest if the green curve were the one we expected. Near x == 100, ALL of the errors will be BELOW the line.
In both cases, the line will be drawn into a position so the slope is reduced. While you see data that makes you WANT something, this is something the data tells me cannot happen. What you WANT is not relevant, because the tool you are using looks only at errors in the y variable. Sadly, I am pretty sure the tool cannot read your mind, nor does it really care about what you want to see happen. Computers do what they are programmed to do.
Worse, we have another problem that is just as serious. your two variables have wildly different variation.
stdx = std(WX)
stdy = std(WY)
We can also see that in the axis scaling.
We can see the vast difference by forcing the two axes to have the same scaling.
Can the problem be solved to do what you want to see? For that, you are probably thinking in terms of what is called the total least squares problem. And of course, you have weights, so that will force me to actually write code. And since the two variables have hugely different variations, we need to deal with that too. Bah, humbug. I hate writing code. It makes me think.
The trick for total least squares is to use the SVD to do the work, sometimes called orthogonal regression. Other people call this principal component regression. As you can see here I computed weighted means. Subtract them off the data variables, then I weighted that matrix using W, and rescaled the variables to have unit variances. SVD does the hard work though.
mux = sum(WX.*W)./sum(W);
muy = sum(WY.*W)./sum(W);
A = [(WX - mux).*W/stdx,(WY - muy).*W/stdy];
[U,S,V] = svd(A,0);
We choose the singular vector with the SMALLER singular value. That the two singular values are so close in magnitude tells me this regression is poorly posed, in the sense that the data ended up as a vaguely circular point cloud. Seriously, any line is arguably almost as good as any other in this case.
The weighted orthogonal regression line becomes...
V(1,2)*(x - mux)/stdx + V(2,2)*(y - muy)/stdy = 0
I'll use MATLAB to display the equation in a standard form, mainly becaue I am feeling too lazy to do basic algebra. Just too hot out today.
syms x y
y = vpa(solve(V(1,2)*(x - mux)/stdx + V(2,2)*(y - muy)/stdy,y),5)
424.11*x - 11482.0
Now replot things, and see what happens.
H = fplot(matlabFunction(y),[35,55]);
H.Color = 'g';
H.LineWidth = 3;
So the total least squares regression, using weights and a rescaling of the variables to have unit variances works. Again, it was very close. I could almost have chosen the regression line orthogonal to the one we got. Your data is NOT well posed for regression, as an almost circular point cloud.
The Rolling Stones said it for me. "You can't always get what you want." Of course, you can freely decide the answer is exactly what you want to see. It is a scheme that works well for many politicians these days. :( But today, you got lucky. Que sera, sera... (I know somebody said that, but who? One "day" I'll remember.)