affine3d used for pure rotation causes translational misalignment

4 ビュー (過去 30 日間)
Linus Olofsson
Linus Olofsson 2024 年 3 月 15 日
回答済み: Linus Olofsson 2024 年 3 月 16 日
Hey guys, thanks for showing interest in helping me!
If it matters, I'm running Matlab (R2023b) Update 7. I'm also using "affine3d", which is not recommended after R2022, and Matalb recommends "affinetform3d" according to https://se.mathworks.com/help/images/ref/affine3d.html#mw_e0c661e6-b991-4445-88b2-485ca10c30d3 but I don't understand the differences or if it would somehow help or how to implement this change in my case.
I am designing an app for manual volumetric registration of CT images of log segments (manuall because the app is also used for labelling tasks). One log is x-rayed in two different states - wet and dry - and I need to align the two. To show you my problem, I've performed the registration for this sample and then re-introduced a known misalignment which I try to show in this image:
The red line is the z-axis (longitudinal direction) of the log going from the top-end to the butt-end and the green line is the same axis but missaligned. The red X is the point of rotation and the misalignment is -1 degrees in the horizontal plane. In practice, the red line would be the wet scan of the log and the green line the dry scan of the log that I now need to align. Basically moving the green line to align with the red.
My problem is now that when I use the affine3d function to perform this rotation, I also get a shift along the z-axis. Here are two screenshots from my app showing side and top views showing before and after performing the rotation. The only important part here is the alignment of the green (wet) image with the pink (dry) image along the z-axis, which here is left and right.
Before:
After:
This is the code used to perform the rotation that also introduces this translational missalignment. I suspect the problem lies in the "%%% Perform the rotation"-part of the code.
Input:
>>angleX
angleX =
0
>>angleY
angleY =
1
This codes extracts the point of rotation defined by the user, which in this case is at the red X in the first image (in this case, at the coordinates [138 168 62] for a log of size [365 341 542], which is basically size(app.dryVolume)./[2 2 100]).
function AlignWetButtWithDryButtGivenWetTopAndDryTop(app, angleX, angleY)
%%% Rotation can be locked by the user
if app.Lock23Button.Value; beep; return; end
%%% Check requirements
% Center marking(s), preferably at the very top end of the log (red X)
c1 = [str2num(app.WetTopPointStoreEditField.Value) str2num(app.WetTopPointSliceEditField.Value)];
c2 = [str2num(app.DryTopPointStoreEditField.Value) str2num(app.DryTopPointSliceEditField.Value)];
pointOfRotation = mean([c1;c2]); % [138 168 62]
% centre is nan if both c1 and c2 are empty.
if isnan(pointOfRotation); beep; return; end
% UI
app.SaveallknotsButton.BackgroundColor = [0.96 0.96 0.96];
app.SavematchedwhorlButton.BackgroundColor = [0.96 0.96 0.96];
app.PlotLamp.Color = [1 0 0];
drawnow
%%% !!! This if-statement is false for the example posted to mathworks, however I'm pretty sure it also contains a bug :) I'll save that for a later time.
%%% If angleX and angleY are not both given as inputs - calculate them
if nargin < 3
% User-marked features (buttons: "3" <- dry, "4" <- wet used at the butt-end of the log)
dry_feature = [str2num(app.WetButtPointStoreEditField.Value) str2num(app.WetButtPointSliceEditField.Value)];
wet_feature = [str2num(app.DryButtPointStoreEditField.Value) str2num(app.DryButtPointSliceEditField.Value)];
if ~exist("dry_feature","var") || ~exist("wet_feature","var"); beep; return; end
% Vectors from the center of rotation to the features
dry_vector = dry_feature - pointOfRotation;
wet_vector = wet_feature - pointOfRotation;
% Calculate rotation angles
% For rotation around the X-axis, project onto the YZ plane (set the x-component to 0)
dry_vector_YZ = [0, dry_vector(2), dry_vector(3)];
wet_vector_YZ = [0, wet_vector(2), wet_vector(3)];
angleX = atan2d(norm(cross(dry_vector_YZ, wet_vector_YZ)), dot(dry_vector_YZ, wet_vector_YZ));
% For rotation around the Y-axis, project onto the XZ plane (set the y-component to 0)
dry_vector_XZ = [dry_vector(1), 0, dry_vector(3)];
wet_vector_XZ = [wet_vector(1), 0, wet_vector(3)];
angleY = atan2d(norm(cross(dry_vector_XZ, wet_vector_XZ)), dot(dry_vector_XZ, wet_vector_XZ));
% Calculate the cross product to determine the direction of rotation
cross_product_X = cross(dry_vector_YZ, wet_vector_YZ);
cross_product_Y = cross(dry_vector_XZ, wet_vector_XZ);
% Determine the direction of the rotation.
% If cross_product in x (1) is positive, wet_vector is counterclockwise from dry_vector, and the angle should be negative.
if cross_product_X(1) > 0
angleX = -angleX;
end
if cross_product_Y(2) > 0
angleY = -angleY;
end
end
%%% Perform the rotation
% Create rotation matrices around the X-axis and Y-axis
rotationMatrixX = makehgtform('xrotate', deg2rad(angleX));
rotationMatrixY = makehgtform('yrotate', deg2rad(angleY));
% Combine the rotation matrices
combinedRotationMatrix = rotationMatrixX * rotationMatrixY;
% Translate the volume origin to the point of rotation
translationMatrixToOrigin = makehgtform('translate', -pointOfRotation);
% Combine the translation to the origin with the rotations
affineMatrix = translationMatrixToOrigin * combinedRotationMatrix;
% Translate back to the original center
translationMatrixBackToCenter = makehgtform('translate', pointOfRotation);
% Combine in correct order
finalAffineMatrix = affineMatrix*translationMatrixBackToCenter;
% Convert to affine3d object
finalAffine3D = affine3d(finalAffineMatrix');
% Apply the transformation
[rotated_dry_volume, RB] = imwarp(app.dryVolume, finalAffine3D, 'Interpolation', 'cubic', 'OutputView', imref3d(size(app.dryVolume)));
app.dryVolume = rotated_dry_volume;
% fprintf("Rotated by \n\tangleX = %.1f \n\tangleY = %.1f\n", angleX, angleY)
%%% UI stuff
app.DryButtPointSliceEditField.Value = app.WetButtPointSliceEditField.Value;
app.DryButtPointStoreEditField.Value = app.WetButtPointStoreEditField.Value;
app.Lock23Button.Enable = 1;
maskWhorl(app) % The use of cubic can introduce negative values, which are removed in this masking.
updateView(app)
end
If I can get your help to make this work, I can finally look into using the "RB" above to create tracability, and drastically improve image quality for my data - it would be of great help to my research project!
Thank you so much for your time!

採用された回答

Matt J
Matt J 2024 年 3 月 15 日
編集済み: Matt J 2024 年 3 月 15 日
I don't really understand how your posted images show that something is wrong. It shows that the image has changed after the warping, but nothing more, as far as I can see.
Independently, though, the calculation of the affine matrix looks wrong to me. I think you need,
finalAffineMatrix=translationMatrixToOrigin\combinedRotationMatrix*translationMatrixToOrigin;
Also keep in mind that inwarp thinks your volumes are indexed (y,x,z), not (x,y,z). So, if you derived your pointOfRotation and other geometric quantities assuming the latter, that would be a problem.
  3 件のコメント
Linus Olofsson
Linus Olofsson 2024 年 3 月 15 日
Hey Matt, thanks for your quick reply. Adressing your answer and comments.
  • In the green/pink images I show, I've marked in red that a gap is created along the z-direction, which shouldn't exist. Basically the function unintentionally moves the object along the z-axis apart from doing the intended rotation. In the side view, the rotation happends around a vertical axis close to the left end, meaning the pink image is rotated a slight ammount in or out of the monitor, meaning for a 1 degree rotation there should basically be no difference. It's hard to see, but there is a faint outline in the Top view that shows that the pink image is rotated and misaligned.
  • How does a backslash fit into this combination of matrices - isn't that for solving linear system of equations?
  • I looked into using your package, and I have two questions:
  1. Is is possible and ok to use this package for a stand-alone installation of my app exclusively for my current project for research use? If no, then sadly this is not an option for me and my colleagues.
  2. I tried it, and maybe I'm doing something wrong, but I get a discrepancy both with my implementation and yours. Using your implementation I tried:
Using:
dry_vector =
31 149 441
wet_vector =
22 149 441
I tried:
K>> combinedRotationMatrix = quadricFit.vecrot(dry_vector,wet_vector);
K>> discrepancy = wet_vector' - combinedRotationMatrix*dry_vector'
And got instead of zeros:
discrepancy =
-0.0241
-0.1635
-0.4841
Matt J
Matt J 2024 年 3 月 15 日
編集済み: Matt J 2024 年 3 月 15 日
How does a backslash fit into this combination of matrices - isn't that for solving linear system of equations?
A\B is the same as inv(A)*B
tried it, and maybe I'm doing something wrong, but I get a discrepancy both with my implementation and yours. Using your implementation I tried:
I forgot to mention that the vectors should be normalized,
dry_vector = [31 149 441];
wet_vector = [22 149 441];
combinedRotationMatrix = quadricFit.vecrot(dry_vector,wet_vector);
discrepancy = normalize(wet_vector','n') - combinedRotationMatrix*normalize(dry_vector','n')
discrepancy =
1.0e-14 *
0.1360
0
-0.0222
Is is possible and ok to use this package for a stand-alone installation of my app exclusively for my current project for research use?
I don't know why it wouldn't be okay. It's certainly okay with me...
But regardless, if your own implementation works, there's no reason you shouldn't use that, as long as it passes the test above. Does it?

サインインしてコメントする。

その他の回答 (1 件)

Linus Olofsson
Linus Olofsson 2024 年 3 月 16 日
TLDR;
  • The way I created my affine matrix was incorrect.
  • The dry/wet_vectors where never a problem and correctly showed 0 discrepancy.
  • I can now apply my transformation with or without Matt's package equivalently.
To the best of my understanding, it was simply the case that my combination of transforms to create the affine matrix was indeed wrong... Math man... It would seem that my old code "worked" in the sense that since I was applying so small transformations (anything above 5 degrees would be considered large in my data) I was tricked into thinking the code worked at all.
Here is my updated and simplified code. Thanks so much Matt for your answer and to all of the other answers I've learned from you throughout the years - cheers!
%%% Perform the rotation
% Define transform 1:
% Translate the volume origin to the point of rotation whos inverse moves it back.
translationMatrixToOrigin = makehgtform('translate', -pointOfRotation);
% Define transform 2:
% Create rotation matrices around the X-axis and Y-axis
rotationMatrix = makehgtform('xrotate', deg2rad(angleX), 'yrotate', deg2rad(angleY));
% Combine Transforms in the order: inverse of 1 (applied using \), rotation, 1. See: https://se.mathworks.com/help/images/create-composite-2d-affine-transformation.html#CompareComposite2DAffineTransformationsExample-1
affineMatrix = translationMatrixToOrigin \ rotationMatrix * translationMatrixToOrigin;
% Convert to affine3d object
affine3DObject = affine3d(affineMatrix');
% Apply the transformation
[rotated_dry_volume, RB] = imwarp(app.dryVolume, affine3DObject, 'Interpolation', 'cubic', 'OutputView', imref3d(size(app.dryVolume)));

カテゴリ

Help Center および File ExchangeMatrix Indexing についてさらに検索

製品


リリース

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by