Putting symbolic array into matrix
古いコメントを表示
Hello,
I am trying to take a symbolic variable and subtract it from every value in an array (of some 1xn size) and then solve for this unknown variable later, but I am having trouble seemingly getting the symbolic array to nest correctly.
Here is an example of what I am trying to do,
syms q0
Vy=[300 500 300]
q(1,:)=[q0-(Vy.*2.*4./8)]
After this, q(i,:) will be in a for loop that will use this symbolic variable.
I keep getting an error and I dont know how to modify this in a way where I can still solve for q0 later on.
Thanks for the help!!!
採用された回答
その他の回答 (0 件)
カテゴリ
ヘルプ センター および File Exchange で Logical についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
