Putting symbolic array into matrix
1 回表示 (過去 30 日間)
古いコメントを表示
Hello,
I am trying to take a symbolic variable and subtract it from every value in an array (of some 1xn size) and then solve for this unknown variable later, but I am having trouble seemingly getting the symbolic array to nest correctly.
Here is an example of what I am trying to do,
syms q0
Vy=[300 500 300]
q(1,:)=[q0-(Vy.*2.*4./8)]
After this, q(i,:) will be in a for loop that will use this symbolic variable.
I keep getting an error and I dont know how to modify this in a way where I can still solve for q0 later on.
Thanks for the help!!!
0 件のコメント
採用された回答
Walter Roberson
2022 年 4 月 29 日
That code works in the form you posted.
syms q0
Vy=[300 500 300]
q(1,:)=[q0-(Vy.*2.*4./8)]
The most common mistake for something like that would have been to initialize q using zeros(), such as
q = zeros(5,3);
If you had initialized q to double precision, then the assigning into q(1,:) would fail because what you are assigning needs to contain a symbolic variable.
The cure for that would be to use something like
q = zeros(5, 3, 'sym');
for new enough versions of MATLAB, or
q = sym(zeros(5,3));
if your MATLAB is older than that.
0 件のコメント
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Symbolic Math Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!