Issue
T(i) = Tm(i) + (T(i-1)-Tm(i))**(-tau(i))
Tm
and tau
are NumPy vectors of the same length that have been previously calculated, and the desire is to create a new vector T
. The i
is included only to indicate the element index for what is desired.
Is a for loop necessary for this case?
Solution
You might think this would work:
import numpy as np
n = len(Tm)
t = np.empty(n)
t[0] = 0 # or whatever the initial condition is
t[1:] = Tm[1:] + (t[0:n-1] - Tm[1:])**(-tau[1:])
but it doesn't: you can't actually do recursion in numpy this way (since numpy calculates the whole RHS and then assigns it to the LHS).
So unless you can come up with a non-recursive version of this formula, you're stuck with an explicit loop:
tt = np.empty(n)
tt[0] = 0.
for i in range(1,n):
tt[i] = Tm[i] + (tt[i-1] - Tm[i])**(-tau[i])
Answered By - Andrew Jaffe
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.