-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Shape Issues #4114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I am also experiencing this issue. I'll add to this that if I specify the coords of the model and dims for my RVs I can work around this issue but the NUTS sampling seems to get much slower. Happy to provide more information if it would be helpful. |
@martintb Please do. |
So, my model was a bit complicated and messily implemented, but I was able to adapt an example from Eric Ma's very nice tutorial. I have a simple notebook that I'd be happy to share, but I'm not sure the best way to do so given that I can't attach it to this issue. I generate two datasets with different shapes (1000 points for dataset1, 750 for dataset2): The model is defined as with pm.Model() as model:
t_data = pm.Data('t',t1)
act_data = pm.Data('activity',activity1)
A = pm.HalfNormal('A', sd=100)
tau = pm.Exponential('tau', lam=1)
C = pm.Normal('C', sd=100)
sd = pm.HalfCauchy('sd', beta=1)
link = A * np.exp(-t_data/ tau) + C
like = pm.Normal('activity_likelihood', mu=link, sd=sd, observed=act_data) The initial sampling of the data works fine with model:
trace = pm.sample(2000,tune=2000)
pm.plot_trace(trace) But if I try to reset the data: with model:
pm.set_data({'t':t2,'activity':activity2})
trace = pm.sample(2000,tune=2000)
pm.plot_trace(trace) I get a very long error that ends in: ~/software/anaconda/miniconda3_200323/envs/analyze/lib/python3.7/site-packages/theano/gradient.py in access_term_cache(node)
1235 "%s.grad returned object of "
1236 "shape %s as gradient term on input %d "
-> 1237 "of shape %s" % (node.op, t_shape, i, i_shape))
1238
1239 if not isinstance(term.type,
ValueError: Elemwise{sub,no_inplace}.grad returned object of shape (1000,) as gradient term on input 0 of shape (750,) Interestingly, I can change the values of the domain variable (time) in order to sample the posterior predictive and not get an error tnew = np.arange(0,1000,0.5)
with model:
pm.set_data({'t':tnew})
ppc = pm.fast_sample_posterior_predictive(idata) Also, I think my comment about the "coords" above was wrong. I was misinterpreting the coords "fixing" the problem from the sample_posterior_predictive working. |
Sampling the same model on new data with different shape currently fails, see issue #3007 (though it took a slight detour for a few comments). Hopefully this can be fixed in the next major release. Currently, this needs to be done with model factories in order to recompile the model every time the shape of the data is modified. Luciano talks about them in his PyMCon talk for example, and there are many other similar questions on Discourse. |
Thanks @OriolAbril! I'll use the model factory pattern until I hear that shape changing is supported. It would be nice if the devs could add a sentence or two on the Data Container page in the documentation about this limitation. |
Has there been any updates/ fixes for this issues yet? I can't find anything online.
Did you ever solve this? |
All of the above should be fixed in v4 which is currently in beta. All the changes are listed in https://github.com/pymc-devs/pymc/blob/main/RELEASE-NOTES.md. We are also working on a migration guide explaining the dims/shape/size argument and other changes. |
Uh oh!
There was an error while loading. Please reload this page.
Linking all shape issues tickets together
#2326
#4029
The text was updated successfully, but these errors were encountered: