Hi all. I've just gotten hddm running on my Windows 10 machine which was a chore and a half!
Now it runs my model (in test mode) just fine:
----
def run_model(id):
import hddm
data1 = hddm.load_csv('C:/path_to_data/St1_MFCT_dropped.csv')
v_reg = {'model' : "v ~ 1 + C(MFCTType)", 'link_func' : lambda x: x}
t_reg = {'model' : "t ~ C(Action)", 'link_func' : lambda x: x}
a_reg = {'model' : 'a ~ C(Valence)', 'link_func' : lambda x: x}
reg_descr = [t_reg, a_reg, v_reg]
m = hddm.HDDMRegressor(data1, reg_descr, p_outlier = .05)
m.find_starting_values()
m.sample(5, burn = 1, dbname='db%i'%id, db='pickle')
return m
from ipyparallel import Client
v = Client()[:]
jobs = v.map(run_model, range(3))
models = jobs.get()
combined_model2 = kabuki.utils.concat_models(models)
combined_model2.save('C:/path_to_folder/Study1_MFCT_concatenated_models)
combined_model2.print_stats('C:/path_to_folder/Full_stats_report.csv')
----
However, I get an error (identical for every engine):
---
AttributeError Traceback (most recent call last)
~\anaconda3\envs\hddmEnv\lib\site-packages\ipyparallel\serialize\serialize.py in serialize_object(obj, buffer_threshold, item_threshold)
117 buffers.extend(_extract_buffers(cobj, buffer_threshold))
118
--> 119 buffers.insert(0, pickle.dumps(cobj, PICKLE_PROTOCOL))
120 return buffers
121
AttributeError: Can't pickle local object 'run_model.<locals>.<lambda>'
---
After some judicious googling, it appears that Python 3 can't pickle lambda functions. I've tried using dill instead, but importing that (either vanilla or import dill as pickle) and substituting 'dill' in, e.g.,
dbname='db%i'%id, db='dill', throws a different error about how pymc doesn't recognise this in its database backend.
Any suggestions or helpful tips on how to fix or work around this so I can save my model?
Thanks,
Adam