Hi,
answers inline below.
* MasayoMusic <
bigmi...@gmail.com> [2019-01-15]:
> I was introduced to framework via FastAI, and I cant sucessfully seem to
> save data/ load data without crashing my memory.
>
> I was attempting something like this but it seems bcolz_array keeps getting
> larger takes up more memory as it grows?
>
>
> Thank you.
>
>
> def save_array(fname, generator_array, num_batches,data_type = "data"):
> if data_type == "data":
> bcolz_array = bcolz.carray(np.zeros([0,img_width, img_height,3], dtype=np.float32), mode='w', rootdir=fname)
> else:
> bcolz_array = bcolz.carray(np.zeros([0,len(labels)], dtype=np.float32), mode='w', rootdir=fname)
How long is 'labels' or otherwise: how big is the zeros array? Because
this might be part of your issue. I haven't use bcolz in years, but my
gut feeling is telling me, that you might want to try:
http://bcolz.blosc.org/en/latest/reference.html#bcolz.zeros
>
> data_dict = {"data": 0, "labels": 1}
>
> if data_type not in ["data", "labels"]:
> raise ValueError ("data or labels")
>
> for i in range(batches):
> bcolz_array.append(next(generator_array)[data_dict[data_type]])
> bcolz_array.flush()
Maybe you could let us know how it goes and if that helped anything?
V-