passing multiple DataFrames\Tables between pandas\matlab

436 views
Skip to first unread message

Yoel Shapiro

unread,
Jan 17, 2016, 10:06:36 AM1/17/16
to PyData
hi

i'm looking for an elegant\lazy method to pass data from matlab to python and back to matlab

hdf (h5py) allows encapsulating the entire matlab workspace but i can't figure out how to read the Tables into DataFrames

saving tables to csv works well but my matlab-colleagues are using a struct that includes >10 tables and we don't want to start managing lists of files

to clarify, the matlab workspace looks something like this:

main_struct
- Table1
...
- Table10
- nested struct 1
- nested struct 2
- 4D array

Dov Grobgeld

unread,
Jan 17, 2016, 11:20:05 AM1/17/16
to pyd...@googlegroups.com
I would suggest that you figure out how to use the python HDF5 module and use it to build the necessary pandas dataframes (from the tables), python pod data (from the nested structs) and numpy arrays (from 4D arrays).

Another option is to zip together all your csv files, and write libraries for python and matlab respectively to unzip and build up the "workspaces".

Regards,
Dov


--
You received this message because you are subscribed to the Google Groups "PyData" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pydata+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Yoel Shapiro

unread,
Jan 20, 2016, 4:09:56 AM1/20/16
to PyData
Thanks Dov

 zipping can improve the data management, didn't think of it

saving a matlab table to hdf results in a group with pointers to a #refs# group, that contains ASCII codes for the actual data (represented as char arrays)
i'm battling with importing this into python as a pandas DataFrame

i'll try asking about this in a HDF forum
Reply all
Reply to author
Forward
0 new messages