Any Guide on Parallel I/O System?

53 views
Skip to first unread message

Dale Wang

unread,
Oct 1, 2014, 8:40:27 PM10/1/14
to rbigdatap...@googlegroups.com
Hello Everyone,
  
          I find the rpbd is a project exactly what I want ! And I am trying to use it in our project. But I meet a basic problem: How to store the distributed matrix onto disks? I find that rpbd project provide a package "pbdNCDF4" which can store the ddmatrix in a file supported by parallel file system.

          But I am from the background of Hadoop/HDFS, and know nothing about parallel file system and file format. It seems that NCDF4 is only a file format and I need a parallel filesystem to store the file.
       
          Is there some guides or tutorials about how to setup a parallel I/O system which can be used by pbdNCDF4? I searched over the Internet and find few courses about parallel  I/O in HPC systems. Can somebody recommend any materials on topic? I know how HDFS and Hadoop works and how MPI programs work but know nothing in Parallel I/O in HPC. It is a new field to me.

Thank you all very much!
                                                                               Dale Wang

Wei-Chen Chen

unread,
Oct 3, 2014, 1:06:35 AM10/3/14
to rbigdatap...@googlegroups.com
FYI, pbdDEMO vignettes chapter 10.

Dale Wang

unread,
Oct 17, 2014, 10:47:27 PM10/17/14
to

After getting through pbdR vignettes and some other piceces of slides and web pages, I think I manage to understand the whole stack of parallel linear algebra library in HPC environment. I draw a  figure  to illustrate the relationship between softwares in the stack.
For who comes from Hadoop platform, I illustrate a reference stack on the right-hand side. Saury(whose name is Marlin currently) is a parallel linear algebra library running on Spark platform developped by our Lab (github link is https://github.com/PasaLab/marlin).

pbdR nearlly covers all the softwares in the left stack. That must be a big work and deep understanding of the whole stack.

If there are problems or mistakes in the figure, please tell me and thank you ahead.
                                        
                                                               Dale



在 2014年10月3日星期五UTC+8下午1时06分35秒,Wei-Chen Chen写道:
Reply all
Reply to author
Forward
0 new messages