[rsf-user] best way to break very large sgy file (~ 200GB ) into smaller and process for rsf

1 view
Skip to first unread message

S C

unread,
Feb 24, 2023, 7:18:48 AM2/24/23
to rsf-...@lists.sourceforge.net
KIndly anyone help me to how to proceed with this very large sgy file and how to break it into smaller chunks without loss of data.


Regards.

Sergey Fomel

unread,
Feb 27, 2023, 10:48:19 AM2/27/23
to S C, rsf-...@lists.sourceforge.net
It is not easy to break a SEGY file. If you manage to convert it to RSF with sfsegyread, you can break the RSF file using sfwindow.

On Fri, Feb 24, 2023 at 6:19 AM S C <satishgch...@gmail.com> wrote:
KIndly anyone help me to how to proceed with this very large sgy file and how to break it into smaller chunks without loss of data.


Regards.
_______________________________________________
RSF-user mailing list
RSF-...@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/rsf-user

Sergey Fomel

unread,
Feb 27, 2023, 10:57:50 AM2/27/23
to S C, rsf user
To window the first 1000 traces, use 

sfwindow n2=1000 < file.rsf > file1.rsf

To window the next 1000 traces, use

sfwindow n2=1000 f2=1000 < file.rsf > file2.rsf

sfwindow n2=1000 f2=2000 < file.rsf > file3.rsf

etc.

You can do it in a loop in SConstruct.

To properfly process the files, you will need to window the trace header file (the output of tfile= in sfsegyread) in the same way.




On Mon, Feb 27, 2023 at 9:50 AM S C <satishgch...@gmail.com> wrote:
Sir i converted segy to rsf. But i was unable to break that rsf.kindly give me the commend to break 180 gb of rsf into each 1 lakh trace file bcz whole dataset is having 52 lakh traces.

On Mon, Feb 27, 2023, 9:17 PM Sergey Fomel <sergey...@gmail.com> wrote:
It is not easy to break a SEGY file. If you manage to convert it to RSF with sfsegyread, you can break the RSF file using sfwindow.

On Fri, Feb 24, 2023 at 6:19 AM S C <satishgch...@gmail.com> wrote:
KIndly anyone help me to how to proceed with this very large sgy file and how to break it into smaller chunks without loss of data.


Regards.

Ajay Pundir

unread,
Apr 26, 2023, 12:39:19 AM4/26/23
to Sergey Fomel, rsf user
I understand this is very late to answer this question,  as it has been asked in february, but it is worth answering at this type of work keep coming.:
we can break a big segy file into small files. before reaading it :
 1. Read only headers.
2. select a heder on basis of which we can divide data into parts.
 3. create masks as many as we want to make parts of data.
4. read data of each chunk  using corresponding mask.
for example.
we have a  huge 3 data , which can be divided into inline swath
then create  masks according to inline swaths. and reads data belonging to those swaths only. I am using this as a routine

Reply all
Reply to author
Forward
0 new messages