[Seisunix] Processing a line from scratch: Seismic Un*x Blog

877 views
Skip to first unread message

John Stockwell

unread,
May 19, 2009, 1:09:07 PM5/19/09
to seis...@dix.mines.edu
Dear Seisunix users,

This message is to announce the creation of a new blog called
The Seismic Un*x User. I suggest that this topic of processing
a line from scratch be discussed on that forum.

The address is:

http://theseismicunixuser.blogspot.com

John Stockwell | jo...@dix.Mines.EDU
Center for Wave Phenomena (The Home of Seismic Un*x)
Colorado School of Mines
Golden, CO 80401 | http://www.cwp.mines.edu/cwpcodes
voice: (303) 273-3049

Our book:
Norman Bleistein, Jack K. Cohen, John W. Stockwell Jr., [2001],
Mathematics of multidimensional seismic imaging, migration, and inversion,
(Interdisciplinary Applied Mathematics, V. 13.), Springer-Verlag, New York.

_______________________________________________
seisunix mailing list
seis...@mailman.mines.edu
https://mailman.mines.edu/mailman/listinfo/seisunix
Unsubscribe: seisunix-u...@mailman.mines.edu

Glenn Reynolds

unread,
May 25, 2009, 12:38:06 AM5/25/09
to seis...@dix.mines.edu
Let's assume that the geometry and CDP binning are possible and think about the next step.

Should we look at noise removal, multiple removal or designature first? Or something else?

Let's look at the data and design a processing flow to brute stack, and put it on the blog, etc.


- Glenn


2009/5/20 John Stockwell <jo...@dix.mines.edu>

Glenn Reynolds

unread,
May 25, 2009, 1:11:23 AM5/25/09
to seisunix
Let's assume that the geometry and CDP binning are possible and think about the next step.

Should we look at noise removal, multiple removal or designature first? Or something else? Shot, offset or CMP domain? t-x, f-k or tau-p?

Let's look at the data and design a processing flow, and put it on the blog, etc.



- Glenn


2009/5/20 John Stockwell <jo...@dix.mines.edu>
Dear Seisunix users,

Mega G. Baitoff

unread,
May 25, 2009, 8:36:57 AM5/25/09
to Glenn Reynolds, seisunix

Hello Glenn,


Monday, May 25, 2009, 11:11:23 AM, you wrote:


>

Let's assume that the geometry and CDP binning are possible and think about the next step.


Should we look at noise removal, multiple removal or designature first? Or something else? Shot, offset or CMP domain? t-x, f-k or tau-p?


Let's look at the data and design a processing flow, and put it on the blog, etc.



Typically, I never start multi-trace routines before amplitude corrections and (especially!) deconvolution.


First step is amplitude recovery. First sub-step is spherical divergence correction. To use formulae k=T*V^2, we need a rough estimate of stacking velocities. So, preliminary step is to construct a supergather on the base of, say, 31 CMPs in the middle of the line, and to perform a rough velocity estimate.


-- 

Best regards,

 Mega                            mailto:mbai...@yandex.ru

Mega G. Baitoff

unread,
May 25, 2009, 1:12:41 PM5/25/09
to Glenn Reynolds, seisunix
Hello Glenn,

Monday, May 25, 2009, 9:40:48 PM, you wrote:

> On the whole I agree, but I usually get a brute stack out for
> reference as soon as possible. I have one now.

Yeah, I have exactly the same unpatience to get the "zero" picture
ASAP. Usually, I never do SDC & SCA before I get the stack. With AGC,
the brutest velocities and no statics.

> I found I needed to do some multi trace work to clean the data in
> order to see it better. I will form a flow and run the data through
> completely once I am happy with it. Too much time on seismic boats!

Can't agree here. I got the stack with almost everything displayed
clear, weak upper horizons, the bowl - like mid-time horizons and
multi-faulted bottom horizons.

> I think there may be an issue with the geometry file, or else they did
> some standing shots to boost fold on the eastern end.

Haven't checked it yet.

> I think this line will take some iteration. My Apple is working hard.
> NMO and stack only take 5 to 10 seconds, but some of the other stuff
> takes a while. And trying a variable top mute and picking velocities
> is not so easy on SU, but good fun anyway.

I have already processed the line quite much, including SDC, SCA, SCD,
three iterations of velocity corrections and two iterations of
autostatics. BUT... Everything is on ProMAX, not on SU. :) Soon I'm
going to post some screenshots of my progress. Consider it to be some
kind of "reference" processing?

> I hope more people have suggestions soon.


--
Best regards,
Mega mailto:mbai...@yandex.ru

_______________________________________________

Gery Herbozo Jimenez

unread,
May 25, 2009, 3:48:37 PM5/25/09
to mbai...@yandex.ru, gdrey...@gmail.com, seis...@dix.mines.edu
 

WoW!!! very nice indeed!, let's try it in SU guys!


 


> Date: Mon, 25 May 2009 23:51:30 +0600
> From: mbai...@yandex.ru
> To: gdrey...@gmail.com
> CC: seis...@dix.mines.edu
> Subject: Re: [Seisunix] Processing a line from scratch: Seismic Un*x Blog
>
> Hello Glenn,
>
> Monday, May 25, 2009, 11:20:04 PM, you wrote:
>
> > Great! But you might frighten people if you use Promax for everything.
>
> Oh, come on! They are big brave boys. :)
>
> Here comes the bride (see attached images):
> Stack 0 is brute stack with approx. velocities and no statics
> Stack 1 is stack with refined velocities and no statics
> Stack 2 is stack with original statics + autostatics and refined velocities
> Stack 3 is stack after s.c.decon and + autostatics 2 and refined
> velocities


>
>
> --
> Best regards,
> Mega mailto:mbai...@yandex.ru


¿Eres del Madrid, del Barça, del Atleti...? Apoya a tu equipo en la Zona Fan de MSN Deportes

Mega G. Baitoff

unread,
May 25, 2009, 4:22:12 PM5/25/09
to walid Osman, seisunix

Hello walid,


Tuesday, May 26, 2009, 1:46:08 AM, you wrote:


>

Hello Mega

 

as I see here you have Promax, so I want to ask you  if I can install Promax on CENTOS 5 Desktop?

 


I don't know. Probably yes. See official system requirements for ProMAX.

Mega G. Baitoff

unread,
May 25, 2009, 4:48:43 PM5/25/09
to David Muerdter, seis...@dix.mines.edu
Hello David,

Tuesday, May 26, 2009, 1:53:15 AM, you wrote:

> My question is operational - I get on The Seismic Un*x User blog
> and see only 2 entries. But I get the posts from Mega and Glenn
> Reynolds from the seismic unix mailing list - and some seem to be
> missing - I assume because the seis...@dix.mines.edu was not
> included as a recipient. Should all these posts go to the blog, or am I missing something?

Oh, this is really wrong. Mailing list is very old-fashioned way for group of people to communicate. Blog with topics and discussions is much more convenient. However, I cannot just carbon-copy every my reply to the TSUU blog without _you_ people to join me there. We should had moved there already, but the inertia is too strong.

> That said, most of us don't have access to the commercial ProMax or
> explanations of its modules. So it would be very helpful to have a
> workflow of Mega's processing with expanded notes to understand the
> steps in context to SU modules. This may be asking too much from
> people who have pressure from their regular jobs, but I though I would put it out there.

I'd be glad to show my fundamental way of processing without being specific to any particular software. Actually, I needed no special processing at all to produce my stacks (except decon, of course), since those stacks are done with AGC.

First step in processing is to obtain a "zero" stack, and we need a "zero" velocity for this. My usual way to guess a velocity is to build a "supergather" (do I need to explain what that mean?). Having a supergather in hands I usually perform the direct measurement of several most intense hyperbolas (by overlaying the "living" hyperbola and seismic data) and just read the "t0" and "V" readings from the velocity measurement tool. If Seismic Unix does not possess such a tool, I may tell how to use ten-fifteen picks of a hyperbola to estimate its velocity.

Baitoff Mega

unread,
May 26, 2009, 1:17:06 AM5/26/09
to da...@luminterra.com, seis...@dix.mines.edu
26.05.09, 07:58, "David Muerdter" <da...@luminterra.com>:

Actually, suwind and susort are not enough to make a supergather. Maybe NMOScan does the job of "vertical stacking", if not I explain how:

1. First, offsets must be binned. I use the following technique: let XMIN be minimum nominal distance from shot to first receiver in the spread, let XSTEP be the nominal shot-receiver distance increment (that is, the nominal distance between receivers), let X be real shot-receiver offset, XBIN - binned offset. Then, XBIN = int((X-XMIN)/XSTEP)*XSTEP+XMIN. After that, sorting on XBIN must be performed, effectively grouping traces of selected CDP ranges into "locally-equal offset bin groups". After that, they must be stacked in each group, and must be assigned a common CDP number (central CDP number). Result must be sorted by primary:CDP, secondary:XBIN. The result is what I call a "supergather". It has a very goog S/N ratio and clear hyperbolas. All this must be done after AGC, of course. No need to use Sph.Div.Corr. or other. See an example of a supergather at central CDP 1915 in attachment.

Thanks Mega, 
comments for all below...

First step in processing is to obtain a "zero" stack, and we need a "zero" velocity for this. My usual way to guess a velocity is to build a "supergather" (do I need to explain what that mean?). Having a supergather in hands I usually perform the direct measurement of several most intense hyperbolas (by overlaying the "living" hyperbola and seismic data) and just read the "t0" and "V" readings from the velocity measurement tool. If Seismic Unix does not possess  such a tool, I may tell how to use ten-fifteen picks of a hyperbola to estimate its velocity.
  
As a first stab from a rusty processor (I'm more of a interpreter/modeler) I will try to use my limited knowledge of SU to duplicate your steps (described above) without all the details.  I will try to create a detailed SU flow and adjust parameters during the next week (unfortunately I have some deadlines on my real job).  I would welcome any comments on better ways to approach this line.

Starting from the dataset loaded into SU format with good header values (I will call it L2D.su)

# window CMPs near middle of line and sort to CMP to create the super-gather
suwind 
# simple spherical correction here
sugain  tpow=2.0 |
# sort into CMPs
susort cdp offset  >L2D_SuperGather.su

Then run the constant-velocity panel script (found in $CWPROOT/src/su/examples/NmoScan)
Must modify parameters in the beginning of the script including the input and output file names. 
From the resultant panels, pick the times and RMS velocities of the flattened reflections in the super-gather (might need to interpolate the velocities between constant velocity panels).
This one velocity estimate can be used for the first brute stack.

Then for the whole line, divergence correction (sudivcor), NMO correction (sunmo), and stack (sustack), AGC (suagc), and plot (suximage)
On second thought, the divergence correction may not be that important for this brute stack because of the AGC

That should get to a brute stack, like Mega's zero stack.  Again, I will try to work through the details this week in my spare time.  Any comments or suggestions are more than welcome.

I'm looking forward to discussion of the more sophisticated part of the processing including residual velocity analysis, multiple suppression, and anisotropy, if applicable.

Cheers,  DaveM


--
Яндекс.Почта. Поищите спам где-нибудь еще http://mail.yandex.ru/nospam
SuperGather.png

Glenn Reynolds

unread,
May 26, 2009, 5:31:02 AM5/26/09
to seisunix
I have just done the same thing to make some stacks. I used suvelan and also a CVS flow. 

Re supergather: I originally tried to sort by offset only and then sushw cdp to a single value, to use suvelan instead. However, the results with CVS seem better. I double-stacked the traces from the supergather (first by CDP, and then all 11 CDPs) to give one trace per velocity as for suvelan. Then I displayed using suattribute mode=amp. Script below. Hopefully some pics attached. I ran a brute function picked at CMP1800 from semblance, same but picked from CVS, and then CVS again with picks every 200 CMPs (from CVS). The additional picks obviously help, but I prefer Mega's approach where you are seeing the hyperbolic trajectories. I wrote a GUI tool for this with dynamic NMO and CVS in Java (and lost it :-(

- glenn



############################################################
#
# CDPs came from sucdpbin with default options - I have:
#
# fldr     231 481 (231 - 481)
# tracf    1 281 (1 - 281)
# ep       32 282 (32 - 282)
# cdp      1142 2422 (1142 - 2422)
#
############################################################
for((cdp=1300;$cdp<2301;cdp+=200))
do
#Big range of velocities
fv=1000
dv=40
lv=5000
#21-traces per location - could also use 11 (or 1)
fcdp=$((fcdp-10))
lcdp=$((fcdp+10))

# for a group of locations, make 11-trace stacks at a range of velocities and compare
# Apply demultiple and inner-trace mute to make it easier to see the peaks
suwind <filt.su key=cdp min=$fcdp max=$lcdp |sumute mode=1 par=itm.par |supef minlag=.032  >temp.su
rm cvs.su
susort<temp.su +cdp +offset >sup.su
for((vel=$fv;$vel<=$lv;vel+=$dv))
do
echo Velocity=$vel
#double stack, first by CDP and then smash the 11-trace stacks to reduce noise
sunmo <sup.su vnmo=$vel |sustack |sustack key=trid>>cvs.su
done
# get some facts
surange<cvs.su

# gain and display as amplitude - display velocities on x-axis for mousepicks
# redirect mousepicks to individual CMP files
sugain<cvs.su agc=1 wagc=1|suattributes mode=amp|suximage perc=99 f2=$fv d2=$dv wbox=400 hbox=990 cmap=hsv2 mpicks=mpicks$cdp.txt

# convert velocity files to SU par files
mkparfile< mpicks$cdp.txt > mpicks$cdp.par string1=tnmo string2=vnmo
done

# You still have to combine the velocity PAR files and put a "cdp=nnn,nnn,nnn,..." header on it in order to use it for stacking


On Tue, May 26, 2009 at 11:58 AM, David Muerdter <da...@luminterra.com> wrote:
Thanks Mega, 
comments for all below...


Mega G. Baitoff wrote:
Hello David,

Tuesday, May 26, 2009, 1:53:15 AM, you wrote:
  
My question is operational - I get on The Seismic Un*x User blog
and see only 2 entries.  But I get the posts from Mega and Glenn
Reynolds from the seismic unix mailing list - and some seem to be
missing - I assume because the seis...@dix.mines.edu was not
included as a recipient.  Should all these posts go to the blog, or am I missing something?
    
Oh, this is really wrong. Mailing list is very old-fashioned way for group of people to communicate. Blog with topics and discussions is much more convenient. However, I cannot just carbon-copy every my reply to the TSUU blog without _you_ people to join me there. We should had moved there already, but the inertia is too strong.
  
Inertia is hard to change...   email for now, unless most everyone moves to the blog.

That said, most of us don't have access to the commercial ProMax or
explanations of its modules.  So it would be very helpful to have a
workflow of Mega's processing with expanded notes to understand the
steps in context to SU modules.  This may be asking too much from
people who have pressure from their regular jobs, but I though I would put it out there.
    
I'd be glad to show my fundamental way of processing without being specific to any particular software. Actually, I needed no special processing at all to produce my stacks (except decon, of course), since those stacks are done with AGC.

First step in processing is to obtain a "zero" stack, and we need a "zero" velocity for this. My usual way to guess a velocity is to build a "supergather" (do I need to explain what that mean?). Having a supergather in hands I usually perform the direct measurement of several most intense hyperbolas (by overlaying the "living" hyperbola and seismic data) and just read the "t0" and "V" readings from the velocity measurement tool. If Seismic Unix does not possess  such a tool, I may tell how to use ten-fifteen picks of a hyperbola to estimate its velocity.
  
As a first stab from a rusty processor (I'm more of a interpreter/modeler) I will try to use my limited knowledge of SU to duplicate your steps (described above) without all the details.  I will try to create a detailed SU flow and adjust parameters during the next week (unfortunately I have some deadlines on my real job).  I would welcome any comments on better ways to approach this line.


Starting from the dataset loaded into SU format with good header values (I will call it L2D.su)

# window CMPs near middle of line and sort to CMP to create the super-gather
suwind  <L2D.su  key=cdp min=middle max=middle+11 |
stkSembBrute.jpg
stkCvsBrute.jpg
stkCvs200.jpg

Glenn Reynolds

unread,
May 26, 2009, 5:54:26 AM5/26/09
to seisunix
TOP MUTE

SU doesn't have a handy flow for picking top mutes (or for picking for that matter). But it's not too hard to make a loop in a script to display shots, save your picks, and reformat them for use with sumute.

1. Sumute takes a par file with lines like:
     tmute=123,234,345,456,567
     xmute=432,543,654,765,876

Here's a flow that displays every 25th shot, lets you pick the mute points and saves them to a file, reformatting them as par files. You then use these in sunmo. I first split the file into individual shots (only for this picking - SPs can be deleted later). I run an inner trace mute to make the percentile gain key on anything but the ground roll. I then redisplay the record with the mute applied, saving the shots back into a combined SU file


########################################
#
for((ep=32;$ep<283;ep+=25))
do
echo Muting SP$ep
sumute mode=1 <SP$ep.su par=itm.par|
suxwigb   perc=90 key=offset mpicks=mute$ep.txt title="SP $ep" hbox=800 wbox=1200
mkparfile < mute$ep.txt >mute$ep.par string1=tmute string2=xmute
sumute mode=1 <SP$ep.su par=itm.par|
sumute par=mute$ep.par|
suximage perc=98 title="SP $ep" hbox=800 wbox=1200
done
#
########################################




2. You can't interpolate the mute easily in SU. What does everyone do when the shots look variable? Can a non-zero header be interpolated?
I used unisam and unisam2.
  1 pick the top mute at uniform SP intervals (I used every 25th SP from SP 1 to 251
  2 unisam - convert picks to uniformly-sampled mute offsets (I used -3500 to 3500, every 500m) - creates a binary file
  3 append all binary files to make a big binary file of mute times (15 offsets, 11 SP) = 165 floats = 660 bytes
  4 unisam2 - resample the binary file to make an even bigger binary file. I made a mute for every SP
  5 b2a n1=15 to make rows of mutes, one SP per row.
  6 use a script to convert these to par files or 2-column text files (and use mkparfile on that).

A bit fiddly, but you end up with interpolated shot mutes. Of course, if you pay for ProMax, you get that for free. You can do the same for the inner mute if required (it probably is). Here's the script.

########################################
#
# create and concatenate a binary file of uniformly-sampled mute picks. Uniformly sampled in offset.
rm binary.dat
for((ep=32;$ep<283;ep+=25))
do
file=mute$ep.txt
echo Processing $file
gawk '{print $2,$1}' <$file|a2b >temp;unisam xyfile=temp npairs=`wc<$file|gawk '{print $1}'` fxout=-3500 dxout=500 nout=15 >>binary.dat
done

# Uniformly resample mute picks from the picked shots to each shot in the line
unisam2 <binary.dat nx1=15 fx1=-3500 dx1=500 n1=15  nx2=11 fx2=32 dx2=25 n2=251  >temp.bin

# Get the pick times out of the binary array and use them to make individual SP files in 2-column format (tmute,xmute)
rm mute*.new
b2a<temp.bin n1=15|
gawk '{for(i=1;i<=NF;i++){
ep= 31+FNR;
file="mute" ep ".new";
print i*500-4000,$i>>file
}
}'

# Convert 2-column format to individual par files
for((ep=32;$ep<283;ep+=1))
do
mkparfile<mute$ep.new >mute$ep.par string2=tmute string1=xmute
done
rm mute*.new

########################################

Simon Crombie

unread,
May 26, 2009, 6:56:21 AM5/26/09
to Mega G. Baitoff, seis...@dix.mines.edu
I would be grateful not to have 5MB of attachments arriving with e-mails on
this message board. It filled up my mailbox and prevented other mail being
received.

Thanks

Baitoff Mega

unread,
May 26, 2009, 7:30:35 AM5/26/09
to Simon Crombie, seis...@dix.mines.edu

26.05.09, 14:56, "Simon Crombie" <simonc...@onetel.com>:

> I would be grateful not to have 5MB of attachments arriving with e-mails on
> this message board. It filled up my mailbox and prevented other mail being
> received.
> Thanks

Even free web-mails have almost unlimited mail space. Maybe using them for this maillist will help? Or even better - tune you mail client not to accept messages with attachments above 1Mb?

--
Жизнь без спама на Яндекс.Почте http://mail.yandex.ru/nospam

John Stockwell

unread,
May 26, 2009, 1:04:55 PM5/26/09
to Simon Crombie, seis...@dix.mines.edu
I set up the blog so that you could post there, not here. Apparently
nobody wants to do that. Please note messages are not automatically
transferred to the blog, it is up to you to post them there.

John Stockwell | jo...@dix.Mines.EDU


Center for Wave Phenomena (The Home of Seismic Un*x)
Colorado School of Mines
Golden, CO 80401 | http://www.cwp.mines.edu/cwpcodes
voice: (303) 273-3049

Our book:
Norman Bleistein, Jack K. Cohen, John W. Stockwell Jr., [2001],
Mathematics of multidimensional seismic imaging, migration, and inversion,
(Interdisciplinary Applied Mathematics, V. 13.), Springer-Verlag, New York.

_______________________________________________

John Stockwell

unread,
May 26, 2009, 5:39:20 PM5/26/09
to Glenn Reynolds, seisunix
All of the SU graphics programs have a picking capability.

For example, if we had some file

data=yourdata.su

suximage < $data mpicks=$mutepicks wbox=1000 hbox=500

The result would be a su wiggle trace plot. To pick the mute
curve, place the cursor on the desired point and press "s".
When finished picking press "q"

Then make a parfile in the form of tmute= and xmute= entries:

sort < $mutepicks -n |
mkparfile string1=tmute string2=xmute > $parfile

and apply the mute, making sure that the values represented by
the field key=KeyValue are the same as those of the horizontal
scale in the original plot that you pick from.

sumute < $data par=$parfile key=$key > mute.$data

John Stockwell | jo...@dix.Mines.EDU


Center for Wave Phenomena (The Home of Seismic Un*x)
Colorado School of Mines
Golden, CO 80401 | http://www.cwp.mines.edu/cwpcodes
voice: (303) 273-3049

Our book:
Norman Bleistein, Jack K. Cohen, John W. Stockwell Jr., [2001],
Mathematics of multidimensional seismic imaging, migration, and inversion,
(Interdisciplinary Applied Mathematics, V. 13.), Springer-Verlag, New York.

_______________________________________________

Reply all
Reply to author
Forward
0 new messages