Questions about absorption spectrum

27 views
Skip to first unread message

Aniket Bhagwat

unread,
Feb 3, 2020, 3:45:08 AM2/3/20
to trident-project-users
Hello! 

I'm trying to calculate absorption spectra for a few metal ions and 21 cm line of HI using a gadget based simulation. I have a few questions that I can't seem to find solutions to : 

- Can I bin the fields in a ray into cells of fixed size? I want to display the ray temperature and density fields along with the final absorption spectrum in a single plot. Can I bin the fields stored in the ray with the same uniform bins with binsize dlambda as used in the absorption spectra?

- When calculating the 21 cm forest spectra, thermal broadening of the lines isn't considered. (Eq 1 https://arxiv.org/abs/1510.02296). Can I disable the voigt deposition in trident? Another way I could do this is using the temperature and column density fields from the ray. But the fields need to be binned along the sightline to calculate the spectrum, which is why I asked the first question. 

- I'm using a small box with L = 12.5 cMpc and the dz across the box is too small. I want to loop the sightline inside the same dataset until I achieve a fixed dz. Is this possible with trident? I don't have access to the simulation parameter file which makes things more difficult. Is the looping part achievable? 

Any inputs/suggestions are highly appreciated! 

Thanking you,
Sincerely

Aniket 

Cameron Hummels

unread,
Feb 3, 2020, 11:49:31 AM2/3/20
to Aniket Bhagwat, trident-project-users
Hi Aniket,

- Can I bin the fields in a ray into cells of fixed size? I want to display the ray temperature and density fields along with the final absorption spectrum in a single plot. Can I bin the fields stored in the ray with the same uniform bins with binsize dlambda as used in the absorption spectra?

I guess I don't fully understand what you mean by this question.  The wavelength array is in wavelength space that does not directly correlate with the "l" array in physical space down the ray.  Are you trying to make a plot/movie like this: https://vimeo.com/116594959 ?  It may be a bit trickier for a particle-based dataset, because the "dl" and "l" arrays, the path lengths of the ray elements and their location along the ray, are not uniform in distribution for SPH datasets.

 As the ray traverses the simulation volume, it isn't sampling the fluids in a uniform way as it would for grid-based datasets.  It's actually identifying what portion of the ray passes in proximity to SPH particles along its length within their smoothing kernels.  Trident records the fluid quantities for each of those particles in the ray itself (e.g., the temperature field, density field, metallicity field), so the ray data structure actually has the particles' original fluid quantities in it, *not* the values along the ray itself.  The way trident/yt actually encodes how much these fluid quantities from the SPH particles *matter* for the ray is encoded in the path length of that ray element, the "dl" array.  So the "dl" array array is not a true physical path length, it's a combination of the true path length and the proximity of that path to the SPH particle.  But for calculating column densities, the traditional calculation still works for a given ray element: density * dl = column density.  dl is just where all the magic is hidden about numerically integrating the path of the ray through the smoothing kernel of the particle.
 
- When calculating the 21 cm forest spectra, thermal broadening of the lines isn't considered. (Eq 1 https://arxiv.org/abs/1510.02296). Can I disable the voigt deposition in trident? Another way I could do this is using the temperature and column density fields from the ray. But the fields need to be binned along the sightline to calculate the spectrum, which is why I asked the first question. 
 
You are the first person to my knowledge, who has tried to use Trident to produce predictions of 21 cm data, so things may not work out ideally, but we will see what we can do.  The actual Voigt deposition is done in using the "tau_profile" function https://github.com/trident-project/trident/blob/master/trident/absorption_spectrum/absorption_line.py#L147 .  It's primarily used in absorption_line.py here: https://github.com/trident-project/trident/blob/master/trident/absorption_spectrum/absorption_spectrum.py#L886 .  As you can see, the "thermb" value is what feeds into it to give it doppler broadening (or thermal broadening) to each line.  One way to suppress thermal broadening would be for you to generate a ray for your sightline, then manually set the temperature field in that ray to something very small (zero may just break it), and then generate your spectrum from that.

 
- I'm using a small box with L = 12.5 cMpc and the dz across the box is too small. I want to loop the sightline inside the same dataset until I achieve a fixed dz. Is this possible with trident? I don't have access to the simulation parameter file which makes things more difficult. Is the looping part achievable? 
 
You can use the "make_compound_ray" function to generate rays that traverse the simulation volume mulitiple times: https://trident.readthedocs.io/en/latest/annotated_example.html#compound-lightrays .  But this requires the simulation parameter file.  I suppose you could try to make several sightlines across the box and then stitch them together yourself, but this will be a bit harder to achieve.  But you could just manually pick your start_point and end_point of your different lines.  Just remember to look at the final redshift of the last ray element to figure out what to use as your starting redshift for the next sightline, and use them until you get your desired z range.  Keep in mind that the "starting" z is the high-z, and the ending "z" is low-z.

I hope this helps!  Good luck!

Cameron
 
--
Cameron Hummels
Computational Astrophysicist
California Institute of Technology

Aniket Bhagwat

unread,
Feb 4, 2020, 8:41:51 AM2/4/20
to Cameron Hummels, trident-project-users
Hi Cameron,

Thanks for your response!
 
I guess I don't fully understand what you mean by this question.  The wavelength array is in wavelength space that does not directly correlate with the "l" array in physical space down the ray.  Are you trying to make a plot/movie like this: https://vimeo.com/116594959 ?  It may be a bit trickier for a particle-based dataset, because the "dl" and "l" arrays, the path lengths of the ray elements and their location along the ray, are not uniform in distribution for SPH datasets.

 As the ray traverses the simulation volume, it isn't sampling the fluids in a uniform way as it would for grid-based datasets.  It's actually identifying what portion of the ray passes in proximity to SPH particles along its length within their smoothing kernels.  Trident records the fluid quantities for each of those particles in the ray itself (e.g., the temperature field, density field, metallicity field), so the ray data structure actually has the particles' original fluid quantities in it, *not* the values along the ray itself.  The way trident/yt actually encodes how much these fluid quantities from the SPH particles *matter* for the ray is encoded in the path length of that ray element, the "dl" array.  So the "dl" array array is not a true physical path length, it's a combination of the true path length and the proximity of that path to the SPH particle.  But for calculating column densities, the traditional calculation still works for a given ray element: density * dl = column density.  dl is just where all the magic is hidden about numerically integrating the path of the ray through the smoothing kernel of the particle.
 

I want to achieve something very similar to the movie you have linked but for the fields, I want to represent the x-axis in bins of hubble flow velocity (which trident does as velocity space when calculating spectra). I wanted to check if I can do a velocity space binning for the field values along the LOS just like how the spectrum (tau) is binned in velocity space with bin width dlambda (dvbin). This is interchangeable with wavelength space through lambda = lambda_0 (1+z) (1 + v/c) which is what I intended to say. Does this make sense? 

I guess the underlying question here is that given the field values from the ray, how does the spectrum generator covert these fields to tau (and exp(-tau)) values in wavelength/velocity space with a fixed dlambda value and if the same process can be applied to just the fields like temperature, column density etc. 
 
You are the first person to my knowledge, who has tried to use Trident to produce predictions of 21 cm data, so things may not work out ideally, but we will see what we can do.  The actual Voigt deposition is done in using the "tau_profile" function https://github.com/trident-project/trident/blob/master/trident/absorption_spectrum/absorption_line.py#L147 .  It's primarily used in absorption_line.py here: https://github.com/trident-project/trident/blob/master/trident/absorption_spectrum/absorption_spectrum.py#L886 .  As you can see, the "thermb" value is what feeds into it to give it doppler broadening (or thermal broadening) to each line.  One way to suppress thermal broadening would be for you to generate a ray for your sightline, then manually set the temperature field in that ray to something very small (zero may just break it), and then generate your spectrum from that.

Oh wow! What do you mean by a lot can many things may not work out ideally? Under the assumption that the local gas temperature is the spin temperature calculating the optical depth is as simple as tau = (constant * N_HI / T). If I can bin the N_HI and T in velocity space then the spectrum would be a simple exp(-tau) in each bin. 

I went through the bits you suggested, I was thinking of adding a boolean option to bypass the Voigt deposition 
 
You can use the "make_compound_ray" function to generate rays that traverse the simulation volume mulitiple times: https://trident.readthedocs.io/en/latest/annotated_example.html#compound-lightrays .  But this requires the simulation parameter file.  I suppose you could try to make several sightlines across the box and then stitch them together yourself, but this will be a bit harder to achieve.  But you could just manually pick your start_point and end_point of your different lines.  Just remember to look at the final redshift of the last ray element to figure out what to use as your starting redshift for the next sightline, and use them until you get your desired z range.  Keep in mind that the "starting" z is the high-z, and the ending "z" is low-z.

I'm currently trying to get my hands on the parameter file, but just to test what I intend to do I tride this with the gizmo data that is linked in trident examples. I made a thread about this on the github page (apologies for multiple parallel threads via email and github). I tried using the compound light ray feature on a single snapshot and it doesn't quite work as I wanted. I kept a single snapshot at z =  0.1 (snap_N128L16_149.hdf5) in the runtime directory and ran : 

ray = trident.make_compound_ray('N128L16.param',simulation_type='Gadget',near_redshift=0.0,far_redshift=0.1,max_box_fraction=25)

I got an error saying : YTOutputNotIdentified: Supplied ('snap_N128L16_151.hdf5',) {}, but could not load!

Given that the boxsize of the simulation at hand is 16 Mpc, 25 loops in the dataset at z = 0.1 should achieve the dz I'm looking for, but trident expects me to provide the dataset at z=0. I'm very confused by this. What am I doing wrong? The main reason I'm trying to get the los to fit in a single dataset is that the simulation I'm using has an output frequency of dz = 0.28 (for reference I'm using the Aurora simulations https://arxiv.org/pdf/1603.00034.pdf and I'll need to loop inside the same box n times.
Stitching the sightlines manually seems much more difficult. If I can manage to get the compund ray working on the gizmo snapshots as I'm trying, I think I'll be able to translate the same parameter file to suit my simulations. 

I really appreciate all your help!

Aniket Bhagwat

unread,
Feb 4, 2020, 8:46:54 AM2/4/20
to Cameron Hummels, trident-project-users
Edit : In the last bit I meant redshift of 0.01 not 0.1. 

Britton Smith

unread,
Feb 4, 2020, 11:33:26 AM2/4/20
to Aniket Bhagwat, trident-project-users
Hi Aniket,

I am not 100% sure this will help with your first question, but it's possible. If you are working directly with the SpectrumGenerator as is documented in here, there is a keyword argument for the make_spectrum command called "store_observables", which if set to True, will create a dictionary called "line_observables_dict" hanging off the SpectrumGenerator object. This is a relatively unknown feature that could use some more documentation, but I believe it will contain a dictionary of arrays of column density, wavelength, equivalent width, etc for each line that was deposited in the spectrum. Have a look at the docstring for the make_spectrum command. It may contain a little more information.

For the error you're getting in the make_compound_ray command, it's possible a bug may have crept in because I think this is suppose to work. I'm happy to try to help debug this. This thread is already pretty long and a bit hard to read. I think it might be easier to chat on the slack channel (info in trident docs). If you've able to come on there, one of us should be around.

Britton

--
You received this message because you are subscribed to the Google Groups "trident-project-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to trident-project-...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/trident-project-users/CAGpJuHKtLNGZnNtqWgpFY2G_hZ4W8JYjWraVcGwV2%3D6v4Ce_1A%40mail.gmail.com.

Aniket Bhagwat

unread,
Feb 4, 2020, 10:28:10 PM2/4/20
to Britton Smith, trident-project-users
Hi Britton, 

I am not 100% sure this will help with your first question, but it's possible. If you are working directly with the SpectrumGenerator as is documented in here, there is a keyword argument for the make_spectrum command called "store_observables", which if set to True, will create a dictionary called "line_observables_dict" hanging off the SpectrumGenerator object. This is a relatively unknown feature that could use some more documentation, but I believe it will contain a dictionary of arrays of column density, wavelength, equivalent width, etc for each line that was deposited in the spectrum. Have a look at the docstring for the make_spectrum command. It may contain a little more information.

I'll test this out and go through the docstring, thanks! 

For the error you're getting in the make_compound_ray command, it's possible a bug may have crept in because I think this is suppose to work. I'm happy to try to help debug this. This thread is already pretty long and a bit hard to read. I think it might be easier to chat on the slack channel (info in trident docs). If you've able to come on there, one of us should be around.

The invite link in the docs doesn't seem to be active, could you send me a new link?  

Kind Regards, 
Aniket

Britton Smith

unread,
Feb 5, 2020, 7:00:39 AM2/5/20
to Aniket Bhagwat, trident-project-users
Hi Aniket,

I've seen this issue with the slack invite link on another channel as well. I'm not an admin on the trident slack, so I don't think I can invite you, but perhaps someone else can. In the meantime, I've contacted slack about this issue. Hopefully, they can get resolve it soon.

Britton
Reply all
Reply to author
Forward
0 new messages