Re: Running AHF through pynbody for RAMSES data

415 views
Skip to first unread message

Pontzen, Andrew

unread,
Jan 29, 2014, 5:41:02 PM1/29/14
to pynbod...@googlegroups.com
Hi David,

Looks to me like the problem is not related to ahf versions but rather to the Ramses reader which currently doesn't produce an 'eps' (i.e. softening) array. If I recall correctly this is because it's a little hard to associate a particle with the cell it came from (and it's the cells that implicitly know the softening length).

This could legitimately be raised as a github issue but in the meantime the workaround would be just to set your own representative 'eps' array before trying to run AHF, e.g. 

s['eps']=pynbody.units.Unit('100 pc') 

where 100pc is a representative resolution. Note the only reason that eps is used here is to pick a cell size for the AHF input parameters.

Hope that might help

Best wishes. Andrew


On 29 Jan 2014, at 22:09, "David Sullivan" <dsulli...@gmail.com> wrote:

Hi all,

I'm trying to run AHF through pynbody (as explained here: http://pynbody.github.io/pynbody/tutorials/halos.html) but get the following error:

  File "hmf.py", line 8, in <module>

    h = pynbody.halo.AHFCatalogue(s)

  File "/PATH/TO/PYTHON/python2.6/site-packages/pynbody/halo.py", line 285, in __init__

    self._run_ahf(sim)

  File "/PATH/TO/PYTHON/python2.6/site-packages/pynbody/halo.py", line 571, in _run_ahf

    1.0 / np.min(sim['eps'])))), 131072])

  File "/PATH/TO/PYTHON/python2.6/site-packages/pynbody/snapshot.py", line 286, in __getitem__

    self._derive_array(i)

  File "/PATH/TO/PYTHON/python2.6/site-packages/pynbody/snapshot.py", line 1357, in _derive_array

    raise KeyError("No derivation rule for "+name)


I'm using AHF v1.0-075, although in the github issues it looks like only versions up to 1.0-067 are supported. Is this still the case? If so, does anyone know where I can get a copy of 1.1-067 (I can only find down to 1.0-071). Greatly appreciate any help.


Many thanks,

David

--
You received this message because you are subscribed to the Google Groups "pynbody-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pynbody-user...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

David Sullivan

unread,
Feb 3, 2014, 12:29:08 PM2/3/14
to pynbod...@googlegroups.com
Hi Andrew,

Many thanks for your reply, that took care of the above problem. I'm getting some strange particles counts however:

time    = 0.000000

nbodies = 687402840

ndim    = 62

nsph    = 7530992

ndark   = 0

nstar   = 7531024


This is for a 4 Mpc/h 256^3 (both AMR cells and particles) simulation. Does pynbody perform on the fly format conversion of RAMSES data to a AHF supported format? If so could there be some issues with the header?

Many thanks,
David

Pontzen, Andrew

unread,
Feb 3, 2014, 3:00:50 PM2/3/14
to David Sullivan, pynbod...@googlegroups.com
I don't think pynbody does any format conversion on behalf of AHF. I think what you are seeing must be an AHF issue?

Best wishes, Andrew

On 3 Feb 2014, at 17:30, "David Sullivan" <dsulli...@gmail.com> wrote:

Hi Andrew,

Many thanks for your reply, that took care of the above problem. I'm getting some strange particles counts however:

time    = 0.000000

nbodies = 687402840

ndim    = 62

nsph    = 7530992

ndark   = 0

nstar   = 7531024


This is for a 4 Mpc/h 256^3 (both AMR cells and particles) simulation. Does pynbody perform on the fly format conversion of RAMSES data to a AHF supported format? If so could there be some issues with the header?

Many thanks,
David

On Wednesday, 29 January 2014 22:41:02 UTC, Andrew Pontzen wrote:

Rok Roškar

unread,
Feb 3, 2014, 6:32:34 PM2/3/14
to David Sullivan, pynbod...@googlegroups.com
Hi David,

the automatic AHF routines in pynbody expect a tipsy file (I believe). I cut out some of the bits of a script I use to run AHF on RAMSES outputs and put it in a gist here: https://gist.github.com/rokroskar/8794562

I *think* I've ironed out all the unit conversion bugs, at least for my use scenario, but I don't guarantee it will work for any given simulation. Hopefully it's self-explanatory enough that you can adapt it to your needs. 

Rok

David Sullivan

unread,
Feb 3, 2014, 7:51:03 PM2/3/14
to pynbod...@googlegroups.com, David Sullivan, ros...@physik.uzh.ch
Hi both,

Many thanks for your help and for sharing the script, it's greatly appreciated! I'll try running it on my simulations tomorrow and let you know how I get on.

Thanks again for all the help,
David

Alex Fitts

unread,
Sep 19, 2014, 3:11:07 PM9/19/14
to pynbod...@googlegroups.com, dsulli...@gmail.com, ros...@physik.uzh.ch

Hello all,

  I'm trying to do something similar except with a Gadget HDF file.  I keep getting errors that are similar to the one David was encountering.  Do Gadget HDF files have to be converted into Gadget binary format in order to be used by AHF?  Or is there a way to funnel them into AHF after using the pynbody.load function?

Cheers,
Alex Fitts

Andrew Pontzen

unread,
Sep 19, 2014, 5:30:29 PM9/19/14
to Alex Fitts, pynbod...@googlegroups.com
Hi Alex

I don’t know for sure as I’ve never tried, but I suspect AHF does *not* read Gadget HDF files. Maybe someone knows better but if not...

You can convert them first within pynbody. If your HDF file is loaded as snap, you can use:

snap.write(fmt=pynbody.gadget.GadgetSnap,
                filename=“converted.gadget”)

I seem to remember you will need to make sure all the arrays you need to save have been loaded into memory first. I.e., before issuing snap.write, you’ll want to do something like

snap[‘pos’]; snap[‘mass’]; snap[‘vel’]; snap.gas[‘rho’] ...  etc


Hope this helps

Best wishes

Andrew



On 19 Sep 2014, at 20:11, Alex Fitts <fitts...@gmail.com> wrote:


Hello all,

  I'm trying to do something similar except with a Gadget HDF file.  I keep getting errors that are similar to the one David was encountering.  Do Gadget HDF files have to be converted into Gadget binary format in order to be used by AHF?  Or is there a way to funnel them into AHF after using the pynbody.load function?

Cheers,
Alex Fitts
On Monday, February 3, 2014 7:51:03 PM UTC-5, David Sullivan wrote:
For more options, visit https://groups.google.com/d/optout.

Alex Fitts

unread,
Sep 21, 2014, 3:36:54 PM9/21/14
to pynbod...@googlegroups.com, fitts...@gmail.com
 Andrew,
  
   Let me back up. After loading the hdf5 files, i cannot even access all of the information in the file.  Invoking snap['pos'] or snap['iord'] works fine but neither snap['vel'] nor snap['mass'] have 'derivation rules'.  Is this possibly a mistake with my config.ini script in the pynbody directory? In an attempt to temporarily sidestep the hdf5 issue, i have resorted to using binary snapshots.  I am confused by the error i am receiving when trying to use the halos function.  If i have already run AHF and have the corresponding .AHF_halos, .AHF_profiles etc how can i get pynbody to notice them?  Do the files have to be gzipped with the snapshot? I'm getting the following error when trying to run the halos function:

>>> h=s.halos()
gunzip: snapdir_051/snapshot_051.gz: No such file or directory
sh: ./AHF.in: Permission denied
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/homes/afitts/.local/lib/python2.7/site-packages/pynbody/snapshot/__init__.py", line 793, in halos
    return c(self, *args, **kwargs)
  File "/homes/afitts/.local/lib/python2.7/site-packages/pynbody/halo.py", line 612, in __init__
    glob.glob(sim._filename + '*z*halos*')[0])[:-5]
IndexError: list index out of range

Andrew Pontzen

unread,
Sep 21, 2014, 3:39:32 PM9/21/14
to pynbod...@googlegroups.com
Hi Alex,

I think you’ll have to point me at some sample files so that I can reproduce the errors. If you can’t load velocity or mass arrays, it sounds like you’ve found a bug. Either post a link to the files and script here, or on github, and I’ll try to work out what’s gone wrong.

Cheers, Andrew




On 21 Sep 2014, at 20:36, Alex Fitts <fitts...@gmail.com> wrote:

 Andrew,
  
   Let me back up. After loading the hdf5 files, i cannot even access all of the information in the file.  Invoking snap['pos'] or snap['iord'] works fine but neither snap['vel'] nor snap['mass'] have 'derivation rules'.  Is this possibly a mistake with my config.ini script in the pynbody directory? In an attempt to temporarily sidestep the hdf5 issue, i have resorted to using binary snapshots.  I am confused by the error i am receiving when trying to use the halos function.  If i have already run AHF and have the corresponding .AHF_halos, .AHF_profiles etc how can i get pynbody to notice them?  Do the files have to be gzipped with the snapshot? I'm getting the following error when trying to run the halos function:

>>> h=s.halos()
gunzip: snapdir_051/snapshot_051.gz: No such file or directory
sh: ./AHF.in: Permission denied
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/homes/afitts/.local/lib/python2.7/site-packages/pynbody/snapshot/__init__.py", line 793, in halos
    return c(self, *args, **kwargs)
  File "/homes/afitts/.local/lib/python2.7/site-packages/pynbody/halo.py", line 612, in __init__
    glob.glob(sim._filename + '*z*halos*')[0])[:-5]
IndexError: list index out of range


On Friday, September 19, 2014 5:30:29 PM UTC-4, Andrew Pontzen wrote:

Alex Fitts

unread,
Sep 21, 2014, 4:20:54 PM9/21/14
to pynbod...@googlegroups.com

Hey Andrew,

 Here is a link to the snapshot and a script that should reproduce the hdf5 loading error. https://www.dropbox.com/s/gl61p6bz1t5jhfe/example.tar?dl=0 Thanks for your help!

-Alex

Andrew Pontzen

unread,
Sep 21, 2014, 5:16:04 PM9/21/14
to Alex Fitts, pynbod...@googlegroups.com
Interestingly, your velocity array, which in HDF files I’ve seen in the past is called ‘Velocity’ is instead called ‘Velocities’. 

Similarly, your mass array is called “Masses” rather than “Mass”.

Is this file written using regular Gadget or is it some variant?

We should produce a patch if this is a common variant but in the meantime you can fix the problem by creating a .pynbodyrc file in your home folder with the following contents:

[gadgethdf-name-mapping]
Velocities: vel
Velocity: xvel
Masses: mass
Mass: xmass


There is an added complication which is that “Mass” was hardcoded into the gadgethdf handler for cases where the mass derives from the header (which is partially true in your case) so you will also need to update your version of pynbody to the new version I just created

git pull origin gadgethdf_flexibility
python setup.py install

should do it.


Hope this helps. Please let us know what variant of gadget you are using and I can prepare a more permanent fix for this.

Best wishes, Andrew


On 21 Sep 2014, at 21:20, Alex Fitts <fitts...@gmail.com> wrote:


Hey Andrew,

 Here is a link to the snapshot and a script that should reproduce the hdf5 loading error. https://www.dropbox.com/s/gl61p6bz1t5jhfe/example.tar?dl=0 Thanks for your help!

-Alex

Andrew Pontzen

unread,
Sep 21, 2014, 5:17:22 PM9/21/14
to Alex Fitts, pynbod...@googlegroups.com
P.S. you can see the diff here, it’s just one line where ‘Mass’ was hard-coded



On 21 Sep 2014, at 22:15, Andrew Pontzen <a.po...@ucl.ac.uk> wrote:

Interestingly, your velocity array, which in HDF files I’ve seen in the past is called ‘Velocity’ is instead called ‘Velocities’. 

Similarly, your mass array is called “Masses” rather than “Mass”.

Is this file written using regular Gadget or is it some variant?

We should produce a patch if this is a common variant but in the meantime you can fix the problem by creating a .pynbodyrc file in your home folder with the following contents:

[gadgethdf-name-mapping]
Velocities: vel
Velocity: xvel
Masses: mass
Mass: xmass


There is an added complication which is that “Mass” was hardcoded into the gadgethdf handler for cases where the mass derives from the header (which is partially true in your case) so you will also need to update your version of pynbody to the new version I just created

git pull origin gadgethdf_flexibility
python setup.py install

should do it.


Hope this helps. Please let us know what variant of gadget you are using and I can prepare a more permanent fix for this.

Best wishes, Andrew
On 21 Sep 2014, at 21:20, Alex Fitts <fitts...@gmail.com> wrote:


Hey Andrew,

 Here is a link to the snapshot and a script that should reproduce the hdf5 loading error. https://www.dropbox.com/s/gl61p6bz1t5jhfe/example.tar?dl=0 Thanks for your help!

-Alex

Alex Fitts

unread,
Sep 21, 2014, 11:44:57 PM9/21/14
to pynbod...@googlegroups.com, fitts...@gmail.com
Great! That worked perfectly.  The file was produced using Gadget 3.

Thanks Again,
Alex

Andrew Pontzen

unread,
Sep 22, 2014, 5:25:31 AM9/22/14
to pynbod...@googlegroups.com
Sure, which version of Gadget 3 though? There are a lot of variants. I’m trying to work out if your naming scheme in the HDF file is going to be used by other people, or just by a small group.

Cheers, Andrew



On 22 Sep 2014, at 04:44, Alex Fitts <fitts...@gmail.com> wrote:

Great! That worked perfectly.  The file was produced using Gadget 3.

Thanks Again,
Alex

Alex Fitts

unread,
Sep 22, 2014, 9:22:08 AM9/22/14
to pynbod...@googlegroups.com
I believe its the FIRE variant.  Sorry about the confusion.

-Alex

Rick Sarmento

unread,
Apr 14, 2015, 8:27:53 PM4/14/15
to pynbod...@googlegroups.com, dsulli...@gmail.com, ros...@physik.uzh.ch
Hello Rok...

I'm trying to figure out your python script (since I have RAMSES data that I need to run AHF on...), but it can't load it. There's a dandling "else :" clause in the spawn_amiga routine...

 
    else : 
        os.environ['OMP_NUM_THREADS'] = '16'
        os.system("~/bin/amiga_pthread_for_tipsyramses %s.AHF.input"%newfile)


Does that make sense?? I used the link below.

I guess I have to convert RAMSES to TIPSY before using "h=s.halos()" ... so I'm kinda stuck. Any help would be appreciated!

Rick 

Rick Sarmento

unread,
Apr 14, 2015, 8:29:55 PM4/14/15
to pynbod...@googlegroups.com, dsulli...@gmail.com, ros...@physik.uzh.ch
FYI -- there is a ramses2gadget utility that comes with AHF, but not much documentation!

R

Rok Roškar

unread,
Apr 17, 2015, 2:30:22 AM4/17/15
to Rick Sarmento, pynbod...@googlegroups.com, dsulli...@gmail.com
Hi Rick,

yes, you need to work with the RAMSES data as a tipsy output if you want to use AHF. You can do this using e.g. https://github.com/pynbody/pynbody/blob/master/pynbody/analysis/ramses_util.py#L269
Reply all
Reply to author
Forward
0 new messages