I am doing a project where I need Positions and velocities of Galaxy Clusters (Halos?) at specific snap shots.
and within each Galaxy Cluster I need each Galaxies (Sub-Halo?) position and velocity.
I wanted to see if my interpretation of the catalog were correct.
For the Cluster:
I believe the GroupCM and GroupPos are both two different ways of expressing the position. GroupVel would be the velocity.
And how would I be able to correctly identify each Clusters identifier so I can find which subgroups belong to it?
For the Galaxy:
The SubHaloCM and SubhaloPos are two different ways of expressing position and the SubhaloVel is the velocity?
Does the SubhaloParent refer to the parent cluster, or the SubhalGrNr? or are they locations of the parent cluster in two separate tables?
Are all these in an absolute xyz dimensions, or are say the Galaxies position in reference to the center of the cluster?
Thanks!
Craig
Dylan Nelson
10 Jun '19
Hi Craig,
I would use GroupPos and SubhaloPos for the locations of the clusters halos and member galaxies, respectively.
First, you would pick a cluster, e.g. by selecting on Group_M_Crit200 (the halo mass). Then you have its position, and also GroupNsubs and GroupFirstSub, which together give you the list (so to speak) of the member galaxies. The first one is the central galaxy (e.g. BCG) itself, while all the remaining ones are satellites.
You can verify you have the right subhalos (member galaxies) by confirming, as you say, that their SubhaloGrNr is equal to the cluster (index/ID) that you selected.
All the positions are in "box coordinates", so to make them relative, you would want e.g. GalaxyPosRel_x = GroupPos_x - SubhaloPos_x. Careful of periodic boundaries if the cluster is near the edge of the box.
Which data set should I be looking at for this information? Also I have been following the getting started tutorial online and I am running into an error with the Illustris-3 data set. As I downloaded both the Snapshot and the FoF & Subfind[Groupcat] for group 135. However I can't find any fof_subhalo_tab hdf5 files.
I keep running into the error:
" Unable to open file (unable to open file: name = './Illustris_3/output//groups_135/fof_subhalo_tab_135.0.hdf5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)"
when following the example script. Yet I have all the files from the 135 group. Is there any other supplementary data I need to download for the example files that I am missing? All I see are hdf5 files prefaced with either groups (2 such files) or snap (30 such files). I also was required
Thanks!
Dylan Nelson
13 Jun '19
Hi Craig,
The group catalog is called groups_*.hdf5 for Illustris and fof_subhalo*.hdf5 for TNG, but these files are essentially the same. You can see where in the python script it searches for both files, so as long as you have these files in the path it is looking for, it should work.
Craig Cissel
13 Jun '19
Thank you!
I realize now it was just a simple path error. My apologies for the inconvenience.
I am having the same issue. I am even running this on the TNG lab and I still get this same error
OSError: Unable to open file (unable to open file: name = '/TNG100-1/output/groups_099/fof_subhalo_tab_099.0.hdf5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)
Meanwhile, the files exist.
Please can you link up with me or tell me specifically how you solved it. I am also following the example but I keep getting this error.
Dylan Nelson
19 Aug '20
Hi Alexander,
LIkely just a path issue. Where did you start python (in what directory, should agree with os.getcwd()), and what is basePath?
Alexander Akoto-Danso
19 Aug '20
Hi Dylan,
Please the directory is '/home/alex/PHD_YEAR_TWO_2019/SIMULATE/MARTINI/TNG100-1/output'. This is where I started the python from. basePath is basePath = '/TNG100-1/output'
Dylan Nelson
19 Aug '20
Hello,
The search path is the concatenation of those two. So either change your working directory to ''/home/alex/PHD_YEAR_TWO_2019/SIMULATE/MARTINI/' or change your basePath to '.'
I am having the same error here. I am trying to load the coordinates from the dm particle data from the snapshot (looking for the coordinates field). When I load the star particle data, it works fine (I get the coordinates for this). I run the following: all_pos = il.snapshot.loadSubset(basedir, snapnum, parttype, fields=['Coordinates']).astype(np.float32) (where parttype=4 runs fine but parttype=1 gives me the following error line).
I am unsure if Alexander Akoto-Danso was able to resolve the issue. Using TNG300-1, I get a similar error line:
Unable to open file (Unable to open file: name = '/fs1/porrasaj/astronomy_research/illustristng/tng300_1/output/snapdir_099/snap_099.600.hdf5', errno = 2, error message = 'no such file or directory', flags = 0, o_flags = 0)
I checked my path and even changed the directory capitalization to match that of the path that appears in the error. My basedir='/fs1/porrasaj/astronomy_research/illustristng/tng300_1/output' and I changed the basedir to '/fs1/porrasaj/astronomy_research/illustristng' and '.' which was the suggestions you made above. Still not successful.
Something different here is that the snap_099 files only go to 599, so the file snap_099.600.hdf5 does not exist which the error indicates. Following this error, I get another error which could relate to the previous one:
IOError Traceback (most recent call last)
<ipython-input-8-822459f635d0> in <module>()
----> 1 all_pos = il.snapshot.loadSubset(basedir, snapnum, parttype, fields=['Coordinates']).astype(np.float32)
/fs1/porrasaj/astronomy_research/illustristng/tng300_1/output/illustris_python/snapshot.pyc in loadSubset(basePath, snapNum, partType, fields, subset, mdi, sq, float32)
74 i = 1
75 while gName not in f:
---> 76 f = h5py.File(snapPath(basePath, snapNum, i), 'r')
77 i += 1
78
/fs1/porrasaj/anaconda2/lib/python2.7/site-packages/h5py/_hl/files.pyc in __init__(self, name, mode, driver, libver, userblock_size, swmr, **kwds)
270
271 fapl = make_fapl(driver, libver, **kwds)
--> 272 fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr)
273
274 if swmr_support:
/fs1/porrasaj/anaconda2/lib/python2.7/site-packages/h5py/_hl/files.pyc in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
90 if swmr and swmr_support:
91 flags |= h5f.ACC_SWMR_READ
---> 92 fid = h5f.open(name, flags, fapl=fapl)
93 elif mode == 'r+':
94 fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)
h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2696)()
h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2654)()
h5py/h5f.pyx in h5py.h5f.open (/home/ilan/minonda/conda-bld/work/h5py/h5f.c:1942)()
The error points me towards a line within the loadSubset function in the snapshot.py script in the illustris_python module. The line is: "f = h5py.File(snapPath(basePath, snapNum, i), 'r')" I edited the while loop so that it does not go to the snap file 600, but after that I get another error... I feel like I am getting into a rabbit hole for something as simple as loading dm coordinate data from the snapshot files.
Dylan Nelson
18 Sep '20
Hi Antonio,
Yes this doesn't make much sense. This works fine for me, and will work e.g. on the Lab.
In [1]: basePath = 'sims.TNG/TNG300-1/output/'
In [2]: snapNum = 99
In [3]: fields = ['Coordinates']
In [4]: partType = 1
In [5]: x = il.snapshot.loadSubset(basePath, snapNum, partType, fields)
Have you downloaded the complete snapshots, i.e. not subsets of them with only the stars?
What is the output of h5ls -r snapdir_099/snap_099.0.hdf5?
You see that this section of loadSubset() from lines 73-77 is simply searching for a file which contains a group named PartType1. You can add a print(gName) to make sure it says "PartType1". If it does, then it must be the case that none of your HDF5 files for that snapshot contain such a group? (This is not true on the original files, so my only thought is that you have downloaded them only partially).
Antonio Porras
18 Sep '20
Have you downloaded the complete snapshots, i.e. not subsets of them with only the stars?
This is the answer to my error. I forgot that a while back when I downloaded the snapshot data, I did so for a particular set of fields for the stars particle only because downloading the entire snapshot data was a lot to store. When I write: h5ls -r snapdir_099/snap_099.1.hdf5 , I get:
/ Group
/Header Group
/PartType4 Group
/PartType4/Coordinates Dataset {1706501, 3}
/PartType4/GFM_StellarFormationTime Dataset {1706501}
/PartType4/Masses Dataset {1706501}
/PartType4/Velocities Dataset {1706501, 3}
This is telling me that the snap_099.1.hdf5 file contains only Coordinates, GFM_StellarFormationTime, Masses, and Velocities for PartType4 (stars) only [I think]. I did add the print(gName) statement in the function and get PartType1 to make sure. You are right, none of the HDF5 files for the 99 snapshot I downloaded contain dark matter, or gas particle data. Thus, that is the reason why I get the error.
Thank you for your help and timely response.
Alexander Akoto-Danso
23 Sep '20
Hi Antonio,
I would want to see if I can link up with you directly so you can give me some pointers. I want to know how you solved this issue. My email address is alexander.akotodanso@gmail.com.
I am doing a project where I need Positions and velocities of Galaxy Clusters (Halos?) at specific snap shots.
and within each Galaxy Cluster I need each Galaxies (Sub-Halo?) position and velocity.
I wanted to see if my interpretation of the catalog were correct.
For the Cluster:
I believe the GroupCM and GroupPos are both two different ways of expressing the position. GroupVel would be the velocity.
And how would I be able to correctly identify each Clusters identifier so I can find which subgroups belong to it?
For the Galaxy:
The SubHaloCM and SubhaloPos are two different ways of expressing position and the SubhaloVel is the velocity?
Does the SubhaloParent refer to the parent cluster, or the SubhalGrNr? or are they locations of the parent cluster in two separate tables?
Are all these in an absolute xyz dimensions, or are say the Galaxies position in reference to the center of the cluster?
Thanks!
Craig
Hi Craig,
I would use
GroupPos
andSubhaloPos
for the locations of the clusters halos and member galaxies, respectively.First, you would pick a cluster, e.g. by selecting on
Group_M_Crit200
(the halo mass). Then you have its position, and alsoGroupNsubs
andGroupFirstSub
, which together give you the list (so to speak) of the member galaxies. The first one is the central galaxy (e.g. BCG) itself, while all the remaining ones are satellites.You can verify you have the right subhalos (member galaxies) by confirming, as you say, that their
SubhaloGrNr
is equal to the cluster (index/ID) that you selected.All the positions are in "box coordinates", so to make them relative, you would want e.g.
GalaxyPosRel_x = GroupPos_x - SubhaloPos_x
. Careful of periodic boundaries if the cluster is near the edge of the box.Thank you so much!
Craig
Which data set should I be looking at for this information? Also I have been following the getting started tutorial online and I am running into an error with the Illustris-3 data set. As I downloaded both the Snapshot and the FoF & Subfind[Groupcat] for group 135. However I can't find any fof_subhalo_tab hdf5 files.
I keep running into the error:
" Unable to open file (unable to open file: name = './Illustris_3/output//groups_135/fof_subhalo_tab_135.0.hdf5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)"
when following the example script. Yet I have all the files from the 135 group. Is there any other supplementary data I need to download for the example files that I am missing? All I see are hdf5 files prefaced with either groups (2 such files) or snap (30 such files). I also was required
Thanks!
Hi Craig,
The group catalog is called
groups_*.hdf5
for Illustris andfof_subhalo*.hdf5
for TNG, but these files are essentially the same. You can see where in the python script it searches for both files, so as long as you have these files in the path it is looking for, it should work.Thank you!
I realize now it was just a simple path error. My apologies for the inconvenience.
You've been an amazing help!
I am having the same issue. I am even running this on the TNG lab and I still get this same error
OSError: Unable to open file (unable to open file: name = '/TNG100-1/output/groups_099/fof_subhalo_tab_099.0.hdf5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)
Meanwhile, the files exist.
Please can you link up with me or tell me specifically how you solved it. I am also following the example but I keep getting this error.
Hi Alexander,
LIkely just a path issue. Where did you start python (in what directory, should agree with os.getcwd()), and what is
basePath
?Hi Dylan,
Please the directory is '/home/alex/PHD_YEAR_TWO_2019/SIMULATE/MARTINI/TNG100-1/output'. This is where I started the python from. basePath is basePath = '/TNG100-1/output'
Hello,
The search path is the concatenation of those two. So either change your working directory to ''/home/alex/PHD_YEAR_TWO_2019/SIMULATE/MARTINI/' or change your basePath to '.'
Hi Dylan,
I am having the same error here. I am trying to load the coordinates from the dm particle data from the snapshot (looking for the coordinates field). When I load the star particle data, it works fine (I get the coordinates for this). I run the following: all_pos = il.snapshot.loadSubset(basedir, snapnum, parttype, fields=['Coordinates']).astype(np.float32) (where parttype=4 runs fine but parttype=1 gives me the following error line).
I am unsure if Alexander Akoto-Danso was able to resolve the issue. Using TNG300-1, I get a similar error line:
I checked my path and even changed the directory capitalization to match that of the path that appears in the error. My basedir='/fs1/porrasaj/astronomy_research/illustristng/tng300_1/output' and I changed the basedir to '/fs1/porrasaj/astronomy_research/illustristng' and '.' which was the suggestions you made above. Still not successful.
Something different here is that the snap_099 files only go to 599, so the file snap_099.600.hdf5 does not exist which the error indicates. Following this error, I get another error which could relate to the previous one:
The error points me towards a line within the loadSubset function in the snapshot.py script in the illustris_python module. The line is: "f = h5py.File(snapPath(basePath, snapNum, i), 'r')" I edited the while loop so that it does not go to the snap file 600, but after that I get another error... I feel like I am getting into a rabbit hole for something as simple as loading dm coordinate data from the snapshot files.
Hi Antonio,
Yes this doesn't make much sense. This works fine for me, and will work e.g. on the Lab.
Have you downloaded the complete snapshots, i.e. not subsets of them with only the stars?
What is the output of
h5ls -r snapdir_099/snap_099.0.hdf5
?You see that this section of
loadSubset()
from lines 73-77 is simply searching for a file which contains a group namedPartType1
. You can add aprint(gName)
to make sure it says "PartType1". If it does, then it must be the case that none of your HDF5 files for that snapshot contain such a group? (This is not true on the original files, so my only thought is that you have downloaded them only partially).Have you downloaded the complete snapshots, i.e. not subsets of them with only the stars?
This is the answer to my error. I forgot that a while back when I downloaded the snapshot data, I did so for a particular set of fields for the stars particle only because downloading the entire snapshot data was a lot to store. When I write: h5ls -r snapdir_099/snap_099.1.hdf5 , I get:
/ Group
/Header Group
/PartType4 Group
/PartType4/Coordinates Dataset {1706501, 3}
/PartType4/GFM_StellarFormationTime Dataset {1706501}
/PartType4/Masses Dataset {1706501}
/PartType4/Velocities Dataset {1706501, 3}
This is telling me that the snap_099.1.hdf5 file contains only Coordinates, GFM_StellarFormationTime, Masses, and Velocities for PartType4 (stars) only [I think]. I did add the print(gName) statement in the function and get PartType1 to make sure. You are right, none of the HDF5 files for the 99 snapshot I downloaded contain dark matter, or gas particle data. Thus, that is the reason why I get the error.
Thank you for your help and timely response.
Hi Antonio,
I would want to see if I can link up with you directly so you can give me some pointers. I want to know how you solved this issue. My email address is alexander.akotodanso@gmail.com.
Hi Alexander,
Sure thing. Emailing you now.
Antonio