I went through the forums and looked through a lot of the questions regarding getting images, but I was having trouble really pinning down a concrete answer for me.
I have certain galaxies in a particular snapshot that I have been investigating and would really like to be able to visualize these systems. I currently use the JupyterLab service for querying/obtaining data from the TNG servers. At this point, I was just going to locally use this URL I found in the supporting documents (http://www.tng-project.org/api/TNG100-1/snapshots/99/subhalos/397866/skirt/broadband_sdss.fits), where I believe the 397866 number refers to the Subhalo index of the galaxy of interests, and recursively get all of the files downloaded onto my server (I'm fine with viewing locally or in the JupyterLab service)
Are there better ways to accomplish this?
Dylan Nelson
19 Jul '21
Are you interested in the SKIRT synthetic images in particular? If so, and you need just a few, then yes I would suggest as you say above, 'downloading' those of interest into your personal Lab. If you need to access e.g. every single file, then I can possibly add these to the Lab more directly.
For making images of fields other than stellar light, you can also similarly use the visualization tool, and once you make the visualization you want, you can find the link at the bottom of the page, where you can then replace the subhalo ID by additional subhalo IDs, just as for the skirt images.
I am finding that the visualization tool is actually very powerful and might be exactly what I need for my purposes. I have just a few short questions to ask about it
Would a wget command of replacing the subhalo ID in the url allow me to recursively get all of the files I need? (probably 1000+)
Dylan Nelson
19 Jul '21
The first example looks ok to me, it is a very small (M* = 10^9 Msun) satellite galaxy. It could be getting stripped e.g. Yun+2018. You're right to be cautious though, especially for such low mass (marginally resolved) galaxies.
For the second example, the values are not between -0.1 to 0.1 km/s, only the colorbar range is. This is an automatic guess, but in this case you should override it (by setting e.g. min=-50&max=50).
Yes exactly, you can wget in a loop a number of subhalo IDs of interest. Please run this in serial (i.e. one request at a time).
DErrick Carr
19 Jul '21
Thanks, this is all great help! The visualization tool (and some code I was running in the Jupyterlab) suddenly became extremely slow, is anything going on with the TNG servers?
So I started working on a wget script to get all of these images and I seem to get an HTTP Error 403: Forbidden. I add my api key and the token at the end but if I open up an incognito browser and go to the url, I get a failed session auth as well. Do you know why this is happening? I put a dummy URL at the bottom that works if I am signed in, but fails if not. [I edited my api key for safety, feel free to use yours or another one]
You should never put a "token", or use a link to e.g. "data-eu.tng-project.org", these both change over time.
To make such requests, you have two options, either (i) add your API key to the wget command (header) as in the example for downloading data, or (ii) add your API key to the URL as above.
The only issue with the link above is a syntax typo, since ? can appear only once in a URL. You only need to change ?api_key= to &api_key=, to indicate that this is one more (of many) URL parameters.
DErrick Carr
16 Sep '21
Thanks Dylan! Was not aware of that last line, and that change fixed it.
I went through the forums and looked through a lot of the questions regarding getting images, but I was having trouble really pinning down a concrete answer for me.
I have certain galaxies in a particular snapshot that I have been investigating and would really like to be able to visualize these systems. I currently use the JupyterLab service for querying/obtaining data from the TNG servers. At this point, I was just going to locally use this URL I found in the supporting documents (http://www.tng-project.org/api/TNG100-1/snapshots/99/subhalos/397866/skirt/broadband_sdss.fits), where I believe the 397866 number refers to the Subhalo index of the galaxy of interests, and recursively get all of the files downloaded onto my server (I'm fine with viewing locally or in the JupyterLab service)
Are there better ways to accomplish this?
Are you interested in the SKIRT synthetic images in particular? If so, and you need just a few, then yes I would suggest as you say above, 'downloading' those of interest into your personal Lab. If you need to access e.g. every single file, then I can possibly add these to the Lab more directly.
For making images of fields other than stellar light, you can also similarly use the visualization tool, and once you make the visualization you want, you can find the link at the bottom of the page, where you can then replace the subhalo ID by additional subhalo IDs, just as for the skirt images.
Hey Dylan,
I am finding that the visualization tool is actually very powerful and might be exactly what I need for my purposes. I have just a few short questions to ask about it
I'm looking at a handful of my ~200 sample of galaxies around 10^9 in stellar mass and there's a sizeable fraction (~20%) of the galaxies that I've seen so far that are like a small speck of stars and gas mass, and completely isolated from any sort of matter (stars,gas,dm) slightly outside its effective radius. Are those bonafied galaxies? The SubhaloFlag for them is true as well. Example: https://www.tng-project.org/api/TNG100-1/snapshots/79/subhalos/442874/vis.png?partType=gas&partField=mass&size=20&sizeType=kpc&rasterPx=1100&fracsType=rHalfMassStars&plotStyle=edged
Some galaxies have gas vel_los that is basically zero (ranges from -0.1 km/s to 0.1 km/s), why is that? Ex: https://www.tng-project.org/api/TNG100-1/snapshots/79/subhalos/442874/vis.png?partType=gas&partField=vel_los&size=20&sizeType=kpc&rasterPx=1100&fracsType=rHalfMassStars&plotStyle=edged
Would a wget command of replacing the subhalo ID in the url allow me to recursively get all of the files I need? (probably 1000+)
The first example looks ok to me, it is a very small (M* = 10^9 Msun) satellite galaxy. It could be getting stripped e.g. Yun+2018. You're right to be cautious though, especially for such low mass (marginally resolved) galaxies.
For the second example, the values are not between -0.1 to 0.1 km/s, only the colorbar range is. This is an automatic guess, but in this case you should override it (by setting e.g.
min=-50&max=50
).Yes exactly, you can wget in a loop a number of subhalo IDs of interest. Please run this in serial (i.e. one request at a time).
Thanks, this is all great help! The visualization tool (and some code I was running in the Jupyterlab) suddenly became extremely slow, is anything going on with the TNG servers?
Hey Dylan,
So I started working on a wget script to get all of these images and I seem to get an HTTP Error 403: Forbidden. I add my api key and the token at the end but if I open up an incognito browser and go to the url, I get a failed session auth as well. Do you know why this is happening? I put a dummy URL at the bottom that works if I am signed in, but fails if not. [I edited my api key for safety, feel free to use yours or another one]
(https://www.tng-project.org/api/TNG100-1/snapshots/67/subhalos/584290/vis.png?partType=gas&partField=mass&size=90&sizeType=kpc&rasterPx=1100&fracsType=kpc&axesUnits=kpc&plotStyle=edged&labelZ=True&labelScale=True&labelSim=True&labelHalo=True&colorbars=True?api_key=05014e03f5846bc25bbc789)
Hello,
You should never put a "token", or use a link to e.g. "data-eu.tng-project.org", these both change over time.
To make such requests, you have two options, either (i) add your API key to the wget command (header) as in the example for downloading data, or (ii) add your API key to the URL as above.
The only issue with the link above is a syntax typo, since
?
can appear only once in a URL. You only need to change?api_key=
to&api_key=
, to indicate that this is one more (of many) URL parameters.Thanks Dylan! Was not aware of that last line, and that change fixed it.