Jupyter-lab crashes when trying to open a massive snapshot

Pablo Galan de Anta
  • 2
  • 20 Apr '22

Hi,

when I try to read the data from snapshot for subhalo 96762 (in particular, the stars) the code crashes when trying to assign masses, velocities and other variables.

In particular

head = il.groupcat.loadHeader(basePath,SnapNum)
sing = il.groupcat.loadSingle(basePath,SnapNum,subhaloID=subhalo)
#VV   = sing['SubhaloVel']

hh = head['HubbleParam'] # Hubble constant

snap = il.snapshot.loadSubhalo(basePath,SnapNum,subhalo,'stars',fields=['Coordinates','Velocities','Masses','Potential','GFM_Metallicity','GFM_StellarFormationTime'])
scalefact = head['Time']

print('Defining the quantities')
potential = snap['Potential']/scalefact
coords = snap['Coordinates']*scalefact/hh; vels = snap['Velocities']*np.sqrt(scalefact); mass = snap['Masses']*1.0e10/hh
Z_sun  = np.log10(snap['GFM_Metallicity']/0.0127); StellarTime = snap['GFM_StellarFormationTime']

The code crashes when I try to do any operation over these variables (such as constrain around 2Re, filtering by mass or any other). Is there any other way to open the data trying to not exceed the memory?

Thanks in advance,
Pablo

Dylan Nelson
  • 20 Apr '22

Dear Pablo,

There is a 10GB memory limit I believe, you won't be able to load massive datasets all at once.

To confirm you hit a memory limit, I'd suggest to switch to a small simulation to test (e.g. TNG100-3).

If you are hitting a memory limit, the way forward would be to think about how you can compute what you want, without loading all the stars in the entire snapshot at once.

Pablo Galan de Anta
  • 21 Apr '22

Dear Dylan,

I've found the method to open massive subhaloes without exceeding the memory limit.

Thanks for your help!

  • Page 1 of 1