HI
I have many IDs and I want to do task 9 for them ; But I want to have one curve(average curve) instead of a lot curves in my chart . But I can't make average from my curves...

for example , for part type Gas , I want to get median from all the points in each snapshot , and then connect all the points of the average .
I can do this for one snapshot! As shown in the following code(for example for two IDs):

ids = [109974,110822]
list1 = []
for id in ids:
url = "http://www.illustris-project.org/api/Illustris-1/snapshots/68/subhalos/" + str(id)
sub = get(url) # get json response of subhalo properties
# prepare dict to hold result arrays
fields = ['snap','id','mass_gas']
r = {}
for field in fields:
r[field] = []
while sub['desc_sfid'] != -1:
for field in fields:
r[field].append(sub[field])
# request the full subhalo details of the descendant by following the sublink URL
sub = get(sub['related']['sublink_descendant'])
# make a plot (notice our subhalo falls into a much more massive halo around snapshot 105)
for partType in ['gas']:
mass_logmsun = np.log10( np.array(r['mass_'+partType][1])*1e10/0.704)
list1.append(mass_logmsun)
Sum = sum(list1)
ave = Sum/2
plt.plot(r['snap'][1],ave,'.',label=partType)

But I can't generalize it to all the snapshots...
(I want to create a circle (loop) that does this for all the snapshots).

Thank you for your help.

Dylan Nelson

13 Jul '20

Hi Maryam,

I'd suggest you review something like a numpy tutorial which should be helpful later. I would do something more like

num_snaps = 100
ids = [109974,110822]
mass_vs_time = np.zeros( (len(ids),num_snaps), dtype='float32')
for i, id in enumerate(ids):
# download data for this subhalo
mass_vs_time[i,:] = gas_mass_history
avg_mass_vs_time = np.mean(mass_vs_time, axis=0)

Maryam salimi

16 Jul '20

Thank you so much Dylan . You help me a lot .Thank you again .

If I want to get the standard deviation diagram for the mean diagram I drew using the above code ,can you help me?
I have used the following code in continuation of the above code :

total_bins = 10
bins = np.linspace(68, 134 , total_bins)
delta = bins[1]-bins[0]
idx = np.digitize(x,bins)
running_median = [np.median(avg_mass_vs_time[idx==k]) for k in range(total_bins)]
plt.plot(bins-delta/2 , running_median, color= 'blue' , lw=2 , markersize=1 )
running_std = [avg_mass_vs_time[idx==k].std()/np.sqrt(len(avg_mass_vs_time[idx==k])) for k in range(total_bins)]
plt.errorbar(bins-delta/2,running_median,running_std,color='blue', lw=2.0 , marker='o', markersize=4, ls='none', mew=1)

The numpy std() function also has an axis argument, you can maybe use it in the same way. Beyond this I'm afraid I can't help with such detailed aspects.

Maryam salimi

16 Jul '20

I understand. Thank you very much for your guidance and for spend your time to help .Thank you so much .

HI

I have many IDs and I want to do task 9 for them ; But I want to have one curve(average curve) instead of a lot curves in my chart . But I can't make average from my curves...

for example , for part type Gas , I want to get median from all the points in each snapshot , and then connect all the points of the average .

I can do this for one snapshot! As shown in the following code(for example for two IDs):

But I can't generalize it to all the snapshots...

(I want to create a circle (loop) that does this for all the snapshots).

Thank you for your help.

Hi Maryam,

I'd suggest you review something like a numpy tutorial which should be helpful later. I would do something more like

Thank you so much Dylan . You help me a lot .Thank you again .

Hi again Dylan ,

If I want to get the standard deviation diagram for the mean diagram I drew using the above code ,can you help me?

I have used the following code in continuation of the above code :

But I think it is not true...

Thanks in advance for your help.

Hi Maryam,

The numpy std() function also has an

`axis`

argument, you can maybe use it in the same way. Beyond this I'm afraid I can't help with such detailed aspects.I understand. Thank you very much for your guidance and for spend your time to help .Thank you so much .