-- ManelErrando - 12 Oct 2009

Image:fermi_satellite.png|thumb|270px|right Some hints to start analyzing Fermi data.

The Fermi Collaboration provides some documentation, examples, tutorials. A very good description of the Fermi analysis threads, with all needed steps and examples can be found [http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/ here]. What follows is just complementary information to this guide.

= Getting the data =

Weekly all-sky photon lists are systematically downloaded to PIC. They are in: /nfs/magic-buffer2/FermiData Organized in directories by release date. You find there the photon files and the Space Craft files (with information about the exposures).

Alternatively, you can get the Fermi data of the source you are interested directly from the FSSC [http://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi web]. For the analysis you will need the photon and spacecraft data.

= Installation of the Fermi Tools =

The easiest way is to use the ready-to-go, official installation at PIC. To do it, the only thing you need to do is to define the following environment (e.g. in your .bashrc and/or in your job-submission script):

MAGICSWPATH=/nfs/magic/sw_slc4/magic/pic #common to other MAGIC software, you may have it already FERMISWDIR=$MAGICSWPATH/FermiST FERMIPLATFORM="i686-pc-linux-gnu-libc" FERMISWVERSION="2.3.4" FERMISWTAG="v9r15p2-fssc-20090808-${FERMIPLATFORM}${FERMISWVERSION}" export FERMI_DIR=$FERMISWDIR/ScienceTools-${FERMISWTAG}/$FERMIPLATFORM$FERMISWVERSION source $FERMI_DIR/fermi-init.sh

Alternatively, you can build a personal installation: # Get the latest version of the Fermi Science tools from the [http://fermi.gsfc.nasa.gov/ssc/data/analysis/software/ web] and follow the installation instructions. # Once installed and compiled, you just need to launch the initialization script every time before start working $ source $FERMI_DIR/fermi-init.sh If you use Mac OS, you might have to add this line to your .bashrc: $ export DYLD_INSERT_LIBRARIES=$FERMI_DIR/lib/libf2c.so

SLC4 users may also be interested in the script /nfs/magic/sw_slc4/magic/pic/FermiST/install_FermiST.sh. Please change the installation paths (and version maybe) before use.

= Installing additional software =

Fermi data files and outputs are mainly in fits format. Yo may want to get [http://heasarc.nasa.gov/lheasoft/ftools/fv/ fv] to view fits files. [http://hea-www.harvard.edu/RD/ds9/ ds9] is also useful to view images, and if you want to manipulate and edit fits files you will need [http://heasarc.nasa.gov/docs/software/ftools/ftools_menu.html ftools].

= Working with the Fermi Science Tools =

The Fermi Collaboration provides basic tools for the data analysis. Type fhelp tool_name to know how to use each tools. A list of the tools provided can be found [http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/references.html here]. Most of the tools can be launched in interactive mode, and then you will be asked to type each of the input parameters. Example: $ gtselect Input FT1 file[events.fits] Output FT1 file[events-filtered.fits] RA for new search center (degrees) (0:360) [216.751632] Dec for new search center (degrees) (-90:90) [23.80001] radius of new search region (degrees) (0:180) [10] start time (MET in s) (0:) [256608000] end time (MET in s) (0:) [266976000] lower energy limit (MeV) (0:) [100] upper energy limit (MeV) (0:) [300000] maximum zenith angle value (degrees) (0:180) [105]

Alternatively, one can include the input parameters directly when calling the tool from the command line, which is useful if you want to call them from a script: $ gtselect events.fits events-filtered.fits 216.751632 23.80001 10 256608000 266976000 100 300000 105

'''WARNING''': The Fermi Tool produce temporary files, which are stored in $HOME/pfiles/scriptname.par. If you run several instances of the same program (also on the farm), they sometimes interfere, so check your output for errors like this: error reading from file "/nfs/pic.es/user/k/klepser/pfiles/gtltcube.par"

Next, you can find an example on how to select data from a particular (point-like) source, using PIC installation. There are two scripts involved, the first one, ''template.sh'', configures the environment, and does not require changes from particular users or analyses. You do not need to run this script, since it is called by the second one. #!/bin/bash # template.sh MAGICSWPATH=/nfs/magic/sw_slc4/magic/pic #common to other MAGIC software, you may have it already FERMISWDIR=$MAGICSWPATH/FermiST FERMIPLATFORM="i686-pc-linux-gnu-libc" FERMISWVERSION="2.3.4" FERMISWTAG="v9r15p2-fssc-20090808-${FERMIPLATFORM}${FERMISWVERSION}" export FERMI_DIR=$FERMISWDIR/ScienceTools-${FERMISWTAG}/$FERMIPLATFORM$FERMISWVERSION source $FERMI_DIR/fermi-init.sh The second one, ''selectData.sh'', selects photons within a circle centered at coordinates (''RA,DEC'') and radius ''RADIUS'' between times ''TMIN'' and ''TMAX'' (expressed in seconds after 01-01-2001, 0 means no time cuts) and energies ''EMIN'' and ''EMAX'' (in MeV). In the example, all photons above 100 MeV in a circle of 5 deg around Cygnus X-3 are selected. You may need to configure also the output path (''OPATH'') to a local directory of your own. #!/bin/bash -f # selectData.sh IPATH=/nfs/magic-buffer2/FermiData OPATH=/nfs/pic.es/user/c/castro/fermi/data/ RA=20.5405 DEC=40.9578 RADIUS=5 EMIN=100 EMAX=200000 ZMAX=105 TMIN=0 TMAX=0 for dir in $(ls $IPATH | grep 200); do SCRIPT=./analyze$dir.sh cat template.sh > $SCRIPT FILEI=$IPATH/$dir/$(ls $IPATH/$dir | grep LAT_allsky) FILEO=$OPATH/cygX3_${RADIUS}deg_${dir}_filtered.fits # gtselect echo "echo Running gtslect" >> $SCRIPT echo gtselect evclsmin=3 evclsmax=3 infile=$FILEI outfile=$FILEO ra=$RA dec=$DEC rad=$RADIUS tmin=$TMIN tmax=$TMAX emin=$EMIN emax=$EMAX zmax=$ZMAX >> $SCRIPT echo "echo Done" >> $SCRIPT echo "" >> $SCRIPT qsub -q short $SCRIPT done By running this script, you will generate a set of scripts (one per weekly Fermi data file), which will be sent to the ''short'' queue at PIC. Each of them will read the corresponding weekly list of photons, and produce a file which only contains those events passing the selected cuts.

= Likelihood analysis =

A [http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/likelihood_tutorial.html tutorial] on the Likelihood analysis is provided in the FSSC web. Find here just a summary of the tools you will have to use in sequential order. In the following examples it is assumed that you have your photon data in a file called events.fits and the spacecraft data in spacecraft.fits.Essentially, the steps are:

# Select the Data # Filter good quality data # Make a Lifetime Cube # Make an exposure Map # Make a Model & Likelihood analysis

'''HINT''': In case you have many event files, you may at any stage list them in an ascii file and pass the ascii file, preceeded by '@' instead of the event file (i.e. @eventFileList.txt). Here are what the commands may look like:

* '''[http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/help/gtselect.txt gtselect]''': Once you have your photon data file, you might want to apply cuts in arrival direction, arrival time, estimated energy, etc using the gtselect tool. Here you also choose the analysis cut you want to apply to the events. For point-source analysis, the recommended choice is to select ''diffuse class'' events. Example: $ gtselect evclsmin=3 evclsmax=3 events.fits events-filtered.fits 216.751632 23.80001 10 256608000 266976000

  1. 300000 105
* '''[http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/help/gtmktime.txt gtmktime]''': In this step the time interval information from the event file is updated using spacecraft data. Basically the time intervals where data quality was bad, south Atlantic anomaly crossings, and intervals when the region of interest was outside the field of view are removed. Example: $ gtmktime spacecraft.fits (IN_SAA!=T)&&(DATA_QUAL==1) yes events-filtered.fits events-filtered-gti.fits * '''[http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/help/gtltcube.txt gtltcube]''': It calculates the integrated livetime and stores it in a fits file. In case of many event files, you have to create one cube for all of them (see hint above to see how). Example: $ gtltcube events-filtered-gti.fits spacecraft.fits expCube.fits 0.025 1 * '''[http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/help/gtexpmap.txt gtexpmap]''': It creates the exposure maps that will be used to calculate the number of predicted photons from every source in your model. Usually, the exposure map is generated for a region with a radius 10º bigger than your region of interest, to allow the possible contribution of photons from sources outside the field of view to be taken into account. In case of many event files, you have to create one exposure map for all of them (see hint above to see how). Example: $ gtexpmap events-filtered-gti.fits spacecraft.fits expCube.fits expMap.fits P6_V3_DIFFUSE 20 100 100 20 * '''[http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/help/gtlike.txt gtlike]''': Fits the free parameters in your model to the data. The output of gtlike is an ascii file with the best fit values of all the free parameters in the model. Example: $ gtlike P6_V3_DIFFUSE expCube.fits src_model.xml UNBINNED NEWMINUIT events-filtered-gti.fits spacecraft.fits expMap.fits

== The Source Model ==

The final step in the Likelihood analysis is to fit a user input model of the gamma-ray sources expected in the selected FoV to the data. The model is input as an XML file that may be created by hand or using the modeleditor. Find here an example describing the FoV of PKS 1424+240:

The model has two basic ingredients: * '''Diffuse emission''': The model must contain a description of the expected diffuse gamma-ray emission. By default this is accomplished by adding two diffuse sources: EG_v02 and GAL_v02. EG_v02 accounts for the isotropic extragalactic background, and GAL_v02 describes the galactic component. Both are described by tabulated files provided at the by the [http://fermi.gsfc.nasa.gov/ssc/data/access/lat/BackgroundModels.html FSSC]. * '''Gamma-ray sources''': The gamma-ray sources in the selected FoV have to be described in the model. A first approach is to check the [http://fermi.gsfc.nasa.gov/ssc/data/access/lat/bright_src_list/ LAT bright source list] for sources near your source of interest. The sources can be point-like or diffuse, and their spectrum and spatial model have to be specified (help [http://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/xml_model_defs.html here]). Spectral parameters can be set as free or fixed. If free, gtlike will return the best fit value.

'''HINT''': You should not try to fit all positions, indexes, flux normalisations at once. Instead, start with a simple minimisation (DRMNGB) and power law and fixed location. Start with the brightest sources and iteratively open the parameters. Later, you may use MINUIT and adjust the spectral shapes.

'''HINT''': You can make gtlike write a model xml ooutput file, including the fitted parameters, by adding sfile=newmodel.xml to the gtlike spell. Like this, you'll only have to release parameters and reiterate.

'''HINT''': To simplify the fitting procedure, one should fix the source position, using the catalog position of every object by default. If some object is suspected to be misplaced with respect to the position in the model, one can fit the source position using gtfindsrc and then run gtlike again with the new coordinates.

= Light curve =

There is a tool, gtbin, that easily produces a light curve. The resulting light curve contains all photons in the field of view, binned in time. Even selecting a FoV of the size of the Fermi PSF (approx 3.5º at 100 MeV), and depending of the source strength, it might be that the contamination from diffuse photons is more than 50%. To produce a light curve taking into account the model for the selected FoV, one has to manually chop the event data file in time bins and do a full Likelihood analysis for every time bin (or write a script that does it), and then compute the integral flux of the source in each time bin from the gtlike output.

= Energy Spectrum =

The Fermi-Tools do not offer a straight-forward way of extracting a usable energy spectrum. One way is to cut your data into energy bins with gtselect, and go through the whole above likelihood procedure bin by bin. If you use powerlaw2 in the model.xml files, and adjust the energy range for each bin, you directly get the integral flux in your bin. This analysis of course does not take into account any migrations, so you may have to do an unfolding in the end, which is not provided by the Fermi tools. A way to avoid this is to inspect the spectrum, decide upon the function you think is appropriate and repeat the likelihood analysis on the full dataset, which corresponds to a forward unfolding. Like this, you have (non-unfolded) datapoints, and an unfolded function, which is what also Fermi people sometimes present.

Topic revision: r1 - 2009-10-12 - ManelErrando
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback