Difference: FermiPyAnalysis (1 vs. 17)

Revision 172020-08-21 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 47 to 47
 
$FERMIPIPE/setup_pipeline.sh -p <CONDA_PATH>
Changed:
<
<
For example, to install to your home directory (recommended for milne), use -p $HOME/miniconda3. You can also use this option to create the fermi environment in an existing conda installation, if you have one, provided it doesn't already have an environment named fermi.
>
>
For example, to install to your home directory, use -p $HOME/miniconda3. You can also use this option to create the fermi environment in an existing conda installation, if you have one, provided it doesn't already have an environment named fermi.
 

Configuring Your Analysis

Revision 162020-07-30 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 119 to 112
 For analysis, you can use many of the commands used in the run_analysis.py script that creates all the objects necessary for GTAnalysis. Most importantly, note the calls to gta.free_source() and gta.delete_sources() for the default model selection (which lives between the calls to gta.setup(), gta.optimize() and gta.fit(), and the various print statements.). You can modify away from the default by adding or deleting sources and fixing or freeing parameters to help your fit converge, and then use plots to assess how it worked. You can use these calls explicitly in your notebook to avoid having to run the same script multiple times.

Troubleshooting

Changed:
<
<
Fermipy says "WARNING: version mismatch between CFITSIO header (v3.43) and linked library (v3.41)."
This should not cause any issues with the analysis. For more details see here.
>
>
Fermipy says "WARNING
version mismatch between CFITSIO header (v3.43) and linked library (v3.41)." : This should not cause any issues with the analysis. For more details see here.
 
My analysis finished without reporting any errors, but I don't see any plots.
This sometimes occurs. Simply rerun the analysis. It will pick up from where it terminated before and produce the plots.
Line: 127 to 120
  Introduction_to_Fermipy_Analysis_at_Nevis.pdf
Added:
>
>
FermiPy_Analysis_Tips.pptx
 Fermipy documentation

Fermipy Tutorials from Fermi Summer School

Revision 152020-07-21 - GwenLaPlante

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 154 to 154
 
<--/commentPlugin-->

META FILEATTACHMENT attachment="Introduction_to_Fermipy_Analysis_at_Nevis.pdf" attr="" comment="" date="1591037562" name="Introduction_to_Fermipy_Analysis_at_Nevis.pdf" path="Introduction_to_Fermipy_Analysis_at_Nevis.pdf" size="2748978" user="AriBrill" version="1"
Added:
>
>
META FILEATTACHMENT attachment="FermiPy_Analysis_Tips.pptx" attr="" comment="" date="1595364712" name="FermiPy_Analysis_Tips.pptx" path="FermiPy_Analysis_Tips.pptx" size="870709" user="GwenLaPlante" version="1"

Revision 142020-06-23 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 96 to 96
  To generate the light curve, first run the base analysis. After that's complete, run the lightcurve analysis by running the analysis using the --lightcurve option. The pipeline offers the capability to divide the lightcurve analysis into sections, so that an error encountered in one bin crashes only a portion of the analysis rather than everything. If using this functionality, the section may either be specified in the pipeline configuration file or on the command line using the --section option, which overrides the value set in the configuration file. After running the analysis for all sections, the output files must then be combined.
Added:
>
>
To set the bin size of your lightcurve (in units of seconds), edit the binsz parameter in the lightcurve section of your Fermipy config file.
 Run the analysis for the first section of the light curve:

python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml --lightcurve --section 0

Line: 117 to 119
 For analysis, you can use many of the commands used in the run_analysis.py script that creates all the objects necessary for GTAnalysis. Most importantly, note the calls to gta.free_source() and gta.delete_sources() for the default model selection (which lives between the calls to gta.setup(), gta.optimize() and gta.fit(), and the various print statements.). You can modify away from the default by adding or deleting sources and fixing or freeing parameters to help your fit converge, and then use plots to assess how it worked. You can use these calls explicitly in your notebook to avoid having to run the same script multiple times.

Troubleshooting

Changed:
<
<
Fermipy says "WARNING
version mismatch between CFITSIO header (v3.43) and linked library (v3.41).": This should not cause any issues with the analysis. For more details see here.
>
>
Fermipy says "WARNING: version mismatch between CFITSIO header (v3.43) and linked library (v3.41)."
This should not cause any issues with the analysis. For more details see here.
 
My analysis finished without reporting any errors, but I don't see any plots.
This sometimes occurs. Simply rerun the analysis. It will pick up from where it terminated before and produce the plots.

Revision 132020-06-22 - DeividRibeiro

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 101 to 108
  Below I will give steps for how to alter your SSH commands to forward a port and use Jupyter from tehanu on your local computer and browser.
Using Jupyter from tehanu:
  1. Here, ‘port’ will be a four digit number that you choose. It must be the same in all commands where it is used. I recommend choosing something non-standard to lessen the probability that someone else will be logged into tehanu with the same port. Here is an example list of common linux ports that you should avoid: https://www.techbrown.com/cheat-sheet-most-commonly-used-port-numbers-cheat-sheet-linux/
Changed:
<
<
  1. ssh username@tehanu.nevis.columbia.edu -L port:localhost:port
>
>
  1. ssh username@tehanu.nevis.columbia.edu -L port:localhost:port
 
  1. You are now in tehanu with your chosen port being forwarded to ‘localhost.’
  2. Activate your Anaconda environment and navigate to your analysis folder with the .ipynb file. (use fermisetup shown above)
Changed:
<
<
  1. jupyter notebook --no-browser --port=port
>
>
  1. jupyter notebook --no-browser --port=port
 
  1. This will sit for a moment and spit out a bunch of text. Take a look and make sure that the port you specied wasn’t already in use. There will be a url in the printout that lists a ‘localhost.’ Copy this into your browser and go there. You should have access to Jupyter in your browser now.
  2. Jupyter might request a token that was also in the output. If so, copy that into the appropriate box to gain access.
For analysis, you can use many of the commands used in the run_analysis.py script that creates all the objects necessary for GTAnalysis. Most importantly, note the calls to gta.free_source() and gta.delete_sources() for the default model selection (which lives between the calls to gta.setup(), gta.optimize() and gta.fit(), and the various print statements.). You can modify away from the default by adding or deleting sources and fixing or freeing parameters to help your fit converge, and then use plots to assess how it worked. You can use these calls explicitly in your notebook to avoid having to run the same script multiple times.

Revision 122020-06-22 - DeividRibeiro

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 105 to 97
  python $FERMIPIPE/combine_lightcurve_results.py $FERMI_ANALYSIS_DIR/pipeline_config.yml
Added:
>
>

Jupyter Notebook

Below I will give steps for how to alter your SSH commands to forward a port and use Jupyter from tehanu on your local computer and browser.
Using Jupyter from tehanu:

  1. Here, ‘port’ will be a four digit number that you choose. It must be the same in all commands where it is used. I recommend choosing something non-standard to lessen the probability that someone else will be logged into tehanu with the same port. Here is an example list of common linux ports that you should avoid: https://www.techbrown.com/cheat-sheet-most-commonly-used-port-numbers-cheat-sheet-linux/
  2. ssh username@tehanu.nevis.columbia.edu -L port:localhost:port
  3. You are now in tehanu with your chosen port being forwarded to ‘localhost.’
  4. Activate your Anaconda environment and navigate to your analysis folder with the .ipynb file. (use fermisetup shown above)
  5. jupyter notebook --no-browser --port=port
  6. This will sit for a moment and spit out a bunch of text. Take a look and make sure that the port you specied wasn’t already in use. There will be a url in the printout that lists a ‘localhost.’ Copy this into your browser and go there. You should have access to Jupyter in your browser now.
  7. Jupyter might request a token that was also in the output. If so, copy that into the appropriate box to gain access.
For analysis, you can use many of the commands used in the run_analysis.py script that creates all the objects necessary for GTAnalysis. Most importantly, note the calls to gta.free_source() and gta.delete_sources() for the default model selection (which lives between the calls to gta.setup(), gta.optimize() and gta.fit(), and the various print statements.). You can modify away from the default by adding or deleting sources and fixing or freeing parameters to help your fit converge, and then use plots to assess how it worked. You can use these calls explicitly in your notebook to avoid having to run the same script multiple times.
 

Troubleshooting

Changed:
<
<
Fermipy says "WARNING
version mismatch between CFITSIO header (v3.43) and linked library (v3.41).": This should not cause any issues with the analysis. For more details see here.
>
>
Fermipy says "WARNING
version mismatch between CFITSIO header (v3.43) and linked library (v3.41).": This should not cause any issues with the analysis. For more details see here.
 
My analysis finished without reporting any errors, but I don't see any plots.
This sometimes occurs. Simply rerun the analysis. It will pick up from where it terminated before and produce the plots.

Revision 112020-06-09 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 66 to 66
 
fermipy_config
the name of your Fermipy config file; if null, defaults to <prefix>_config.yml
Changed:
<
<
multithread
use multiple processes when calculating TS maps, residual maps and light curves (true/false)
>
>
delete_source
list of names of sources to delete from the model
 
Changed:
<
<
nthread
number of processes to use when multithread is true
>
>
delete_sources
list of dictionaries containing parameters defining sources to delete from the model; isodiff, galdiff, and all custom sources are automatically excluded

free_source
list of names of sources to free in the model fitting

free_sources
list of dictionaries containing parameters defining sources to free in the model fitting
 
Deleted:
<
<
calculate_sed
whether to calculate the SED (true/false)
  The following parameters pertain to the light curve analysis only:

Revision 102020-06-01 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 15 to 16
  In addition to initializing the conda environment for the fermi code for your account, this will provide you with several definitions and commands (you can see them by opening with a text editor the file .myprofile in your home directory).
Changed:
<
<
FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis
directory where your analysis will occur by default
>
>
FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis
directory where your analysis will occur by default
 
Changed:
<
<
fermisetup
set up your environment before running analyses
>
>
fermisetup
set up your environment before running analyses
 
Changed:
<
<
fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &"
run an analysis
>
>
fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &"
run an analysis
  NOTE: If your home directory is on Milne or another Nevis machine other than tehanu, see the special instructions "Customizing the Installation" below. If you're unsure where your home directory is, type echo $HOME.
Line: 57 to 62
  The pipeline around Fermipy is designed in a similar way. In your analysis directory, you should have a file called pipeline_config.yml, which you can use as a template for your pipeline configuration file.
Changed:
<
<
prefix
the name to be used for the output directory (within the fermi_analysis directory) and as a prefix for your output files; shouldn't contain spaces
>
>
prefix
the name to be used for the output directory (within the fermi_analysis directory) and as a prefix for your output files; shouldn't contain spaces
 
Changed:
<
<
fermipy_config
the name of your Fermipy config file; if null, defaults to <prefix>_config.yml
>
>
fermipy_config
the name of your Fermipy config file; if null, defaults to <prefix>_config.yml
 
Changed:
<
<
multithread
use multiple processes when calculating TS maps, residual maps and light curves (true/false)
>
>
multithread
use multiple processes when calculating TS maps, residual maps and light curves (true/false)
 
Changed:
<
<
nthread
number of processes to use when multithread is true
>
>
nthread
number of processes to use when multithread is true
 
Changed:
<
<
calculate_sed
whether to calculate the SED (true/false)
>
>
calculate_sed
whether to calculate the SED (true/false)
  The following parameters pertain to the light curve analysis only:
Changed:
<
<
num_sections
number of sections into which to split the light curve; if null, do not split

section
the index of the section to analyze if num_sections is not null
>
>
num_sections
number of sections into which to split the light curve; if null, do not split
 
Added:
>
>
section
the index of the section to analyze if num_sections is not null
 

Running Your Analysis

Line: 96 to 102
  python $FERMIPIPE/combine_lightcurve_results.py $FERMI_ANALYSIS_DIR/pipeline_config.yml
Deleted:
<
<
 

Troubleshooting

Changed:
<
<
Fermipy says "WARNING: version mismatch between CFITSIO header (v3.43) and linked library (v3.41)."
This should not cause any issues with the analysis. For more details see here.
>
>
Fermipy says "WARNING
version mismatch between CFITSIO header (v3.43) and linked library (v3.41).": This should not cause any issues with the analysis. For more details see here.
 
Changed:
<
<
My analysis finished without reporting any errors, but I don't see any plots.
This sometimes occurs. Simply rerun the analysis. It will pick up from where it terminated before and produce the plots.
>
>
My analysis finished without reporting any errors, but I don't see any plots.
This sometimes occurs. Simply rerun the analysis. It will pick up from where it terminated before and produce the plots.
 

Useful links

Added:
>
>
Introduction_to_Fermipy_Analysis_at_Nevis.pdf
 Fermipy documentation

Fermipy Tutorials from Fermi Summer School

Line: 130 to 137
 

Comments

<--/commentPlugin-->
Added:
>
>
META FILEATTACHMENT attachment="Introduction_to_Fermipy_Analysis_at_Nevis.pdf" attr="" comment="" date="1591037562" name="Introduction_to_Fermipy_Analysis_at_Nevis.pdf" path="Introduction_to_Fermipy_Analysis_at_Nevis.pdf" size="2748978" user="AriBrill" version="1"

Revision 92020-06-01 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 15 to 15
  In addition to initializing the conda environment for the fermi code for your account, this will provide you with several definitions and commands (you can see them by opening with a text editor the file .myprofile in your home directory).
Changed:
<
<
FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis: directory where your analysis will occur by default
>
>
FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis
directory where your analysis will occur by default
 
Changed:
<
<
fermisetup: set up your environment before running analyses
>
>
fermisetup
set up your environment before running analyses
 
Changed:
<
<
fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &": run an analysis
>
>
fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &"
run an analysis
  NOTE: If your home directory is on Milne or another Nevis machine other than tehanu, see the special instructions "Customizing the Installation" below. If you're unsure where your home directory is, type echo $HOME.
Line: 57 to 57
  The pipeline around Fermipy is designed in a similar way. In your analysis directory, you should have a file called pipeline_config.yml, which you can use as a template for your pipeline configuration file.
Changed:
<
<
fermipy_config: the name of your Fermipy config file
>
>
prefix
the name to be used for the output directory (within the fermi_analysis directory) and as a prefix for your output files; shouldn't contain spaces

fermipy_config
the name of your Fermipy config file; if null, defaults to <prefix>_config.yml

multithread
use multiple processes when calculating TS maps, residual maps and light curves (true/false)

nthread
number of processes to use when multithread is true

calculate_sed
whether to calculate the SED (true/false)

The following parameters pertain to the light curve analysis only:

num_sections
number of sections into which to split the light curve; if null, do not split

section
the index of the section to analyze if num_sections is not null
 
Deleted:
<
<
prefix: the name to be used for the output directory (within the fermi_analysis directory) and as a prefix for your output files; shouldn't contain spaces
 

Running Your Analysis

Line: 73 to 86
  The fermianalysis command runs the analysis in the background, using nohup. This means that it will continue to run even if you close your terminal, and that instead of printing the output to the screen, it saves it to a file called nohup.out.
Added:
>
>
To generate the light curve, first run the base analysis. After that's complete, run the lightcurve analysis by running the analysis using the --lightcurve option. The pipeline offers the capability to divide the lightcurve analysis into sections, so that an error encountered in one bin crashes only a portion of the analysis rather than everything. If using this functionality, the section may either be specified in the pipeline configuration file or on the command line using the --section option, which overrides the value set in the configuration file. After running the analysis for all sections, the output files must then be combined.

Run the analysis for the first section of the light curve:

python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml --lightcurve --section 0

After running the analyses for all light curve sections, combine the output files:

python $FERMIPIPE/combine_lightcurve_results.py $FERMI_ANALYSIS_DIR/pipeline_config.yml

 

Troubleshooting

Fermipy says "WARNING: version mismatch between CFITSIO header (v3.43) and linked library (v3.41)."
This should not cause any issues with the analysis. For more details see here.

Revision 82020-05-29 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 31 to 31
  Customizing the Installation
Changed:
<
<
If you'd like to perform your analysis somewhere besides the tehanu data partition, change the FERMI_ANALYSIS_DIR to your preferred directory in your .myprofile file. For example, to perform the analysis in your home directory (recommended if your account is on milne), write:
>
>
If you'd like to perform your analysis somewhere besides the tehanu data partition, change FERMI_ANALYSIS_DIR in your .myprofile file to your preferred directory. For example, to perform the analysis in your home directory (recommended if your account is on milne), the line should read:
 
export FERMI_ANALYSIS_DIR=$HOME/fermi_analysis
Added:
>
>
NOTE: As of Fermipy v0.19, installation via conda is currently broken. See Fermipy issue #337.
 If you need to recreate the conda environment from scratch, use the setup_pipeline script to download and install fermitools and fermipy into an environment named "fermi". By default, the software will be installed to /a/data/tehanu/$USER/miniconda3, where $USER is your username. Run the following command anywhere:
$FERMIPIPE/setup_pipeline.sh
Line: 71 to 73
  The fermianalysis command runs the analysis in the background, using nohup. This means that it will continue to run even if you close your terminal, and that instead of printing the output to the screen, it saves it to a file called nohup.out.
Added:
>
>

Troubleshooting

Fermipy says "WARNING: version mismatch between CFITSIO header (v3.43) and linked library (v3.41)."
This should not cause any issues with the analysis. For more details see here.

My analysis finished without reporting any errors, but I don't see any plots.
This sometimes occurs. Simply rerun the analysis. It will pick up from where it terminated before and produce the plots.
 

Useful links

Fermipy documentation

Revision 72020-04-29 - DeividRibeiro

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 9 to 9
 The following instructions will set up an installation of the Fermi Science Tools (fermitools) and Fermipy using conda. Conda is a package manager that will automatically manage all of the dependencies needed to use the software and encapsulate them into an environment, which must be activated before use.

First, set up your environment for using the Fermipy pipeline by running the following commands:

Deleted:
<
<
 
cat /a/home/tehanu/brill/fermipipe/bash_setup.txt >> ~/.myprofile
Changed:
<
<
source ~/.bashrc
>
>
source ~/.bashrc
  In addition to initializing the conda environment for the fermi code for your account, this will provide you with several definitions and commands (you can see them by opening with a text editor the file .myprofile in your home directory).
Line: 30 to 27
 
mkdir $FERMI_ANALYSIS_DIR
cp $FERMIPIPE/config.yml $FERMI_ANALYSIS_DIR
Changed:
<
<
cp $FERMIPIPE/pipeline_config.yml $FERMI_ANALYSIS_DIR
>
>
cp $FERMIPIPE/pipeline_config.yml $FERMI_ANALYSIS_DIR
  Customizing the Installation
Line: 36 to 32
 Customizing the Installation

If you'd like to perform your analysis somewhere besides the tehanu data partition, change the FERMI_ANALYSIS_DIR to your preferred directory in your .myprofile file. For example, to perform the analysis in your home directory (recommended if your account is on milne), write:

Deleted:
<
<
 
Changed:
<
<
export FERMI_ANALYSIS_DIR=$HOME/fermi_analysis
>
>
export FERMI_ANALYSIS_DIR=$HOME/fermi_analysis
  If you need to recreate the conda environment from scratch, use the setup_pipeline script to download and install fermitools and fermipy into an environment named "fermi". By default, the software will be installed to /a/data/tehanu/$USER/miniconda3, where $USER is your username. Run the following command anywhere:
Deleted:
<
<
 
Changed:
<
<
$FERMIPIPE/setup_pipeline.sh
>
>
$FERMIPIPE/setup_pipeline.sh
  The script should take several minutes to run. Ignore any message saying to run source ~/.bashrc again, the script does that for you automatically.
Line: 50 to 42
 The script should take several minutes to run. Ignore any message saying to run source ~/.bashrc again, the script does that for you automatically.

If you'd like to change the installation path of miniconda, run the setup script using the -p flag to change the path, as follows:

Deleted:
<
<
 
Changed:
<
<
$FERMIPIPE/setup_pipeline.sh -p
>
>
$FERMIPIPE/setup_pipeline.sh -p
  For example, to install to your home directory (recommended for milne), use -p $HOME/miniconda3. You can also use this option to create the fermi environment in an existing conda installation, if you have one, provided it doesn't already have an environment named fermi.
Line: 72 to 62
 

Running Your Analysis

First, set your environment and activate the conda environment:

Deleted:
<
<
 
Changed:
<
<
fermisetup
>
>
fermisetup
  Now you can run an analysis using your config files:
Deleted:
<
<
 
Changed:
<
<
fermianalysis
>
>
fermianalysis
  The fermianalysis command runs the analysis in the background, using nohup. This means that it will continue to run even if you close your terminal, and that instead of printing the output to the screen, it saves it to a file called nohup.out.
Line: 95 to 81
  Managing conda environments
Added:
>
>
FermiLAT Summer school presentations:
  1. 2012 complete agenda
    1. Useful slide with good tips on getting analysis to converge - slide 28 is best
  2. 2013 complete agenda
  3. 2014 complete agenda
  4. 2015 complete agenda
  5. 2016 complete agenda
  6. 2017 complete agenda
  7. 2018 complete agenda
    1. This beautiful python notebook explores a full analysis with FermiPY, which is very similar to what run_analysis.py is doing.
  8. 2019 complete agenda
 -- Ari Brill - 2020-02-17

Comments

Revision 62020-02-27 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 6 to 6
 

Installation and Setup

Changed:
<
<
1. Set up your environment for using the Fermipy pipeline as follows. Open the file .myprofile in your home directory in a text editor and add the following lines:
>
>
The following instructions will set up an installation of the Fermi Science Tools (fermitools) and Fermipy using conda. Conda is a package manager that will automatically manage all of the dependencies needed to use the software and encapsulate them into an environment, which must be activated before use.

First, set up your environment for using the Fermipy pipeline by running the following commands:

 
Changed:
<
<
export FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis export FERMIPIPE=/a/home/tehanu/brill/fermipipe alias fermisetup="source $FERMIPIPE/setup_fermi.sh" alias fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &"
>
>
cat /a/home/tehanu/brill/fermipipe/bash_setup.txt >> ~/.myprofile source ~/.bashrc
 
Changed:
<
<
For these to take effect, either run source ~/.bashrc, or log out and log back in again.
>
>
In addition to initializing the conda environment for the fermi code for your account, this will provide you with several definitions and commands (you can see them by opening with a text editor the file .myprofile in your home directory).
 
Changed:
<
<
NOTE: If your home directory is on Milne or another Nevis machine other than tehanu, see the special instructions "Customizing the Installation" below. If you're unsure where your home directory is, type echo $HOME.
>
>
FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis: directory where your analysis will occur by default
 
Changed:
<
<
2. Install the Fermi Science Tools (fermitools) and Fermipy using conda. Conda is a package manager that will automatically manage all of the dependencies needed to use the software and encapsulate them into an environment, which must be activated before use. The setup_pipeline script will download and install fermitools and fermipy into an environment named "fermi". By default, the software will be installed to /a/data/tehanu/$USER/miniconda3, where $USER is your username. Run the following command anywhere:
>
>
fermisetup: set up your environment before running analyses
 
Changed:
<
<
$FERMIPIPE/setup_pipeline.sh
>
>
fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &": run an analysis
 
Changed:
<
<
The script should take several minutes to run. Ignore any message saying to run source ~/.bashrc again, the script does that for you automatically.
>
>
NOTE: If your home directory is on Milne or another Nevis machine other than tehanu, see the special instructions "Customizing the Installation" below. If you're unsure where your home directory is, type echo $HOME.
 
Changed:
<
<
3. Create a directory for running your analysis, and copy the configuration file templates there:
>
>
Next, create a directory for running your analysis, and copy the configuration file templates there:
 
mkdir $FERMI_ANALYSIS_DIR
Line: 37 to 35
  Customizing the Installation
Changed:
<
<
If you'd like to perform your analysis somewhere besides the tehanu data partition, in step 1, change the FERMI_ANALYSIS_DIR to your preferred directory. For example, to perform the analysis in your home directory (recommended if your account is on milne), write:
>
>
If you'd like to perform your analysis somewhere besides the tehanu data partition, change the FERMI_ANALYSIS_DIR to your preferred directory in your .myprofile file. For example, to perform the analysis in your home directory (recommended if your account is on milne), write:
 
export FERMI_ANALYSIS_DIR=$HOME/fermi_analysis
Changed:
<
<
If you'd like to change the installation path of miniconda, in step 2, run the setup script using the -p flag to change the path, as follows:
>
>
If you need to recreate the conda environment from scratch, use the setup_pipeline script to download and install fermitools and fermipy into an environment named "fermi". By default, the software will be installed to /a/data/tehanu/$USER/miniconda3, where $USER is your username. Run the following command anywhere:

$FERMIPIPE/setup_pipeline.sh

The script should take several minutes to run. Ignore any message saying to run source ~/.bashrc again, the script does that for you automatically.

If you'd like to change the installation path of miniconda, run the setup script using the -p flag to change the path, as follows:

 
$FERMIPIPE/setup_pipeline.sh -p <CONDA_PATH>
Line: 65 to 71
 

Running Your Analysis

Changed:
<
<
Command to set up for running the analysis, including activating the conda environment:
>
>
First, set your environment and activate the conda environment:
 
fermisetup
Changed:
<
<
Command to run the analysis using your config files:
>
>
Now you can run an analysis using your config files:
 
fermianalysis

Revision 52020-02-19 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 10 to 10
 
export FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis
Changed:
<
<
export FERMIPIPE=/a/home/tehanu/brill/fermi_pipeline
>
>
export FERMIPIPE=/a/home/tehanu/brill/fermipipe
 alias fermisetup="source $FERMIPIPE/setup_fermi.sh" alias fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &"

Revision 42020-02-19 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 27 to 27
  The script should take several minutes to run. Ignore any message saying to run source ~/.bashrc again, the script does that for you automatically.
Changed:
<
<
3. Create a directory for running your analysis, and copy the configuration templates there:
>
>
3. Create a directory for running your analysis, and copy the configuration file templates there:
 
mkdir $FERMI_ANALYSIS_DIR
Line: 51 to 51
  For example, to install to your home directory (recommended for milne), use -p $HOME/miniconda3. You can also use this option to create the fermi environment in an existing conda installation, if you have one, provided it doesn't already have an environment named fermi.
Changed:
<
<

Configuring an Analysis

>
>

Configuring Your Analysis

 
Changed:
<
<
Text
>
>
When running an analysis with Fermipy, all parameters are set using a configuration file. Different sections contains the parameters for the data selection, likelihood fit, source model, and so on. There are many configuration parameters you can adjust when running an analysis. In your analysis directory, you should have a file called config.yml, which you can use as a template for your Fermipy configuration file. The settings in it are meant for a standard setup, which you can adjust as needed. The Fermipy documentation lists all possible configuration settings.
 
Changed:
<
<

Running an Analysis

>
>
To perform an analysis, you will have to set a minimum of three parameters: the start time, the stop time, and the name of the target to analyze (if your target is not a known Fermi source, you can define the target using coordinates instead). The analysis start and stop times must be provided in Fermi mission elapsed time (MET). NASA provides a handy time converter that you can use to convert between a number of calendar/time formats and MET.
 
Changed:
<
<
Set up for running the Fermi analysis, including activating the conda environment:
>
>
The pipeline around Fermipy is designed in a similar way. In your analysis directory, you should have a file called pipeline_config.yml, which you can use as a template for your pipeline configuration file.

fermipy_config: the name of your Fermipy config file

prefix: the name to be used for the output directory (within the fermi_analysis directory) and as a prefix for your output files; shouldn't contain spaces

Running Your Analysis

Command to set up for running the analysis, including activating the conda environment:

 
fermisetup
Changed:
<
<
Run the analysis using your config files:
>
>
Command to run the analysis using your config files:
 
fermianalysis
Changed:
<
<
The fermianalysis command runs the analysis in the background, using nohup. This means that it will continue to run even if you close your terminal, and that instead of printing the output to the screen, saves it to a file called nohup.out.
>
>
The fermianalysis command runs the analysis in the background, using nohup. This means that it will continue to run even if you close your terminal, and that instead of printing the output to the screen, it saves it to a file called nohup.out.

Useful links

 
Changed:
<
<

Links

>
>
Fermipy documentation
  Fermipy Tutorials from Fermi Summer School

Revision 32020-02-19 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 25 to 25
 $FERMIPIPE/setup_pipeline.sh
Changed:
<
<
The script should take several minutes to run.
>
>
The script should take several minutes to run. Ignore any message saying to run source ~/.bashrc again, the script does that for you automatically.
  3. Create a directory for running your analysis, and copy the configuration templates there:
Line: 57 to 57
 

Running an Analysis

Changed:
<
<
Text
>
>
Set up for running the Fermi analysis, including activating the conda environment:

fermisetup

Run the analysis using your config files:

fermianalysis

The fermianalysis command runs the analysis in the background, using nohup. This means that it will continue to run even if you close your terminal, and that instead of printing the output to the screen, saves it to a file called nohup.out.

 

Links

Changed:
<
<
Fermi Summer School FermiPy Tutorials
>
>
Fermipy Tutorials from Fermi Summer School

NASA mission time converter

Managing conda environments

  -- Ari Brill - 2020-02-17

Revision 22020-02-18 - AriBrill

Line: 1 to 1
 
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy

Line: 9 to 9
 1. Set up your environment for using the Fermipy pipeline as follows. Open the file .myprofile in your home directory in a text editor and add the following lines:
Deleted:
<
<
export FERMIPIPE=/a/home/tehanu/brill/fermi_pipeline
 export FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis
Added:
>
>
export FERMIPIPE=/a/home/tehanu/brill/fermi_pipeline
 alias fermisetup="source $FERMIPIPE/setup_fermi.sh" alias fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &"

For these to take effect, either run source ~/.bashrc, or log out and log back in again.

Changed:
<
<
2. Install the Fermi Science Tools (fermitools) and Fermipy using conda. Conda is a package manager that will automatically manage all of the dependencies needed to use the software and encapsulate them into an environment, which must be activated before use. The following command will download and install fermitools and fermipy into an environment named "fermi", located in /a/data/tehanu/$USER/miniconda3, where $USER is your username:
>
>
NOTE: If your home directory is on Milne or another Nevis machine other than tehanu, see the special instructions "Customizing the Installation" below. If you're unsure where your home directory is, type echo $HOME.

2. Install the Fermi Science Tools (fermitools) and Fermipy using conda. Conda is a package manager that will automatically manage all of the dependencies needed to use the software and encapsulate them into an environment, which must be activated before use. The setup_pipeline script will download and install fermitools and fermipy into an environment named "fermi". By default, the software will be installed to /a/data/tehanu/$USER/miniconda3, where $USER is your username. Run the following command anywhere:

 
$FERMIPIPE/setup_pipeline.sh
Added:
>
>
The script should take several minutes to run.
 3. Create a directory for running your analysis, and copy the configuration templates there:
Line: 31 to 35
 cp $FERMIPIPE/pipeline_config.yml $FERMI_ANALYSIS_DIR
Added:
>
>
Customizing the Installation

If you'd like to perform your analysis somewhere besides the tehanu data partition, in step 1, change the FERMI_ANALYSIS_DIR to your preferred directory. For example, to perform the analysis in your home directory (recommended if your account is on milne), write:

export FERMI_ANALYSIS_DIR=$HOME/fermi_analysis

If you'd like to change the installation path of miniconda, in step 2, run the setup script using the -p flag to change the path, as follows:

$FERMIPIPE/setup_pipeline.sh -p <CONDA_PATH>

For example, to install to your home directory (recommended for milne), use -p $HOME/miniconda3. You can also use this option to create the fermi environment in an existing conda installation, if you have one, provided it doesn't already have an environment named fermi.

 

Configuring an Analysis

Text

Revision 12020-02-17 - AriBrill

Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FermiAnalysis"

Fermi Analysis with FermiPy


Installation and Setup

1. Set up your environment for using the Fermipy pipeline as follows. Open the file .myprofile in your home directory in a text editor and add the following lines:

export FERMIPIPE=/a/home/tehanu/brill/fermi_pipeline
export FERMI_ANALYSIS_DIR=/a/data/tehanu/$USER/fermi_analysis
alias fermisetup="source $FERMIPIPE/setup_fermi.sh"
alias fermianalysis="nohup python $FERMIPIPE/run_analysis.py $FERMI_ANALYSIS_DIR/pipeline_config.yml &"

For these to take effect, either run source ~/.bashrc, or log out and log back in again.

2. Install the Fermi Science Tools (fermitools) and Fermipy using conda. Conda is a package manager that will automatically manage all of the dependencies needed to use the software and encapsulate them into an environment, which must be activated before use. The following command will download and install fermitools and fermipy into an environment named "fermi", located in /a/data/tehanu/$USER/miniconda3, where $USER is your username:

$FERMIPIPE/setup_pipeline.sh

3. Create a directory for running your analysis, and copy the configuration templates there:

mkdir $FERMI_ANALYSIS_DIR
cp $FERMIPIPE/config.yml $FERMI_ANALYSIS_DIR
cp $FERMIPIPE/pipeline_config.yml $FERMI_ANALYSIS_DIR

Configuring an Analysis

Text

Running an Analysis

Text

Links

Fermi Summer School FermiPy Tutorials

-- Ari Brill - 2020-02-17

Comments

<--/commentPlugin-->
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback