Research Journal: Adam Oppenheimer

Thursday, July 23rd

  • Ran the Ton599 Base Analysis but with higher requirements for TS(changed it to 64 but will likely have to do it again going up to 100.
  • Implemented a way to restrict the BBs used to a given flare
  • Debugged the existing flare plotting algorithms
    • The detection method used sometimes mismaps, going to a small data perturbation as opposed to a noticable flare
  • Worked on implementing a new flare profile as an exponential piecewise function as opposed to the sum of reciprocal exponentials that is currently being used to make it both easier to work with and possibly more accurate.

Wednesday, July 22nd

  • Debugged the initial Base Analysis for Ton 599 and there was an overall poor fit for the source.
    • The mean displayed in the significance histogram was around 1.20
  • The Ton 599 base analysis was restarted with a higher TS requirement(25 -> 36) and the distance was reduced (5.0 -> 4.0)
  • The flare detection algorithm was edited/debugged.
  • The capability to determine and plot the exponential profile for a given flare was worked on
    • There is currently an error that occurs to overfill in the number of bits available to store the numbers used(the numbers are too large by like 10-15 orders of magnitude

Tuesday, July 21st

  • Fixed some bugs within the flare detection methods and added the capability of having strictly decreasing flare detection
  • Continued an investigation into the connections between the exponential profile model used and the specific parameters
    • A desmos graph was produced which can be shared with anyone on demand
  • Set up the fermi analysis pipeline within the linux cluster
  • Started the base analysis for Ton 599 to prepare for flare detection and analysis

Monday, July 20th

  • Wrote up code to take the Bayesian blocks and the baseline Quiescent Flux to determine the time intervals over which there are flares.
    • This is determined by finding the peak of a given flare and then continuing down the flare up until a certain baseline is reached, at which point the flare is terminated
    • The initial peak must be higher than 3 times the QF level (can be modified) and the flare is terminated when the BB flux drops below 1 SD above the QF level(the SD is calculated based on the QF data range that is calculated from the find QF method written). The SD required to end the peak is also modifiable
    • The method might clump peaks together, so this has to be written in to modify the current code
  • Began looking into the actual meanings of the parameters associated with the exponential profile function.
    • determined the conversion from the t0 value to the actual peak of the flare(the difference between t0 and the actual peak is determined by the asymmetry of the flare itself.

Friday, July 17th

  • Implemented a code that is able to optimize an exponential profile to a given time series data to model a flare
    • The exponential profile used has 4 degrees of freedom: F0, t0, Tr, and Td. These are determined through the scipy.stats.curve_fit function, which yields these coefficients
    • It was tested on some simulated flares and on an actual flare zoned in on from 3C 279
  • Wrote a code that is able to determine the quiescent flux level for a given time signal
    • This code is a modified version of the pseudocode present in Meyer et al. (2019) on how to determine the quiescent flux leve
    • There are 2 major alterations to the code: the first alteration is that the array of test QFs that is supposed to go from the min value to the mean value was truncated at a certain point to ensure that there are enough data points in the time signal to create an even distribution. This was done by hand today but will be automated in the future
    • The 2nd alteration is that instead of using the ratio of posterior probabilities as indicated in Wolpert (1996) as done in the paper, a short term solution was used which is determining how centered the peak of the CDF is for these curves. This is done due to an inability to extend the method done in Wolpert (1996) to the signals used due to a lack of all of the necessary parameters to use that method. This will be investigated further in the future to determine the likely next step; this will probably be an implementation of chi squared or the Wolpert method

Thursday, July 16th

  • Modified, expanded, and finalized the txt file that was produced for the flare data of Ton 599
    • Added notes at the top including the many conversions required for the data, the original form of the data, and the wavelength bands that the data was collected in.
  • Began to look into the Scipy.optimize module in preparation for writing a method to map exponential profiles to a flare

Wednesday, July 15th

  • Created a txt file for all of the information collected about the Dec 2017 Flare of Ton 599
    • Includes observatory, observed time(if applicable), freq, energy, spectral density
  • Exported this txt file and sent it around, in addition to completing SEDs for Ton 599 containing both archival and flared data

Tuesday, July 14th

  • Converted the Jansky values and Magnitude values to Spectral Energy Density
  • Took the SED values in the Fermi range from Figure 6 in the paper and plotted it on the SED for Ton 599
    • This finished up the SED for Ton 599 during the flare for now (the flared points and archival data are both plotted on the same graph
  • Set the prior p0 for the Bayesian Blocks model equal to a standard p0 representing 2 sigma (p0=.05)
  • Used the Bayesian Blocks method on Gwen's monthly light curve for 1ES 1218+304

Monday, July 13th

  • Collected energy + spectral density information from the Prince 2018 to upload it into a jupyter notebook
    • was able to plot about 3/4 of the data points (will work tomorrow on fixing the last few points due to conversion issues.

Friday, July 10th

  • Used the 1ES 1215+303 light curve data from the Cutoff folder (both 30 and 3 day bins), plotted it, and used Bayesian Blocks on that data
  • Found the UVOT data associated with the flare for Ton 599 in Dec 2017 and downloaded all of the data in a folder filled with .img files.
  • Fixed the code associated with the first source of Janet's light curve data, and plotted Ari and Reshmi's data on the same plot to compare similarities and differences
    • The plots are nearly identical.
  • Wrote up a draft for the Rationale for the research plan

Thursday, July 9th

  • Took Gwen's 3 Day binned light curve data for 1ES 1215+303 and used the Bayesian Blocks method on it
  • Took Janet's light curve data for the source 3C 279 and used the Bayesian Blocks method on it
  • Created slides for the VERITAS Collaboration for next week about the Bayesian Blocks method and using it on the sources 3C 279 and 1ES 1215+303

Wednesday, July 8th

  • Finished the paper on multiwavelength analysis of Ton 599
  • Continued searching for the data from the flare that occurred in Dec, 2017

Tuesday, July 7th

  • Used the data from Ton599 that was imported into a jupyter notebook as a txt file and converted it into a spectral graph correlating the nuFnu to the energy in GeV
  • Tried to find data for the flare that occurred in Dec. 2017, but was unable to find it through the paper linked below or through the ASDC catalog itself
  • Read Parts 1 and 2 of the journal paper Multi-frequency Variability Study of Ton 599 during the High Activity of 2017

Monday, July 6th

  • Learned about and read into the ASDC SED Builder and how to work it
  • Created an SED for the source Ton599 over the entire timeframe of collected data and exported it as an ASCII file
  • Transferred this ASCII file to the Tehanu Server and opened it in a Jupyter Notebook
  • Tried to create a SED for Ton599 with a limited time frame around Dec, 2017(the time of a significant flare), but there was no data present in that timeframe

Friday, July 3rd

  • Restricted the data entries from 1ES1215+303 based on their Test Statistic values to increase the significance of the plot
    • Only entries with TS>=16 were kept
  • The Bayesian Blocks model was repeated on this restricted data graphing the flux relative to time
  • The flux histograms were redone(both normal and log version) using the restricted dataset
  • A plot was made plotting dN/dE relative to Pivot Energy

Thursday, July 2nd

  • Messed around with the p0 value on the graph indicated to determine a adequate value for the bayesian blocks
  • Created a few histograms of the flux values, both with the actual flux values and a logarithmic version
    • on a first glance, the data does not appear to be normally or lognormally distributed
  • Simplified the bayesian blocks presentation

Wednesday, July 1st

  • Took Gwen's light curve data on the source 1ES 1215+303 and imported it into a separate jupyter notebook
    • debugged the code/data to ensure that it would run/graph correctly
  • Modified Ari's code to create Bayesian Blocks in order to use it in conjunction with Gwen's data
    • The end product was a Bayesian Block model of Gwen's light curve

Tuesday, June 30th

  • Had a group meeting with Ari, Gwen, Reshmi, and Qi
    • Presented the presentation on Bayesian Blocks
    • Discussed the future path for research
      • Do Bayesian Block analysis of Gwen's source/numbers
      • eventually might do Bayesian Block analysis of other sources or do fermi analysis on some sources
  • Figured out how to open up Jupyter Notebook Files through the tehanu cluster
  • Copied Gwen's source files + code into my analysis folder
  • Began to play around with Bayesian Blocks in jupyter notebook

Monday, June 29th

  • Took a tutorial in TWiki to understand how to use and create topics/pages
  • Set up the research journal on TWiki
  • Created a powerpoint summarizing the idea of Bayesian Blocks: Findings come from Scargle et al.1998
    • Named Week 0 Update.pptx
  • Downloaded Linux kernel (MobaXterm)
  • Looked over ways to manipulate/move directories in Unix
  • Read Intro to FermiPy presentation by Ari
-- Adam Oppenheimer - 2020-06-29

Comments


Topic attachments
I Attachment History ActionSorted ascending Size Date Who Comment
PowerPointpptx Week_0_Update.pptx r1 manage 824.9 K 2020-06-29 - 20:58 AdamOppenheimer  
Edit | Attach | Watch | Print version | History: r32 | r26 < r25 < r24 < r23 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r24 - 2020-07-24 - AdamOppenheimer
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback