Research Journal: Adam Oppenheimer

Monday, August 3rd

  • Changed the Chi-Squared function so that it would be more reflective of the actual data
    • The old values were WAY too low due to forgetting to divide by the expected value squared as opposed to the expected value
  • Finished the first draft for my presentation for Wednesday
  • Modified the code so that all of the plots would be in MJD instead of the original MET
  • Used the equation present in the Meyer et al. paper to determine an upper limit to the size of the emitting region
    • The doppler shift was lifted from the Ton 599 paper from the Ghisellini 1998 approximation present in the Prince 2018 paper
    • The redshift was lifted off of SIMBAD
    • This value was in the range of 1e9 cm for Ton 599 and in the range 3e10 for 3C 279
      • The difference for 3C 279 might be due to the different redshift + doppler boost that was not changed

Friday, July 31st

  • Modified the code so that it would include the associated Chi Squared value with the fit and present if the model is a good fit for the data
  • Added a constant flux term to the exponential profile to try and ensure better fits for the flares modeled
    • This was a double edged sword: all of the flared sections modeled were fit well by the curves, but the addition of extra parameters resulted in many fits that were not exactly reasonable given the other details of the data(the plot would map to each data point instead of mapping to the big picture
  • Many bugs were worked out with the analysis of the data presented from the 2 flares analyzed for Ton 599

Thursday, July 30th

  • Prepared for and hosted Journal Club
  • Ran the FermiPy analysis on the two flared regions that were identified for the Ton599 monthly light curve
    • This was a 3 day binned analysis
  • Ran the series of algorithms written for flare analysis on these finer analyses
    • There appears to be a bug in the flare modeling system with the finer binned data that will need to be fixed

Wednesday, July 29th

  • Debugged the flare detection method to work well with flared regions that are not strictly decreasing from the flare
  • Determined the 'flared zones' for the source Ton599
  • Tried to get the flare plotting to work for those zones, but the curves did not fit properly
    • Is likely because of the absence of enough data
  • Put work into running the 3 day binned analysis onto the specific zones that were isolated
    • tried to debug some errors, but was having trouble finding the issue

Tuesday, July 28th

  • Worked out the issue with the flares profiling not mapping to the correct flare
    • The issue was one with the initial values that were set by the curve_fit method. To solve the problem, I set initial t0 and F0 values to lie close to the peak to ensure that the method maps to the correct peak
  • Solved the issue with the piecewise exponential function
    • decided against its use in the short term since even though it is simpler, the flare modeling is worse, in my opinion and I prefer the smooth composition as opposed to the sharp peak.
      • Could be subject to future scrutiny though
  • Consolidated the many methods written surrounding flare analysis into a single concise document
  • Figured out how to import jupyter notebooks into other jupyter notebooks (%run)
  • Ran some of the introductory analysis on the Monthly Light Curve for Ton599
    • The QF is extremely low for this source, which might lead to too many flared regions when detecting flares.
      • Might just jump straight from monthly to mapping the entire curve on 3 day curve

Monday, July 27th

  • Completed the Monthly binned lightcurve analysis for Ton599 after many technical issues
  • Moved the data into a jupyter notebook and plotted the data
  • Used the Bayesian Blocks on the data created
  • Began looking into the use of a prior prediction for the curve_fit model in order to make it fit the data better

Friday, July 24th

  • Finished the base analysis: the mean was very good but the SD was a little too low(.84)
    • Decided to continue on to the lightcurve analysis
  • Spent a long time trying to debug the lightcurve analysis, especially running multiple sections at a time
    • After eventually getting it to work, I ran the lightcurve analysis but for some reason, none of the sections actually finished, resulting in an empty lightcurve
    • Will need to do further investigation over the weekend into why the individual sections won't return their individual lightcurves
  • Did some debugging on the Flare Plotting/analysis
    • Am having difficulty with the piecewise functions required for the piecewise exponential function; will try to investigate how to solve that specific problem over the weekend
  • Found a journal article for the Journal Club

Thursday, July 23rd

  • Ran the Ton599 Base Analysis but with higher requirements for TS(changed it to 64 but will likely have to do it again going up to 100.
  • Implemented a way to restrict the BBs used to a given flare
  • Debugged the existing flare plotting algorithms
    • The detection method used sometimes mismaps, going to a small data perturbation as opposed to a noticable flare
  • Worked on implementing a new flare profile as an exponential piecewise function as opposed to the sum of reciprocal exponentials that is currently being used to make it both easier to work with and possibly more accurate.

Wednesday, July 22nd

  • Debugged the initial Base Analysis for Ton 599 and there was an overall poor fit for the source.
    • The mean displayed in the significance histogram was around 1.20
  • The Ton 599 base analysis was restarted with a higher TS requirement(25 -> 36) and the distance was reduced (5.0 -> 4.0)
  • The flare detection algorithm was edited/debugged.
  • The capability to determine and plot the exponential profile for a given flare was worked on
    • There is currently an error that occurs to overfill in the number of bits available to store the numbers used(the numbers are too large by like 10-15 orders of magnitude

Tuesday, July 21st

  • Fixed some bugs within the flare detection methods and added the capability of having strictly decreasing flare detection
  • Continued an investigation into the connections between the exponential profile model used and the specific parameters
    • A desmos graph was produced which can be shared with anyone on demand
  • Set up the fermi analysis pipeline within the linux cluster
  • Started the base analysis for Ton 599 to prepare for flare detection and analysis

Monday, July 20th

  • Wrote up code to take the Bayesian blocks and the baseline Quiescent Flux to determine the time intervals over which there are flares.
    • This is determined by finding the peak of a given flare and then continuing down the flare up until a certain baseline is reached, at which point the flare is terminated
    • The initial peak must be higher than 3 times the QF level (can be modified) and the flare is terminated when the BB flux drops below 1 SD above the QF level(the SD is calculated based on the QF data range that is calculated from the find QF method written). The SD required to end the peak is also modifiable
    • The method might clump peaks together, so this has to be written in to modify the current code
  • Began looking into the actual meanings of the parameters associated with the exponential profile function.
    • determined the conversion from the t0 value to the actual peak of the flare(the difference between t0 and the actual peak is determined by the asymmetry of the flare itself.

Friday, July 17th

  • Implemented a code that is able to optimize an exponential profile to a given time series data to model a flare
    • The exponential profile used has 4 degrees of freedom: F0, t0, Tr, and Td. These are determined through the scipy.stats.curve_fit function, which yields these coefficients
    • It was tested on some simulated flares and on an actual flare zoned in on from 3C 279
  • Wrote a code that is able to determine the quiescent flux level for a given time signal
    • This code is a modified version of the pseudocode present in Meyer et al. (2019) on how to determine the quiescent flux leve
    • There are 2 major alterations to the code: the first alteration is that the array of test QFs that is supposed to go from the min value to the mean value was truncated at a certain point to ensure that there are enough data points in the time signal to create an even distribution. This was done by hand today but will be automated in the future
    • The 2nd alteration is that instead of using the ratio of posterior probabilities as indicated in Wolpert (1996) as done in the paper, a short term solution was used which is determining how centered the peak of the CDF is for these curves. This is done due to an inability to extend the method done in Wolpert (1996) to the signals used due to a lack of all of the necessary parameters to use that method. This will be investigated further in the future to determine the likely next step; this will probably be an implementation of chi squared or the Wolpert method

Thursday, July 16th

  • Modified, expanded, and finalized the txt file that was produced for the flare data of Ton 599
    • Added notes at the top including the many conversions required for the data, the original form of the data, and the wavelength bands that the data was collected in.
  • Began to look into the Scipy.optimize module in preparation for writing a method to map exponential profiles to a flare

Wednesday, July 15th

  • Created a txt file for all of the information collected about the Dec 2017 Flare of Ton 599
    • Includes observatory, observed time(if applicable), freq, energy, spectral density
  • Exported this txt file and sent it around, in addition to completing SEDs for Ton 599 containing both archival and flared data

Tuesday, July 14th

  • Converted the Jansky values and Magnitude values to Spectral Energy Density
  • Took the SED values in the Fermi range from Figure 6 in the paper and plotted it on the SED for Ton 599
    • This finished up the SED for Ton 599 during the flare for now (the flared points and archival data are both plotted on the same graph
  • Set the prior p0 for the Bayesian Blocks model equal to a standard p0 representing 2 sigma (p0=.05)
  • Used the Bayesian Blocks method on Gwen's monthly light curve for 1ES 1218+304

Monday, July 13th

  • Collected energy + spectral density information from the Prince 2018 to upload it into a jupyter notebook
    • was able to plot about 3/4 of the data points (will work tomorrow on fixing the last few points due to conversion issues.

Friday, July 10th

  • Used the 1ES 1215+303 light curve data from the Cutoff folder (both 30 and 3 day bins), plotted it, and used Bayesian Blocks on that data
  • Found the UVOT data associated with the flare for Ton 599 in Dec 2017 and downloaded all of the data in a folder filled with .img files.
  • Fixed the code associated with the first source of Janet's light curve data, and plotted Ari and Reshmi's data on the same plot to compare similarities and differences
    • The plots are nearly identical.
  • Wrote up a draft for the Rationale for the research plan

Thursday, July 9th

  • Took Gwen's 3 Day binned light curve data for 1ES 1215+303 and used the Bayesian Blocks method on it
  • Took Janet's light curve data for the source 3C 279 and used the Bayesian Blocks method on it
  • Created slides for the VERITAS Collaboration for next week about the Bayesian Blocks method and using it on the sources 3C 279 and 1ES 1215+303

Wednesday, July 8th

  • Finished the paper on multiwavelength analysis of Ton 599
  • Continued searching for the data from the flare that occurred in Dec, 2017

Tuesday, July 7th

  • Used the data from Ton599 that was imported into a jupyter notebook as a txt file and converted it into a spectral graph correlating the nuFnu to the energy in GeV
  • Tried to find data for the flare that occurred in Dec. 2017, but was unable to find it through the paper linked below or through the ASDC catalog itself
  • Read Parts 1 and 2 of the journal paper Multi-frequency Variability Study of Ton 599 during the High Activity of 2017

Monday, July 6th

  • Learned about and read into the ASDC SED Builder and how to work it
  • Created an SED for the source Ton599 over the entire timeframe of collected data and exported it as an ASCII file
  • Transferred this ASCII file to the Tehanu Server and opened it in a Jupyter Notebook
  • Tried to create a SED for Ton599 with a limited time frame around Dec, 2017(the time of a significant flare), but there was no data present in that timeframe

Friday, July 3rd

  • Restricted the data entries from 1ES1215+303 based on their Test Statistic values to increase the significance of the plot
    • Only entries with TS>=16 were kept
  • The Bayesian Blocks model was repeated on this restricted data graphing the flux relative to time
  • The flux histograms were redone(both normal and log version) using the restricted dataset
  • A plot was made plotting dN/dE relative to Pivot Energy

Thursday, July 2nd

  • Messed around with the p0 value on the graph indicated to determine a adequate value for the bayesian blocks
  • Created a few histograms of the flux values, both with the actual flux values and a logarithmic version
    • on a first glance, the data does not appear to be normally or lognormally distributed
  • Simplified the bayesian blocks presentation

Wednesday, July 1st

  • Took Gwen's light curve data on the source 1ES 1215+303 and imported it into a separate jupyter notebook
    • debugged the code/data to ensure that it would run/graph correctly
  • Modified Ari's code to create Bayesian Blocks in order to use it in conjunction with Gwen's data
    • The end product was a Bayesian Block model of Gwen's light curve

Tuesday, June 30th

  • Had a group meeting with Ari, Gwen, Reshmi, and Qi
    • Presented the presentation on Bayesian Blocks
    • Discussed the future path for research
      • Do Bayesian Block analysis of Gwen's source/numbers
      • eventually might do Bayesian Block analysis of other sources or do fermi analysis on some sources
  • Figured out how to open up Jupyter Notebook Files through the tehanu cluster
  • Copied Gwen's source files + code into my analysis folder
  • Began to play around with Bayesian Blocks in jupyter notebook

Monday, June 29th

  • Took a tutorial in TWiki to understand how to use and create topics/pages
  • Set up the research journal on TWiki
  • Created a powerpoint summarizing the idea of Bayesian Blocks: Findings come from Scargle et al.1998
    • Named Week 0 Update.pptx
  • Downloaded Linux kernel (MobaXterm)
  • Looked over ways to manipulate/move directories in Unix
  • Read Intro to FermiPy presentation by Ari
-- Adam Oppenheimer - 2020-06-29

Comments


Topic attachments
I Attachment History Action Size DateSorted ascending Who Comment
PowerPointpptx Week_0_Update.pptx r1 manage 824.9 K 2020-06-29 - 20:58 AdamOppenheimer  
Edit | Attach | Watch | Print version | History: r32 < r31 < r30 < r29 < r28 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r32 - 2020-08-04 - AdamOppenheimer
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback