Implemented a code that is able to optimize an exponential profile to a given time series data to model a flare
The exponential profile used has 4 degrees of freedom: F0, t0, Tr, and Td. These are determined through the scipy.stats.curve_fit function, which yields these coefficients
It was tested on some simulated flares and on an actual flare zoned in on from 3C 279
Wrote a code that is able to determine the quiescent flux level for a given time signal
This code is a modified version of the pseudocode present in Meyer et al. (2019) on how to determine the quiescent flux leve
There are 2 major alterations to the code: the first alteration is that the array of test QFs that is supposed to go from the min value to the mean value was truncated at a certain point to ensure that there are enough data points in the time signal to create an even distribution. This was done by hand today but will be automated in the future
The 2nd alteration is that instead of using the ratio of posterior probabilities as indicated in Wolpert (1996) as done in the paper, a short term solution was used which is determining how centered the peak of the CDF is for these curves. This is done due to an inability to extend the method done in Wolpert (1996) to the signals used due to a lack of all of the necessary parameters to use that method. This will be investigated further in the future to determine the likely next step; this will probably be an implementation of chi squared or the Wolpert method
Thursday, July 16th
Modified, expanded, and finalized the txt file that was produced for the flare data of Ton 599
Added notes at the top including the many conversions required for the data, the original form of the data, and the wavelength bands that the data was collected in.
Began to look into the Scipy.optimize module in preparation for writing a method to map exponential profiles to a flare
Wednesday, July 15th
Created a txt file for all of the information collected about the Dec 2017 Flare of Ton 599
Includes observatory, observed time(if applicable), freq, energy, spectral density
Exported this txt file and sent it around, in addition to completing SEDs for Ton 599 containing both archival and flared data
Tuesday, July 14th
Converted the Jansky values and Magnitude values to Spectral Energy Density
Took the SED values in the Fermi range from Figure 6 in the paper and plotted it on the SED for Ton 599
This finished up the SED for Ton 599 during the flare for now (the flared points and archival data are both plotted on the same graph
Set the prior p0 for the Bayesian Blocks model equal to a standard p0 representing 2 sigma (p0=.05)
Used the Bayesian Blocks method on Gwen's monthly light curve for 1ES 1218+304
Monday, July 13th
Collected energy + spectral density information from the Prince 2018 to upload it into a jupyter notebook
was able to plot about 3/4 of the data points (will work tomorrow on fixing the last few points due to conversion issues.
Friday, July 10th
Used the 1ES 1215+303 light curve data from the Cutoff folder (both 30 and 3 day bins), plotted it, and used Bayesian Blocks on that data
Found the UVOT data associated with the flare for Ton 599 in Dec 2017 and downloaded all of the data in a folder filled with .img files.
Fixed the code associated with the first source of Janet's light curve data, and plotted Ari and Reshmi's data on the same plot to compare similarities and differences
The plots are nearly identical.
Wrote up a draft for the Rationale for the research plan
Thursday, July 9th
Took Gwen's 3 Day binned light curve data for 1ES 1215+303 and used the Bayesian Blocks method on it
Took Janet's light curve data for the source 3C 279 and used the Bayesian Blocks method on it
Created slides for the VERITAS Collaboration for next week about the Bayesian Blocks method and using it on the sources 3C 279 and 1ES 1215+303
Wednesday, July 8th
Finished the paper on multiwavelength analysis of Ton 599
Continued searching for the data from the flare that occurred in Dec, 2017
Tuesday, July 7th
Used the data from Ton599 that was imported into a jupyter notebook as a txt file and converted it into a spectral graph correlating the nuFnu to the energy in GeV
Tried to find data for the flare that occurred in Dec. 2017, but was unable to find it through the paper linked below or through the ASDC catalog itself
Read Parts 1 and 2 of the journal paper Multi-frequency Variability Study of Ton 599 during the High Activity of 2017
Learned about and read into the ASDC SED Builder and how to work it
Created an SED for the source Ton599 over the entire timeframe of collected data and exported it as an ASCII file
Transferred this ASCII file to the Tehanu Server and opened it in a Jupyter Notebook
Tried to create a SED for Ton599 with a limited time frame around Dec, 2017(the time of a significant flare), but there was no data present in that timeframe
Friday, July 3rd
Restricted the data entries from 1ES1215+303 based on their Test Statistic values to increase the significance of the plot
Only entries with TS>=16 were kept
The Bayesian Blocks model was repeated on this restricted data graphing the flux relative to time
The flux histograms were redone(both normal and log version) using the restricted dataset
A plot was made plotting dN/dE relative to Pivot Energy
Thursday, July 2nd
Messed around with the p0 value on the graph indicated to determine a adequate value for the bayesian blocks
Created a few histograms of the flux values, both with the actual flux values and a logarithmic version
on a first glance, the data does not appear to be normally or lognormally distributed
Simplified the bayesian blocks presentation
Wednesday, July 1st
Took Gwen's light curve data on the source 1ES 1215+303 and imported it into a separate jupyter notebook
debugged the code/data to ensure that it would run/graph correctly
Modified Ari's code to create Bayesian Blocks in order to use it in conjunction with Gwen's data
The end product was a Bayesian Block model of Gwen's light curve
Tuesday, June 30th
Had a group meeting with Ari, Gwen, Reshmi, and Qi
Presented the presentation on Bayesian Blocks
Discussed the future path for research
Do Bayesian Block analysis of Gwen's source/numbers
eventually might do Bayesian Block analysis of other sources or do fermi analysis on some sources
Figured out how to open up Jupyter Notebook Files through the tehanu cluster
Copied Gwen's source files + code into my analysis folder
Began to play around with Bayesian Blocks in jupyter notebook
Monday, June 29th
Took a tutorial in TWiki to understand how to use and create topics/pages
Set up the research journal on TWiki
Created a powerpoint summarizing the idea of Bayesian Blocks: Findings come from Scargle et al.1998