Bibcode
Kashyap, Vinay L.; Guarcello, Mario G.; Wright, Nicholas J.; Drake, Jeremy J.; Flaccomio, Ettore; Aldcroft, Tom L.; Albacete Colombo, Juan F.; Briggs, Kevin; Damiani, Francesco; Drew, Janet E.; Martin, Eduardo L.; Micela, Giusi; Naylor, Tim; Sciortino, Salvatore
Referencia bibliográfica
The Astrophysical Journal Supplement Series
Fecha de publicación:
11
2023
Número de citas
6
Número de citas referidas
6
Descripción
We have devised a predominantly Naive Bayes-based method to classify X-ray sources detected by Chandra in the Cygnus OB2 association into members, foreground objects, and background objects. We employ a variety of X-ray, optical, and infrared characteristics to construct likelihoods using training sets defined by well-measured sources. Combinations of optical photometry from the Sloan Digital Sky Survey (riz) and Isaac Newton Telescope Photometric Hα Survey (r I i I Hα), infrared magnitudes from United Kingdom Infrared Telescope Deep Sky Survey and Two-Micron All Sky Survey (JHK), X-ray quantiles and hardness ratios, and estimates of extinction A v are used to compute the relative probabilities that a given source belongs to one of the classes. Principal component analysis is used to isolate the best axes for separating the classes for the photometric data, and Gaussian component separation is used for X-ray hardness and extinction. Errors in the measurements are accounted for by modeling as Gaussians and integrating over likelihoods approximated as quartic polynomials. We evaluate the accuracy of the classification by inspection and reclassify a number of sources based on infrared magnitudes, the presence of disks, and spectral hardness induced by flaring. We also consider systematic errors due to extinction. Of the 7924 X-ray detections, 5501 have a total of 5597 optical/infrared matches, including 78 with multiple counterparts. We find that ≈6100 objects are likely association members, ≈1400 are background objects, and ≈500 are foreground objects, with an accuracy of 96%, 93%, and 80%, respectively, with an overall classification accuracy of approximately 95%.