Assessing the Performance of Emerging and Existing Continuous Monitoring Solutions under a Single-blind Controlled Testing Protocol
Comments
The text lists 10 participating solutions (10 different companies) but Table 1 lists a total of 13 technology solutions (10 point-source networks and 3 scanning/imaging systems). This implies that at least one of the 10 companies deployed more than one solution. Could the authors please note this? Is there any potential that a solution or solutions might have benefited from receiving data from another solution? For a company that had deployed more than one solution, can we be assured that emission reports were prepared independently? Related to this, is there any potential that a participating company may have shared data with any other company, such as a point-source solution with a scanning/imaging solution?
Page 14 of the text tags solution C as "the solution with the largest 90% DL" (DL meaning detection level). It's not apparent why the authors chose to point out the 90% POD value for solution C as predicted by the solution's power curve fit (Figure 2), rather than, say, doing the same for solution R. For solution R, the predicted 90% POD based on it's power fit would be around 1350 kg/hr. Or why not consider the several solutions that had an estimated 90% POD of infinity as being solutions with the largest 90% DL? Perhaps it's because solution C's predicted 90% POD of 76.5 kg/hr isn't quite as ridiculous as Solution R's 1350 kg/hr (or predictions of infinity for other solutions) and therefore was easier to target as something to focus on in the manuscript.
To extend on my comment referring to the last line on pg. 18 regarding Solution C's 90% POD results for 2024 vs. 2022, I would like to point out that the authors themselves say the following in Table 2's footnote for Solution C: "Solutions with no observable POD trend with emission rates or whose 90% DL is significantly outside the range of tested rates. These POD curves and DLs should be used with caution." On page 18, despite Solution C's POD curve having minimal POD trend, they violate their own caution by using the POD curve to project out to some unrealistic POD value and then using that value to make an unsupported assertion about Solution C's capabilities. As the authors themselves note, this is an inappropriate use of the data. Underlying this issue of flattening of the POD curve, the authors assume that release rate is the dominant factor in controlling the detection of probability, but there are other factors they do not consider. In fact, when digging deeper into the results, one finds that Solution C appears to be sensitive enough that release rate was not a substantial limitation on detection frequency. Instead, the factor affecting detection frequency, and thus the POD curve, was the number of releases per experiment. This again points to aspects of the ADED protocol that ought to be discussed or at least acknowledged by the authors.