As presented in the algorithms meeting today
we observe a SF strangely far from unity in the (60-70%, 175 < pT < 200 GeV) bin. DL1 does not have this feature. We'd like to understand if this might be coming from a quickly-changing efficiency with pT or some other observable that is not perfectly modeled in the simulation.
For example, if the efficiency in this bin varies strongly with pT, and the pT distribution inside the bin is not modeled well (perhaps because the taggers are run before the full JES calibration is applied), we might expect to see a non-unity scale factor.