Fun with HiggsCombine Nick Amin September 6, 2018
Replicating AN numbers ⚫ Take cards associated with 2016 paper (which were given for the combination) and add lumiscale rateParam * * 1. theorySysts group = pdf alphas isrvar fsrvar scale expSysts group = jes jer isr bb lep lephlt hthlt btag pu datadrivenSyst group = TTWSF TTZSF rares fakes fakes_EWK flips backgrounds group = TTH TTVV XG ⚫ Two extrapolation scenarios • S2 ("least conservative") text2workspace.py --channel-masks v0.10_paper_forDenys/card_tttt_srcr.txt --X-nuisance-group-function 'expSysts' 'expr::scaleexpSysts("1/sqrt(@0)",lumiscale[1])' --X-nuisance-group-function 'datadrivenSyst' 'expr::scaledatadrivenSyst("1/sqrt(@0)",lumiscale[1])' --X-nuisance-group-function 'theorySysts' '0.5' --X-nuisance- function 'lumi' '0.4' --X-nuisance-function 'tttt' '0.5' --X-nuisance-group-function 'backgrounds' '0.5'; • S2NF ("more conservative") text2workspace.py --channel-masks v0.10_paper_forDenys/card_tttt_srcr.txt --X-nuisance-group-function 'expSysts' 'expr::scaleexpSysts("max(0.5,1/sqrt(@0))",lumiscale[1])' --X-nuisance-group-function 'datadrivenSyst' 'expr::scaledatadrivenSyst("1/sqrt(@0)",lumiscale[1])' --X-nuisance-group-function 'theorySysts' '0.5' --X-nuisance- function 'lumi' '0.4' --X-nuisance-function 'tttt' '0.5' --X-nuisance-group-function 'backgrounds' '0.5' ⚫ Two energies • For 14 TeV, scale main backgrounds by 14/13TeV k-factor with the following included in the card before lumiscale rateParam * * 1 scale14tttt rateParam * tttt 1.33 [1.33,1.33] scale14fakes rateParam * fakes 1.19 [1.19,1.19] scale14tth rateParam * tth 1.24 [1.24,1.24] scale14ttw rateParam * ttw 1.16 [1.16,1.16] scale14ttz rateParam * ttz 1.21 [1.21,1.21] � 2
Replicating AN numbers ⚫ After setting up a given workspace from previous slide, get expected significances • 300/35.87 = 8.3635; 150/35.87 = 4.1818 combine -M ProfileLikelihood v0.10_paper_forDenys/card_tttt_srcr.root -- significance -t -1 --expectSignal=1 --setParameters lumiscale=8.3635 ⚫ Values within ~2% except for S2NF@14TeV and 3ab which are ~5% o ff ⚫ When plotting my values vs lumi, not sure what happens with jumps… AN S21) (more Fonservative) lumi sqrt(s) S2 S2NF S2 (least Fonservative) 4.0 naive (only lumisFale) 13 2.029 2.027 3.5 150 14 2.411 2.408 3.0 sigma me 13 2.678 2.636 2.5 300 14 2.817 2.780 2.0 13 3.962 3.770 1.5 3000 14 4.443 4.240 1.0 � 3 0 500 1000 1500 2000 2500 3000 3500 lumi [ifb]
Side note about our lumi scaling ⚫ We can add a lumi scaling rate parameter (at the bottom/end of the card!) and constrain it with the syntax lumiscale rateParam * * 1.0 [1.0,1.0] • This does not change the result of running combine on a card, but doing lumiscale rateParam * * 2.0 [2.0,2.0] • gives us the exact value we would get by (naively) re-running the looper, scaling every event by 2 ⚫ Also the range can be explicitly set outside the card during the combine command invocation • With lumiscale rateParam * * 1.0 [1.0,1.0] • in the card, doing combine -M ProfileLikelihood v1.00_2016_75p0_v1_try2/card_tttt_srcr_freya.txt -- significance -t -1 --expectSignal=1 --setParameters lumiscale=2 --setParameterRanges lumiscale=2,2 • will give the same thing but now we don’t need to edit the card. Can just always keep the lumiscale 1 line in the card and modify the command later to scale. • Note that the parameter range needs to be fixed or else lumiscale will be a freely floating nuisance and the significance will be o ff ⚫ And, this can be restricted to a process like tth lumiscale rateParam * tth 1.0 [1.0,1.0] • Makes it easier to do the y t scan now, for example � 4
Backup � 5
Overview ⚫ Freya is working on tttt projections for HLLHC • AN: http://cms.cern.ch/iCMS/jsp/openfile.jsp? tp=draft&files=AN2018_209_v3.pdf ⚫ Normally, we re-loop on trees with an overall scalefactor on event weights to make new root files → new card .txt → significance with lumi x scaled to y ⚫ Apparently you can do this with lumiscale rateParam * * 1. • at the bottom of the card (note, top doesn’t work!) ⚫ Working directory on UAF: ~namin/2018/fourtop/all/FTAnalysis/analysis/limits/ extraptest • Taking 2016 MC/data cards scaled to 75.0ifb using re-loop method (v1.00_2016_75p0_v1_try2/card_tttt_srcr.txt) and 150 (v1.00_2016_150p0_v1_try2/card_tttt_srcr.txt) using MC fakes • Command combine -M ProfileLikelihood v1.00_2016_150p0_v1_try2/ card_tttt_srcr.txt --significance -t -1 --expectSignal=1 • Expected significance of 2.025 𝜏 with 75ifb , 2.692 𝜏 with 150ifb � 6
⚫ Adding lumiscale line to end of text file and re-running limits should in principle give the exact same significance (because the default value is 1). However, combine -M ProfileLikelihood v1.00_2016_75p0_v1_try2/card_tttt_srcr_freya.txt -- significance -t -1 --expectSignal=1 • Expected significance of 1.970 𝜏 with 75ifb , 2.633 𝜏 with 150ifb • Numbers are 2.2-2.7% lower than previous slide. ⚫ Can override the parameter with extra flag like --setParameters lumiscale=1.066 • Docs: https://cms-hcomb.gitbooks.io/combine/content/part2/ settinguptheanalysis.html#beyond-simple-datacards ⚫ Turns out 1.066 is what I need to reproduce the original numbers from the previous slide (found by tuning by hand) ⚫ I don’t understand. But OK. Let’s say I need a ~7% "correction", so if I use 2*1.066=2.132 with the 75ifb card, do I get close to the 150ifb card? --setParameters lumiscale=2.132 • results in 2.702 𝜏 (only ~0.3% higher from 150ifb card) • Pretty close, especially given I had to apply a ~7% "correction" to lumiscale for some reason � 7
⚫ OK, let me restrict the lumiscale rateParam to just one process (ttH) and keep the value of 1. So again, this should give the original number (2.025) lumiscale rateParam * tth 1. • → 1.761 𝜏 with 75ifb • Though, I guess if this worked, this would provide an easier way to scale ttH (for y t scan) ⚫ Ah, got it. The rateParam is a floating parameter (on top of r!) in the fit, so if we constrain it with the syntax lumiscale rateParam * * 1.0 [1.0,1.0] • we get back 2.025! And now with lumiscale rateParam * * 2.0 [2.0,2.0] • in the 75ifb card, we get 2.692 𝜏 , matching exactly the naively-scaled (via looper) 150ifb card! • Also the range can be explicitly set outside the card. With lumiscale rateParam * * 1.0 [1.0,1.0] • in the card, doing combine -M ProfileLikelihood v1.00_2016_75p0_v1_try2/card_tttt_srcr_freya.txt --significance -t -1 --expectSignal=1 --setParameters lumiscale=2 --setParameterRanges lumiscale=2,2 • will also yield 2.692 𝜏 but now we don’t need to edit the card. Can just always keep the lumiscale 1 line in the card and modify the command later to scale. � 8
Recommend
More recommend