24
1 Efficiencies and backgrounds re- evaluation ANKARA CM 2/4/2009 D.Autiero, S.Dusini

Efficiencies and backgrounds re-evaluation

  • Upload
    miyoko

  • View
    35

  • Download
    0

Embed Size (px)

DESCRIPTION

Efficiencies and backgrounds re-evaluation. ANKARA CM 2/4/2009 D.Autiero, S.Dusini. The re-evaluation of OPERA background and efficiencies is asked by the scientific committees who would like to know what we have learnt with one year of real data. - PowerPoint PPT Presentation

Citation preview

Page 1: Efficiencies and backgrounds re-evaluation

1

Efficiencies and backgrounds re-evaluation

ANKARA CM 2/4/2009

D.Autiero, S.Dusini

Page 2: Efficiencies and backgrounds re-evaluation

2

The re-evaluation of OPERA background and efficiencies is asked by the scientific committees who would like to know what we have learnt with one year of real data.For years we have been showing efficiencies and backgrounds estimated at the time of the proposal or during the year after the proposal submission.

This work can be based on the following handles which were not available in the past:

Availability of real data and possibility of measuring directly on them backgrounds and efficiencies

New MC production with final geometry of the experiment, detailed simulation of the events at emulsion level with the same reconstruction as for data

Possibility of validating the MC simulation with real data in order to have a credible knowledge of the variables and the effects of the cuts

Improvements in the analysis: more sophisticated analysis including the merging of scanning data + electronic detector data (many things at the proposal level where estimated almost at the generator level) and taking into account the correlations among different steps. New analysis approaches with optimized cuts aiming at maximizing the sensitivity of the experiment. Multivariable likelihood approaches.

Page 3: Efficiencies and backgrounds re-evaluation

3

Going from the conservative cuts of the proposal to these last two points is possible only if we show that we have under control MC vs Data the variables used for the analysis and we can squeeze their discriminating power without creating artefacts and believing in corners of the phase-space which are not really existing in real data.

In OPERA we have a factor 10 between tau interactions in the bricks and candidates selected at the end of the analysis. We should see how this could be optimized given the experience from real data.

The re-evaluation implies:

New MC production with state of the art knowledge

Data ED and MC reconstruction with final version

Reconstruction of MC emulsion data with OpEmurec

Availability of emulsion data and merging with ED data by using OpEmurec

In particular there are requirements on the quality and the kind of samples of emulsion data which should be made available for the analysis

Goal: Timescale to complete the re-evaluation work: at latest the end of the year

Page 4: Efficiencies and backgrounds re-evaluation

4

Data availability:

OpEmurec is almost in final shape. Succesful first reconstruction test down to vertex, dedicated meeting last week on first steps Alignment Luca

During after Mitzunami we got a sample event of Japanese data. An interface was built in order to put it in the standard DB format and this event has been re-reconstructed offline by Cristiano using sysal. A second event was provided last week to complete testing. Tools have been provided to Nagoya in order to handle the bricks publication.

So we are also going towards the full integration of data from Japan. Elisabetta

It is a first priority job to get it interfaced also to the MC emulsion data (+ background from real data) in order to perform full MC studies as for real data Luca

The new MC production was started at CCINP23 at the time of the last PCData reprocessing will be following Elisabetta + Stefano

Page 5: Efficiencies and backgrounds re-evaluation

5

Bulk efficiencies:

TriggerTarget event selectionBrick findingCS taggingGeometrical efficiencyVertex locationB2BLong and Short decays

Page 6: Efficiencies and backgrounds re-evaluation

6

Trigger efficiencies:

During 2008 a cut at 10 hits was left in the DAQ manager. It penalizes the tau>e QE events.The situation has to be improved for 2009 with a smarter cut based on pulse height and relaxing the 10 hits.

A special run has been taken with 4 hits threshold factor 2 larger data flow, random coincidences with veto (affordable), timeouts of DAQ manager (being investigated)

re-evaluation with MC of trigger efficiency, optimization of cuts. (T. Brugiere)

Caveat: in data there are spurious hits and x-talk not reproduced by MC, this can increase the real efficiency. Try to estimate this bias.

Checks on data (also useful for other aspects of this re-evaluation): check with a sample of rock muons that the number of p.e. is well reproduced DATA/MC, check on these tracks the inefficiency: how many times there are missing hits( Cecile final results)

Implementation by Stefano in OpFilter of a package simulating for MC data the trigger conditions in DAQ and removing hits not satisfying the trigger conditions.

Page 7: Efficiencies and backgrounds re-evaluation

7

Fiducial volume selection efficiencies:This efficiency was never taken into account so far. 2007-2008

The algorithm of Alessandro/Tiem is mature now and it can be used for automatic selection on real data. It has a few per mille inefficiency on low energy NC events with respect to visual scanning. These events are at the borderline and partially doubtful

Re-evaluation with MC samples of these efficiencies with final version of the algorithm (OpCarac)

Show with Data vs MC that we have under control the background from external interactions

Page 8: Efficiencies and backgrounds re-evaluation

8

Brick finding and CS tagging

Getting data from second bricks to complete efficiency estimation from real data. At the moment still poor statistics of 2nd bricks (data from EU only)

Last data:

First bricks raw eff: 467/715 = 65.3% Second bricks raw eff: 29/65 = 44.6%

Removal of interactions in dead materials 715 - 3.8% = 688

1st brick corrected eff: 467/688 / (1 -6.8%) = 72.8% (72% MC)

Correlation with 2nd brick:Unfound raw after first brick (100% - 65.3%)Unfound related to BF inefficiency (100% - 72.8%) foundable in 2nd brick

Second brick corrected eff: = 29/65 * (100 – 65.3) / ( 100 - 72.8) = 56.9%

1st brick BF eff = 72.8% +- 1.7%2nd brick BF eff = 56.9% +- 6.1%

Total efficiency 1st + 2nd: 72.8% + (100% - 72.8%)*56.9% = 88.3 % +- 5%(80% MC)Still too large uncertainty on 2nd bricks

Page 9: Efficiencies and backgrounds re-evaluation

9

Comments and warnings:

1) Predictions were made time ago for these events. Taking the latest version of the reconstruction for about 1/3 (9 events) now the brick is predicted directly as first brick, reconstruction problems fixed

2) For 8 events the second brick suffers from the underestimation of tracking errors. This has to be fixed in order to predict correctly the lateral probability

3) For 8 events the second brick was in another wall with large probability (30-40%), this looks higher (~ a factor 2) than on average prediction for second bricks. It is a fluctuation in this first sample

4) Too few NC

We need a larger sample to draw some more solid conclusions

We should get a large sample from the last couple of weeks of extractions

Page 10: Efficiencies and backgrounds re-evaluation

10

We should have a clearer view with the increase in statistics of the last weeks (gather all possible second brick results, also from Japan)

Re-evaluation of last version of the reconstruction on all events

A deeper comparison per events categories is needed (QE,NC,CC, electromagnetic-like)

So far the algorithms have been kept untouched, the main work has been concerning the debugging of the reconstruction. On the basis of these results it will be possible to have a second version

BF efficiency was optimized for tau events (the efficiency is about 5% larger than for standard numu NC or CC events) , in order to check it we should look for events which are tau like CC QE and with low energy muons, events with dominant electromagnetic activity, like taue and taurho.

First priority: increase the efficiency as much as possible, There are ideas about that on the side of BF and also on the possibility of concentrating the efforts on a tau signal enriched sample. We should first advance with the analysis of the event sample and understand why it goes wrong (reconstruction, 2nd bricks, …)

Page 11: Efficiencies and backgrounds re-evaluation

11

Integrate in the simulation the CS tagging absent in the past: acceptance effect with respect to the total surface behind the bricks (7.4% uncovered area)

+ CS base track efficiencies measured from real data, implement also the 3/4)

Complete efficiency evaluation from real data (convolution of BF + CS)

Evaluate CS tagging efficiency from full MC (including the CS intrinsic efficiency as a function of the angle measured in real data, provided by Giovanni for base tracks)

Re-evaluation campaign (brick finding CS tagging part):

Page 12: Efficiencies and backgrounds re-evaluation

12

Better errors evaluation in tracking, important for probability maps

Re-evaluate MC BF efficiency per categories of event including the interplay of CS tagging efficiency, compare to data per categories of events: DIS, QE, NC, NC with e.m. component.

Possible tau enriched sample to maximize the efficiency

Estimate possible bias of CS tagging inefficiency (~ 25%) on muon matching: are all muons found at the vertex even if not followed by scan-back ? even a residual inefficiency of 2% could change a lot the charm background.

Page 13: Efficiencies and backgrounds re-evaluation

13

Geometrical efficiency

At the proposal level it was simply assumed that 1mm along the border was inefficient at the level of emulsion scanning a 3.6% loss.

Giovanni thinks that the dimension of the inefficient region is actually reasonably close to 0.5 mm.

We have also to take into account the fact that the lead plates seem to have a different geometry with respect to the nominal one: 137.9 g/plate vs 142.57 g/plates (nominal) a loss of 3.3%. Where is the missing mass ? The reduction in surface of the plates (124.96 x 99.35 vs 125.5x100: -1.16%) does not justify the effect. In The MC prod we applied also a thickness reduction of 20 microns (Stefano)

we have concluded at the last PC meeting that the thickness reduction of 20 microns was real (BAM and lead production presentations)

update also the target mass and number of interactions (in progress)

evaluate with the full MC the real loss of efficiency for events at the borders (need to include in MC final lead geometry) Emulsion dimensions 124.6 x 99.0

Page 14: Efficiencies and backgrounds re-evaluation

14

B2B connectionNow we should have experience from real data. The statistics is still

poor, in Europe we have 32 events with frontal connection, very few events with lateral connection

re-evaluate with full MC

increase statistics with data measurement

Page 15: Efficiencies and backgrounds re-evaluation

15

Vertex localization (Giovanni)

This is another point where people expect some experience return from real data. At the moment the statistics is poor and can provide just a range for the efficiency:

CC: 84-94%NC: 71-92%

The lower limit of the range is computed assuming that the actually pending events are not localized.

Improve the statistics with data

Launch the evaluation of efficiency (quite relevant for NC like) with full MC by performing detailed reconstruction

Study of multiplicity distributions at the primary

How to prove that we understand the efficiency for events with electromagnetic showers

How to proove that we understand the efficiency for secondary vertices ? Try a K0s sample with one track detected in CS, rates well known

Page 16: Efficiencies and backgrounds re-evaluation

16

How to prove that we understand the efficiency for NC events

Fake NC removing the muon

How to prove that we understand the efficiency for events with electromagnetic showers

How to proove that we understand the efficiency for secondary vertices ? Try a K0s sample with one track detected in CS, rates well known

CS

CS

Tagging in the CS of known showers from photon conversions

Page 17: Efficiencies and backgrounds re-evaluation

17

Beam angle

Slope zx

Slope zy

Tracks at large angle

File provided by Giovanni with 196 vertices, 166 CC

Tracks slopes at primary vertex for CC events

Page 18: Efficiencies and backgrounds re-evaluation

18

Track multiplicity at primary vertex for numu cc

Single track events 24%Expected contribution from QE+RES 10% To be quantified losses of tracks for DIS (efficiency for large angles, range for soft particles)

Page 19: Efficiencies and backgrounds re-evaluation

19

Muon slope in xz plane

Data

MC

Page 20: Efficiencies and backgrounds re-evaluation

20

Page 21: Efficiencies and backgrounds re-evaluation

21

Muliplicity at primary vertex

Data

MCQE+RES (10%) not included but taken into account in normalization

Protons <300 MeV/c (mainly from nuclear rescattering removed)

Page 22: Efficiencies and backgrounds re-evaluation

22

Page 23: Efficiencies and backgrounds re-evaluation

23

All tracks slope in zx plane

Data

MC

Angular efficiency

Page 24: Efficiencies and backgrounds re-evaluation

24Distributions already spoiled by deficit of tracks seen at the multiplicity level