11
Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information Technology Uppsala University, Sweden [email protected] Mina Ferizbegovic School of Electrical Engineering and Computer Science KTH, Sweden [email protected] Thomas B. Schön Department of Information Technology Uppsala University, Sweden [email protected] Håkan Hjalmarsson School of Electrical Engineering and Computer Science KTH, Sweden [email protected] Abstract This paper concerns the problem of learning control policies for an unknown linear dynamical system to minimize a quadratic cost function. We present a method, based on convex optimization, that accomplishes this task robustly: i.e., we minimize the worst-case cost, accounting for system uncertainty given the observed data. The method balances exploitation and exploration, exciting the system in such a way so as to reduce uncertainty in the model parameters to which the worst-case cost is most sensitive. Numerical simulations and application to a hardware-in-the-loop servo-mechanism demonstrate the approach, with appreciable performance and robustness gains over alternative methods observed in both. 1 Introduction Learning to make decisions in an uncertain and dynamic environment is a task of fundamental importance in a number of domains. Though it has been the subject of intense research activity since the formulation of the ‘dual control problem’ in the 1960s[17], the recent success of reinforcement learning (RL), particularly in games [33, 37], has inspired a resurgence in interest in the topic. Problems of this nature require decisions to be made with respect to two objectives. First, there is a goal to be achieved, typically quantified as a reward function to be maximized. Second, due to the inherent uncertainty there is a need to gather information about the environment, often referred to as ‘learning’ via ‘exploration’. These two objectives are often competing, a fact known as the exploration/exploitation trade-off in RL, and the ‘dual effect’ (of decision) in control. It is important to recognize that the second objective (exploration) is important only in so far as it facilitates the first (maximizing reward); there is no intrinsic value in reducing uncertainty. As a consequence, exploration should be targeted or application specific; it should not excite the system arbitrarily, but rather in such a way that the information gathered is useful for achieving the goal. Furthermore, in many real-world applications, it is desirable that exploration does not compromise the safe and reliable operation of the system. This paper is concerned with control of uncertain linear dynamical systems, with the goal of maxi- mizing (minimizing) rewards (costs) that are a quadratic function of states and actions; cf. §2 for a detailed problem formulation. We derive methods to synthesize control policies that balance the 33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada.

Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

Robust exploration in linear quadraticreinforcement learning

Jack UmenbergerDepartment of Information Technology

Uppsala University, [email protected]

Mina FerizbegovicSchool of Electrical Engineering

and Computer ScienceKTH, Sweden

[email protected]

Thomas B. SchönDepartment of Information Technology

Uppsala University, [email protected]

Håkan HjalmarssonSchool of Electrical Engineering

and Computer ScienceKTH, Sweden

[email protected]

Abstract

This paper concerns the problem of learning control policies for an unknownlinear dynamical system to minimize a quadratic cost function. We present amethod, based on convex optimization, that accomplishes this task robustly: i.e.,we minimize the worst-case cost, accounting for system uncertainty given theobserved data. The method balances exploitation and exploration, exciting thesystem in such a way so as to reduce uncertainty in the model parameters to whichthe worst-case cost is most sensitive. Numerical simulations and application to ahardware-in-the-loop servo-mechanism demonstrate the approach, with appreciableperformance and robustness gains over alternative methods observed in both.

1 Introduction

Learning to make decisions in an uncertain and dynamic environment is a task of fundamentalimportance in a number of domains. Though it has been the subject of intense research activity sincethe formulation of the ‘dual control problem’ in the 1960s[17], the recent success of reinforcementlearning (RL), particularly in games [33, 37], has inspired a resurgence in interest in the topic.Problems of this nature require decisions to be made with respect to two objectives. First, there isa goal to be achieved, typically quantified as a reward function to be maximized. Second, due tothe inherent uncertainty there is a need to gather information about the environment, often referredto as ‘learning’ via ‘exploration’. These two objectives are often competing, a fact known as theexploration/exploitation trade-off in RL, and the ‘dual effect’ (of decision) in control.

It is important to recognize that the second objective (exploration) is important only in so far as itfacilitates the first (maximizing reward); there is no intrinsic value in reducing uncertainty. As aconsequence, exploration should be targeted or application specific; it should not excite the systemarbitrarily, but rather in such a way that the information gathered is useful for achieving the goal.Furthermore, in many real-world applications, it is desirable that exploration does not compromisethe safe and reliable operation of the system.

This paper is concerned with control of uncertain linear dynamical systems, with the goal of maxi-mizing (minimizing) rewards (costs) that are a quadratic function of states and actions; cf. §2 fora detailed problem formulation. We derive methods to synthesize control policies that balance the

33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada.

Page 2: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

exploration/exploitation tradeoff by performing robust, targeted exploration: robust in the sensethat we optimize for worst-case performance given uncertainty in our knowledge of the system, andtargeted in the sense that the policy excites the system so as to reduce uncertainty in such a waythat specifically minimizes the worst-case cost. To this end, this paper makes the following specificcontributions. We derive a high-probability bound on the spectral norm of the system parameterestimation error, in a form that is applicable to both robust control synthesis and design of targetedexploration; cf. §3. We also derive a convex approximation of the worst-case (w.r.t. parameteruncertainty) infinite-horizon linear quadratic regulator (LQR) problem; cf. §4.2. We then combinethese two developments to present an approximate solution, via convex semidefinite programing(SDP), to the problem of minimizing the worst-case quadratic costs for an uncertain linear dynamicalsystem; cf. §4. For brevity, we will refer to this as a ‘robust reinforcement learning’ (RRL) problem.

1.1 Related work

Inspired, perhaps in part, by the success of RL in games [33, 37], there has been a flurry of recentresearch activity in the analysis and design of RL methods for linear dynamical systems with quadraticrewards. Works such as [1, 23, 15] employ the so-called ‘optimism in the face of uncertainty’ (OFU)principle, which selects control actions assuming that the true system behaves as the ‘best-case’model in the uncertain set. This leads to optimal regret but requires the solution of intractable non-convex optimization problems. Alternatively, the works of [35, 3, 4] employ Thompson sampling,which optimizes the control action for a system drawn randomly from the posterior distribution overthe set of uncertain models, given data. The work of [30] eschews uncertainty quantification, anddemonstrates that ‘so-called’ certainty equivalent control attains optimal regret. There has also beenconsiderable interest in ‘model-free’ methods[39] for direct policy optimization [16, 29], as wellpartially model-free methods based on spectral filtering [22, 21]. Unlike the present paper, noneof the works above consider robustness which is essential for implementation on physical systems.Robustness is studied in the so-called ‘coarse-ID’ family of methods, c.f. [12, 13, 11]. In [11], sampleconvexity bounds are derived for LQR with unknown linear dynamics. This approach is extended toadaptive LQR in [12], however, unlike the present paper, the policies are not optimized for explorationand exploitation jointly; exploration is effectively random. Also of relevance is the field of so-called‘safe RL’ [19] in which one seeks to respect certain safety constraints during exploration and/or policyoptimization [20, 2], as well as ‘risk-sensitive RL’, in which the search for a policy also considers thevariance of the reward [32, 14]. Other works seek to incorporate notions of robustness commonlyencountered in control theory, e.g. stability [34, 7, 10]. In closing, we mention that related problemsof simultaneous learning and control have a long history in control theory, beginning with the studyof ‘dual control’ [17, 18] in the 1960s. Many of these formulations relied on a dynamic programing(DP) solution and, as such, were applicable only in special cases [6, 8]. Nevertheless, these earlyefforts [9] established the importance of balancing ‘probing’ (exploration) with ‘caution’ (robustness).For subsequent developments from the field of control theory, cf. e.g. [24, 25, 5].

2 Problem statement

In this section we describe in detail the problem addressed in this paper. Notation is as follows: A>

denotes the transpose of a matrix A. x1:n is shorthand for the sequence {xt}nt=1. �max(A) denotes the

maximum eigenvalue of a matrix A. ⌦ denotes the Kronecker product. vec (A) stacks the columns ofA to form a vector. Sn

+ (Sn++) denotes the cone(s) of n ⇥ n symmetric positive semidefinite (definite)

matrices. w.p. means ‘with probability.’ �2n(p) denotes the value of the Chi-squared distribution with

n degrees of freedom and probability p. blkdiag is the block diagonal operator.

Dynamics and cost function We are concerned with control of linear time-invariant systems

xt+1 = Axt + But + wt, wt ⇠ N�0, �2

wInx

�, x0 = 0, (1)

where xt 2 Rnx , ut 2 Rnu and wt 2 Rn denote the state (which is assumed to be observed directly,without noise), input and process noise, respectively, at time t. The objective is to design a feedbackcontrol policy ut = �({x1:t, u1:t�1}) so as to minimize the cost function

PTt=i c(xt, ut), where

c(xt, ut) = x>t Qxt + u>

t Rut for user-specified positive semidefinite matrices Q and R. When theparameters of the true system, denoted {Atr, Btr}, are known this is exactly the finite-horizon LQRproblem, the optimal solution of which is well-known. We assume that {Atr, Btr} are unknown.

2

Page 3: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

inform

ation

cost

time t<latexit sha1_base64="S+akv5tTihlPBCz7yoQZT+v775E=">AAAkR3icpVrbcty4ER1vbhvl5k0e88LKWFW219JKykNSrtqqteSstbbssi1bVuKRpjAkZgYRbyLAuZjmT+Q1+aF8Qr4ib6k8phsgZ8huWvFaU2UPB+d0A2igG92gRmmotNnZ+deNz37wwx/9+Cef/3TjZz//xS9/dfOLX5/oJM98+dpPwiQ7HQktQxXL10aZUJ6mmRTRKJRvRhcHiL+ZyUyrJH5llqk8i8QkVmPlCwNNp0ZF0rtlbg1v9ne2d+zH4w+71UO/V32eD7/o3xkEiZ9HMjZ+KLR+u7uTmrNCZEb5oSw3BrmWqfAvxES+hcdYRFKfFXbApbcJLYE3TjL4FxvPtjYlChFpvYxGwIyEmWqKYWMX9jY34z+eFSpOcyNj33U0zkPPJB7O3gtUJn0TLuFB+JmCsXr+VGTCN2CjjY2NTW9faGh0mje3PvqzMYjl3E+iSMRBMchmuB5lMZhqGJm8W2wNRJaJJUwUxloOLPwBsGyrSkFLilYVIUGiJRihOB8YuTB5WsAPQoB9EII0zmU0Kl5SGAxcvt09WxGOy/Oiv0tZSSw1KKk6sd+jcQG0coMMNNEmgf6KgUy1CpOYaorEwhH8jpGAdNdghrVa0pn2p3lWzc0XIdCpSiNGKlTvBDgC8EYia5GpvjjJwBh70H8ox2ZwIjPT3x1kajJ1P4ZFf492gTJgYCtVq35WWgW3cez3UMbpuNMpCts4ZPJrUSokZ+vF/BMMZrOFzkS2Xib8QaQdXomfMHEDPiDXCgyTdwS7RGuON/BgtG0ieNZkrcj+IoxReNEm2W9jihqgAn4Sz9Zs+4uOLg3Nu8bwEwnBE1qoEcD5s3etWUBTtWb1glO7ZyK+aInYBibTFgpHaG74H61G9FkoqyCybdUkXk9DT2K2UyN0aQjxPvUuu88RkJcUQlPr3GciE1kBXCRGmdgJkRFMVTxuuN5hOSwG0GSW1NpTM08IkTlRRrS9LLsVU1fwRZygpeQMSJcqVubeIMUvzk0TH+LkK9a+SOHUAQvQ9QHAOcuxmkSCigXCiMZ4H7IolwQybBCe8hVEhpYGSK+m0rAeYGYRHh3CFA+Y66yxfYrlsQ+RqoQhdbUHflk/DgJf6w+QcG5GhYFc6buSPatja5tMTCZFDE6rjSN7Byz6QhqBqwQHzTOKwRMKPmZKsaNiuIozPrNz4GN2A+fNqxaL0rKE0TLO0pqxoP+SOccoDya4tI9XNNhOTNvCrgYabsFWcQXlTMxksU7tkZ+kLOBEqrkv2c4BfNTA2e4B3G/gBx140MCfd+BNP/62A5808Ecd+LQVLOgEp2iYw67dOEbk2y5kgsijLsRH5KDertwUdtMdD+nhBlC4FClCe91QkGPSdXzVxvST2aIOL8NisbC7CHPPp0r725+edVoHqzfnttmmJg7kGAJ9cf9rGvAuASmLr++z/HGsJW6KY5Z5drdL1TjU8QdzZghcEsqR0iZoGTOMTR/rYz7kXhiAgXzMITMZQHLv4e9cY3FT1xBN9l+hKLDsUZjL/0+PVCwsHUokqGpEpwQfD8Dr7MCSGuPEfPqjxslUrAb/USpw7EzFakIfUoF77nipjYy+X61Ds3d9VSY+hXqtxbCuvekdCnAYk8RKxJ42+Xj86Rs/hl0dkG6/iyUWTSUcwCTVVTUS0e11GaFrY5ZYXLK9Z5OM+ribYr1CK7UWIYKDmqmwp/fl+V12uhismd5hOlYXTywxhAQvnmDFN5AhLQUBtSrK4oiFzciH8lFE6FkTAY2ryAQI97BoKUJw4gfhZJs6/Wi8rtBwHb+D/MzWi4SXqarDgRE5RfMALJyfr+uDtGMMYpEmcxsl0JavGQEMHcNhtFPaHfxQaZOpUY4XGteo2SEe2ZMb1pEubKaSrAJ4zSsdyg7kVJsIEh/c+MwKcpbaA8alr9Zc71n6DEloxYKnFo0qq3JSVLZ873i2gCOmT1KDtr/LkjmbU7qsjycdSyu2ZGLLptiSZ0LLwFpzjZMMqIEueD2eR0ubDNLWS+vQ5NjAWkD5eO6mNrSgPa4RTfIIN+krmmVIoe3FUiOUvebTHquobPoIm1nlHHG9oi4agpFVpN7ZW7mvIFk2WRJ6nz6HucSqtCze0O7HkKh3ZNMu3zlhWwaOEFs2BDLDKyu7PQhDLJr5Kom1o3lZ7A/ntIqB1kcl5QqTNcoeomeF7XNsvgJRM+nLai0eDJs3G3Q4lrJ/JcX2gd+ERToLIQSv+wrZGTFCwn6bwFIKOL+xrnpO41AFuZFUP64czioOuzJzk4NOl8Wvmj7lQjCgmXaW5Knrzx4S7uYDr0c67zksU4zqsQ2LB/vODx5e13XBb8IkN5pHD0AUZI7UDtgM7OK2fbrDLigq2Il3xCnIFV38ZaXrIg1zEF0Miy9Z+APHqrAthuVQw0NdcC93Jjl68fL7ZWebnr1eV7EHxvXixEhN0oVVTd3enAk4uoOsNz2GrJFmMA5+QY3kml+ygjUxInSFM7t/AbulYFJ/iSfrVJV03KlIJclyV/w6vD5hwaCeGaS7YQIUDxvI7POoYg1x/SJ6LYagNcXaDCQ/8TmlUnWOd5VjqrDq7hxfSHhIUIGMfellcuKScTrEaC1SjxEkI98TaZolCxXZc4IY9CLF1wfFe6TiG5EMai1rySce4J51NDQuqqbr6oraU55oQXozw4OVLZ+dJ9aUIYYWBrmpod/YO67KLnaRD6YinkgvGXuQ1yi8oNe015kbz5+pYjgWQwcd0Qw4hHrJQX9hu1As8a0TO9EhganPrlN+uR85bR+ELyrsCQsKF+El9PZkHeUveTIG/g99n7KsCtpHi1VidNpxdxDP7FuURqEARZxJ/KnQRvnXKeTyRjrrqphUsYnbzchzMC2iFBaykdRVl5tu5VeeRm/VfHtcdBYSc6WnIjPtFyRvyvNia7e83d+919+jN/UoAUQ8O6f08EQsSHDLxiwdh1S9dePbTq+j1J3H1iKOQXMMzRh2lM37603vqTTTJLhGhWITcRzPtLDpIvgl35Y1Ax6h/GeMuV8T5kmmzZYvtGTxTS0kRLZbh8O9rw6Hbgq3ujjKJl5WG4R5mcUQcbbYmyN/5UqWeXDEXjHmma5RCG+wC6H7rv40703zDFxkaAQRplMxpIuZShuzHZiW7trt+Oj4OtcfuN54yVs9Xnnxm43rS9TRmL+RjZroU7a4TfQZuz9tokfcKosmzm+dmyhL8pdNlJVy8yY65/Vju0psi0aV13SJNdWiFuqXTVzy3LcJs9vqiybKIzicagubZOPpNmTWgublGuYVOYq3Rn+Fng5eRzEdSN9GqBndMTVwQl+Ijl0G8WjIX7dX6Q8GyS7U7WULcrMK7Uqc9UVzqHm1ZEkjRiJxwZaO/vndlcdgg/XJ9oCstvodseZFFRLsW5M15cuAzwua40Rp2Sa6JnY8A6QzGyvW3Pu2pUPtmDHHjuk+w5v9XfqnNPzhZG979/fbey/2+t/sV39m83nvt73f9W73dnt/6H3TO+w9773u+b2w97fe33v/6P+z/+/+f/r/ddTPblQyv+m1Prdu/A8g3r7w</latexit>

0<latexit sha1_base64="nb8EW40PdR6S2JNihSJ4ZRqVZ3E=">AAAkQnicpVpLc9y4EZ7dvDbKy5scc2FlrCrba2kl5ZCUq7ZqLTlrrS27bMuWlXisKQyJmUHElwhwHqL5E3JN/lD+RP5Cbqlcc0g3QM6Q3bTWa02VPRx8XzeABrrRDWqUhkqbnZ1/ffLpD374ox//5LOfbvzs57/45a9ufP7rE53kmS9f+UmYZKcjoWWoYvnKKBPK0zSTIhqF8vXo/ADx1zOZaZXEL80ylW8jMYnVWPnCQNPxzZ2bwxv9ne0d+/H4w2710O9Vn2fDz/u3B0Hi55GMjR8Krd/s7qTmbSEyo/xQlhuDXMtU+OdiIt/AYywiqd8WdqyltwktgTdOMvgXG8+2NiUKEWm9jEbAjISZaophYxf2JjfjP74tVJzmRsa+62ich55JPJy4F6hM+iZcwoPwMwVj9fypyIRvwDwbGxub3r7Q0Og0b2598GdjEMu5n0SRiINikM1wKcpiMNUwMnmn2BqILBNLmCiMtRxY+D1g2VaVgpYUrSpCgkRLMEJxNjByYfK0gB+EAFsgBGmcy2hUvKAwGLh8s/t2RTguz4r+LmUlsdSgpOrEfo/GBdDKDTLQRJsE+isGMtUqTGKqKRILR/A7RgLSXYMZ1mpJZ9qf5lk1N1+EQKcqjRipUF0K8AHgjUTWIlN9cZKBMfag/1COzeBEZqa/O8jUZOp+DIv+Hu0CZcDAVqpW/bS0Cm7h2O+ijNNxu1MUtnHI5NeiVEjO1ov5JxjMZgudiWy9TPiDSDu8Ej9h4gZ8QK4VGCbvCHaJ1hxv4MFo20TwrMlakf1FGKPwvE2y38YUNUAF/CSerdn2Fx1dGprLxvATCXETWqgRwPmzy9YsoKlas3rBqd0zEZ+3RGwDk2kLhSM0N/yPViP6LJRVENm2ahKvp6EnMdupEbo0RHefepfd5wjICwqhqXXuM5GJrAAuEqNM7ITICKYqHjdc77AcFgNoMktq7amZJ4TInCgj2l6U3YqpK/giTtBScgakCxUrc3eQ4hfnpokPcfIla1+kcOqABej6AOCc5VhNIkHFAmFEY7wPWJRLAhk2CE/4CiJDSwOkl1NpWA8wswiPDmGK+8x11tg+xfLYh0hVwpC62gO/rB8Hga/1e0g4N6PCQK70Xcme1bG1TSYmkyIGp9XGkb0DFn0hjcBVgoPmKcXgCQUfMaXYUTFcxRmf2TnwjYpgfYuXLRalZQmjZZylNWNB/yVzjlEeTHBpH61osJ2YtoVdDTTcgq3iCsqZmMlindojP0lZwIlUc1+ynQP4qIGz3QO438APOvCggT/rwJt+/E0HPmngDzvwaStY0AlO0TCHXbtxjMg3XcgEkYddiI/IQb1duSnspjse0sMNoHApUoT2uqEgx6Tr+KqN6SezRR1ehsViYXcR5p5PlPa3Pz7rtA5Wb85ts01NHMgxBPri3lc04F0AUhZf3WP541hL3BTHLPPsbpeqcajjD+bMELgkVCKlTdAyZhibPtbHfMi9MAAD+ZhDZjKA5N7D37nGuqauIZrsv0JRYNmjMJffTY9ULCwdqiOoakSnBB8PwOvswJIa48R8+oPGyVSsBv9BKnDsTMVqQu9TgXvueKmNjL5frUOzd31VJj6Feq3FsK696R0KcBiTxErEnjb5ePzxGz+GXR2Qbr+NJRZNJRzAJNVVNRLR7XURoWtjllhcsL1nk4z6uJtivUIrtRYhgoOaqbCn98XZHXa6GKyZLjEdq4snlhhCghdPsOIbyJCWgoBaFWVxxMJm5EP5KCL0rImAxlVkAoR7WLQUITjx/XCyTZ1+NF5XaLiO30J+ZutFwstU1eHAiJyieQAWzs/W9UHaMQaxSJO5jRJoy1eMAIaO4TDaKe0OfqC0ydQox7uMa9TsEI/syQ3rSBc2U0lWAbzmlQ5lB3KqTQSJD258ZgU5S+0B49JXa653LH2GJLRiwVOLRpVVOSkqW75zPFvAEdMnqUHb32HJnM0pXdbHk46lFVsysWVTbMkzoWVgrbnGSQbUQBe8Hs+jpU0GaeuFdWhybGAtoHw8d1MbWtAe14gmeYSb9CXNMqTQ9mKpEcpe8WmPVVQ2fYTNrHKOuF5RFw3ByCpSl/ZC7ktIlk2WhN7Hz2EusSoti9e0+zEk6h3ZtMt3TtiWgSPElg2BzPDKym4PwhCLZr5KYu1oXhb7wzmtYqD1YUm5wmSNsofoWWH7HJuvQNRM+rJai/vD5s0GHY6l7F9JsX3gN2GRzkIIweu+QnZGjJCw3yawlALOb6yrntE4VEFuJNWPK4ezisOuzNzkoNNl8aumT7kQDGimnSV56vqzh4S7+cDrkc57DssUo3psw+L+vvODB9d1XfCbMMmN5tEDEAWZI7UDNgO7uGWfbrMLigp24h1xCnJFF39Z6bpIwxxEF8PiCxb+wLEqbIthOdTwUBfczZ1Jjp6/+H7Z2aZnr9dV7IFxvTgxUpN0YVVTtzdnAo7uIOtNjyBrpBmMg59TI7nmF6xgTYwIXeHM7l/AbimY1F/iyTpVJR13KlJJstwVvw6vj1kwqGcG6W6YAMXDBjL7PKpYQ1y/iF6LIWhNsTYDyU98TqlUneFd5ZgqrLo7wxcSHhJUIGNfepmcuGScDjFai9RjBMnI90SaZslCRfacIAY9T/H1QfEOqfhGJINay1rysQe4Zx0NjYuq6bq6ovaUJ1qQ3szwYGXLZ+eJNWWIoYVBbmroN/aOq7KLXeSDqYgn0kvGHuQ1Ci/oNe115sbzZ6oYjsXQQUc0Aw6hXnLQX9guFEt868ROdEhg6rPrlF/uR07be+HzCnvMgsJ5eAG9PV5H+QuejIH/Q9+nLKuC9tFilRiddtwdxDP7FqVRKEARZxJ/KrRR/nUKubyRzroqJlVs4nYz8hxMiyiFhWwkddXlplv5lafRWzXfHhedhcRc6anITPsFyevyrNjaLW/1d+/29+hNPUoAEc/OKT08EQsS3LIxS8chVW/d+LbT6yh157G1iGPQHEMzhh1l8/5603sizTQJrlGh2EQcxzMtbLoIfsm3Zc2ARyj/GWPu14R5kmmz5QstWXxTCwmR7ebhcO/Lw6Gbws0ujrKJl9UGYV5mMUScLfbmyF+5kmUeHLFXjHmmaxTCG+xC6L6rP8170zwDFxkaQYTpVAzpYqbSxmwHpqW7djs+Or7O9QeuN17yVo9XXvxm4/oSdTTmb2SjJvqELW4TfcruT5voEbfKoonzW+cmypL8ZRNlpdy8ic55/diuEtuiUeU1XWJNtaiF+mUTlzz3bcLstvq8ifIIDqfawibZeLoNmbWgebmGeUWO4q3RX6Gng9dRTAfStxFqRndMDZzQF6Jjl0E8HPLX7VX6g0GyC3V72YLcrEK7Emd90RxqXi1Z0oiRSFywpaN/dmflMdhgfbI9IKutfkeseVGFBPvWZE35IuDzguY4UVq2ia6JHc8A6czGijX3nm3pUDtmzLFjus/wRn+X/ikNfzjZ2979/fbe873+1/vVn9l81vtt73e9W73d3h96X/cOe896r3p+b9L7W+/vvX/0/9n/d/8//f866qefVDK/6bU+/f/9H03Fvao=</latexit>

t1<latexit sha1_base64="9UceZnf0NJXwgDuhWzU0irpRS0I=">AAAkRHicpVrbcty4EZ3d3DbKzZs85oWVsapsr6WVlIekXLVVa8lZa23ZZVuWrMRjTWFIzAwi3kSAczHNb8hr8kP5h/xD3lJ5TaUbIGfIblrrtabKHg7O6QbQQDe6QY3SUGmzs/OvTz79wQ9/9OOffPbTjZ/9/Be//NWNz399qpM88+WJn4RJdjYSWoYqlidGmVCepZkU0SiUr0YXB4i/mslMqyR+aZapfBOJSazGyhcGmk5umuHuzeGN/s72jv14/GG3euj3qs+z4ef924Mg8fNIxsYPhdavd3dS86YQmVF+KMuNQa5lKvwLMZGv4TEWkdRvCjva0tuElsAbJxn8i41nW5sShYi0XkYjYEbCTDXFsLELe52b8R/fFCpOcyNj33U0zkPPJB5O3QtUJn0TLuFB+JmCsXr+VGTCN2CgjY2NTW9faGh0mje3PvizMYjl3E+iSMRBMchmuBhlMZhqGJm8U2wNRJaJJUwUxloOLPwesGyrSkFLilYVIUGiJRihOB8YuTB5WsAPQoBNEII0zmU0Kl5QGAxcvt59syIcl+dFf5eyklhqUFJ1Yr9H4wJo5QYZaKJNAv0VA5lqFSYx1RSJhSP4HSMB6a7BDGu1pDPtT/OsmpsvQqBTlUaMVKjeCvAC4I1E1iJTfXGSgTH2oP9Qjs3gVGamvzvI1GTqfgyL/h7tAmXAwFaqVv20tApu4djvoozTcbtTFLZxyOTXolRIztaL+ScYzGYLnYlsvUz4g0g7vBI/ZeIGfECuFRgm7wh2idYcb+DBaNtE8KzJWpH9RRij8KJNst/GFDVABfwknq3Z9hcdXRqat43hJxIiJ7RQI4DzZ29bs4Cmas3qBad2z0R80RKxDUymLRSO0NzwP1qN6LNQVkFk26pJvJ6GnsRsp0bo0hDffepddp8jIC8phKbWuc9EJrICuEiMMrETIiOYqnjccL3DclgMoMksqbWnZp4QInOijGh7UXYrpq7gizhBS8kZkC5VrMzdQYpfnJsmPsTJl6x9kcKpAxag6wOAc5ZjNYkEFQuEEY3xPmBRLglk2CA84SuIDC0NkF5OpWE9wMwiPDqEKe4z11lj+xTLYx8iVQlD6moP/LJ+HAS+1u8h4dyMCgO50ncle1bH1jaZmEyKGJxWG0f2Dlj0hTQCVwkOmqcUgycUfMSUYkfFcBVnfGbnwDcqgvUtXrZYlJYljJZxltaMBf2XzDlGeTDBpX20osF2YtoWdjXQcAu2iisoZ2Imi3Vqj/wkZQEnUs19yXYO4KMGznYP4H4DP+jAgwb+rANv+vE3HfikgT/swKetYEEnOEXDHHbtxjEi33QhE0QediE+Igf1duWmsJvueEgPN4DCpUgR2uuGghyTruOrNqafzBZ1eBkWi4XdRZh7PlHa3/74rNM6WL05t802NXEgxxDoi3tf0YB3CUhZfHWP5Y9jLXFTHLPMs7tdqsahjj+YM0PgklCLlDZBy5hhbPpYH/Mh98IADORjDpnJAJJ7D3/nGiubuoZosv8KRYFlj8Jcfjc9UrGwdKiPoKoRnRJ8PACvswNLaowT8+kPGidTsRr8B6nAsTMVqwm9TwXuueOlNjL6frUOzd71VZn4FOq1FsO69qZ3KMBhTBIrEXva5OPxx2/8GHZ1QLr9NpZYNJVwAJNUV9VIRLfXZYSujVliccn2nk0y6uNuivUKrdRahAgOaqbCnt6X53fY6WKwZnqL6VhdPLHEEBK8eIIV30CGtBQE1KooiyMWNiMfykcRoWdNBDSuIhMg3MOipQjBie+Hk23q9KPxukLDdfwW8jNbLxJepqoOB0bkFM0DsHB+vq4P0o4xiEWazG2UQFueMAIYOobDaKe0O/iB0iZToxxvM65Rs0M8sic3rCNd2EwlWQXwmlc6lB3IqTYRJD648ZkV5Cy1B4xLX6253rH0GZLQigVPLRpVVuWkqGz5zvFsAUdMn6QGbX+HJXM2p3RZH086llZsycSWTbElz4SWgbXmGicZUANd8Ho8j5Y2GaStl9ahybGBtYDy8dxNbWhBe1wjmuQRbtKXNMuQQtuLpUYoO+HTHquobPoIm1nlHHG9oi4agpFVpN7aK7kvIVk2WRJ6Hz+HucSqtCxe0e7HkKh3ZNMu3zllWwaOEFs2BDLDKyu7PQhDLJr5Kom1o3lZ7A/ntIqB1ocl5QqTNcoeomeF7XNsvgJRM+nLai3uD5s3G3Q4lrJ/JcX2gd+ERToLIQSv+wrZGTFCwn6bwFIKOL+xrnpG41AFuZFUP64czioOuzJzk4NOl8Wvmj7lQjCgmXaW5Knrzx4S7uYDr0c67zksU4zqsQ2L+/vODx5c13XBb8IkN5pHD0AUZI7UDtgM7OKWfbrNLigq2Il3xCnIFV38ZaXrIg1zEF0Miy9Y+APHqrAthuVQw0NdcDd3Jjl6/uL7ZWebnr1eV7EHxvXixEhN0oVVTd3enAk4uoOsNz2CrJFmMA5+To3kml+wgjUxInSFM7t/AbulYFJ/iSfrVJV03KlIJclyV/w6vD5mwaCeGaS7YQIUDxvI7POoYg1x/SJ6LYagNcXaDCQ/8TmlUnWOd5VjqrDq7hxfSHhIUIGMfellcuKScTrEaC1SjxEkI98TaZolCxXZc4IY9CLF1wfFO6TiG5EMai1rycce4J51NDQuqqbr6oraM55oQXozw4OVLZ+dJ9aUIYYWBrmpod/YO67KLnaRD6YinkgvGXuQ1yi8oNe015kbz5+pYjgWQwcd0Qw4hHrJQX9hu1As8a0TO9EhganPrjN+uR85be+FLyrsMQsKF+El9PZ4HeUveTIG/g99n7GsCtpHi1VidNZxdxDP7FuURqEARZxJ/KnQRvnXKeTyRjrrqphUsYnbzchzMC2iFBaykdRVl5tu5VeeRm/VfHtcdBYSc6WnIjPtFySvyvNia7e81d+929+jN/UoAUQ8O6f08EQsSHDLxiwdh1S9dePbTq+j1J3H1iKOQXMMzRh2lM37603viTTTJLhGhWITcRzPtLDpIvgl35Y1Ax6h/GeMuV8T5kmmzZYvtGTxTS0kRLabh8O9Lw+Hbgo3uzjKJl5WG4R5mcUQcbbYmyN/5UqWeXDEXjHmma5RCG+wC6H7rv40703zDFxkaAQRplMxpIuZShuzHZiW7trt+Oj4OtcfuN54yVs9Xnnxm43rS9TRmL+RjZroE7a4TfQpuz9tokfcKosmzm+dmyhL8pdNlJVy8yY65/Vju0psi0aV13SJNdWiFuqXTVzy3LcJs9vqiybKIzicagubZOPpNmTWgublGuYVOYq3Rn+Fng5eRzEdSN9GqBndMTVwSl+Ijl0G8XDIX7dX6Q8GyS7U7WULcrMK7Uqc9UVzqHm1ZEkjRiJxwZaO/vmdlcdgg/XJ9oCstvodseZFFRLsW5M15YuAzwua40Rp2Sa6JnY8A6QzGyvW3Hu2pUPtmDHHjuk+wxv9XfqnNPzhdG979/fbe8/3+l/vV39m81nvt73f9W71dnt/6H3dO+w96530/J7q/a33994/+v/s/7v/n/5/HfXTTyqZ3/Ran/7//g+SnL6S</latexit>

t2<latexit sha1_base64="JTfvDy4EgM0eXAk+kk+Q6sZAxJY=">AAAkRHicpVrbcty4EZ3d3DbKzZs85oWVsapsr6WVlIekXLVVa8lZa23ZZVuWrMRjTWFIzAwi3kSAczHNb8hr8kP5h/xD3lJ5TaUbIGfIblrrtabKHg7O6QbQQDe6QY3SUGmzs/OvTz79wQ9/9OOffPbTjZ/9/Be//NWNz399qpM88+WJn4RJdjYSWoYqlidGmVCepZkU0SiUr0YXB4i/mslMqyR+aZapfBOJSazGyhcGmk5umuHezeGN/s72jv14/GG3euj3qs+z4ef924Mg8fNIxsYPhdavd3dS86YQmVF+KMuNQa5lKvwLMZGv4TEWkdRvCjva0tuElsAbJxn8i41nW5sShYi0XkYjYEbCTDXFsLELe52b8R/fFCpOcyNj33U0zkPPJB5O3QtUJn0TLuFB+JmCsXr+VGTCN2CgjY2NTW9faGh0mje3PvizMYjl3E+iSMRBMchmuBhlMZhqGJm8U2wNRJaJJUwUxloOLPwesGyrSkFLilYVIUGiJRihOB8YuTB5WsAPQoBNEII0zmU0Kl5QGAxcvt59syIcl+dFf5eyklhqUFJ1Yr9H4wJo5QYZaKJNAv0VA5lqFSYx1RSJhSP4HSMB6a7BDGu1pDPtT/OsmpsvQqBTlUaMVKjeCvAC4I1E1iJTfXGSgTH2oP9Qjs3gVGamvzvI1GTqfgyL/h7tAmXAwFaqVv20tApu4djvoozTcbtTFLZxyOTXolRIztaL+ScYzGYLnYlsvUz4g0g7vBI/ZeIGfECuFRgm7wh2idYcb+DBaNtE8KzJWpH9RRij8KJNst/GFDVABfwknq3Z9hcdXRqat43hJxIiJ7RQI4DzZ29bs4Cmas3qBad2z0R80RKxDUymLRSO0NzwP1qN6LNQVkFk26pJvJ6GnsRsp0bo0hDffepddp8jIC8phKbWuc9EJrICuEiMMrETIiOYqnjccL3DclgMoMksqbWnZp4QInOijGh7UXYrpq7gizhBS8kZkC5VrMzdQYpfnJsmPsTJl6x9kcKpAxag6wOAc5ZjNYkEFQuEEY3xPmBRLglk2CA84SuIDC0NkF5OpWE9wMwiPDqEKe4z11lj+xTLYx8iVQlD6moP/LJ+HAS+1u8h4dyMCgO50ncle1bH1jaZmEyKGJxWG0f2Dlj0hTQCVwkOmqcUgycUfMSUYkfFcBVnfGbnwDcqgvUtXrZYlJYljJZxltaMBf2XzDlGeTDBpX20osF2YtoWdjXQcAu2iisoZ2Imi3Vqj/wkZQEnUs19yXYO4KMGznYP4H4DP+jAgwb+rANv+vE3HfikgT/swKetYEEnOEXDHHbtxjEi33QhE0QediE+Igf1duWmsJvueEgPN4DCpUgR2uuGghyTruOrNqafzBZ1eBkWi4XdRZh7PlHa3/74rNM6WL05t802NXEgxxDoi3tf0YB3CUhZfHWP5Y9jLXFTHLPMs7tdqsahjj+YM0PgklCLlDZBy5hhbPpYH/Mh98IADORjDpnJAJJ7D3/nGiubuoZosv8KRYFlj8Jcfjc9UrGwdKiPoKoRnRJ8PACvswNLaowT8+kPGidTsRr8B6nAsTMVqwm9TwXuueOlNjL6frUOzd71VZn4FOq1FsO69qZ3KMBhTBIrEXva5OPxx2/8GHZ1QLr9NpZYNJVwAJNUV9VIRLfXZYSujVliccn2nk0y6uNuivUKrdRahAgOaqbCnt6X53fY6WKwZnqL6VhdPLHEEBK8eIIV30CGtBQE1KooiyMWNiMfykcRoWdNBDSuIhMg3MOipQjBie+Hk23q9KPxukLDdfwW8jNbLxJepqoOB0bkFM0DsHB+vq4P0o4xiEWazG2UQFueMAIYOobDaKe0O/iB0iZToxxvM65Rs0M8sic3rCNd2EwlWQXwmlc6lB3IqTYRJD648ZkV5Cy1B4xLX6253rH0GZLQigVPLRpVVuWkqGz5zvFsAUdMn6QGbX+HJXM2p3RZH086llZsycSWTbElz4SWgbXmGicZUANd8Ho8j5Y2GaStl9ahybGBtYDy8dxNbWhBe1wjmuQRbtKXNMuQQtuLpUYoO+HTHquobPoIm1nlHHG9oi4agpFVpN7aK7kvIVk2WRJ6Hz+HucSqtCxe0e7HkKh3ZNMu3zllWwaOEFs2BDLDKyu7PQhDLJr5Kom1o3lZ7A/ntIqB1ocl5QqTNcoeomeF7XNsvgJRM+nLai3uD5s3G3Q4lrJ/JcX2gd+ERToLIQSv+wrZGTFCwn6bwFIKOL+xrnpG41AFuZFUP64czioOuzJzk4NOl8Wvmj7lQjCgmXaW5Knrzx4S7uYDr0c67zksU4zqsQ2L+/vODx5c13XBb8IkN5pHD0AUZI7UDtgM7OKWfbrNLigq2Il3xCnIFV38ZaXrIg1zEF0Miy9Y+APHqrAthuVQw0NdcDd3Jjl6/uL7ZWebnr1eV7EHxvXixEhN0oVVTd3enAk4uoOsNz2CrJFmMA5+To3kml+wgjUxInSFM7t/AbulYFJ/iSfrVJV03KlIJclyV/w6vD5mwaCeGaS7YQIUDxvI7POoYg1x/SJ6LYagNcXaDCQ/8TmlUnWOd5VjqrDq7hxfSHhIUIGMfellcuKScTrEaC1SjxEkI98TaZolCxXZc4IY9CLF1wfFO6TiG5EMai1rycce4J51NDQuqqbr6oraM55oQXozw4OVLZ+dJ9aUIYYWBrmpod/YO67KLnaRD6YinkgvGXuQ1yi8oNe015kbz5+pYjgWQwcd0Qw4hHrJQX9hu1As8a0TO9EhganPrjN+uR85be+FLyrsMQsKF+El9PZ4HeUveTIG/g99n7GsCtpHi1VidNZxdxDP7FuURqEARZxJ/KnQRvnXKeTyRjrrqphUsYnbzchzMC2iFBaykdRVl5tu5VeeRm/VfHtcdBYSc6WnIjPtFySvyvNia7e81d+929+jN/UoAUQ8O6f08EQsSHDLxiwdh1S9dePbTq+j1J3H1iKOQXMMzRh2lM37603viTTTJLhGhWITcRzPtLDpIvgl35Y1Ax6h/GeMuV8T5kmmzZYvtGTxTS0kRLabh8O9Lw+Hbgo3uzjKJl5WG4R5mcUQcbbYmyN/5UqWeXDEXjHmma5RCG+wC6H7rv40703zDFxkaAQRplMxpIuZShuzHZiW7trt+Oj4OtcfuN54yVs9Xnnxm43rS9TRmL+RjZroE7a4TfQpuz9tokfcKosmzm+dmyhL8pdNlJVy8yY65/Vju0psi0aV13SJNdWiFuqXTVzy3LcJs9vqiybKIzicagubZOPpNmTWgublGuYVOYq3Rn+Fng5eRzEdSN9GqBndMTVwSl+Ijl0G8XDIX7dX6Q8GyS7U7WULcrMK7Uqc9UVzqHm1ZEkjRiJxwZaO/vmdlcdgg/XJ9oCstvodseZFFRLsW5M15YuAzwua40Rp2Sa6JnY8A6QzGyvW3Hu2pUPtmDHHjuk+wxv9XfqnNPzhdG979/fbe8/3+l/vV39m81nvt73f9W71dnt/6H3dO+w96530/J7q/a33994/+v/s/7v/n/5/HfXTTyqZ3/Ran/7//g+2eb6T</latexit>

tN�1<latexit sha1_base64="mJ65Dz+ppvpYIyizPXCTkEZTLKk=">AAAkSHicpVrbcty4EZ315rJRLuvdPOaFlbGqbK+llZSHpFy1VWvJWWtt2WVbtqyNx5rCkJgZRLyJAOdiml+R1+SH8gf5i7yl8pZugJwhu2nFa02VPRyc0w2ggW50gxqlodJmZ+dfn1z79Cc//dnPP/vFxi9/9evffH79iy9PdJJnvnzpJ2GSnY6ElqGK5UujTChP00yKaBTKV6PzA8RfzWSmVRK/MMtUvonEJFZj5QsDTT/cMMPiydZueWN4vb+zvWM/Hn/YrR76verzdPhF/9YgSPw8krHxQ6H1692d1LwpRGaUH8pyY5BrmQr/XEzka3iMRST1m8KOuPQ2oSXwxkkG/2Lj2damRCEirZfRCJiRMFNNMWzswl7nZvynN4WK09zI2HcdjfPQM4mH0/cClUnfhEt4EH6mYKyePxWZ8A0YaWNjY9PbFxoanebNrQ/+bAxiOfeTKBJxUAyyGS5IWQymGkYmbxdbA5FlYgkThbGWAwu/ByzbqlLQkqJVRUiQaAlGKM4GRi5MnhbwgxBgI4QgjXMZjYrnFAYDl69336wIx+VZ0d+lrCSWGpRUndjv0bgAWrlBBppok0B/xUCmWoVJTDVFYuEIfsdIQLprMMNaLelM+9M8q+bmixDoVKURIxWqtwI8AXgjkbXIVF+cZGCMPeg/lGMzOJGZ6e8OMjWZuh/Dor9Hu0AZMLCVqlU/Ka2Cmzj2OyjjdNzqFIVtHDL5tSgVkrP1Yv4ZBrPZQmciWy8T/iDSDq/ET5i4AR+QawWGyTuCXaI1xxt4MNo2ETxrslZkfxHGKDxvk+y3MUUNUAE/iWdrtv1FR5eG5m1j+ImE6Akt1Ajg/Nnb1iygqVqzesGp3TMRn7dEbAOTaQuFIzQ3/I9WI/oslFUQ2bZqEq+noScx26kRujTEeJ96l93nCMgLCqGpde4zkYmsAC4So0zshMgIpioeN1zvsBwWA2gyS2rtqZknhMicKCPanpfdiqkr+CJO0FJyBqQLFStzZ5DiF+emiQ9x8gVrX6Rw6oAF6PoA4JzlWE0iQcUCYURjvPdZlEsCGTYIj/kKIkNLA6QXU2lYDzCzCI8OYYp7zHXW2D7F8tiHSFXCkLraA7+sHweBr/V7SDg3o8JArvRdyp7VsbVNJiaTIgan1caRvQMWfSGNwFWCg+YJxeAJBR8ypdhRMVzFGZ/ZOfCNimB9ixctFqVlCaNlnKU1Y0H/JXOOUR5McGkfrmiwnZi2hV0NNNyCreIKypmYyWKd2iM/SVnAiVRzX7KdA/iogbPdA7jfwA868KCBP+3Am378XQc+aeAPOvBpK1jQCU7RMIddu3GMyHddyASRB12Ij8hBvV25KeymOx7Sww2gcClShPa6oSDHpOv4so3pJ7NFHV6GxWJhdxHmno+V9rc/Puu0DlZvzm2zTU0cyDEE+uLuNzTgXQBSFt/cZfnjWEvcFMcs8+xul6pxqOMP5swQuCTUI6VN0DJmGJs+1sd8yL0wAAP5mENmMoDk3sPfucbqpq4hmuy/QlFg2aMwl/+fHqlYWDrUSFDViE4JPh6A19mBJTXGifn0B42TqVgN/oNU4NiZitWE3qcC99zxUhsZ/bhah2bv+rJMfAr1WothXXvTOxTgMCaJlYg9bfLx+OM3fgy7OiDdfh9LLJpKOIBJqqtqJKLb6yJC18Yssbhge88mGfVxN8V6hVZqLUIEBzVTYU/vi7Pb7HQxWDO9xXSsLp5YYggJXjzBim8gQ1oKAmpVlMURC5uRD+WjiNCzJgIaV5EJEO5h0VKE4MT3wsk2dfrReF2h4Tp+D/mZrRcJL1NVhwMjcormAVg4P1vXB2nHGMQiTeY2SqAtXzICGDqGw2intDv4vtImU6McbzSuULNDPLInN6wjXdhMJVkF8JpXOpQdyKk2ESQ+uPGZFeQstQeMS1+tud6x9BmS0IoFTy0aVVblpKhs+c7xbAFHTJ+kBm1/myVzNqd0WR9POpZWbMnElk2xJc+EloG15honGVADXfB6PI+WNhmkrRfWocmxgbWA8vHcTW1oQXtcIZrkEW7SFzTLkELbi6VGKHvJpz1WUdn0ETazyjniekVdNAQjq0i9tddyX0OybLIk9D5+DnOJVWlZvKLdjyFR78imXb5zwrYMHCG2bAhkhldWdnsQhlg081USa0fzstgfzmkVA60PSsoVJmuUPUTPCtvn2HwFombSl9Va3Bs2bzbocCxl/1KK7QO/CYt0FkIIXvcVsjNihIT9NoGlFHB+Y131lMahCnIjqX5cOpxVHHZl5iYHnS6LXzZ9yoVgQDPtLMlT1589JNzNB16PdN5zWKYY1WMbFvf2nR/cv6rrgt+ESW40jx6AKMgcqR2wGdjFTft0i11QVLAT74hTkCu6+MtK10Ua5iC6GBZfsfAHjlVhWwzLoYaHuuBO7kxy9Oz5j8vONj17va5iD4zrxYmRmqQLq5q6vTkTcHQHWW96CFkjzWAc/IwayTU/ZwVrYkToCmd2/wJ2S8Gk/hJP1qkq6bhTkUqS5a74dXh9xIJBPTNId8MEKB42kNnnUcUa4vpF9FoMQWuKtRlIfuJzSqXqDO8qx1Rh1d0ZvpDwkKACGfvSy+TEJeN0iNFapB4jSEa+J9I0SxYqsucEMeh5iq8PindIxTciGdRa1pKPPMA962hoXFRN19UVtac80YL0ZoYHK1s+O0+sKUMMLQxyU0O/sXdclV3sIh9MRTyRXjL2IK9ReEGvaa8zN54fqGI4FkMHHdEMOIR6yUF/YbtQLPGtEzvRIYGpz65TfrkfOW3vhc8r7BELCufhBfT2aB3lL3gyBv4PfZ+yrAraR4tVYnTacXcQz+xblEahAEWcSfyp0Eb5Vynk8kY666qYVLGJ283IczAtohQWspHUVZebbuVXnkZv1Xx7XHQWEnOlpyIz7Rckr8qzYmu3vNnfvdPfozf1KAFEPDun9PBELEhwy8YsHYdUvXXj206vo9Sdx9YijkFzDM0YdpTN++tN77E00yS4QoViE3Ecz7Sw6SL4Jd+WNQMeofxnjLlfE+ZJps2WL7Rk8U0tJES2G4fDva8Ph24KN7o4yiZeVhuEeZnFEHG22Jsjf+VKlnlwxF4x5pmuUQhvsAuh+67+NO9N8wxcZGgEEaZTMaSLmUobsx2Ylu7a7fjo+CrXH7jeeMlbPV568ZuN60vU0Zi/kY2a6GO2uE30Cbs/baJH3CqLJs5vnZsoS/KXTZSVcvMmOuf1Y7tKbItGldd0iTXVohbql01c8ty3CbPb6vMmyiM4nGoLm2Tj6TZk1oLm5RrmFTmKt0Z/iZ4OXkcxHUjfRqgZ3TE1cEJfiI5dBvFgyF+3V+kPBsku1O1lC3KzCu1KnPVFc6h5tWRJI0YiccGWjv7Z7ZXHYIP1yfaArLb6HbHmRRUS7FuTNeWrgM8LmuNEadkmuiZ2PAOkMxsr1ty7tqVD7Zgxx47pPsPr/V36pzT84WRve/cP23vP9vrf7ld/ZvNZ73e93/du9nZ7f+x92zvsPe297Pm9qPe33t97/+j/s//v/n/6/3XUa59UMr/ttT43rv0Pslm/Lg==</latexit>

. . .<latexit sha1_base64="gH3/d664qs4AqlCmCpBvAHb3npc=">AAAkRnicpVrNcty4EZ7d/G2UP29yzIWVsapsr6WVlENSrtqqteSstbbs8lq2rMRjTYEkZgYR/0SAoxnTfIhckxfKK+Qlckvlmm6AnCG7acVrTZU9HHxfN4AGutENys8ipc3Ozr8++fQHP/zRj3/y2U83fvbzX/zyVzc+//WJTos8kC+DNErzU19oGalEvjTKRPI0y6WI/Ui+8s8PEH81l7lWafLCLDP5JhbTRE1UIAw0vbo5ClOjb45vDHe2d+zH4w+79cNwUH+ejT8f3gbJoIhlYoJIaP16dyczb0qRGxVEstoYFVpmIjgXU/kaHhMRS/2mtOOtvE1oCb1JmsO/xHi2tS1RiljrZewDMxZmpimGjX3Y68JM/vimVElWGJkErqNJEXkm9XDyXqhyGZhoCQ8iyBWM1QtmIheBARNtbGxsevtCQ6PTvLn1wZ+NUSIvgzSORRKWo3yOy1GVo5mGkck75dZI5LlYwkRhrNXIwu8Bq66qDLRkaFURESReghHKs5GRC1NkJfwgBNgGEUjjXHy/fE5hMHD1evfNinBcnZXDXcpKE6lBSd2J/fYnJdCqDTLQVJsU+itHMtMqShOqKRYLRwh6RgLSfYMZN2pJZzqYFXk9t0BEQKcqjfBVpN4K8APg+SLvkKm+JM3BGHvQfyQnZnQiczPcHeVqOnM/xuVwj3aBMmBgK9WoflpZBbdw7HdRxum43SsK2zhi8mtRKiTn68X8Ewxms4PORb5eJvxBpB1ei58wcQM+INcKDJN3BLtEa4438mC0XSJ41nStyP4iDD8675LstzFlA1CBIE3ma7b9RUeXReZta/iphNgJLdQI4Pz5284soKles2bBqd1zkZx3RGwDk+kKRT6aG/5HqxF9FspriGxbNU3W09DThO3UGF0aInxAvcvucwTkBYXQ1LoImMhU1gAXSVAmcUJkBDOVTFqud1iNyxE0mSW19sxcpoTInCgn2p5X/YqpKwQiSdFScg6kC5Uoc3eU4RfnZmkAcfIFa19kcOqABej6AOCc5VhNY0HFQmFEa7wPWJRLQxm1CE/4CiJDSwOkFzNpWA8wsxiPDmHK+8x11tg+xYokgEhVwZD62sOgah5HYaD1e0g4N6OiUK70XcmeN7G1SyYmkyIBp9XGkb0DFn0hjcBVgoPmKcXgCQUfMaXYUTlexZmA2TkMjIphfcsXHRal5Smj5ZylNWNB/xVzDr8Ip7i0j1Y02E5M28KuBhpuwVZxBRVMzOSJzuyRn2Ys4MSqvS/ZzgHcb+Fs9wAetPCDHjxs4c968LYff9ODT1v4wx581gkWdIIzNMxh326cIPJNHzJF5GEfEiBy0GxXbgq76Y7H9HADKFqKDKG9figsMOk6vmpjBul80YSXcblY2F2EuecTpYPtj886rYM1m3PbbFMTh3ICgb689xUNeBeAVOVX91j+ONESN8Uxyzz726VqHer4gzkzBC4J1UhlE7ScGcamj80xH3EvDMFAAeaQuQwhuffwd6GxtmlqiDb7r1AUWLYfFfL/02OVCEuHCgmqGtErwccD8Do7sKTWODGf/qBxMhWrwX+QChw7U7Ga0PtU4J47Xmoj4+9X69DsXV+Vic+gXuswrGtveocCHMakiRKJp00xmXz8xk9gV4ek228TiUVTBQcwSXVVg8R0e13E6NqYJZYXbO/ZJKM57mZYr9BKrUOI4aBmKuzpfXF2h50uBmumt5iONcUTSwwhwUumWPGNZERLQUCtiqo8YmEzDqB8FDF61lRA4yoyAcI9LF6KCJz4fjTdpk7vT9YVGq7jt5Cf2XqR8HJVdzgyoqBoEYKFi7N1fZD1jEEssvTSRgm05UtGAEMncBjtVHYHP1Da5Mov8D7jGjU7xCN7csM60oXNVZrXAK95pUPZgZxpE0PigxufWUHOM3vAuPTVmusdS58hCa1Z8NShUWV1TorKlu8czxZwxPRpZtD2d1gyZ3NKl/XxpGNpxZZMbNkWW/JMaBlaa65xkgG10AWvx4t4aZNB2nphHZocG1gLqADP3cyGFrTHNaJJEeMmfUGzDCm0vVhqhbKXfNoTFVdtH2Ezq50jaVbURUMwsorVW3sp9yUkyyZPI+/j53ApsSqtyle0+wkk6j3ZtMt3TtiWgSPElg2hzPHKym4PwhCLdr5KYq1/WZX740taxUDrw4pyhclbZQ/Rs8L2OXa5AlEz6ctqLe+P2zcbdDiWsn8lxfaB34RFOosgBK/7itgZ4SNhv0tgKQWc31hXPaNxqIbcSOofVw5nFYddmbnJQafL4ldNn3IhGNBMO0+LzPVnDwl384HXI733HJYp/GZs4/L+vvODB9d1XfCbKC2M5tEDEAWZI7UDNgO7vGWfbrMLihp24j1xCnJFF39Z6brIogJEF+PyCxb+wLFqbIthBdTwUBfcLZxJjr57/v2ys03PXq+rxAPjeklqpCbpwqqm7m7OFBzdQdabHkHWSDMYB39HjeSan7OCNTUicoUzu38Bu2Vg0mCJJ+tMVXTcmcgkyXJX/Ca8PmbBoJkZpLtRChQPG8jsi7hmjXH9YnothqA1xdoMJD8JOKVWdYZ3lROqsO7uDF9IeEhQoUwC6eVy6pJxOsR4LdKMESTjwBNZlqcLFdtzghj0PMPXB+U7pOIbkRxqLWvJxx7gnnU0NC6qpuvqitpTnmhBejPHg5Utn50n1pQRhhYGuamh39g7rtoudpEPZiKZSi+deJDXKLyg17TXuRvPn6liOBYjBx3RDDiCeslBf2G7UCzxrRM70SGBac6uU365Hztt74XPa+wxCwrn0QX09ngd5S94Mgb+D32fsqwK2v3FKjE67bk7SOb2LUqrUIAizqTBTGijgusUckUrnXVVTKbYxO1m5DmYFnEGC9lK6urLTbfyK0+jt2qBPS56C4lLpWciN90XJK+qs3Jrt7o13L073KM39SgBRDw7Z/TwRCxMccsmLB2HVL1z49tNr+PMncfWIo5BcwzNGHaU7fvrTe+JNLM0vEaFYhNxHM+stOki+CXflg0DHqH8Z4zLoCFcprk2W4HQksU3tZAQ2W4ejve+PBy7Kdzs4yibeFltEOZlnkDE2WJvjoKVK1nmwRF7xVjkukEhvMEuhO77+tO8N80zcJGjEUSUzcSYLmYmbcx2YFa5a7fjo+PrXH/geuMlb/145cVvPmkuUf0JfyMbt9EnbHHb6FN2f9pGj7hVFm2c3zq3UZbkL9soK+Uu2+glrx+7VWJXNK69pk+srRa1UL9s45Lnvm2Y3Vaft1EeweFUW9gkG0+3MbMWNC/XMK/IUbwz+iv09PB6iulQBjZCzemOaYAT+kJ04jKIh2P+ur1OfzBI9qFuL1uQm1VoV+KsL5ojzaslS/IZicQFWzoGZ3dWHoMN1ie7A7LamnfEmhdVSLBvTdaUL0I+L2hOUqVll+ia2PEMkM5trFhz79mWHrUTxpw4pvuMbwx36Z/S8IeTve3d32/vfbc3/Hq//jObzwa/HfxucGuwO/jD4OvB4eDZ4OUgGJwP/jb4++Afw38O/z38z/C/jvrpJ7XMbwadz83B/wCH6r65</latexit>t3<latexit sha1_base64="L4Obrt1LiGSR7L8/gwyrhANaExI=">AAAkRHicpVrbcty4EZ3d3DbKzZs85oWVsapsr6WVtA9JuWqr1pKz1tqyy7YsWYnHmsKQmBlEvIkA52Ka35DX5IfyD/mHvKXymko3QM6Q3bTitabKHg7O6QbQQDe6QY3SUGmzs/PPTz79wQ9/9OOffPbTjZ/9/Be//NWNz399qpM88+WJn4RJdjYSWoYqlidGmVCepZkU0SiUr0YXB4i/mslMqyR+aZapfBOJSazGyhcGmk5umuFXN4c3+jvbO/bj8Yfd6qHfqz7Php/3bw+CxM8jGRs/FFq/3t1JzZtCZEb5oSw3BrmWqfAvxES+hsdYRFK/KexoS28TWgJvnGTwLzaebW1KFCLSehmNgBkJM9UUw8Yu7HVuxn94U6g4zY2MfdfROA89k3g4dS9QmfRNuIQH4WcKxur5U5EJ34CBNjY2Nr19oaHRad7c+uDPxiCWcz+JIhEHxSCb4WKUxWCqYWTyTrE1EFkmljBRGGs5sPB7wLKtKgUtKVpVhASJlmCE4nxg5MLkaQE/CAE2QQjSOJfRqHhBYTBw+Xr3zYpwXJ4X/V3KSmKpQUnVif0ejQuglRtkoIk2CfRXDGSqVZjEVFMkFo7gd4wEpLsGM6zVks60P82zam6+CIFOVRoxUqF6K8ALgDcSWYtM9cVJBsbYg/5DOTaDU5mZ/u4gU5Op+zEs+nu0C5QBA1upWvXT0iq4hWO/izJOx+1OUdjGIZNfi1IhOVsv5h9hMJstdCay9TLhDyLt8Er8lIkb8AG5VmCYvCPYJVpzvIEHo20TwbMma0X2F2GMwos2yX4bU9QAFfCTeLZm2190dGlo3jaGn0iInNBCjQDOn71tzQKaqjWrF5zaPRPxRUvENjCZtlA4QnPD/2g1os9CWQWRbasm8XoaehKznRqhS0N896l32X2OgLykEJpa5z4TmcgK4CIxysROiIxgquJxw/UOy2ExgCazpNaemnlCiMyJMqLtRdmtmLqCL+IELSVnQLpUsTJ3Byl+cW6a+BAnX7L2RQqnDliArg8AzlmO1SQSVCwQRjTG+4BFuSSQYYPwhK8gMrQ0QHo5lYb1ADOL8OgQprjPXGeN7VMsj32IVCUMqas98Mv6cRD4Wr+HhHMzKgzkSt+V7FkdW9tkYjIpYnBabRzZO2DRF9IIXCU4aJ5SDJ5Q8BFTih0Vw1Wc8ZmdA9+oCNa3eNliUVqWMFrGWVozFvRfMucY5cEEl/bRigbbiWlb2NVAwy3YKq6gnImZLNapPfKTlAWcSDX3Jds5gI8aONs9gPsN/KADDxr4sw686cffduCTBv6wA5+2ggWd4BQNc9i1G8eIfNuFTBB52IX4iBzU25Wbwm664yE93AAKlyJFaK8bCnJMuo6v2ph+MlvU4WVYLBZ2F2Hu+URpf/vjs07rYPXm3Dbb1MSBHEOgL+59TQPeJSBl8fU9lj+OtcRNccwyz+52qRqHOv5gzgyBS0ItUtoELWOGseljfcyH3AsDMJCPOWQmA0juPfyda6xs6hqiyf4LFAWWPQpz+f/pkYqFpUN9BFWN6JTg4wF4nR1YUmOcmE9/0DiZitXgP0gFjp2pWE3ofSpwzx0vtZHR96t1aPaur8rEp1CvtRjWtTe9QwEOY5JYidjTJh+PP37jx7CrA9Ltd7HEoqmEA5ikuqpGIrq9LiN0bcwSi0u292ySUR93U6xXaKXWIkRwUDMV9vS+PL/DTheDNdNbTMfq4oklhpDgxROs+AYypKUgoFZFWRyxsBn5UD6KCD1rIqBxFZkA4R4WLUUITnw/nGxTpx+N1xUaruN3kJ/ZepHwMlV1ODAip2gegIXz83V9kHaMQSzSZG6jBNryhBHA0DEcRjul3cEPlDaZGuV4m3GNmh3ikT25YR3pwmYqySqA17zSoexATrWJIPHBjc+sIGepPWBc+mrN9Y6lz5CEVix4atGosionRWXLd45nCzhi+iQ1aPs7LJmzOaXL+njSsbRiSya2bIoteSa0DKw11zjJgBrogtfjebS0ySBtvbQOTY4NrAWUj+duakML2uMa0SSPcJO+pFmGFNpeLDVC2Qmf9lhFZdNH2Mwq54jrFXXREIysIvXWXsl9CcmyyZLQ+/g5zCVWpWXxinY/hkS9I5t2+c4p2zJwhNiyIZAZXlnZ7UEYYtHMV0msHc3LYn84p1UMtD4sKVeYrFH2ED0rbJ9j8xWImklfVmtxf9i82aDDsZT9Kym2D/wmLNJZCCF43VfIzogREvbbBJZSwPmNddUzGocqyI2k+nHlcFZx2JWZmxx0uix+1fQpF4IBzbSzJE9df/aQcDcfeD3Sec9hmWJUj21Y3N93fvDguq4LfhMmudE8egCiIHOkdsBmYBe37NNtdkFRwU68I05BrujiLytdF2mYg+hiWHzBwh84VoVtMSyHGh7qgru5M8nR8xffLzvb9Oz1uoo9MK4XJ0Zqki6saur25kzA0R1kvekRZI00g3Hwc2ok1/yCFayJEaErnNn9C9gtBZP6SzxZp6qk405FKkmWu+LX4fUxCwb1zCDdDROgeNhAZp9HFWuI6xfRazEErSnWZiD5ic8plapzvKscU4VVd+f4QsJDggpk7EsvkxOXjNMhRmuReowgGfmeSNMsWajInhPEoBcpvj4o3iEV34hkUGtZSz72APeso6FxUTVdV1fUnvFEC9KbGR6sbPnsPLGmDDG0MMhNDf3G3nFVdrGLfDAV8UR6ydiDvEbhBb2mvc7ceP5EFcOxGDroiGbAIdRLDvoz24ViiW+d2IkOCUx9dp3xy/3IaXsvfFFhj1lQuAgvobfH6yh/yZMx8H/o+4xlVdA+WqwSo7OOu4N4Zt+iNAoFKOJM4k+FNsq/TiGXN9JZV8Wkik3cbkaeg2kRpbCQjaSuutx0K7/yNHqr5tvjorOQmCs9FZlpvyB5VZ4XW7vlrf7u3f4evalHCSDi2TmlhydiQYJbNmbpOKTqrRvfdnodpe48thZxDJpjaMawo2zeX296T6SZJsE1KhSbiON4poVNF8Ev+basGfAI5T9jzP2aME8ybbZ8oSWLb2ohIbLdPBzufXk4dFO42cVRNvGy2iDMyyyGiLPF3hz5K1eyzIMj9ooxz3SNQniDXQjdd/WneW+aZ+AiQyOIMJ2KIV3MVNqY7cC0dNdux0fH17n+wPXGS97q8cqL32xcX6KOxvyNbNREn7DFbaJP2f1pEz3iVlk0cX7r3ERZkr9soqyUmzfROa8f21ViWzSqvKZLrKkWtVC/bOKS575NmN1WXzRRHsHhVFvYJBtPtyGzFjQv1zCvyFG8Nfor9HTwOorpQPo2Qs3ojqmBU/pCdOwyiIdD/rq9Sn8wSHahbi9bkJtVaFfirC+aQ82rJUsaMRKJC7Z09M/vrDwGG6xPtgdktdXviDUvqpBg35qsKV8EfF7QHCdKyzbRNbHjGSCd2Vix5t6zLR1qx4w5dkz3Gd7o79I/peEPp3vbu19t7z3f63+zX/2ZzWe93/Z+17vV2+39vvdN77D3rHfS83uq99fe33p/7/+j/6/+v/v/cdRPP6lkftNrffr//R/aVr6U</latexit>

tN<latexit sha1_base64="51CpTBxyxsNozcsf4vqHCCYloXE=">AAAkRHicpVrbcty4EZ3d3DbKzZs85oWVsapsr6WVlIekXLVVa8lZa23ZZVuWrMRjTWFIzAwi3kSAczHNb8hr8kP5h/xD3lJ5TaUbIGfIblrrtabKHg7O6QbQQDe6QY3SUGmzs/OvTz79wQ9/9OOffPbTjZ/9/Be//NWNz399qpM88+WJn4RJdjYSWoYqlidGmVCepZkU0SiUr0YXB4i/mslMqyR+aZapfBOJSazGyhcGmk5umuHTm8Mb/Z3tHfvx+MNu9dDvVZ9nw8/7twdB4ueRjI0fCq1f7+6k5k0hMqP8UJYbg1zLVPgXYiJfw2MsIqnfFHa0pbcJLYE3TjL4FxvPtjYlChFpvYxGwIyEmWqKYWMX9jo34z++KVSc5kbGvutonIeeSTycuheoTPomXMKD8DMFY/X8qciEb8BAGxsbm96+0NDoNG9uffBnYxDLuZ9EkYiDYpDNcDHKYjDVMDJ5p9gaiCwTS5gojLUcWPg9YNlWlYKWFK0qQoJESzBCcT4wcmHytIAfhACbIARpnMtoVLygMBi4fL37ZkU4Ls+L/i5lJbHUoKTqxH6PxgXQyg0y0ESbBPorBjLVKkxiqikSC0fwO0YC0l2DGdZqSWfan+ZZNTdfhECnKo0YqVC9FeAFwBuJrEWm+uIkA2PsQf+hHJvBqcxMf3eQqcnU/RgW/T3aBcqAga1UrfppaRXcwrHfRRmn43anKGzjkMmvRamQnK0X808wmM0WOhPZepnwB5F2eCV+ysQN+IBcKzBM3hHsEq053sCD0baJ4FmTtSL7izBG4UWbZL+NKWqACvhJPFuz7S86ujQ0bxvDTyRETmihRgDnz962ZgFN1ZrVC07tnon4oiViG5hMWygcobnhf7Qa0WehrILItlWTeD0NPYnZTo3QpSG++9S77D5HQF5SCE2tc5+JTGQFcJEYZWInREYwVfG44XqH5bAYQJNZUmtPzTwhROZEGdH2ouxWTF3BF3GClpIzIF2qWJm7gxS/ODdNfIiTL1n7IoVTByxA1wcA5yzHahIJKhYIIxrjfcCiXBLIsEF4wlcQGVoaIL2cSsN6gJlFeHQIU9xnrrPG9imWxz5EqhKG1NUe+GX9OAh8rd9DwrkZFQZype9K9qyOrW0yMZkUMTitNo7sHbDoC2kErhIcNE8pBk8o+IgpxY6K4SrO+MzOgW9UBOtbvGyxKC1LGC3jLK0ZC/ovmXOM8mCCS/toRYPtxLQt7Gqg4RZsFVdQzsRMFuvUHvlJygJOpJr7ku0cwEcNnO0ewP0GftCBBw38WQfe9ONvOvBJA3/YgU9bwYJOcIqGOezajWNEvulCJog87EJ8RA7q7cpNYTfd8ZAebgCFS5EitNcNBTkmXcdXbUw/mS3q8DIsFgu7izD3fKK0v/3xWad1sHpzbpttauJAjiHQF/e+ogHvEpCy+Ooeyx/HWuKmOGaZZ3e7VI1DHX8wZ4bAJaEWKW2CljHD2PSxPuZD7oUBGMjHHDKTAST3Hv7ONVY2dQ3RZP8VigLLHoW5/G56pGJh6VAfQVUjOiX4eABeZweW1Bgn5tMfNE6mYjX4D1KBY2cqVhN6nwrcc8dLbWT0/Wodmr3rqzLxKdRrLYZ17U3vUIDDmCRWIva0ycfjj9/4MezqgHT7bSyxaCrhACaprqqRiG6vywhdG7PE4pLtPZtk1MfdFOsVWqm1CBEc1EyFPb0vz++w08VgzfQW07G6eGKJISR48QQrvoEMaSkIqFVRFkcsbEY+lI8iQs+aCGhcRSZAuIdFSxGCE98PJ9vU6UfjdYWG6/gt5Ge2XiS8TFUdDozIKZoHYOH8fF0fpB1jEIs0mdsogbY8YQQwdAyH0U5pd/ADpU2mRjneZlyjZod4ZE9uWEe6sJlKsgrgNa90KDuQU20iSHxw4zMryFlqDxiXvlpzvWPpMyShFQueWjSqrMpJUdnynePZAo6YPkkN2v4OS+ZsTumyPp50LK3Ykoktm2JLngktA2vNNU4yoAa64PV4Hi1tMkhbL61Dk2MDawHl47mb2tCC9rhGNMkj3KQvaZYhhbYXS41QdsKnPVZR2fQRNrPKOeJ6RV00BCOrSL21V3JfQrJssiT0Pn4Oc4lVaVm8ot2PIVHvyKZdvnPKtgwcIbZsCGSGV1Z2exCGWDTzVRJrR/Oy2B/OaRUDrQ9LyhUma5Q9RM8K2+fYfAWiZtKX1VrcHzZvNuhwLGX/SortA78Ji3QWQghe9xWyM2KEhP02gaUUcH5jXfWMxqEKciOpflw5nFUcdmXmJgedLotfNX3KhWBAM+0syVPXnz0k3M0HXo903nNYphjVYxsW9/edHzy4ruuC34RJbjSPHoAoyBypHbAZ2MUt+3SbXVBUsBPviFOQK7r4y0rXRRrmILoYFl+w8AeOVWFbDMuhhoe64G7uTHL0/MX3y842PXu9rmIPjOvFiZGapAurmrq9ORNwdAdZb3oEWSPNYBz8nBrJNb9gBWtiROgKZ3b/AnZLwaT+Ek/WqSrpuFORSpLlrvh1eH3MgkE9M0h3wwQoHjaQ2edRxRri+kX0WgxBa4q1GUh+4nNKpeoc7yrHVGHV3Tm+kPCQoAIZ+9LL5MQl43SI0VqkHiNIRr4n0jRLFiqy5wQx6EWKrw+Kd0jFNyIZ1FrWko89wD3raGhcVE3X1RW1ZzzRgvRmhgcrWz47T6wpQwwtDHJTQ7+xd1yVXewiH0xFPJFeMvYgr1F4Qa9przM3nj9TxXAshg46ohlwCPWSg/7CdqFY4lsndqJDAlOfXWf8cj9y2t4LX1TYYxYULsJL6O3xOspf8mQM/B/6PmNZFbSPFqvE6Kzj7iCe2bcojUIBijiT+FOhjfKvU8jljXTWVTGpYhO3m5HnYFpEKSxkI6mrLjfdyq88jd6q+fa46Cwk5kpPRWbaL0helefF1m55q797t79Hb+pRAoh4dk7p4YlYkOCWjVk6Dql668a3nV5HqTuPrUUcg+YYmjHsKJv315veE2mmSXCNCsUm4jieaWHTRfBLvi1rBjxC+c8Yc78mzJNMmy1faMnim1pIiGw3D4d7Xx4O3RRudnGUTbysNgjzMosh4myxN0f+ypUs8+CIvWLMM12jEN5gF0L3Xf1p3pvmGbjI0AgiTKdiSBczlTZmOzAt3bXb8dHxda4/cL3xkrd6vPLiNxvXl6ijMX8jGzXRJ2xxm+hTdn/aRI+4VRZNnN86N1GW5C+bKCvl5k10zuvHdpXYFo0qr+kSa6pFLdQvm7jkuW8TZrfVF02UR3A41RY2ycbTbcisBc3LNcwrchRvjf4KPR28jmI6kL6NUDO6Y2rglL4QHbsM4uGQv26v0h8Mkl2o28sW5GYV2pU464vmUPNqyZJGjETigi0d/fM7K4/BBuuT7QFZbfU7Ys2LKiTYtyZryhcBnxc0x4nSsk10Tex4BkhnNlasufdsS4faMWOOHdN9hjf6u/RPafjD6d727u+3957v9b/er/7M5rPeb3u/693q7fb+0Pu6d9h71jvp+T3V+1vv771/9P/Z/3f/P/3/Ouqnn1Qyv+m1Pv3//R+i4b6v</latexit>

T1<latexit sha1_base64="p+Gcgm6AHyPhppn7Wf1aXBV+hfs=">AAAkRHicpVrbcty4EZ3d3DbKzZs85oWVsapsr6WVlIekXLVVa8lZa23ZZVuWrMRjTWFIzAwi3kSAczHNb8hr8kP5h/xD3lJ5TaUbIGfIblrrtabKHg7O6QbQQDe6QY3SUGmzs/OvTz79wQ9/9OOffPbTjZ/9/Be//NWNz399qpM88+WJn4RJdjYSWoYqlidGmVCepZkU0SiUr0YXB4i/mslMqyR+aZapfBOJSazGyhcGmk5uvhzu3hze6O9s79iPxx92q4d+r/o8G37evz0IEj+PZGz8UGj9encnNW8KkRnlh7LcGORapsK/EBP5Gh5jEUn9prCjLb1NaAm8cZLBv9h4trUpUYhI62U0AmYkzFRTDBu7sNe5Gf/xTaHiNDcy9l1H4zz0TOLh1L1AZdI34RIehJ8pGKvnT0UmfAMG2tjY2PT2hYZGp3lz64M/G4NYzv0kikQcFINshotRFoOphpHJO8XWQGSZWMJEYazlwMLvAcu2qhS0pGhVERIkWoIRivOBkQuTpwX8IATYBCFI41xGo+IFhcHA5evdNyvCcXle9HcpK4mlBiVVJ/Z7NC6AVm6QgSbaJNBfMZCpVmESU02RWDiC3zESkO4azLBWSzrT/jTPqrn5IgQ6VWnESIXqrQAvAN5IZC0y1RcnGRhjD/oP5dgMTmVm+ruDTE2m7sew6O/RLlAGDGylatVPS6vgFo79Lso4Hbc7RWEbh0x+LUqF5Gy9mH+CwWy20JnI1suEP4i0wyvxUyZuwAfkWoFh8o5gl2jN8QYejLZNBM+arBXZX4QxCi/aJPttTFEDVMBP4tmabX/R0aWhedsYfiIhckILNQI4f/a2NQtoqtasXnBq90zEFy0R28Bk2kLhCM0N/6PViD4LZRVEtq2axOtp6EnMdmqELg3x3afeZfc5AvKSQmhqnftMZCIrgIvEKBM7ITKCqYrHDdc7LIfFAJrMklp7auYJITInyoi2F2W3YuoKvogTtJScAelSxcrcHaT4xblp4kOcfMnaFymcOmABuj4AOGc5VpNIULFAGNEY7wMW5ZJAhg3CE76CyNDSAOnlVBrWA8wswqNDmOI+c501tk+xPPYhUpUwpK72wC/rx0Hga/0eEs7NqDCQK31Xsmd1bG2TicmkiMFptXFk74BFX0gjcJXgoHlKMXhCwUdMKXZUDFdxxmd2DnyjIljf4mWLRWlZwmgZZ2nNWNB/yZxjlAcTXNpHKxpsJ6ZtYVcDDbdgq7iCciZmslin9shPUhZwItXcl2znAD5q4Gz3AO438IMOPGjgzzrwph9/04FPGvjDDnzaChZ0glM0zGHXbhwj8k0XMkHkYRfiI3JQb1duCrvpjof0cAMoXIoUob1uKMgx6Tq+amP6yWxRh5dhsVjYXYS55xOl/e2Pzzqtg9Wbc9tsUxMHcgyBvrj3FQ14l4CUxVf3WP441hI3xTHLPLvbpWoc6viDOTMELgm1SGkTtIwZxqaP9TEfci8MwEA+5pCZDCC59/B3rrGyqWuIJvuvUBRY9ijM5XfTIxULS4f6CKoa0SnBxwPwOjuwpMY4MZ/+oHEyFavBf5AKHDtTsZrQ+1TgnjteaiOj71fr0OxdX5WJT6FeazGsa296hwIcxiSxErGnTT4ef/zGj2FXB6Tbb2OJRVMJBzBJdVWNRHR7XUbo2pglFpds79kkoz7upliv0EqtRYjgoGYq7Ol9eX6HnS4Ga6a3mI7VxRNLDCHBiydY8Q1kSEtBQK2KsjhiYTPyoXwUEXrWREDjKjIBwj0sWooQnPh+ONmmTj8arys0XMdvIT+z9SLhZarqcGBETtE8AAvn5+v6IO0Yg1ikydxGCbTlCSOAoWM4jHZKu4MfKG0yNcrxNuMaNTvEI3tywzrShc1UklUAr3mlQ9mBnGoTQeKDG59ZQc5Se8C49NWa6x1LnyEJrVjw1KJRZVVOisqW7xzPFnDE9Elq0PZ3WDJnc0qX9fGkY2nFlkxs2RRb8kxoGVhrrnGSATXQBa/H82hpk0HaemkdmhwbWAsoH8/d1IYWtMc1okke4SZ9SbMMKbS9WGqEshM+7bGKyqaPsJlVzhHXK+qiIRhZReqtvZL7EpJlkyWh9/FzmEusSsviFe1+DIl6Rzbt8p1TtmXgCLFlQyAzvLKy24MwxKKZr5JYO5qXxf5wTqsYaH1YUq4wWaPsIXpW2D7H5isQNZO+rNbi/rB5s0GHYyn7V1JsH/hNWKSzEELwuq+QnREjJOy3CSylgPMb66pnNA5VkBtJ9ePK4azisCszNznodFn8qulTLgQDmmlnSZ66/uwh4W4+8Hqk857DMsWoHtuwuL/v/ODBdV0X/CZMcqN59ABEQeZI7YDNwC5u2afb7IKigp14R5yCXNHFX1a6LtIwB9HFsPiChT9wrArbYlgONTzUBXdzZ5Kj5y++X3a26dnrdRV7YFwvTozUJF1Y1dTtzZmAozvIetMjyBppBuPg59RIrvkFK1gTI0JXOLP7F7BbCib1l3iyTlVJx52KVJIsd8Wvw+tjFgzqmUG6GyZA8bCBzD6PKtYQ1y+i12IIWlOszUDyE59TKlXneFc5pgqr7s7xhYSHBBXI2JdeJicuGadDjNYi9RhBMvI9kaZZslCRPSeIQS9SfH1QvEMqvhHJoNaylnzsAe5ZR0Pjomq6rq6oPeOJFqQ3MzxY2fLZeWJNGWJoYZCbGvqNveOq7GIX+WAq4on0krEHeY3CC3pNe5258fyZKoZjMXTQEc2AQ6iXHPQXtgvFEt86sRMdEpj67Drjl/uR0/Ze+KLCHrOgcBFeQm+P11H+kidj4P/Q9xnLqqB9tFglRmcddwfxzL5FaRQKUMSZxJ8KbZR/nUIub6SzropJFZu43Yw8B9MiSmEhG0lddbnpVn7lafRWzbfHRWchMVd6KjLTfkHyqjwvtnbLW/3du/09elOPEkDEs3NKD0/EggS3bMzScUjVWze+7fQ6St15bC3iGDTH0IxhR9m8v970nkgzTYJrVCg2EcfxTAubLoJf8m1ZM+ARyn/GmPs1YZ5k2mz5QksW39RCQmS7eTjc+/Jw6KZws4ujbOJltUGYl1kMEWeLvTnyV65kmQdH7BVjnukahfAGuxC67+pP8940z8BFhkYQYToVQ7qYqbQx24Fp6a7djo+Or3P9geuNl7zV45UXv9m4vkQdjfkb2aiJPmGL20SfsvvTJnrErbJo4vzWuYmyJH/ZRFkpN2+ic14/tqvEtmhUeU2XWFMtaqF+2cQlz32bMLutvmiiPILDqbawSTaebkNmLWhermFekaN4a/RX6OngdRTTgfRthJrRHVMDp/SF6NhlEA+H/HV7lf5gkOxC3V62IDer0K7EWV80h5pXS5Y0YiQSF2zp6J/fWXkMNlifbA/IaqvfEWteVCHBvjVZU74I+LygOU6Ulm2ia2LHM0A6s7Fizb1nWzrUjhlz7JjuM7zR36V/SsMfTve2d3+/vfd8r//1fvVnNp/1ftv7Xe9Wb7f3h97XvcPes95Jz++p3t96f+/9o//P/r/7/+n/11E//aSS+U2v9en/7/8WgL5y</latexit>

T2<latexit sha1_base64="KtxAY6QnR4n0zDmMsuYKHAh28ko=">AAAkRHicpVrbcty4EZ3d3DbKzZs85oWVsapsr6WVlIekXLVVa8lZa23ZZVuWrMRjTWFIzAwi3kSAczHNb8hr8kP5h/xD3lJ5TaUbIGfIblrrtabKHg7O6QbQQDe6QY3SUGmzs/OvTz79wQ9/9OOffPbTjZ/9/Be//NWNz399qpM88+WJn4RJdjYSWoYqlidGmVCepZkU0SiUr0YXB4i/mslMqyR+aZapfBOJSazGyhcGmk5uvhzu3Rze6O9s79iPxx92q4d+r/o8G37evz0IEj+PZGz8UGj9encnNW8KkRnlh7LcGORapsK/EBP5Gh5jEUn9prCjLb1NaAm8cZLBv9h4trUpUYhI62U0AmYkzFRTDBu7sNe5Gf/xTaHiNDcy9l1H4zz0TOLh1L1AZdI34RIehJ8pGKvnT0UmfAMG2tjY2PT2hYZGp3lz64M/G4NYzv0kikQcFINshotRFoOphpHJO8XWQGSZWMJEYazlwMLvAcu2qhS0pGhVERIkWoIRivOBkQuTpwX8IATYBCFI41xGo+IFhcHA5evdNyvCcXle9HcpK4mlBiVVJ/Z7NC6AVm6QgSbaJNBfMZCpVmESU02RWDiC3zESkO4azLBWSzrT/jTPqrn5IgQ6VWnESIXqrQAvAN5IZC0y1RcnGRhjD/oP5dgMTmVm+ruDTE2m7sew6O/RLlAGDGylatVPS6vgFo79Lso4Hbc7RWEbh0x+LUqF5Gy9mH+CwWy20JnI1suEP4i0wyvxUyZuwAfkWoFh8o5gl2jN8QYejLZNBM+arBXZX4QxCi/aJPttTFEDVMBP4tmabX/R0aWhedsYfiIhckILNQI4f/a2NQtoqtasXnBq90zEFy0R28Bk2kLhCM0N/6PViD4LZRVEtq2axOtp6EnMdmqELg3x3afeZfc5AvKSQmhqnftMZCIrgIvEKBM7ITKCqYrHDdc7LIfFAJrMklp7auYJITInyoi2F2W3YuoKvogTtJScAelSxcrcHaT4xblp4kOcfMnaFymcOmABuj4AOGc5VpNIULFAGNEY7wMW5ZJAhg3CE76CyNDSAOnlVBrWA8wswqNDmOI+c501tk+xPPYhUpUwpK72wC/rx0Hga/0eEs7NqDCQK31Xsmd1bG2TicmkiMFptXFk74BFX0gjcJXgoHlKMXhCwUdMKXZUDFdxxmd2DnyjIljf4mWLRWlZwmgZZ2nNWNB/yZxjlAcTXNpHKxpsJ6ZtYVcDDbdgq7iCciZmslin9shPUhZwItXcl2znAD5q4Gz3AO438IMOPGjgzzrwph9/04FPGvjDDnzaChZ0glM0zGHXbhwj8k0XMkHkYRfiI3JQb1duCrvpjof0cAMoXIoUob1uKMgx6Tq+amP6yWxRh5dhsVjYXYS55xOl/e2Pzzqtg9Wbc9tsUxMHcgyBvrj3FQ14l4CUxVf3WP441hI3xTHLPLvbpWoc6viDOTMELgm1SGkTtIwZxqaP9TEfci8MwEA+5pCZDCC59/B3rrGyqWuIJvuvUBRY9ijM5XfTIxULS4f6CKoa0SnBxwPwOjuwpMY4MZ/+oHEyFavBf5AKHDtTsZrQ+1TgnjteaiOj71fr0OxdX5WJT6FeazGsa296hwIcxiSxErGnTT4ef/zGj2FXB6Tbb2OJRVMJBzBJdVWNRHR7XUbo2pglFpds79kkoz7upliv0EqtRYjgoGYq7Ol9eX6HnS4Ga6a3mI7VxRNLDCHBiydY8Q1kSEtBQK2KsjhiYTPyoXwUEXrWREDjKjIBwj0sWooQnPh+ONmmTj8arys0XMdvIT+z9SLhZarqcGBETtE8AAvn5+v6IO0Yg1ikydxGCbTlCSOAoWM4jHZKu4MfKG0yNcrxNuMaNTvEI3tywzrShc1UklUAr3mlQ9mBnGoTQeKDG59ZQc5Se8C49NWa6x1LnyEJrVjw1KJRZVVOisqW7xzPFnDE9Elq0PZ3WDJnc0qX9fGkY2nFlkxs2RRb8kxoGVhrrnGSATXQBa/H82hpk0HaemkdmhwbWAsoH8/d1IYWtMc1okke4SZ9SbMMKbS9WGqEshM+7bGKyqaPsJlVzhHXK+qiIRhZReqtvZL7EpJlkyWh9/FzmEusSsviFe1+DIl6Rzbt8p1TtmXgCLFlQyAzvLKy24MwxKKZr5JYO5qXxf5wTqsYaH1YUq4wWaPsIXpW2D7H5isQNZO+rNbi/rB5s0GHYyn7V1JsH/hNWKSzEELwuq+QnREjJOy3CSylgPMb66pnNA5VkBtJ9ePK4azisCszNznodFn8qulTLgQDmmlnSZ66/uwh4W4+8Hqk857DMsWoHtuwuL/v/ODBdV0X/CZMcqN59ABEQeZI7YDNwC5u2afb7IKigp14R5yCXNHFX1a6LtIwB9HFsPiChT9wrArbYlgONTzUBXdzZ5Kj5y++X3a26dnrdRV7YFwvTozUJF1Y1dTtzZmAozvIetMjyBppBuPg59RIrvkFK1gTI0JXOLP7F7BbCib1l3iyTlVJx52KVJIsd8Wvw+tjFgzqmUG6GyZA8bCBzD6PKtYQ1y+i12IIWlOszUDyE59TKlXneFc5pgqr7s7xhYSHBBXI2JdeJicuGadDjNYi9RhBMvI9kaZZslCRPSeIQS9SfH1QvEMqvhHJoNaylnzsAe5ZR0Pjomq6rq6oPeOJFqQ3MzxY2fLZeWJNGWJoYZCbGvqNveOq7GIX+WAq4on0krEHeY3CC3pNe5258fyZKoZjMXTQEc2AQ6iXHPQXtgvFEt86sRMdEpj67Drjl/uR0/Ze+KLCHrOgcBFeQm+P11H+kidj4P/Q9xnLqqB9tFglRmcddwfxzL5FaRQKUMSZxJ8KbZR/nUIub6SzropJFZu43Yw8B9MiSmEhG0lddbnpVn7lafRWzbfHRWchMVd6KjLTfkHyqjwvtnbLW/3du/09elOPEkDEs3NKD0/EggS3bMzScUjVWze+7fQ6St15bC3iGDTH0IxhR9m8v970nkgzTYJrVCg2EcfxTAubLoJf8m1ZM+ARyn/GmPs1YZ5k2mz5QksW39RCQmS7eTjc+/Jw6KZws4ujbOJltUGYl1kMEWeLvTnyV65kmQdH7BVjnukahfAGuxC67+pP8940z8BFhkYQYToVQ7qYqbQx24Fp6a7djo+Or3P9geuNl7zV45UXv9m4vkQdjfkb2aiJPmGL20SfsvvTJnrErbJo4vzWuYmyJH/ZRFkpN2+ic14/tqvEtmhUeU2XWFMtaqF+2cQlz32bMLutvmiiPILDqbawSTaebkNmLWhermFekaN4a/RX6OngdRTTgfRthJrRHVMDp/SF6NhlEA+H/HV7lf5gkOxC3V62IDer0K7EWV80h5pXS5Y0YiQSF2zp6J/fWXkMNlifbA/IaqvfEWteVCHBvjVZU74I+LygOU6Ulm2ia2LHM0A6s7Fizb1nWzrUjhlz7JjuM7zR36V/SsMfTve2d3+/vfd8r//1fvVnNp/1ftv7Xe9Wb7f3h97XvcPes95Jz++p3t96f+/9o//P/r/7/+n/11E//aSS+U2v9en/7/86Xb5z</latexit>

T3<latexit sha1_base64="kceIvuMu3WtsjiW5/fmb384FIdI=">AAAkRHicpVrbcty4EZ3d3DbKzZs85oWVsapsr6WVtA9JuWqr1pKz1tqyy7YsWYnHmsKQmBlEvIkA52Ka35DX5IfyD/mHvKXymko3QM6Q3bTitabKHg7O6QbQQDe6QY3SUGmzs/PPTz79wQ9/9OOffPbTjZ/9/Be//NWNz399qpM88+WJn4RJdjYSWoYqlidGmVCepZkU0SiUr0YXB4i/mslMqyR+aZapfBOJSazGyhcGmk5uvhx+dXN4o7+zvWM/Hn/YrR76verzbPh5//YgSPw8krHxQ6H1692d1LwpRGaUH8pyY5BrmQr/Qkzka3iMRST1m8KOtvQ2oSXwxkkG/2Lj2damRCEirZfRCJiRMFNNMWzswl7nZvyHN4WK09zI2HcdjfPQM4mHU/cClUnfhEt4EH6mYKyePxWZ8A0YaGNjY9PbFxoanebNrQ/+bAxiOfeTKBJxUAyyGS5GWQymGkYm7xRbA5FlYgkThbGWAwu/ByzbqlLQkqJVRUiQaAlGKM4HRi5MnhbwgxBgE4QgjXMZjYoXFAYDl69336wIx+V50d+lrCSWGpRUndjv0bgAWrlBBppok0B/xUCmWoVJTDVFYuEIfsdIQLprMMNaLelM+9M8q+bmixDoVKURIxWqtwK8AHgjkbXIVF+cZGCMPeg/lGMzOJWZ6e8OMjWZuh/Dor9Hu0AZMLCVqlU/La2CWzj2uyjjdNzuFIVtHDL5tSgVkrP1Yv4RBrPZQmciWy8T/iDSDq/ET5m4AR+QawWGyTuCXaI1xxt4MNo2ETxrslZkfxHGKLxok+y3MUUNUAE/iWdrtv1FR5eG5m1j+ImEyAkt1Ajg/Nnb1iygqVqzesGp3TMRX7REbAOTaQuFIzQ3/I9WI/oslFUQ2bZqEq+noScx26kRujTEd596l93nCMhLCqGpde4zkYmsAC4So0zshMgIpioeN1zvsBwWA2gyS2rtqZknhMicKCPaXpTdiqkr+CJO0FJyBqRLFStzd5DiF+emiQ9x8iVrX6Rw6oAF6PoA4JzlWE0iQcUCYURjvA9YlEsCGTYIT/gKIkNLA6SXU2lYDzCzCI8OYYr7zHXW2D7F8tiHSFXCkLraA7+sHweBr/V7SDg3o8JArvRdyZ7VsbVNJiaTIgan1caRvQMWfSGNwFWCg+YpxeAJBR8xpdhRMVzFGZ/ZOfCNimB9i5ctFqVlCaNlnKU1Y0H/JXOOUR5McGkfrWiwnZi2hV0NNNyCreIKypmYyWKd2iM/SVnAiVRzX7KdA/iogbPdA7jfwA868KCBP+vAm378bQc+aeAPO/BpK1jQCU7RMIddu3GMyLddyASRh12Ij8hBvV25KeymOx7Sww2gcClShPa6oSDHpOv4qo3pJ7NFHV6GxWJhdxHmnk+U9rc/Puu0DlZvzm2zTU0cyDEE+uLe1zTgXQJSFl/fY/njWEvcFMcs8+xul6pxqOMP5swQuCTUIqVN0DJmGJs+1sd8yL0wAAP5mENmMoDk3sPfucbKpq4hmuy/QFFg2aMwl/+fHqlYWDrUR1DViE4JPh6A19mBJTXGifn0B42TqVgN/oNU4NiZitWE3qcC99zxUhsZfb9ah2bv+qpMfAr1WothXXvTOxTgMCaJlYg9bfLx+OM3fgy7OiDdfhdLLJpKOIBJqqtqJKLb6zJC18Yssbhke88mGfVxN8V6hVZqLUIEBzVTYU/vy/M77HQxWDO9xXSsLp5YYggJXjzBim8gQ1oKAmpVlMURC5uRD+WjiNCzJgIaV5EJEO5h0VKE4MT3w8k2dfrReF2h4Tp+B/mZrRcJL1NVhwMjcormAVg4P1/XB2nHGMQiTeY2SqAtTxgBDB3DYbRT2h38QGmTqVGOtxnXqNkhHtmTG9aRLmymkqwCeM0rHcoO5FSbCBIf3PjMCnKW2gPGpa/WXO9Y+gxJaMWCpxaNKqtyUlS2fOd4toAjpk9Sg7a/w5I5m1O6rI8nHUsrtmRiy6bYkmdCy8Bac42TDKiBLng9nkdLmwzS1kvr0OTYwFpA+Xjupja0oD2uEU3yCDfpS5plSKHtxVIjlJ3waY9VVDZ9hM2sco64XlEXDcHIKlJv7ZXcl5AsmywJvY+fw1xiVVoWr2j3Y0jUO7Jpl++csi0DR4gtGwKZ4ZWV3R6EIRbNfJXE2tG8LPaHc1rFQOvDknKFyRplD9GzwvY5Nl+BqJn0ZbUW94fNmw06HEvZv5Ji+8BvwiKdhRCC132F7IwYIWG/TWApBZzfWFc9o3GogtxIqh9XDmcVh12ZuclBp8viV02fciEY0Ew7S/LU9WcPCXfzgdcjnfcclilG9diGxf195wcPruu64DdhkhvNowcgCjJHagdsBnZxyz7dZhcUFezEO+IU5Iou/rLSdZGGOYguhsUXLPyBY1XYFsNyqOGhLribO5McPX/x/bKzTc9er6vYA+N6cWKkJunCqqZub84EHN1B1pseQdZIMxgHP6dGcs0vWMGaGBG6wpndv4DdUjCpv8STdapKOu5UpJJkuSt+HV4fs2BQzwzS3TABiocNZPZ5VLGGuH4RvRZD0JpibQaSn/icUqk6x7vKMVVYdXeOLyQ8JKhAxr70MjlxyTgdYrQWqccIkpHviTTNkoWK7DlBDHqR4uuD4h1S8Y1IBrWWteRjD3DPOhoaF1XTdXVF7RlPtCC9meHBypbPzhNryhBDC4Pc1NBv7B1XZRe7yAdTEU+kl4w9yGsUXtBr2uvMjedPVDEci6GDjmgGHEK95KA/s10olvjWiZ3okMDUZ9cZv9yPnLb3whcV9pgFhYvwEnp7vI7ylzwZA/+Hvs9YVgXto8UqMTrruDuIZ/YtSqNQgCLOJP5UaKP86xRyeSOddVVMqtjE7WbkOZgWUQoL2UjqqstNt/IrT6O3ar49LjoLibnSU5GZ9guSV+V5sbVb3urv3u3v0Zt6lAAinp1TengiFiS4ZWOWjkOq3rrxbafXUerOY2sRx6A5hmYMO8rm/fWm90SaaRJco0KxiTiOZ1rYdBH8km/LmgGPUP4zxtyvCfMk02bLF1qy+KYWEiLbzcPh3peHQzeFm10cZRMvqw3CvMxiiDhb7M2Rv3Ilyzw4Yq8Y80zXKIQ32IXQfVd/mvemeQYuMjSCCNOpGNLFTKWN2Q5MS3ftdnx0fJ3rD1xvvOStHq+8+M3G9SXqaMzfyEZN9Alb3Cb6lN2fNtEjbpVFE+e3zk2UJfnLJspKuXkTnfP6sV0ltkWjymu6xJpqUQv1yyYuee7bhNlt9UUT5REcTrWFTbLxdBsya0Hzcg3zihzFW6O/Qk8Hr6OYDqRvI9SM7pgaOKUvRMcug3g45K/bq/QHg2QX6vayBblZhXYlzvqiOdS8WrKkESORuGBLR//8zspjsMH6ZHtAVlv9jljzogoJ9q3JmvJFwOcFzXGitGwTXRM7ngHSmY0Va+4929KhdsyYY8d0n+GN/i79Uxr+cLq3vfvV9t7zvf43+9Wf2XzW+23vd71bvd3e73vf9A57z3onPb+nen/t/a339/4/+v/q/7v/H0f99JNK5je91qf/3/8BXjq+dA==</latexit>

TN<latexit sha1_base64="MwCq1joccNwjGmzoYh46vmhJssA=">AAAkRHicpVrbcty4EZ3d3DbKzZs85oWVsapsr6WVlIekXLVVa8lZa23ZZVuWrMRjTWFIzAwi3kSAczHNb8hr8kP5h/xD3lJ5TaUbIGfIblrrtabKHg7O6QbQQDe6QY3SUGmzs/OvTz79wQ9/9OOffPbTjZ/9/Be//NWNz399qpM88+WJn4RJdjYSWoYqlidGmVCepZkU0SiUr0YXB4i/mslMqyR+aZapfBOJSazGyhcGmk5uvhw+vTm80d/Z3rEfjz/sVg/9XvV5Nvy8f3sQJH4eydj4odD69e5Oat4UIjPKD2W5Mci1TIV/ISbyNTzGIpL6TWFHW3qb0BJ44ySDf7HxbGtTohCR1stoBMxImKmmGDZ2Ya9zM/7jm0LFaW5k7LuOxnnomcTDqXuByqRvwiU8CD9TMFbPn4pM+AYMtLGxsentCw2NTvPm1gd/NgaxnPtJFIk4KAbZDBejLAZTDSOTd4qtgcgysYSJwljLgYXfA5ZtVSloSdGqIiRItAQjFOcDIxcmTwv4QQiwCUKQxrmMRsULCoOBy9e7b1aE4/K86O9SVhJLDUqqTuz3aFwArdwgA020SaC/YiBTrcIkppoisXAEv2MkIN01mGGtlnSm/WmeVXPzRQh0qtKIkQrVWwFeALyRyFpkqi9OMjDGHvQfyrEZnMrM9HcHmZpM3Y9h0d+jXaAMGNhK1aqfllbBLRz7XZRxOm53isI2Dpn8WpQKydl6Mf8Eg9lsoTORrZcJfxBph1fip0zcgA/ItQLD5B3BLtGa4w08GG2bCJ41WSuyvwhjFF60SfbbmKIGqICfxLM12/6io0tD87Yx/ERC5IQWagRw/uxtaxbQVK1ZveDU7pmIL1oitoHJtIXCEZob/kerEX0WyiqIbFs1idfT0JOY7dQIXRriu0+9y+5zBOQlhdDUOveZyERWABeJUSZ2QmQEUxWPG653WA6LATSZJbX21MwTQmROlBFtL8puxdQVfBEnaCk5A9KlipW5O0jxi3PTxIc4+ZK1L1I4dcACdH0AcM5yrCaRoGKBMKIx3gcsyiWBDBuEJ3wFkaGlAdLLqTSsB5hZhEeHMMV95jprbJ9ieexDpCphSF3tgV/Wj4PA1/o9JJybUWEgV/quZM/q2NomE5NJEYPTauPI3gGLvpBG4CrBQfOUYvCEgo+YUuyoGK7ijM/sHPhGRbC+xcsWi9KyhNEyztKasaD/kjnHKA8muLSPVjTYTkzbwq4GGm7BVnEF5UzMZLFO7ZGfpCzgRKq5L9nOAXzUwNnuAdxv4AcdeNDAn3XgTT/+pgOfNPCHHfi0FSzoBKdomMOu3ThG5JsuZILIwy7ER+Sg3q7cFHbTHQ/p4QZQuBQpQnvdUJBj0nV81cb0k9miDi/DYrGwuwhzzydK+9sfn3VaB6s357bZpiYO5BgCfXHvKxrwLgEpi6/usfxxrCVuimOWeXa3S9U41PEHc2YIXBJqkdImaBkzjE0f62M+5F4YgIF8zCEzGUBy7+HvXGNlU9cQTfZfoSiw7FGYy++mRyoWlg71EVQ1olOCjwfgdXZgSY1xYj79QeNkKlaD/yAVOHamYjWh96nAPXe81EZG36/Wodm7vioTn0K91mJY1970DgU4jEliJWJPm3w8/viNH8OuDki338YSi6YSDmCS6qoaiej2uozQtTFLLC7Z3rNJRn3cTbFeoZVaixDBQc1U2NP78vwOO10M1kxvMR2riyeWGEKCF0+w4hvIkJaCgFoVZXHEwmbkQ/koIvSsiYDGVWQChHtYtBQhOPH9cLJNnX40XldouI7fQn5m60XCy1TV4cCInKJ5ABbOz9f1QdoxBrFIk7mNEmjLE0YAQ8dwGO2Udgc/UNpkapTjbcY1anaIR/bkhnWkC5upJKsAXvNKh7IDOdUmgsQHNz6zgpyl9oBx6as11zuWPkMSWrHgqUWjyqqcFJUt3zmeLeCI6ZPUoO3vsGTO5pQu6+NJx9KKLZnYsim25JnQMrDWXOMkA2qgC16P59HSJoO09dI6NDk2sBZQPp67qQ0taI9rRJM8wk36kmYZUmh7sdQIZSd82mMVlU0fYTOrnCOuV9RFQzCyitRbeyX3JSTLJktC7+PnMJdYlZbFK9r9GBL1jmza5TunbMvAEWLLhkBmeGVltwdhiEUzXyWxdjQvi/3hnFYx0PqwpFxhskbZQ/SssH2OzVcgaiZ9Wa3F/WHzZoMOx1L2r6TYPvCbsEhnIYTgdV8hOyNGSNhvE1hKAec31lXPaByqIDeS6seVw1nFYVdmbnLQ6bL4VdOnXAgGNNPOkjx1/dlDwt184PVI5z2HZYpRPbZhcX/f+cGD67ou+E2Y5Ebz6AGIgsyR2gGbgV3csk+32QVFBTvxjjgFuaKLv6x0XaRhDqKLYfEFC3/gWBW2xbAcanioC+7mziRHz198v+xs07PX6yr2wLhenBipSbqwqqnbmzMBR3eQ9aZHkDXSDMbBz6mRXPMLVrAmRoSucGb3L2C3FEzqL/FknaqSjjsVqSRZ7opfh9fHLBjUM4N0N0yA4mEDmX0eVawhrl9Er8UQtKZYm4HkJz6nVKrO8a5yTBVW3Z3jCwkPCSqQsS+9TE5cMk6HGK1F6jGCZOR7Ik2zZKEie04Qg16k+PqgeIdUfCOSQa1lLfnYA9yzjobGRdV0XV1Re8YTLUhvZniwsuWz88SaMsTQwiA3NfQbe8dV2cUu8sFUxBPpJWMP8hqFF/Sa9jpz4/kzVQzHYuigI5oBh1AvOegvbBeKJb51Yic6JDD12XXGL/cjp+298EWFPWZB4SK8hN4er6P8JU/GwP+h7zOWVUH7aLFKjM467g7imX2L0igUoIgziT8V2ij/OoVc3khnXRWTKjZxuxl5DqZFlMJCNpK66nLTrfzK0+itmm+Pi85CYq70VGSm/YLkVXlebO2Wt/q7d/t79KYeJYCIZ+eUHp6IBQlu2Zil45Cqt2582+l1lLrz2FrEMWiOoRnDjrJ5f73pPZFmmgTXqFBsIo7jmRY2XQS/5NuyZsAjlP+MMfdrwjzJtNnyhZYsvqmFhMh283C49+Xh0E3hZhdH2cTLaoMwL7MYIs4We3Pkr1zJMg+O2CvGPNM1CuENdiF039Wf5r1pnoGLDI0gwnQqhnQxU2ljtgPT0l27HR8dX+f6A9cbL3mrxysvfrNxfYk6GvM3slETfcIWt4k+ZfenTfSIW2XRxPmtcxNlSf6yibJSbt5E57x+bFeJbdGo8pousaZa1EL9solLnvs2YXZbfdFEeQSHU21hk2w83YbMWtC8XMO8Ikfx1uiv0NPB6yimA+nbCDWjO6YGTukL0bHLIB4O+ev2Kv3BINmFur1sQW5WoV2Js75oDjWvlixpxEgkLtjS0T+/s/IYbLA+2R6Q1Va/I9a8qEKCfWuypnwR8HlBc5woLdtE18SOZ4B0ZmPFmnvPtnSoHTPm2DHdZ3ijv0v/lIY/nO5t7/5+e+/5Xv/r/erPbD7r/bb3u96t3m7vD72ve4e9Z72Tnt9Tvb/1/t77R/+f/X/3/9P/r6N++kkl85te69P/3/8BJsW+jw==</latexit>

apply K1<latexit sha1_base64="yE+5U6gcExkR3YroZM45iFIfgkw=">AAAkUnicpVpLc9y4ER7v5rFRNomdHHNhZaQq22tpJeWQlKu2ai05a60tu2zLlpV4rCkMiZlBxJcIcB6m+VNyTf5QLvkrOaUbIGfIblrZtabKHg6+rxtAA93oBjVKQ6XN7u5/bnz2+U9++rOff/GLjV9++atf/+bmrd+e6iTPfPnaT8IkOxsJLUMVy9dGmVCepZkU0SiUb0YXh4i/mclMqyR+ZZapfBeJSazGyhcGmoY3b4k0DZfe5iBNQuUvh3ubw5v93Z1d+/H4w1710O9Vn+fDW/07gyDx80jGxg+F1m/3dlPzrhCZUX4oy41BrmUq/AsxkW/hMRaR1O8KO/bS24KWwBsnGfyLjWdbmxKFiLReRiNgRsJMNcWwsQt7m5vxn98VKk5zI2PfdTTOQ88kHhrCC1QmfQNzD5TwMwVj9fypyIRvwFwbGxtb3oHQ0Og0b23/4M/GIJZzP4kiEQfFIJvh0pTFYKphZPJusT0QWSaWMFEYazmw8EfAsq0qBS0pWlWEBImWYITifGDkwuRpAT8IAbZECNI4l9GoeElhMHD5du/dinBSnhf9PcpKYqlBSdWJ/R6NC6CVG2SgiTYJ9FcMZKpVmMRUUyQWjuB3jASkuwYzrNWSzrQ/zbNqbr4IgU5VGjFSoXovwCeANxJZi0z1xUkGxtiH/kM5NoNTmZn+3iBTk6n7MSz6+7QLlAEDW6la9bPSKriNY7+HMk7HnU5R2MYhk1+LUiE5Wy/mX2AwWy10JrL1MuEPIu3wSvyUiRvwAblWYJi8I9glWnO8gQejbRPBsyZrRfYXYYzCizbJfhtT1AAV8JN4tmbbX3R0aWjeN4afSIij0EKNAM6fvW/NApqqNasXnNo9E/FFS8Q2MJm2UDhCc8P/aDWiz0JZBZFtqybxehp6ErOdGqFLQ7T3qXfZfY6AvKQQmlrnPhOZyArgIjHKxE6IjGCq4nHD9Y7KYTGAJrOk1p6aeUKIzIkyou1l2a2YuoIv4gQtJWdAulSxMvcGKX5xbpr4ECdfsfZFCqcOWICuDwDOWU7UJBJULBBGNMb7kEW5JJBhg/CUryAytDRAejWVhvUAM4vw6BCmeMBcZ40dUCyPfYhUJQypqz3wy/pxEPhaf4SEczMqDORK35XsWR1b22RiMilicFptHNk7ZNEX0ghcJThonlEMnlDwMVOKHRXDVZzxmZ0D36gI1rd41WJRWpYwWsZZWjMW9F8y5xjlwQSX9vGKBtuJaVvY1UDDLdgqrqCciZks1qk98pOUBZxINfcl2zmAjxo42z2A+w38sAMPGvjzDrzpx9914JMG/qgDn7aCBZ3gFA1z1LUbx4h814VMEHnUhfiIHNbblZvCbrqTIT3cAAqXIkVovxsKcky6Tq7amH4yW9ThZVgsFnYXYe75VGl/59OzTutg9ebcMTvUxIEcQ6Av7n9DA94lIGXxzX2WP461xE1xwjLP7napGoc6/mDODIFLQmVS2gQtY4ax6WN9zIfcCwMwkI85ZCYDSO49/J1rrHPqGqLJ/jsUBZY9CnP5/+mRioWlQ7UEVY3olODjAXidHVhSY5yYT/+gcTIVq8H/IBU4dqZiNaGPqcA9d7LURkY/rtah2bu+KhOfQr3WYljX3vKOBDiMSWIlYk+bfDz+9I0fw64OSLffxxKLphIOYJLqqhqJ6Pa6jNC1MUssLtnes0lGfdxNsV6hlVqLEMFBzVTY0/vy/C47XQzWTO8xHauLJ5YYQoIXT7DiG8iQloKAWhVlcczCZuRD+Sgi9KyJgMZVZAKEe1i0FCE48YNwskOdfjReV2i4jt9DfmbrRcLLVNXhwIiconkAFs7P1/VB2jEGsUiTuY0SaMvXjACGjuEw2i3tDn6otMnUKMe7jWvU7BCP7MkN60gXNlNJVgG85pUOZQdyqk0EiQ9ufGYFOUvtAePSV2uuDyx9hiS0YsFTi0aVVTkpKlt+cDxbwBHTJ6lB299lyZzNKV3Wx5OOpRVbMrFlU2zJM6FlYK25xkkG1EAXvB7Po6VNBmnrpXVocmxgLaB8PHdTG1rQHteIJnmEm/QVzTKk0PZiqRHKXvNpj1VUNn2EzaxyjrheURcNwcgqUu/tBd3XkCybLAm9T5/DXGJVWhZvaPdjSNQ7smmX75yyLQNHiC0bApnhlZXdHoQhFs18lcTa0bwsDoZzWsVA66OScoXJGmUP0bPCDjg2X4GomfRltRYPhs2bDTocSzm4kmL7wG/CIp2FEILXfYXsjBgh4aBNYCkFnN9YVz2ncaiC3EiqH1cOZxWHXZm5xUGny+JXTZ9yIRjQTDtL8tT1Zw8Jd/OB1yOd9xyWKUb12IbFgwPnBw+v67rgN2GSG82jByAKMkdqB2wGdnHbPt1hFxQV7MQ74hTkii7+stJ1kYY5iC6GxVcs/IFjVdg2w3Ko4aEuuJc7kxy/ePnjsrMtz16vq9gD43pxYqQm6cKqpm5vzgQc3UHWmx5D1kgzGAe/oEZyzS9ZwZoYEbrCmd2/gN3c6wc8WaeqpONORSpJlrvi1+H1CQsG9cwg3Q0ToHjYQGafRxVriOsX0WsxBK0p1mYg+YnPKZWqc7yrHFOFVXfn+ELCQ4IKZOxLL5MTl4zTIUZrkXqMIBn5nkjTLFmoyJ4TxKAXKb4+KD4gFd+IZFBrWUs+8QD3rKOhcVE1XVdX1J7xRAvSmxkerGz57DyxpgwxtDDITQ39xt5xVXaxi3w4FfFEesnYg7xG4QW9pr3O3Hj+ShXDsRg66JhmwCHUSw76G9uFYolvndiJDglMfXad8cv9yGn7KHxRYU9YULgIL6G3J+sof8mTMfB/6PuMZVXQPlqsEqOzjruDeGbfojQKBSjiTOJPhTbKv04hlzfSWVfFpIpN3G5GnoNpEaWwkI2krrrcdCu/8jR6q+bb46KzkJgrPRWZab8geVOeF9t75e3+3r3+Pr2pRwkg4tk5pYcnYkGCWzZm6Tik6q0b33Z6HaXuPLYWcQyaY2jGsKNs3l9veU+lmSbBNSoUm4jjeKaFTRfBL/m2rBnwCOU/Y8z9mjBPMm22faEli29qISGybR4N978+GropbHZxlE28rDYI8zKLIeJsszdH/sqVLPPwmL1izDNdoxDeYBdC9139ad6b5hm4yNAIIkynYkgXM5U2ZjswLd2128nxyXWuP3C98ZK3erzy4jcb15eoozF/Ixs10adscZvoM3Z/2kSPuVUWTZzfOjdRluQvmygr5eZNdM7rx3aV2BaNKq/pEmuqRS3UL5u45LlvE2a31RdNlEdwONUWNsnG023IrAXNyzXMK3IUb43+Cj0dvI5iOpC+jVAzumNq4JS+EB27DOLRkL9ur9IfDJJdqNvLFuRmFdqVOOuL5lDzasmSRoxE4oItHf3zuyuPwQbrk+0BWW31O2LNiyok2Lcma8pXAZ8XNMeJ0rJNdE3seAZIZzZWrLn3bUuH2jFjjh3TfYY3+3v0T2n4w+n+zt4fd/Zf7Pe/Paj+zOaL3u97f+jd7u31/tT7tnfUe9573fN7894/ev/s/av/7/5/N29sfu6on92oZH7Xa302v/wfjf7A/A==</latexit> apply K2

<latexit sha1_base64="wd5v4cOMsc2/dfuX/v92H/gzOPM=">AAAkUnicpVpLc9y4ER7v5rFRNomdHHNhZaQq22tpJeWQlKu2ai05a60tu2zLlpV4rCkMiZlBxJcIcB6m+VNyTf5QLvkrOaUbIGfIblrZtabKHg6+rxtAA93oBjVKQ6XN7u5/bnz2+U9++rOff/GLjV9++atf/+bmrd+e6iTPfPnaT8IkOxsJLUMVy9dGmVCepZkU0SiUb0YXh4i/mclMqyR+ZZapfBeJSazGyhcGmoY3b4k0DZfe5iBNQuUvh/ubw5v93Z1d+/H4w1710O9Vn+fDW/07gyDx80jGxg+F1m/3dlPzrhCZUX4oy41BrmUq/AsxkW/hMRaR1O8KO/bS24KWwBsnGfyLjWdbmxKFiLReRiNgRsJMNcWwsQt7m5vxn98VKk5zI2PfdTTOQ88kHhrCC1QmfQNzD5TwMwVj9fypyIRvwFwbGxtb3oHQ0Og0b23/4M/GIJZzP4kiEQfFIJvh0pTFYKphZPJusT0QWSaWMFEYazmw8EfAsq0qBS0pWlWEBImWYITifGDkwuRpAT8IAbZECNI4l9GoeElhMHD5du/dinBSnhf9PcpKYqlBSdWJ/R6NC6CVG2SgiTYJ9FcMZKpVmMRUUyQWjuB3jASkuwYzrNWSzrQ/zbNqbr4IgU5VGjFSoXovwCeANxJZi0z1xUkGxtiH/kM5NoNTmZn+3iBTk6n7MSz6+7QLlAEDW6la9bPSKriNY7+HMk7HnU5R2MYhk1+LUiE5Wy/mX2AwWy10JrL1MuEPIu3wSvyUiRvwAblWYJi8I9glWnO8gQejbRPBsyZrRfYXYYzCizbJfhtT1AAV8JN4tmbbX3R0aWjeN4afSIij0EKNAM6fvW/NApqqNasXnNo9E/FFS8Q2MJm2UDhCc8P/aDWiz0JZBZFtqybxehp6ErOdGqFLQ7T3qXfZfY6AvKQQmlrnPhOZyArgIjHKxE6IjGCq4nHD9Y7KYTGAJrOk1p6aeUKIzIkyou1l2a2YuoIv4gQtJWdAulSxMvcGKX5xbpr4ECdfsfZFCqcOWICuDwDOWU7UJBJULBBGNMb7kEW5JJBhg/CUryAytDRAejWVhvUAM4vw6BCmeMBcZ40dUCyPfYhUJQypqz3wy/pxEPhaf4SEczMqDORK35XsWR1b22RiMilicFptHNk7ZNEX0ghcJThonlEMnlDwMVOKHRXDVZzxmZ0D36gI1rd41WJRWpYwWsZZWjMW9F8y5xjlwQSX9vGKBtuJaVvY1UDDLdgqrqCciZks1qk98pOUBZxINfcl2zmAjxo42z2A+w38sAMPGvjzDrzpx9914JMG/qgDn7aCBZ3gFA1z1LUbx4h814VMEHnUhfiIHNbblZvCbrqTIT3cAAqXIkVovxsKcky6Tq7amH4yW9ThZVgsFnYXYe75VGl/59OzTutg9ebcMTvUxIEcQ6Av7n9DA94lIGXxzX2WP461xE1xwjLP7napGoc6/mDODIFLQmVS2gQtY4ax6WN9zIfcCwMwkI85ZCYDSO49/J1rrHPqGqLJ/jsUBZY9CnP5/+mRioWlQ7UEVY3olODjAXidHVhSY5yYT/+gcTIVq8H/IBU4dqZiNaGPqcA9d7LURkY/rtah2bu+KhOfQr3WYljX3vKOBDiMSWIlYk+bfDz+9I0fw64OSLffxxKLphIOYJLqqhqJ6Pa6jNC1MUssLtnes0lGfdxNsV6hlVqLEMFBzVTY0/vy/C47XQzWTO8xHauLJ5YYQoIXT7DiG8iQloKAWhVlcczCZuRD+Sgi9KyJgMZVZAKEe1i0FCE48YNwskOdfjReV2i4jt9DfmbrRcLLVNXhwIiconkAFs7P1/VB2jEGsUiTuY0SaMvXjACGjuEw2i3tDn6otMnUKMe7jWvU7BCP7MkN60gXNlNJVgG85pUOZQdyqk0EiQ9ufGYFOUvtAePSV2uuDyx9hiS0YsFTi0aVVTkpKlt+cDxbwBHTJ6lB299lyZzNKV3Wx5OOpRVbMrFlU2zJM6FlYK25xkkG1EAXvB7Po6VNBmnrpXVocmxgLaB8PHdTG1rQHteIJnmEm/QVzTKk0PZiqRHKXvNpj1VUNn2EzaxyjrheURcNwcgqUu/tBd3XkCybLAm9T5/DXGJVWhZvaPdjSNQ7smmX75yyLQNHiC0bApnhlZXdHoQhFs18lcTa0bwsDoZzWsVA66OScoXJGmUP0bPCDjg2X4GomfRltRYPhs2bDTocSzm4kmL7wG/CIp2FEILXfYXsjBgh4aBNYCkFnN9YVz2ncaiC3EiqH1cOZxWHXZm5xUGny+JXTZ9yIRjQTDtL8tT1Zw8Jd/OB1yOd9xyWKUb12IbFgwPnBw+v67rgN2GSG82jByAKMkdqB2wGdnHbPt1hFxQV7MQ74hTkii7+stJ1kYY5iC6GxVcs/IFjVdg2w3Ko4aEuuJc7kxy/ePnjsrMtz16vq9gD43pxYqQm6cKqpm5vzgQc3UHWmx5D1kgzGAe/oEZyzS9ZwZoYEbrCmd2/gN3c6wc8WaeqpONORSpJlrvi1+H1CQsG9cwg3Q0ToHjYQGafRxVriOsX0WsxBK0p1mYg+YnPKZWqc7yrHFOFVXfn+ELCQ4IKZOxLL5MTl4zTIUZrkXqMIBn5nkjTLFmoyJ4TxKAXKb4+KD4gFd+IZFBrWUs+8QD3rKOhcVE1XVdX1J7xRAvSmxkerGz57DyxpgwxtDDITQ39xt5xVXaxi3w4FfFEesnYg7xG4QW9pr3O3Hj+ShXDsRg66JhmwCHUSw76G9uFYolvndiJDglMfXad8cv9yGn7KHxRYU9YULgIL6G3J+sof8mTMfB/6PuMZVXQPlqsEqOzjruDeGbfojQKBSjiTOJPhTbKv04hlzfSWVfFpIpN3G5GnoNpEaWwkI2krrrcdCu/8jR6q+bb46KzkJgrPRWZab8geVOeF9t75e3+3r3+Pr2pRwkg4tk5pYcnYkGCWzZm6Tik6q0b33Z6HaXuPLYWcQyaY2jGsKNs3l9veU+lmSbBNSoUm4jjeKaFTRfBL/m2rBnwCOU/Y8z9mjBPMm22faEli29qISGybR4N978+GropbHZxlE28rDYI8zKLIeJsszdH/sqVLPPwmL1izDNdoxDeYBdC9139ad6b5hm4yNAIIkynYkgXM5U2ZjswLd2128nxyXWuP3C98ZK3erzy4jcb15eoozF/Ixs10adscZvoM3Z/2kSPuVUWTZzfOjdRluQvmygr5eZNdM7rx3aV2BaNKq/pEmuqRS3UL5u45LlvE2a31RdNlEdwONUWNsnG023IrAXNyzXMK3IUb43+Cj0dvI5iOpC+jVAzumNq4JS+EB27DOLRkL9ur9IfDJJdqNvLFuRmFdqVOOuL5lDzasmSRoxE4oItHf3zuyuPwQbrk+0BWW31O2LNiyok2Lcma8pXAZ8XNMeJ0rJNdE3seAZIZzZWrLn3bUuH2jFjjh3TfYY3+3v0T2n4w+n+zt4fd/Zf7Pe/Paj+zOaL3u97f+jd7u31/tT7tnfUe9573fN7894/ev/s/av/7/5/N29sfu6on92oZH7Xa302v/wfsdvA/Q==</latexit> apply KN<latexit sha1_base64="d6k+bvvjhTQ9YJyir/wosYKQFHw=">AAAkUnicpVpLc9y4ER7v5rFRNomdHHNhZaQq22tpJeWQlKu2ai05a60tu2zLlpV4rCkMiZlBxJcIcB6m+VNyTf5QLvkrOaUbIGfIblrZtabKHg6+rxtAA93oBjVKQ6XN7u5/bnz2+U9++rOff/GLjV9++atf/+bmrd+e6iTPfPnaT8IkOxsJLUMVy9dGmVCepZkU0SiUb0YXh4i/mclMqyR+ZZapfBeJSazGyhcGmoY3b4k0DZfe5iBNQuUvh882hzf7uzu79uPxh73qod+rPs+Ht/p3BkHi55GMjR8Krd/u7abmXSEyo/xQlhuDXMtU+BdiIt/CYywiqd8VduyltwUtgTdOMvgXG8+2NiUKEWm9jEbAjISZaophYxf2NjfjP78rVJzmRsa+62ich55JPDSEF6hM+gbmHijhZwrG6vlTkQnfgLk2Nja2vAOhodFp3tr+wZ+NQSznfhJFIg6KQTbDpSmLwVTDyOTdYnsgskwsYaIw1nJg4Y+AZVtVClpStKoICRItwQjF+cDIhcnTAn4QAmyJEKRxLqNR8ZLCYODy7d67FeGkPC/6e5SVxFKDkqoT+z0aF0ArN8hAE20S6K8YyFSrMImppkgsHMHvGAlIdw1mWKslnWl/mmfV3HwRAp2qNGKkQvVegE8AbySyFpnqi5MMjLEP/YdybAanMjP9vUGmJlP3Y1j092kXKAMGtlK16melVXAbx34PZZyOO52isI1DJr8WpUJytl7Mv8BgtlroTGTrZcIfRNrhlfgpEzfgA3KtwDB5R7BLtOZ4Aw9G2yaCZ03WiuwvwhiFF22S/TamqAEq4CfxbM22v+jo0tC8bww/kRBHoYUaAZw/e9+aBTRVa1YvOLV7JuKLlohtYDJtoXCE5ob/0WpEn4WyCiLbVk3i9TT0JGY7NUKXhmjvU++y+xwBeUkhNLXOfSYykRXARWKUiZ0QGcFUxeOG6x2Vw2IATWZJrT0184QQmRNlRNvLslsxdQVfxAlaSs6AdKliZe4NUvzi3DTxIU6+Yu2LFE4dsABdHwCcs5yoSSSoWCCMaIz3IYtySSDDBuEpX0FkaGmA9GoqDesBZhbh0SFM8YC5zho7oFge+xCpShhSV3vgl/XjIPC1/ggJ52ZUGMiVvivZszq2tsnEZFLE4LTaOLJ3yKIvpBG4SnDQPKMYPKHgY6YUOyqGqzjjMzsHvlERrG/xqsWitCxhtIyztGYs6L9kzjHKgwku7eMVDbYT07awq4GGW7BVXEE5EzNZrFN75CcpCziRau5LtnMAHzVwtnsA9xv4YQceNPDnHXjTj7/rwCcN/FEHPm0FCzrBKRrmqGs3jhH5rguZIPKoC/EROay3KzeF3XQnQ3q4ARQuRYrQfjcU5Jh0nVy1Mf1ktqjDy7BYLOwuwtzzqdL+zqdnndbB6s25Y3aoiQM5hkBf3P+GBrxLQMrim/ssfxxriZvihGWe3e1SNQ51/MGcGQKXhMqktAlaxgxj08f6mA+5FwZgIB9zyEwGkNx7+DvXWOfUNUST/XcoCix7FOby/9MjFQtLh2oJqhrRKcHHA/A6O7Ckxjgxn/5B42QqVoP/QSpw7EzFakIfU4F77mSpjYx+XK1Ds3d9VSY+hXqtxbCuveUdCXAYk8RKxJ42+Xj86Rs/hl0dkG6/jyUWTSUcwCTVVTUS0e11GaFrY5ZYXLK9Z5OM+ribYr1CK7UWIYKDmqmwp/fl+V12uhismd5jOlYXTywxhAQvnmDFN5AhLQUBtSrK4piFzciH8lFE6FkTAY2ryAQI97BoKUJw4gfhZIc6/Wi8rtBwHb+H/MzWi4SXqarDgRE5RfMALJyfr+uDtGMMYpEmcxsl0JavGQEMHcNhtFvaHfxQaZOpUY53G9eo2SEe2ZMb1pEubKaSrAJ4zSsdyg7kVJsIEh/c+MwKcpbaA8alr9ZcH1j6DEloxYKnFo0qq3JSVLb84Hi2gCOmT1KDtr/LkjmbU7qsjycdSyu2ZGLLptiSZ0LLwFpzjZMMqIEueD2eR0ubDNLWS+vQ5NjAWkD5eO6mNrSgPa4RTfIIN+krmmVIoe3FUiOUvebTHquobPoIm1nlHHG9oi4agpFVpN7bC7qvIVk2WRJ6nz6HucSqtCze0O7HkKh3ZNMu3zllWwaOEFs2BDLDKyu7PQhDLJr5Kom1o3lZHAzntIqB1kcl5QqTNcoeomeFHXBsvgJRM+nLai0eDJs3G3Q4lnJwJcX2gd+ERToLIQSv+wrZGTFCwkGbwFIKOL+xrnpO41AFuZFUP64czioOuzJzi4NOl8Wvmj7lQjCgmXaW5Knrzx4S7uYDr0c67zksU4zqsQ2LBwfODx5e13XBb8IkN5pHD0AUZI7UDtgM7OK2fbrDLigq2Il3xCnIFV38ZaXrIg1zEF0Mi69Y+APHqrBthuVQw0NdcC93Jjl+8fLHZWdbnr1eV7EHxvXixEhN0oVVTd3enAk4uoOsNz2GrJFmMA5+QY3kml+ygjUxInSFM7t/Abu51w94sk5VScedilSSLHfFr8PrExYM6plBuhsmQPGwgcw+jyrWENcvotdiCFpTrM1A8hOfUypV53hXOaYKq+7O8YWEhwQVyNiXXiYnLhmnQ4zWIvUYQTLyPZGmWbJQkT0niEEvUnx9UHxAKr4RyaDWspZ84gHuWUdD46Jquq6uqD3jiRakNzM8WNny2XliTRliaGGQmxr6jb3jquxiF/lwKuKJ9JKxB3mNwgt6TXudufH8lSqGYzF00DHNgEOolxz0N7YLxRLfOrETHRKY+uw645f7kdP2Ufiiwp6woHARXkJvT9ZR/pInY+D/0PcZy6qgfbRYJUZnHXcH8cy+RWkUClDEmcSfCm2Uf51CLm+ks66KSRWbuN2MPAfTIkphIRtJXXW56VZ+5Wn0Vs23x0VnITFXeioy035B8qY8L7b3ytv9vXv9fXpTjxJAxLNzSg9PxIIEt2zM0nFI1Vs3vu30OkrdeWwt4hg0x9CMYUfZvL/e8p5KM02Ca1QoNhHH8UwLmy6CX/JtWTPgEcp/xpj7NWGeZNps+0JLFt/UQkJk2zwa7n99NHRT2OziKJt4WW0Q5mUWQ8TZZm+O/JUrWebhMXvFmGe6RiG8wS6E7rv607w3zTNwkaERRJhOxZAuZiptzHZgWrprt5Pjk+tcf+B64yVv9XjlxW82ri9RR2P+RjZqok/Z4jbRZ+z+tIkec6ssmji/dW6iLMlfNlFWys2b6JzXj+0qsS0aVV7TJdZUi1qoXzZxyXPfJsxuqy+aKI/gcKotbJKNp9uQWQual2uYV+Qo3hr9FXo6eB3FdCB9G6FmdMfUwCl9ITp2GcSjIX/dXqU/GCS7ULeXLcjNKrQrcdYXzaHm1ZIljRiJxAVbOvrnd1cegw3WJ9sDstrqd8SaF1VIsG9N1pSvAj4vaI4TpWWb6JrY8QyQzmysWHPv25YOtWPGHDum+wxv9vfon9Lwh9P9nb0/7uy/2O9/e1D9mc0Xvd/3/tC73dvr/an3be+o97z3uuf35r1/9P7Z+1f/3/3/bt7Y/NxRP7tRyfyu1/psfvk/nkPBGQ==</latexit> Figure 1: Cartoon depiction of the

problem addressed in this paper.The goal is to design N policies,{Ki}N

i=1, so as to minimize theworst-case cost (blue area) over thetime horizon [0, T ]; cf. §2.

Modeling and data As {Atr, Btr} are unknown, all knowledge about the true system dynamicsmust be inferred from observed data, Dn := {xt, ut}n

t=1. We assume that �w is known, or has beenestimated, and that we have access to initial data, denoted (with slight notational abuse) D0, obtained,e.g. during a preliminary experiment. For the model (1), parameter uncertainty can be quantified as:Proposition 2.1. Given observed data Dn from (1), and a uniform prior over the parameters✓ = vec ([A B]), i.e., p(✓) / 1, the posterior distribution p(✓|Dn) is given by N (µ✓,⌃✓), whereµ✓ = vec

⇣[A B]

⌘= arg min

✓2Rn2x+nxnu

Pn�1t=1 |xt+1 � [x>

t u>t ] ⌦ Inx✓|2, i.e., the ordinary least

squares estimator, and ⌃✓�1 = 1

�2w

Pn�1t=1

xt

ut

� xt

ut

�>⌦ Inx .

Proof : cf. §A.1.1. The uniform prior, p(✓) / 1, sometimes called an improper prior, is used as anuninformative prior, signifying that we have no prior knowledge of ✓, i.e., all values are equally likely.Based on Proposition 2.1 we can define a high-probability credibility region by:

⇥e(Dn) := {✓ : (✓ � µ✓)>⌃✓

�1(✓ � µ✓) c�}, (2)

where c� = �2n2x+nxnu

(�) for 0 < � < 1. Then, ✓tr = vec ([Atr Btr]) 2 ⇥e(Dn) w.p. 1 � �.

Policies Though not necessarily optimal, we will restrict our attention to static-gain policies ofthe form ut = Kxt + ⌃1/2et, where et ⇠ N (0, I) represent random excitations for the purpose oflearning. Static-gain policies are popular in practice, due to simplicity of synthesis and implementation[12, 13, 11], and encompass many common control strategies, e.g., proportional-derivative (PD)control. A policy comprises K 2 Rp⇥n and ⌃ 2 Snu

+ , and is denoted K = {K,⌃}. Let {ti}Ni=0 2 N,

with 0 = t0 t1 . . . , tN = T , partition the time horizon T into N intervals. The ith intervalis of length Ti := ti � ti�1. We will then design N policies, {Ki}N

i=1, such that Ki = {Ki,⌃i} isdeployed during the ith interval, t 2 [ti�1, ti]. For convenience, we define the function I : R+ 7! Ngiven by I(t) := arg mini2N{i : t tj}, which maps time t to the index i = I(t) of the policy tobe deployed. We also make use of the notation ut = K(xt) as shorthand for ut = Kxt + ⌃1/2et.

Worst-case dynamics We are now in a position to define the optimization problem that we wish tosolve in this paper. In the absence of knowledge of the true dynamics, {Atr, Btr}, given initial dataD0, we wish to find a sequence of policies {Ki}N

i=0 that minimize the expected costPT

t=1 c(xt, ut),assuming that, at time t, the system evolves according to the worst-case dynamics within the high-probability credibility region ⇥e(Dt), i.e.,

min{Ki}N

i=1

E"

TX

t=0

sup{At,Bt}2⇥e(Dt)

c(xt, ut)

#, s.t. xt+1 = Atxt + Btut + wt, ut = KI(t)(xt), (3)

where the expectation is w.r.t. wt ⇠ N�0, �2

wInx

�and et ⇠ N (0, Inu). We choose to optimize for

the worst-case dynamics so as to bound, with high probability, the cost of applying the policies to theunknown true system. In principle, problems such as (3) can be solved via dynamic programing (DP)[17]. However, such DP-based solutions require gridding to obtain finite state-action spaces, andare hence computationally intractable for systems of even modest dimension [6]; cf. also [31, §IV]for a discussion of RL methods for finite state-action spaces. In what follows, we will present anapproximate solution to this problem, which we refer to as a ‘robust reinforcement learning’ (RRL)problem, that retains the continuous sate-action space formulation and is based on convex optimization.To facilitate such a solution, we require a refined means of quantifying system uncertainty, which wepresent next.

3

Page 4: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

3 Modeling uncertainty for robust control

In this paper, we adopt a model-based approach to control, in which quantifying uncertainty inthe estimates of the system dynamics is of central importance. From Proposition 2.1 the posteriordistribution over parameters is Gaussian, which allows us to construct an ‘ellipsoidal’ credibilityregion ⇥e, centered about the ordinary least squares estimates of the model parameters, as in (2).

To allow for an exact convex formulation of the control problem involving the worst-case dynamics,cf. §4.2, it is desirable to work with a credibility region that bounds uncertainty in terms of thespectral properties of the parameter error matrix [A � Atr, B � Btr], where {A, B} are the ordinaryleast squares estimates, i.e. vec

⇣[A B]

⌘= µ✓, cf. Proposition 2.1. To this end, we will work with

models of the form M(D) := {A, B, D} where D 2 Snx+nu specifies the following region, inparameter space, centered about {A, B}:

⇥m(M) := {A, B : X>DX � I, X = [A � A, B � B]>}. (4)

The following lemma, cf. §A.1.2 for proof, suggests a specific means of constructing D, so as toensure that ⇥m defines a high-probability credibility region:

Lemma 3.1. Given data Dn from (1), and 0 < � < 1, let D = 1�2wc�

Pn�1t=1

xt

ut

� xt

ut

�>, with

c� = �2n2x+nxnu

(�). Then [Atr, Btr] 2 ⇥m(M) w.p. 1 � �.

For convenience, we will make use of the following shorthand notation: M(Dti) = {Ai, Bi, Di}.

Credibility regions of the form (4), i.e. bounds on the spectral properties of the estimation error, haveappeared in recent works on data-driven and adaptive control, cf. e.g., [11, Proposition 2.4] whichmakes use of results from high-dimensional statistics [40]. The construction in [11, Proposition2.4] requires {xt+1, xt, ut} to be independent, and as such is not directly applicable to time seriesdata, without subsampling to attain uncorrelated samples (though more complicated extensions tocircumvent this limitation have been suggested [38]). Lemma 3.1 is directly applicable to correlatedtime series data, and provides a credibility region that is well suited to the RRL problem, cf. §4.3.

4 Convex approximation to robust reinforcement learning problem

Equipped with the high-probability bound on the spectral properties of the parameter estimationerror presented in Lemma 3.1, we now proceed with the main contribution of this paper: a convexapproximation to the ‘robust reinforcement learning’ (RRL) problem in (3).

4.1 Steady-state approximation of cost

In pursuit of a more tractable formulation, we first introduce the following approximation of (3),

NX

i=1

sup{A,B}2

⇥m(M(Dti ))

E

2

4tiX

t=ti�1

c(xt, ut)

3

5 , s.t. xt+1 = Axt + But + wt, ut = Ki(xt). (5)

Observe that (5) has introduced two approximations to (3). First, in (5) we only update the ‘worst-case’ model at the beginning of each epoch, when we deploy a new policy, rather than at each timestep as in (3). This introduces some conservatism, as model uncertainty will generally decreaseas more data is collected, but results in a simpler control synthesis problem. Second, we selectthe worst-case model from the ‘spectral’ credibility region ⇥m as defined in (4), rather than the‘ellipsoidal’ region ⇥e defined in (2). Again, this introduces some conservatism as ⇥e ✓ ⇥m, cf.§A.1.2, but permits convex optimization of the worst-case cost, cf. §4.2. For convenience, we denote

J⌧ (x1, K,⇥m(M)) := sup{A,B}2⇥m(M)

X⌧

t=1c(xt, ut), s.t. xt+1 = Axt + But + wt, ut = K(xt).

4

Page 5: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

Next, we approximate the cost between epochs with the infinite-horizon cost, scaled appropriately forthe epoch duration, i.e., between the i � 1th and ith epoch we approximate the cost as

JTi(xti , Ki,⇥m(M(Dti�1))) ⇡ Ti⇥{J1(Ki,⇥m(M(Dti�1))) := lim⌧!1

1

⌧J⌧ (0, Ki,⇥m(M(Dti�1)))}.

(6)This approximation is accurate when the epoch duration Ti is sufficiently long relative to the timerequired for the state to reach the stationary distribution. Substituting (6) into (5), the cost functionthat we seek to minimize becomes

EXN

i=1Ti ⇥ J1

�Ki,⇥m(M(Dti�1))

��. (7)

The expectation in (7) is w.r.t. to wt and et, as Dti depends on the random variables x1:ti and u1:ti ,which evolve according to the worst-case dynamics in (5).

4.2 Optimization of worst-case cost

The previous subsection introduced an approximation of our ‘ideal’ problem (3), based on the worst-case infinite horizon cost, cf. (7). In this subsection we present a convex approach to the optimizationof J1(K,⇥m(M)) w.r.t. K, given M. The infinite horizon cost can be expressed as

lim⌧!1

1

⌧E"

⌧X

t=1

x>t Qxt + u>

t Rut

#= tr

Q 00 R

�lim

⌧!1

1

⌧X

t=1

E"

xt

ut

� xt

ut

�>#!

. (8)

Under the feedback policy K, the covariance appearing on the RHS of (8) can be expressed as

E"

lim⌧!1

1

⌧X

t=1

xt

Kxt + ⌃1/2et

� xt

Kxt + ⌃1/2et

�>#

=

W WK>

KW KWK> + ⌃

�, (9)

where W = E⇥xtx>

t

⇤denotes the stationary state covariance. For known A and B, W is given by

the (minimum trace) solution to the Lyapunov inequality

W ⌫ (A + BK)W (A + BK)> + B⌃B> + �2wInx , (10)

i.e., arg minW tr W s.t. (10). To optimize J1(K,⇥m(M)) via convex optimization, there are twochallenges to overcome: i. non-convexity of jointly searching for K and W , satisfying (10) andminimizing (8), ii. computing W for worst-case {A, B} 2 ⇥m(M), rather than known {A, B}.

Let us begin with the first challenge: nonconvexity. To facilitate a convex formulation of the RRLproblem (7) we write (10) as

W ⌫ [A B]

W WK>

KW KWK> + ⌃

�[A B]> + �2

wInx , (11)

and introduce the change of variables Z = WK> and Y = KWK> + ⌃, collated in the variable

⌅ =

W ZZ> Y

�. With this change of variables, minimizing (8) subject to (11) is a convex program.

Now, we turn to the second challenge: computation of the stationary state covariance under theworst-case dynamics. As a sufficient condition, we require (11) to hold for all {A, B} 2 ⇥m(M). Inparticular, we define the following approximation of J1(K, M)

J1(K, M) := minW2Snx

++

tr✓

Q 00 R

� W WK>

KW KWK> + ⌃

�◆, s.t. (11) holds 8 {A, B} 2 ⇥m(M).

(12)

Lemma 4.1. Consider the worst-case cost J1(K, M), cf. (6), and the approximation J1(K, M),cf. (12). J1(K, M) � J1(K, M).

Proof: cf. §A.1.3. To optimize J1(K, M), as defined in (12), we make use of the following resultfrom [28]:

5

Page 6: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

Theorem 4.1. The data matrices (A, B, C, P, F , G, H) satisfy, for all X with I � X>PX ⌫ 0, therobust fractional quadratic matrix inequality

H F + GX(F + GX)> C + X>B + B>X + X>AX

�⌫ 0, iff

2

4H F GF> C � �I B>

G> B A + �P

3

5 ⌫ 0,

(13)for some � � 0.

To put (11) in a form to which Theorem 4.1 is applicable, we make use of of the nominal parametersA and B. With X defined as in (4), such that [A B] = [A B] � X 0, we can express (11) as

:= W�[A B]⌅[A B]>+X>⌅[A B]>+[A B]⌅X�X>⌅X ⌫ �2wInx ()

I �wI

�wI

�⌫ 0,

where the ‘iff’ follows from the Schur complement. Given this equivalent representation, by Theorem4.1, (11) holds for all X>DM � I (i.e. all {A, B} 2 ⇥m(M)) iff

S(�,⌅, A, B, D) :=

2

4I �wI 0

�wI W � [A B]⌅[A B]> � �I [A B]⌅>

0 ⌅[A B]> �D � ⌅

3

5 ⌫ 0, (14)

which is simply (13) with the substitutions A = �⌅, B = ⌅[A B]>, C = W � [A B]⌅[A B]>,F = �wI , G = 0, and P = D. We now have the following result, cf. §A.1.4 for proof.Theorem 4.2. The solution to minK J1(K,⇥m(M)), cf. (12), is given by the SDP:

min�,⌅

tr (blkdiag(Q, R)⌅) , s.t. S(�,⌅, A, B, D) ⌫ 0, � � 0, (15)

with the optimal policy given by K = {Z>W�1, Y � Z>W�1Z}.

Note that as minK J1(K,⇥m(M)) is purely an ‘exploitation’ problem ⌃ ! 0 in the above SDP; ingeneral, ⌃ 6= 0 in the RRL setting (i.e. (7)) where exploration is beneficial.

4.3 Approximate uncertainty propagation

Let us now return to the RRL problem (7). Given a model M, §4.2 furnished us with a convexmethod to minimize the worst-case cost. However, at time t = 0, we have access only to data D0,and therefore, only M(D0). To optimize (7) we need to approximate the models {M(Dti)}N�1

i=1

based on the future data, {Dti}N�1i=1 , that we expect to see. To this end, we denote the approximate

model, at time t = tj given data Dti , by Mj(Dti) := {Aj|i, Bj|i, Dj|i} ⇡ E⇥M(Dtj )|Dti

⇤. We

now describe specific choices for Aj|i, Bj|i, and Dj|i, beginning with the latter.

Recall that the uncertainty matrix D at the ith epoch is denoted Di. The uncertainty matrix at the

i + 1th epoch is then given by Di+1 = Di + 1�2wc�

Pti+1

t=ti

xt

ut

� xt

ut

�>. We approximate the

empirical covariance matrix in this expression with the worst-case state covariance Wi as follows:

E"ti+1X

t=ti

xt

ut

� xt

ut

�>#

⇡ Ti+1

Wi WiK>

iK>

i Wi KiWiK>i + ⌃i

�= Ti+1⌅i. (16)

This approximation makes use of the same calculation appearing in (9). The equality makes use ofthe change of variables introduced in §4.2. Note that in proof of Theorem 4.2, cf. §A.1.4, it was

shown that ⌅ =

W WK>

K>W KWK> + ⌃

�, when ⌅ is the solution of (15).

Next, we turn our attention to approximating the effect of future data on the nominal parameterestimates {A, B}. Updating these (ordinary least squares) estimates based on the expected valueof future observations involves difficult integrals that must be approximated numerically [27, §5].To preserve convexity in our formulation, we approximate future nominal parameter estimates withthe current estimates, i.e., given data Dti we set Aj|i = Ai and Bj|i = Bi. To summarize, ourapproximate model at epoch j is given by Mj(Dti) = {Ai, Bi, Di + 1

�2wc�

Pjk=i+1 Tk+1⌅k}.

6

Page 7: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

4.4 Final convex program and receding horizon application

We are now in a position to present a convex approximation to our original problem (3). Bysubstituting J1(·, ·) for J1(·, ·), and Mi(D0) for M(Dti) in (7), we attain the cost function:PN

i=1Ti ⇥ J1

⇣Ki,⇥m(Mi�1(D0))

⌘. Consider the ith term in this sum, which can be optimized

via the SDP (15), with D = Di�1|0. Notice two important facts: 1. for fixed multiplier �, theuncertainty Di�1|0 enters linearly in the constraint S(·) ⌫ 0, cf. (15); 2. Di�1|0 is linear in thedecision variables {⌅k}i�1

k=1, cf. end of §4.3. Therefore, the constraint S(·) ⌫ 0 remains linear in thedecision variables, which means that the cost function derived by substituting Mi(D0) into (7) canbe optimized as an SDP, cf. (17) below.

Hitherto, we have considered the problem of minimizing the expected cost over time horizon Tgiven initial data D0. In practical applications, we employ a receding horizon strategy, i.e., at theith epoch, given data Dti�1 , we find a sequence of policies {Kj}i+h

j=i that minimize the approximateh-step-ahead expected cost

J(i, h, {Kj}i+hj=i , Dti�1) := TiJ1(Ki,⇥m(M(Dti�1))) +

i+hX

j=i+1

Tj J1(Kj ,⇥m(M(Dti�1))),

and then apply Ki during the ith epoch. At the beginning of the i + 1th epoch, we repeat the process;cf. Algorithm1. The problem min{Kj}N

j=iJ(i, h, {Kj}i+h

j=i , Dti�1) can be solved as the SDP:

min�i�0,{⌅j}i+h

j=i

Xi+h

j=itr (blkdiag(Q, R) ⌅j) , s.t. S(�i,⌅i, Ai, Bi, Di) ⌫ 0, ⌅j ⌫ 0 8j, (17a)

S

�j ,⌅k, Ai, Bi, Di +

1

�2wc�

jX

k=i+1

Tk+1⌅k

!⌫ 0 for j = i + 1, . . . , i + h. (17b)

Selecting multipliers For optimization of J1(K, M) given a model M, i.e., (15), the simultaneoussearch for the policy K and multiplier � is convex, as D is fixed. However, in the RRL setting,‘D’ is a function of the decision variables ⌅i, cf. (17b), and so the multipliers {�j}i+h

j=i+1 2Rh�1

+ must be specified in advance. We propose the following method of selecting the multipliers:given Dti�1 , solve K = arg minK J1(K,⇥m(M(Dti�1))) via the SDP (15). Then, compute thecost J(i, h, {K}i+h

j=i , Dti�1) by solving (17), but with the policies fixed to K, and the multipliers{�j}i+h

j=i 2 Rh+ as free decision variables. In other words, approximate the worst-case cost of

deploying the K, h epochs into the future. Then, use the multipliers found during the calculation ofthis cost for control policy synthesis at the ith epoch.

Computational complexity The proposed method can be implemented via semidefinite program-ing (SDP) for which computational complexity is well-understood. In particular, the cost of solving theSDP (15) scales as O(max{m3, mn3, m2n2}) [26], where m = (1/2)nx(nx + 1) + (1/2)nu(nu +1) + nxnu + 1 denotes the dimensionality of the decision variables, and n = 3nx + nu is thedimensionality of the LMI S ⌫ 0. The cost of solving the SDP (17) is then given, approximately, bythe cost of (15) multiplied by the horizon h.

5 Experimental results

Numerical simulations In this section, we consider the RRL problem with parameters

Atr =

"1.1 0.5 00 0.9 0.10 �0.2 0.8

#, Btr =

"0 1

0.1 00 2

#, Q = I, R = blkdiag(0.1, 1), �w = 0.5.

We partition the time horizon T = 103 into N = 10 equally spaced intervals, each of length Ti = 100.For robustness, we set � = 0.05. Each experimental trial consists of the following procedure. Initial

7

Page 8: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

Algorithm 1 Receding horizon application to true system

1: Input: initial data D0, confidence �, LQR cost matrices Q and R, epochs {ti}Ni=1.

2: for i = 1 : N do3: Compute/update nominal model M(Dti�1).4: Solve convex program (17).5: Recover policy Ki: Ki = Z>

i W�1i and ⌃i = Yi � Z>

i W�1i Zi.

6: Apply policy to true system for ti�1 < t ti, which evolves according to (1) with ut =

Kixt + ⌃1/2i et.

7: Form Dti = Dti�1 [ {xti�1:ti , uti�1:ti} based on newly observed data.8: end for

data D0 is obtained by driving the system forward 6 time steps, excited by ut ⇠ N (0, I). Thisopen-loop experiment is repeated 100 times, such that D0 = {xi

1:6, u1:6}100i=1. We then apply three

methods: i. rrl - the method proposed in §4.4, with look-ahead horizon h = 10; ii. nom - applyingthe ‘nominal’ robust policy Ki = arg minK J1(K,⇥m(M(Dti))), i.e., a pure greedy exploitationpolicy, with no explicit exploration; iii. greedy - first obtaining a nominal robustly stabilizing policyas with nom, but then optimizing (i.e., increasing, if possible) the exploration variance ⌃ until thegreedy policy and the rrl policy have the same theoretical worst-case cost at the current epoch. Thisis a greedy exploration policy; iv. ts - Thompson sampling [4]; v. rbst - the robust adaptive-controlsynthesis method proposed in [13]. We perform 100 of these trials and plot the results in Figure 2.

(a) (b) (c)

Figure 2: Results for the experiments described in §5. (a) cost and ‘information’ (a scalar measure ofuncertainty defined in §5) when the system evolves according to the worst-case dynamics. The tracedenotes the median, and the shaded region spans from the 10th to the 90th percentile. (b) cost andinformation when policies are applied to the true system. (c) sum of costs (over all time steps) forthe worst-case dynamics (left) and the true system (right). Note that greedy is abbreviated as grdy.

In Figure 2(a) we plot the cost at each epoch when the system evolves according to the worst-casedynamics. we also plot the information, defined as 1/�max(D

�1i ), at the ith epoch, which is the

(inverse) of the 2-norm of parameter error, cf. (4). This is a scalar measure of uncertainty: thelarger the information, the more certain our estimate of the system (in an absolute sense). ts isomitted from these results as the closed-loop behavior diverges (i.e. attains infinite worst-case cost)in 96% of the trials conducted. In Figure 2(b) we plot the cost, and information, at each epochwhen the policies are applied to the true system. Figure 2(c) plots the total cost (sum of costsover all epochs). We make the following observations. Concerning worst-case performance, nomattains the lowest cost at the initial epoch, as it does no explicit exploration. However, methods thatincorporate exploration achieve better performance at subsequent epochs (and in terms of total cost)due to greater reduction in uncertainty. Of these methods, the proposed rrl performs best, optimallybalancing exploration with exploitation; we emphasize that this balance of exploration/exploitationoccurs automatically. Furthermore, observe that greedy actually achieves higher information (lowerabsolute uncertainty) relative to rrl, yet attains higher cost. This suggests that rrl is reducing the

8

Page 9: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

uncertainty in a structured way, targeting uncertainty reduction in the parameters that ‘matter most forcontrol’. Results on the true system are qualitatively similar, with rrl attaining better performancethan all other methods except ts. We note that Thompson sampling performs well when the policyhappens to stabilize the system; however, as stability is not a consideration during ts synthesis, thiscannot be guaranteed.

Hardware-in-the-loop experiment In this section, we consider the RRL problem for a hardware-in-the-loop simulation comprised of the interconnection of a physical servo mechanism (QuanserQUBE 2) and a synthetic (simulated) LTI dynamical system. Control of servomechanisms is aubiquitous task in practice (e.g. robotics); furthermore, the planar servo can be modeled reasonablywell (globally) as a linear system. As such, this setup represents a good compromise between thecomplexities of a real world system (backlash, friction, unmodeled dynamics, disturbances, etc) and asystem that approximately satisfies the assumptions of the method; cf. Appendix A.2 for full detailsof the experimental setup. An experimental trial consisted of the following procedure. Initial data wasobtained by simulating the system for 0.5 seconds, under closed-loop feedback control (cf. AppendixA.2) with data sampled at 500Hz, to give 250 initial data points. We then applied methods rrl (withhorizon h = 5) and greedy as described in §5. The total control horizon was T = 1250 (2.5 secondsat 500Hz) and was divided into N = 5 intervals, each of duration 0.5 seconds. We performed 5of these experimental trials and plot the results in Figure3. In Figure 3(b) and (d) we plot the totalcost (the sum of the costs at each epoch), and the cost at each epoch, respectively, for each method,and observe significantly better performance from rrl in both cases. Additional plots decomposingthe cost into that associated with the physical and synthetic system are available in Appendix A.2.We also applied ts, and observed that the resulting policy was unable to stabilize the system; cf.Figure 3(c) which demonstrates divergence of the angular position of the servomotor under ts.

(a) (b)

0 1 2 3 4 5-4

-2

0

2

4

(c)

Figure 3: Results for the hardware-in-the-loop experiment in §5. (a) median costs at each epoch; theshaded region covers the best/worst costs at each epoch. (b) total costs (sum of costs at each epoch).(c) angular position of the servo motor (x1) under feedback control with the proposed method andThompson sampling; the latter results in divergence (uncontrolled revolutions of the servo motor).

6 Conclusion

We have presented an algorithm for robust, targeted exploration in RL for linear systems with quadraticrewards. Policies are robust in the sense that stability of the closed loop system is guaranteed, withhigh probability, during learning, and targeted, in the sense that uncertainty is reduced so as toimprove performance of the controller on the specific task at hand. Roughly speaking, the policyprioritizes uncertainty reduction in the parameters that ‘matter most for control’. The search for apolicy is formulated as a convex program; solving to global optimality then automatically gives theoptimal tradeoff between exploration and exploitation, in the worst-case setting.

Acknowledgments

This research was financially supported by the project NewLEADS - New Directions in LearningDynamical Systems (contract number: 621-2016-06079), funded by the Swedish Research Council

9

Page 10: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

and by the project ASSEMBLE (contract number: RIT15-0012), funded by the Swedish Foundationfor Strategic Research (SSF).

References[1] Y. Abbasi-Yadkori and C. Szepesvári. Regret bounds for the adaptive control of linear quadratic systems.

In Proceedings of the 24th Annual Conference on Learning Theory, pages 1–26, 2011.

[2] P. Abbeel and A. Y. Ng. Exploration and apprenticeship learning in reinforcement learning. In Proceedingsof the 22nd international conference on Machine learning, pages 1–8. ACM, 2005.

[3] M. Abeille and A. Lazaric. Thompson Sampling for linear-quadratic control problems. In InternationalConference on Artificial Intelligence and Statistics (AISTATS), 2017.

[4] M. Abeille and A. Lazaric. Improved regret bounds for thompson sampling in linear quadratic controlproblems. In International Conference on Machine Learning, pages 1–9, 2018.

[5] M. Annergren, C. A. Larsson, H. Hjalmarsson, X. Bombois, and B. Wahlberg. Application-oriented inputdesign in system identification: Optimal input design for control [applications of control]. IEEE ControlSystems Magazine, 37(2):31–56, 2017.

[6] K. J. Åström and B. Wittenmark. Problems of identification and control. Journal of Mathematical analysisand applications, 34(1):90–113, 1971.

[7] A. Aswani, H. Gonzalez, S. S. Sastry, and C. Tomlin. Provably safe and robust learning-based modelpredictive control. Automatica, 49(5):1216–1226, 2013.

[8] Y. Bar-Shalom. Stochastic dynamic programming: Caution and probing. IEEE Transactions on AutomaticControl, 26(5):1184–1195, 1981.

[9] Y. Bar-Shalom and E. Tse. Caution, probing, and the value of information in the control of uncertainsystems. In Annals of Economic and Social Measurement, Volume 5, number 3, pages 323–337. NBER,1976.

[10] F. Berkenkamp, M. Turchetta, A. Schoellig, and A. Krause. Safe model-based reinforcement learning withstability guarantees. In Advances in Neural Information Processing Systems (NIPS). 2017.

[11] S. Dean, H. Mania, N. Matni, B. Recht, and S. Tu. On the sample complexity of the linear quadraticregulator. to appear in Foundations of Computational Mathematics (arXiv:1710.01688), 2017.

[12] S. Dean, H. Mania, N. Matni, B. Recht, and S. Tu. Regret bounds for robust adaptive control of the linearquadratic regulator. In Advances in Neural Information Processing Systems, pages 4192–4201, 2018.

[13] S. Dean, S. Tu, N. Matni, and B. Recht. Safely learning to control the constrained linear quadratic regulator.to appear at American Control Conference (arXiv:1809.10121), 2019.

[14] S. Depeweg, J. M. Hernández-Lobato, F. Doshi-Velez, and S. Udluft. Decomposition of Uncertainty inBayesian Deep Learning for Efficient and Risk-sensitive Learning. arXiv:1710.07283, 2017.

[15] M. K. S. Faradonbeh, A. Tewari, and G. Michailidis. Finite time adaptive stabilization of LQ systems.IEEE Transactions on Automatic Control, 2019. accepted.

[16] M. Fazel, R. Ge, S. M. Kakade, and M. Mesbahi. Global convergence of policy gradient methods for thelinear quadratic regulator. arXiv preprint arXiv:1801.05039, 2018.

[17] A. Feldbaum. Dual control theory. i. Avtomatika i Telemekhanika, 21(9):1240–1249, 1960.

[18] A. Feldbaum. Dual control theory. ii. Avtomatika i Telemekhanika, 21(11):1453–1464, 1960.

[19] J. Garcıa and F. Fernández. A comprehensive survey on safe reinforcement learning. Journal of MachineLearning Research, 16(1):1437–1480, 2015.

[20] P. Geibel and F. Wysotzki. Risk-sensitive reinforcement learning applied to control under constraints.Journal of Artificial Intelligence Research, 24:81–108, 2005.

[21] E. Hazan, H. Lee, K. Singh, C. Zhang, and Y. Zhang. Spectral filtering for general linear dynamicalsystems. In Advances in Neural Information Processing Systems, pages 4634–4643, 2018.

10

Page 11: Robust exploration in linear quadratic reinforcement learning · 2020-02-13 · Robust exploration in linear quadratic reinforcement learning Jack Umenberger Department of Information

[22] E. Hazan, K. Singh, and C. Zhang. Learning linear dynamical systems via spectral filtering. In Advancesin Neural Information Processing Systems, pages 6702–6712, 2017.

[23] M. Ibrahimi, A. Javanmard, and B. V. Roy. Efficient reinforcement learning for high dimensional linearquadratic systems. In Advances in Neural Information Processing Systems, pages 2636–2644, 2012.

[24] C. Larsson, A. Ebadat, C. Rojas, X. Bombois, and H. Hjalmarsson. An application-oriented approach todual control with excitation for closed-loop identification. European Journal of Control, 29:1–16, May2016.

[25] C. Larsson, C. Rojas, X. Bombois, and H. Hjalmarsson. Experimental evaluation of model predictivecontrol with excitation (MPC-X) on an industrial depropanizer. Journal of Process Control, 31:1–16, Jul2015.

[26] Z. Liu and L. Vandenberghe. Interior-point method for nuclear norm approximation with application tosystem identification. SIAM Journal on Matrix Analysis and Applications, 31(3):1235–1256, 2009.

[27] M. S. Lobo and S. Boyd. Policies for simultaneous estimation and optimization. In Proceedings of the1999 American Control Conference (Cat. No. 99CH36251), volume 2, pages 958–964. IEEE, 1999.

[28] Z.-Q. Luo, J. F. Sturm, and S. Zhang. Multivariate nonnegative quadratic mappings. SIAM Journal onOptimization, 14(4):1140–1162, 2004.

[29] D. Malik, A. Pananjady, K. Bhatia, K. Khamaru, P. L. Bartlett, and M. J. Wainwright. Derivative-freemethods for policy optimization: Guarantees for linear quadratic systems. arXiv preprint arXiv:1812.08305,2018.

[30] H. Mania, S. Tu, and B. Recht. Certainty equivalent control of lqr is efficient. arXiv preprintarXiv:1902.07826, 2019.

[31] N. Matni, A. Proutiere, A. Rantzer, and S. Tu. From self-tuning regulators to reinforcement learning andback again. arXiv preprint arXiv:1906.11392, 2019.

[32] O. Mihatsch and R. Neuneier. Risk-sensitive reinforcement learning. Machine learning, 49(2-3):267–290,2002.

[33] V. Mnih, K. Kavukcuoglu, D. Silver, A. A. Rusu, J. Veness, M. G. Bellemare, A. Graves, M. Riedmiller,A. K. Fidjeland, G. Ostrovski, et al. Human-level control through deep reinforcement learning. Nature,518(7540):529, 2015.

[34] C. J. Ostafew, A. P. Schoellig, and T. D. Barfoot. Robust Constrained Learning-based NMPC enablingreliable mobile robot path tracking. The International Journal of Robotics Research, 35(13):1547–1563,2016.

[35] Y. Ouyang, M. Gagrani, and R. Jain. Learning-based control of unknown linear systems with thompsonsampling. submitted to IEEE Transactions on Automatic Control (arXiv:1709.04047), 2017.

[36] K. B. Petersen and M. S. Pedersen. The Matrix Cookbook, 2012.

[37] D. Silver, A. Huang, C. J. Maddison, A. Guez, L. Sifre, G. Van Den Driessche, J. Schrittwieser,I. Antonoglou, V. Panneershelvam, M. Lanctot, et al. Mastering the game of Go with deep neuralnetworks and tree search. Nature, 529(7587):484, 2016.

[38] M. Simchowitz, H. Mania, S. Tu, M. I. Jordan, and B. Recht. Learning without mixing: Towards a sharpanalysis of linear system identification. Conf. on Learning Theory (arXiv:1802.08334), 2018.

[39] S. Tu and B. Recht. The gap between model-based and model-free methods on the linear quadraticregulator: An asymptotic viewpoint. to appear in Conference on Learning Theory (arXiv:1812.03565),2019.

[40] M. J. Wainwright. High-dimensional statistics: A non-asymptotic viewpoint, volume 48. CambridgeUniversity Press, 2019.

11