23
Chapter1 : Introduction to Control Systems : Definition : A System is an arrangement, set, or collection of things connected or related in such a manner as to form an entirety or a whole. By this definition, anything can be a system, but we’re only interested in a particular class of systems. Page 1 of 23

1.1 Chapter 1 Introduction to Control Systems, 23 Pages

Embed Size (px)

DESCRIPTION

Control system introduction

Citation preview

  • Chapter1: Introduction to Control Systems: Definition: A System is an arrangement, set, or collection of things connected or related in such a manner as to form an entirety or a whole. By this definition, anything can be a system, but were only interested in a particular class of systems.

    Page 1 of 23

  • System Classifications: 1. Time-Variant and Time-Invariant Systems:

    A Time-Varying System is a system that its parameters are functions of time variable, t. If it is described in terms of an algebraic or a differential equation, the coefficients of the equation are functions of time variable, t. However, A Time-Invariant System is a system that its parameters are constant. If it is described in terms of an algebraic or a

    Page 2 of 23

  • differential equation, the coefficients of the equation are constants.

    Note: In reality, all systems are Time-Varying. In this course, we are interested in Time-Invariant Systems. 2. Continuous and Discrete Systems:

    Continuous Systems are those systems that can be described in terms of continuous-time signals, whereas Discrete Systems can be described in terms of discrete-time signals.

    Page 3 of 23

  • Note: In reality, all systems are continuous. In this course, we are interested in Continuous Systems. 3. Causal and Non-Causal Systems:

    Causal Systems, also known as Physical or Non-Anticipative Systems, are those systems where the output y(t) at some specific instant t0 only depends on the input x(t) for values of t less than or equal to t0. Therefore these kinds of systems have outputs and internal states that

    Page 4 of 23

  • depend only on the current and previous input values. The idea that the output of a function at any time depends only on past and present values of input is defined by the property commonly referred to as Causality. Non-Causal Systems, also known as Non-Physical, Acausal, or Anticipative Systems, are those systems that have some dependence on input values from the future in addition to possible dependence on past or

    Page 5 of 23

  • current input values. Systems that depend solely on future input values are called Anti-Causal Systems. Note that some textbooks have defined Anti-Causal Systems as those systems that depend solely on future and present input values or, more simply, as systems that do not depend on past input values.

    In this course, were interested in Causal Systems.

    Page 6 of 23

  • 4. Lumped and Distributed Systems:

    Lumped Systems are those systems in which the dependent variables of interest are a function of time alone. In general, this means solving a set of differential equations.

    Distributed Systems are those systems in which all dependent variables are functions of time and one or more spatial variables. In this case, this means solving a set of partial differential equations.

    Page 7 of 23

  • For example, consider the following two systems:

    The first system is a distributed system, consisting of an infinitely thin string, supported at both ends; the dependent variable,

    Page 8 of 23

  • the vertical position of the string y(x,t) is indexed continuously in both space and time.

    The second system, a series of ''beads'' connected by massless string segments, constrained to move vertically, can be thought of as a lumped system, perhaps an approximation to the continuous string.

    For electrical systems, consider the difference between a lumped RLC network and a

    Page 9 of 23

  • transmission line as shown in the following figure:

    In this course, we are interested in Lumped Systems. 5. Linear and Non-Linear Systems:

    Page 10 of 23

  • If all the initial conditions are zero, the system response is called Zero-State-Response. If all the inputs or excitations are zero, the system response is called Zero-Input-Response. For a system to be Linear, 2 conditions must be satisfied: i. The Total Response must be decomposable

    into the sum of the Zero-State-Response and Zero-Input-Response. In other words, we should be able to separate the

    Page 11 of 23

  • Zero-State-Response completely from the Zero-Input-Response.

    ii. The response must satisfy Homogeneity and Superposition properties, i.e. F[ag(t) + bh(t)] = aF[g(t)] + bF[h(t)] where g(t) and h(t) are 2 independent input functions, a and b are 2 independent constants, and F is the response of the system. Non-Linear System are those systems that

    Page 12 of 23

  • are not Linear. In this course, we are interested in Linear Systems. 6. Dynamic and Static Systems:

    Dynamic Systems are governed by a set of differential or difference equations, whereas Static Systems, also known as Memoryless Systems, are governed by algebraic equations. The value of any of the states of a Dynamic System at any instant of time depends on the

    Page 13 of 23

  • history of the system. However, the value of any of the states of a Static System at any instant of time depends on the value of the input only at that instant of time.

    In this course were interested in Dynamic Systems. 7. Deterministic and Probabilistic Systems: In this course, we are interested in Deterministic Systems.

    Page 14 of 23

  • Therefore, in this course, we are interested in Dynamic, Deterministic, Linear, Continuous, Time-Invariant, Lumped, and Causal Systems. Control System Terminologies: 1. Definition: A Control System is an

    arrangement of physical components connected or related in such a manner as to command, direct or regulate (keyword in this definition is regulate, and that regulation

    Page 15 of 23

  • simply means controlling, either it is to regulate itself or another system).

    2. Definition: Input is the excitation applied from an external energy source (for example not the initial charge of a capacitor).

    3. Definition: Transfer Function is defined as the ratio of the Laplace Transform of the Output (Zero-State-Response) and Laplace Transform of the Input. Therefore, all I.C.s are zero. There are two types of transfer functions:

    Page 16 of 23

  • i. Open-Loop Transfer Function, and ii. Close-Loop Transfer Function. In Chapter 1, 2, 4, 5, and 6, we analyze and design the system using the closed-loop transfer function. In the remaining chapters, we analyze and design the closed-loop system using the open-loop transfer function.

    4. Definition: Output is the actual response of the system.

    5. Definition: Control Action is that quantity Page 17 of 23

  • responsible for activating the system to produce output, also known as Actuating Signal. It is also known as error signal in the case of unity negative feedback.

    6. Definition: Feedback is that property of the Close-Loop System which permits the system output or some other control variable of the system (not necessarily the principal system output) to be compared with the system input or some input (not necessarily the principal

    Page 18 of 23

  • system input) to some other internally situated component or subsystem of the system so that an appropriate control action maybe formed as some function of the input and the output.

    Feedback Characteristics: Feedback Advantages: 1. It increases accuracy. 2. It reduces sensitivity. 3. It reduces effects of non-linearity and

    Page 19 of 23

  • distortion. We will show distortion but not non-linearity.

    4. It increases bandwidth (Definition: Bandwidth is defined as the difference between high cut-off frequency and low cut-off frequency).

    Feedback Disadvantages: It increases tendency toward oscillation and instability.

    Page 20 of 23

  • Typical Feedback Control Systems: 1. Servomechanism is a power amplifying

    feedback control system where usually the control variables are mechanical position, velocity or acceleration, and the output usually follows the input.

    2. Regulator is a feedback control system which the output is maintained at constant level.

    Page 21 of 23

  • Types of Control Systems: 1. Man-Made, 2. Natural, and 3. Man-made and Natural. Classification of Control Systems: 1. Open-Loop Control Systems: An Open-Loop

    Control System is one in which the control action is independent of the output.

    2. Closed Loop Control Systems: A Closed-Loop Page 22 of 23

  • Control System or Feedback Control System is one in which the Control Action is somehow dependent on the output

    Page 23 of 23