Etendue: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Jcc2011
m →‎Maximum concentration: Added reference to the Acceptance angle (solar concentrators) Wikipedia page
en>Stybn
IPA pronunciation
 
Line 1: Line 1:
{{unreferenced|date=January 2009}}
I'm Joellen and was born on 12 January 1977. My hobbies are Art collecting and Bboying.<br><br>Here is my blog post [http://Iteamsanz.com/2014/08/03/fifa-15-coin-generator/ Fifa 15 Coin Generator]
In [[control theory]], a '''controller''' is a device, historically using mechanical, hydraulic, pneumatic or electronic techniques often in combination, but more recently in the form of a microprocessor or computer, which monitors and physically alters the operating conditions of a given [[dynamical system]].<ref name="The Principal Goal of Control">{{cite book|last=Salgado|first=Graham C. Goodwin, Stefan F. Graebe, Mario E.|title=Control System Design|year=2001|publisher=Prentice Hall|location=Upper Saddle River, N. J.|isbn=0139586539|pages=21}}</ref>
<ref>{{cite book
|title=A History of Control Engineering 1930-1955
|last=Bennett
|first= S.
|authorlink=
|coauthors=
|year=1993 |publisher =Peter Peregrinus Ltd. On behalf of the Institution of Electrical Engineers
|location= London
|isbn= 0-86341-280-7|pages=
| postscript = <!--None-->}}
</ref>  Typical applications of controllers are to hold settings for temperature, pressure, flow or speed.
 
== Input and Control Variables ==
 
A system can either be described as a [[MIMO]] system, having multiple inputs and outputs, therefore requiring more than one controller; or a [[Single-Input and Single-Output|SISO]] system, consisting of a single input and single output, hence having only a single controller. Depending on the set-up of the physical (or non-physical) system, adjusting the system's input variable (assuming it is SISO) will affect the operating parameter, otherwise known as the controlled output variable. Upon receiving the error signal that marks the disparity between the desired value (setpoint) and the actual output value, the controller will then attempt to regulate controlled output behaviour. The controller achieves this by either attenuating or amplifying the input signal to the plant so that the output is returned to the setpoint. For example, a simple feedback control system, such as the one shown on the right, will generate an error signal that's mathematically depicted as the difference between the setpoint value and the output value, r-y. [[FIle:Simple_feedback_control_loop2.svg|thumb|A simple feedback control loop illustrates that the error signal is received by controller C, which then either attenuate or amplify the input signal to the plant.]] This signal describes the magnitude by which the output value deviates from the setpoint. The signal is subsequently sent to the controller C which then interprets and adjusts for the discrepancy. If the plant is a physical one, the inputs to the system are regulated by means of [[actuators]]. 
 
A thermostat on a heater is an example of open loop control that is on or off.  A temperature sensor turns the heat source on if the temperature falls below the set point and turns the heat source off when the set point is reached.  There is no measurement of the difference between the set point and the measured temperature (e.g. no error measurement) and no adjustment to the rate at which heat is added other than all or none.
 
A familiar example of feedback control is cruise control on an automobile. Here speed is the '''measured variable'''.  The operator (driver) adjusts the desired speed '''set point''' (e.g. 100 km/hr) and the controller monitors the speed sensor and compares the measured speed to the set point.  Any deviations, such as changes in grade, drag, wind speed or even using a different grade of fuel (for example an ethanol blend) are corrected by the controller making a compensating adjustment to the fuel valve open position, which is the '''manipulated variable'''.  The controller makes adjustments having information only about the error (magnitude, rate of change or cumulative error) although adjustments known as ''tuning'' are used to achieve stable control. The operation of such controllers is the subject of [[control theory]].
 
{| class="wikitable"
|-
! System !! Controlled Outputs Include !! Controller !! Desired Performance Includes
|-
| Aircraft || Course, pitch, roll, yaw || Autopilot || Maintain flight path on a safe and smooth trajectory
|-
| Furnace || Temperature || Temperature controller || Follow warm-up temperature profile, then maintain temperature
|-
| Waster treatment || pH value of effluent || pH controller || Neutralize effluent to specified accuracy
|-
| Automobile || Speed || Cruise Controller || Attain, then maintain selected speed without undue fuel consumption
|}
 
The notion of controllers can be extended to more complex systems.  In the natural world, individual organisms also appear to be equipped with controllers that assure the [[homeostasis]] necessary for survival of each individual.  Both human-made and natural systems exhibit collective behaviors amongst individuals in which the controllers seek some form of equilibrium
 
== Types of Controlling System==
 
In [[control theory]] there are two basic types of control. These are [[feedback]] and [[Feed forward (control)|feed-forward]].  The input to a feedback controller is the same as what it is trying to control - the controlled variable is "fed back" into the controller.  The thermostat of a house is an example of a feedback controller. This controller relies on measuring the controlled variable, in this case the temperature of the house, and then adjusting the output, whether or not the heater is on. However, feedback control usually results in intermediate periods where the controlled variable is not at the desired set-point. With the thermostat example, if the door of the house were opened on a cold day, the house would cool down. After it fell below the desired temperature (set-point), the heater would kick on, but there would be a period when the house was colder than desired.
 
Feed-forward control can avoid the slowness of feedback control. With feed-forward control, the disturbances are measured and accounted for before they have time to affect the system. In the house example, a feed-forward system may measure the fact that the door is opened and automatically turn on the heater before the house can get too cold. The difficulty with feed-forward control is that the effect of the disturbances on the system must be accurately predicted, and there must not be any unmeasured disturbances. For instance, if a window were opened that was not being measured, the feed-forward-controlled thermostat might still let the house cool down.
 
To achieve the benefits of feedback control (controlling unknown disturbances and not having to know exactly how a system will respond to disturbances) ''and'' the benefits of feed-forward control  (responding to disturbances before they can affect the system), there are combinations of feedback and feed-forward that can be used.
 
Some examples of where feedback and feed-forward control can be used together are dead-time compensation, and inverse response compensation. '''Dead-time compensation''' is used to control devices that take a long time to show any change to a change in input, for example, change in composition of flow through a long pipe. A dead-time compensation control uses an element (also called a [[Smith predictor]]) to predict how changes made now by the controller will affect the controlled variable in the future. The controlled variable is also measured and used in feedback control. '''Inverse response compensation''' involves controlling systems where a change at first affects the measured variable one way but later affects it in the opposite way. An example would be eating candy. At first it will give you lots of energy, but later you will be very tired. As can be imagined, it is difficult to control this system with feedback alone, therefore a predictive feed-forward element is necessary to predict the reverse effect that a change will have in the future.
''''''''
 
== Types of controllers ==
 
Most control valve systems in the past were implemented using mechanical systems or solid state electronics. Pneumatics were often utilized to transmit information and control using pressure. However, most modern [[industrial control system]]s now rely on computers for the '''industrial controller'''. Obviously it is much easier to implement complex control algorithms on a computer than using a mechanical system.
 
For feedback controllers there are a few simple types. The most simple is like the thermostat that just turns the heat on if the temperature falls below a certain value and off it exceeds a certain value ([[on-off control]]).
 
Another simple type of controller is a [[proportional control|proportional controller]]. With this type of controller, the controller output (control action) is proportional to the error in the measured variable.  
 
In feedback control, it is standard to define the error as the difference between the desired value (setpoint) <math>y_s</math>and  the current value (measured) <math>y</math>. If the error is large, then the control action is large. Mathematically:
 
:<math>u(t) = K_c*e(t)+ u_0</math>
 
where
:<math>u(t)</math> represents the control action (controller output),
:<math>e(t)=y_s(t)-y(t)</math> represents the error,
:<math>K_c</math> represents the controller's gain, and
:<math>u_o</math> represents the steady state control action (bias) necessary to maintain the variable at the steady state when there is no error.
 
It is important that the control action <math>u</math> counteracts the change in the controlled variable <math>y</math> (negative feedback). There are then two cases depending on the sign of the process gain.
 
In the first case the process gain is positive, so an increase in the controlled variable (measurement) <math>y</math> requires a decrease in the control action <math>u</math> (reverse-acting control). In this case the controller gain <math>K_c</math> is positive, because the standard definition of the error already contains a negative sign for <math>y</math>.
 
In the second case the process gain is negative, so an increase in the controlled variable (measurement) <math>y</math> requires an increase in the control action <math>u</math> (direct-acting control). In this case the controller gain <math>K_c</math> is negative.
 
A typical example of a reverse-acting system is control of temperature (<math>y</math>) by use of steam (<math>u</math>). In this case the process gain is positive, so if the temperature increases, the steam flow must be decreased to maintain the desired temperature.  Conversely, a typical example of a direct-acting system is control of temperature using cooling water. In this case the process gain is negative, so if the temperature increases, the cooling water flow must be increased to maintain the desired temperature.
 
Although proportional control is simple to understand, it has drawbacks. The largest problem is that for most systems it will never entirely remove error. This is because when error is 0 the controller only provides the steady state control action so the system will settle back to the original steady state (which is probably not the new set point that we want the system to be at). To get the system to operate near the new steady state, the controller gain, Kc, must be very large so the controller will produce the required output when only a very small error is present. Having large gains can lead to system instability or can require physical impossibilities like infinitely large valves.
 
Alternates to proportional control are proportional-integral (PI) control and [[PID controller|proportional-integral-derivative (PID) control]].  PID control is commonly used to implement closed-loop control.
 
[[Open-loop controller|Open-loop control]] can be used in systems sufficiently well-characterized as to predict what outputs will necessarily achieve the desired states.  For example, the rotational velocity of an [[electric motor]] may be well enough characterized for the supplied [[voltage]] to make feedback unnecessary.
 
The drawback of [[Open-loop controller|open-loop control]] is that it requires perfect knowledge of the system (i.e. one knows exactly what inputs to give in order to get the desired output), and it assumes there are no disturbances to the system.
 
== See also ==
* [[Automation]]
* [[BELBIC]]
* [[CoDeSys]]
* [[Control theory]]
* [[PID controller]]
* [[Process control]]
* [[EICASLAB]]
 
==References==
{{reflist}}
 
[[Category:Automation]]
[[Category:Control theory|Controller]]
[[Category:Cybernetics]]

Latest revision as of 01:52, 7 January 2015

I'm Joellen and was born on 12 January 1977. My hobbies are Art collecting and Bboying.

Here is my blog post Fifa 15 Coin Generator