UXDE dot Net Wordpress Themes

Monthly Archives: July 2011

Instrument Parameters

in Basic Instrumentation / 5 Comments

Units and Standards

In order to avoid confusion and to obtain a consistent result, a set of units and standards have been commonly followed by all countries. Each instrument used is given a separate symbol which makes it easier for its identification and also for process control drawings. All the lists have been developed by The Instrument Society of America (ISA) and is being used worldwide.

The units that are used for the measurement f different variables fall mainly under two categories. One is the International system, SI (Systéme International D’Unités) and the other is the English system. The problem is that the latter is followed by very few countries including USA, but the former is followed by most of the other countries.

Parameters

There are some parameters that are to be checked during a process. They are all explained below.

  • Accuracy – It is defined as the difference between the indicated value and the actual value. The actual value may be a known standard and accuracy is obtained by comparing it with the obtained value. If the difference is small accuracy is high and vice versa.  Accuracy depends on several other parameters like hysteresis, linearity, sensitivity, offset, drift and so on. It is usually expressed as a percentage of span, percentage of reading or even absolute value. The standard value is set by the government so as to maintain the standard.
  • Reading accuracy is the deviation from true at the point the reading is being taken and is expressed as a percentage. Absolute accuracy of an instrument is the deviation from true as a number not as a percentage.
  • Span – It can be defined as the range of an instrument from the minimum to maximum scale value. In the case of a thermometer, its scale goes from −40°C to 100°C. Thus its span is 140°C. As said before accuracy is defined as a percentage of span. It is actually a deviation from true expressed as a percentage of the span.
  • Precision – It may be defined as the limits within which a signal can be read. For example if you consider an analog scale, which is set to graduate in divisions of 0.2 psi, the position of the needle of the instrument could be estimated to be within 0.02 psi. Thus the precision of the instrument is 0.02 psi.
  • Range – It can be defined as the measure of the instrument between the lowest and highest readings it can measure. A thermometer has a scale from −40°C to 100°C. Thus the range varies from −40°C to 100°C.
  • Reproducibility – It can be defined as the ability of an instrument to produce the same output repeatedly after reading the same input repeatedly, under the same conditions.
  • Sensitivity – It can also be called as the transfer function of a process. It is the ratio between the change in the output of an instrument to the corresponding change in the measured variable. For a good instrument or process, the sensitivity should always be high, thus producing higher output amplitudes.
  • Offset – Offset is the reading of an instrument with zero input.
  • Drift – Drift is the change in the reading of an instrument of a fixed variable with time.
  • Hysteresis – It can be defined as the different readings taken down when an instrument approaches a signal from opposite directions. That is the corresponding value taken down as the instrument moves from zero to midscale will be different from that between the midscale and full scale reading. The reason is the appearance of stresses inside the instrument material due to the change of its original shape between the zero reading and the full scale reading.
Hysterisis

Hysteresis

  • Resolution – It is the smallest difference in a variable to which the instrument will respond.
  • Repeatability – It is a measure of the closeness of agreement between a number of readings (10 to 12) taken consecutively of a variable, before the variable has time to change. The average reading is calculated and the spread in the value of the readings taken.
  • Linearity – It can b defined as a measure of the proportionality between the actual values of a variable being measured to the output of the instrument over its operating range.

Instrumentation System

in Basic Instrumentation / 1 Comment

The basic need of instrumentation in a process is to get the best and most amount of information so as to successfully complete the process. When referring to the completion of the project with reference to instrumentation, it basically means maximum efficiency with minimum production expense and desired output quality.

The information that is achieved from these processes may be very simple and may mostly involve a direct measurement method. But as the process becomes more complex, direct measurement may seem to be impracticable and so indirect methods must be used for measurements. These methods involve a derived relationship between the measured quantity and the result that is needed.

Most of the indirect methods involve electrical techniques as they have high speed and also simple processing methods. The output from such methods is easier to link to computers.

The obtained information may not necessarily be the direct value of a measured quantity. That is, the value obtained may be a variation of the value with respect to other parameters. It may also be a signal corresponding to the end limit. It could also be a specific value with an indicating hand over a suitable scale. Thus, one instrument may be needed to perform the required operations individually or a number of them at a time.

When it comes to industrial measurements, the measurands are all physical variables which is used to determine the flow of energy in these dynamical units. If so, they can be classified as

1. Flow through or per- variables

Flow trough variables can be measured from a single point in space. Some of the most measured variables using this method are force, momentum, flow, charge, current, volume and so on.

2. Across or trans-variables

Trans-variables need a referencing point and a measuring point. Some of the measured variables are displacement, velocity, pressure, temperature, level and voltage.

Instrumentation Systems

If we are mentioning instrumentation systems based on industrial applications it can be broadly classified into two. They are automatic type and manual type. The former works automatically without any help and the latter will need the assistance of an operator. If viewed from the system design view, the instruments will be classified into self-operated type and power operated type.

Whatever maybe the performance of an instrument, there will be some basic building blocks for its functioning. The correct combination of these blocks in a measurement system helps in converting a process condition into a suitable indication.

These blocks are also called as functional units and are present in all instrumentation systems.

All together, instrumentation systems can be classified into two. They are

1. Analog Instrumentation System

The block diagram is shown below.

An analog instrumentation system includes three functional units. They are

Analog Instrumentation System
                                   Analog Instrumentation System
  • The Primary Element/Transducer

The input receives the quantity whose value is to be measured and is converted into its proportional incremental electrical signal such as voltage, current, resistance change, inductance or even capacitance. Thus, the changed variable contains the information of the measured variable. Such a functional element or device is called a transducer.

  • The Secondary Element/Signal Processing Unit

The output of the transducer is provided to the input of the signal processing unit. This unit amplifies the weak transducer output and is filtered and modified to a form that is acceptable by the output unit. Thus this unit may have devices like: amplifiers, filters, analog to digital converters, and so on.

  • The Final Element/Output Unit

The output from the signal processing unit is fed to the input of the output unit. The output unit measures the signal and indicates the value to the reader. The indication may be either through: an indicating instrument, a CRO, digital computer, and so on.

2. Digital Instrumentation System

All the functional units that were used in an analog system will also be used here. He basic operation in a digital system includes the handling of analog signals, making the measurements, converting and handling digital data, programming and also control. The block diagram and functional units are given below.

Digital Instrumentation System
                                        Digital Instrumentation System
  • Transducer

All the physical input parameters like temperature, pressure, displacement, velocity, acceleration and so on will be converted into its proportionate electrical signal.

  • Signal Conditioning Unit

This working of this unit is exactly the same as that of a signal processing unit in an analog instrumentation system. It includes all the balancing circuits ad calibrating elements along with it.

  • Scanner/Multiplexer

Multiple analog signals are received by this device and are sequentially provided on to a measuring instrument.

  • Signal Converter

It is used to convert an analog signal to a form that is acceptable by the analog to digital converter.

  • Analog to (A-D) Digital Converter

The analog signal is converted into its proportional digital signal. The output of an A-D converter is given to a digital display.

  • Auxiliary Equipment

All the system programming and digital data processing functions are carried out by this unit. The auxiliary equipment may be a single computer or may be a collection of individual instruments. Some of its basic functions include linearizing and limit comparison.

  • Digital Recorder

It is mostly a CRO or a computer.