1 / 30

INTRODUCTION

This chapter defines measurement terms, describes basic measurement units, and discusses instruments and indicators. It also covers the international system of units (SI) and introduces the concept of error in measurement.

naylor
Download Presentation

INTRODUCTION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INTRODUCTION MEASUREMENT STANDARDS AND UNITS

  2. Chapter Objectives Contents To define some measurement terms To describebasic measurement units and relate to derivative units To characterize instruments To differentiate between instrument and indicators

  3. DEFINITION & MEASUREMENT

  4. Definition • MeasurementThe process of determining the amount, degree, or capacity by comparison (direct or indirect) with the accepted standards of the system units being used. • Accuracy The degree of exactness (closeness) of a measurement compared to the expected (desired) value. • Resolution The smallest change in a measured variable to which an instrument will respond. Instrumentation is a technology of measurement which serves not only science but all branches of engineering, medicine, and almost every human endeavor. Electronic Instrumentation – the application of measurement technology in Electronic-related field. Instrument A device or mechanism used to determine the present value of the quantity under measurement.

  5. Definition Precision A measure of the consistency or repeatability of measurements, i.e. successive reading do not differ. (Precision is the consistency of the instrument output for a given value of input). Expected value The design value, i.e. the most probable value that calculations indicate one should expect to measure. Error The deviation of the true value from the desired value. Sensitivity The ratio of the change in output (response) of the instrument to a change of input or measured variable.

  6. Measurement Measurand The process of comparing an unknown quantity with an accepted standard quantity. The process of determining the amount, degree, or capacity by comparison (direct or indirect) with the accepted standards of the system units being used. Displacement Strain Vibration Pressure Flow Temperature Force Torque

  7. Measurand Displacement: Vector representing a change in position of a body or a point with respect to a reference. Strain: Relative deformation of elastic, plastic, and fluid materials under applied forces. Vibration: Oscillatory motion which can be described in term of amplitude (size), frequency (rate of oscillation) and phase (timing of the oscillation relative to fixed time). Pressure: Ratio of force commonly acting on a surface to the area of the surface. Flow: Stream of molten or liquidified material that can be measured in term of speed and quantity Temperature: Measure of relative warmth or coolness of an object compared to absolute value. Force: Defined as a quantity that changes the motion, size, or shape of a body. Torque: Defined as the tendency of a force to rotate the body to which it is applied.

  8. Unit Base Unit International System of Units (abbreviated SI from the French le Système international d'unités) It is the world's most widely used system of measurement, both in everyday commerce and in science. The SI was developed in 1960 from the old metre-kilogram-second system. Length – meter (m) Mass – kilogram (kg) Time – second (s) Electric current – ampere (A) Temperature – kelvin (K) Luminous intensity – candela (cd) Amount of substance – mole (mol)

  9. Derivative Unit Electric charge – coulomb (C) Electric potential difference – volt (V) Electric resistance – ohm (Ω) Electric capacitance – farad (F) Electric inductance – henry (H) Energy – joule (J) Force – newton (N) Magnetic flux – weber (Wb) Power – watt (W)

  10. ERROR IN MEASUREMENT

  11. Direct Analysis Formula Error is the degree to which a measurement nears the expected value. It can be expressed as: • Absolute error • Percentage of error Accuracy can be calculated based on error. • e = absolute error • Yn = expected value • Xn = measured value

  12. Formula %E = percentage of error e = absolute error Yn = expected value Xn = measured value a = percentage of accuracy A = relative accuracy e = absolute error A = relative accuracy e = absolute error Yn = expected value Xn = measured value Yn = expected value Xn = measured value

  13. Statistical Analysis Can be used to determine the uncertainty of the test results. The analysis require a large number of measurement (data) to be taken. Arithmetic Mean • xn is nth data taken and n is the total of data or measurement. Deviation from mean • dn is the deviation of the nth data with the arithmetic mean. Average deviations • Indicate the precision of the instrument used, lower value of average deviation specify a highly precise instruments. Standard deviation • Small value of standard deviation means that the measurement is improved.

  14. Source of Error Errors in measurement can be broadly defined in three categories: • Gross errors • Systematic errors • Random errors Gross Errors Because of the human mistakes. • Improper or incorrect installation or use of measurement instrument. • Failure to eliminate parallax during reading or recording the measurement. Cannot be remedied mathematically.

  15. Systematic Errors Random Errors Because of the instrument. Three types of systematic errors: • Instrumental errors • Environmental errors • Observational errors Produce constant uniform deviation. Occur when different results in magnitude or sign obtained on repeated measurement of one or the same quantity. The effect can be minimized by taking the measurement many times. This error can be handled mathematically.

  16. CLASSIFICATION OF INSTRUMENTS

  17. Absolute Secondary Provide magnitude of the quantity under measurement in terms of physical constant of the instrument. Provide magnitude of the quantity under measurement only from the observation of the output from instrument. Most instrument used in practice are secondary.

  18. Operation type Deflection Only one source of input required. Output reading is based on the deflection from the initial condition of the instrument. The measured value of the quantity depends on the calibration of the instrument. Null Require two input – measurand and balance input. Must have feedback operation that compare the measurand with standard value. More accurate and sensitive compared to deflection type instrument.

  19. Signal Type Analog Produce the signal that vary in continuous way. Infinite range of value in any given range. Digital Produce the signal that vary in discrete steps. Finite different values in a given range.

  20. INSTRUMENT ELEMENTS

  21. Model • Important element is sensor which can convert the physical variable into signal variable. • Signal variable can be displayed, recorded or integrated into secondary instrument system. • Signal variable may also be used as an input signal of a control system.

  22. Block Diagram

  23. Block Diagram (Simplified)

  24. Subsystems • Transducers • Power Supply • Signal Conditioning Circuits • Filter / Amplifier • Data Processors • Process Controller • Command Generator • Recorder

  25. Elements of Electronic Instrumentation • Transducers • Device that converts a change in physical quantity into a change of electrical signal magnitude. • Power Supply • Provide energy to drive the transducers. • Signal Conditioning Circuits • Electronic circuits that manipulate, convert the output from transducers into more usable electrical signal.

  26. Elements of Electronic Instrumentation (cont.) • Amplifiers • Amplify low voltage signal from transducers or signal conditional circuit. • Recorders • Used to display the measurement for easy reading and interpretation. • Data Processors • Can be a microprocessor or microcontroller.

  27. Elements of Electronic Instrumentation (cont.) • Process Controllers • Used to monitor and adjust any quantity of the specified level or value. • Command Generator • Provide control voltage that represents the difference of the parameter in a given process.

  28. APPLICATION AREA

  29. APPLICATION AREA • Engineering Analysis • Process Control • Monitoring • Automation

  30. APPLICATION AREA • Engineering Analysis • To validate new design of structure, component or system by theoretical and experimental approach • Process Control • Monitoring process: provide real-time data that allow operator to respond. • Automatic process: provide real-time feedback data to the control system.

More Related