90 likes | 108 Views
Lesson 11 Calibration. Engineers, turned first to parts of the body, for measuring instruments. However, this was not a positive fixed dimension or a standard. Measurements must be standard to mean the same thing to everyone.
E N D
Lesson 11 Calibration
Engineers, turned first to parts of the body, for measuring instruments. However, this was not a positive fixed dimension or a standard. Measurements must be standard to mean the same thing to everyone
In 1793, the French government adopted a new system of standards called the metric system. In 1824, the English Parliament legalized a new standard yard which had been made in 1760. It was a brass bar containing a gold button near each end.
What is Calibration? • Act of comparing an instrument's measuring accuracy to a known standard • The process of adjusting an instrument, to establish a set of values for correct operation
Notes Info about screw reqd
The cubit of Noah's time was the length of a man's forearm or the distance from the tip of the elbow to the end of his middle finger. Many times this was useful, because it was readily available, convenient, and couldn't be mislaid. However, it was not a positive fixed dimension or a standard. While the cubit is no longer used as a unit of measurement, there are many customary standards that originated in about the same way. Our foot-rule started out as the length of a man's foot. So, in the early days of history, the foot varied in length, sometimes as much as 3 or 4 inches. Once the ancients started using arms and feet for measuring distance, it was only natural that they also thought of using fingers, hands and legs. They also may have discovered that some surprising ratios existed in body measurements. What is now called an inch originally was the width of a man's thumb. It also was the length of the forefinger from the tip to the first joint. Twelve times that distance made a foot. Three times the length of the foot was the distance from the tip of a man's nose to the end of his outstretched arm. This distance very closely approximates what is called the yard. Two yards equaled a fathom which, thousands Going as far back in time as Noah's ark, the lack of a yardstick was not a serious drawback. Most measuring was done by one craftsman completing one job at a time, rather than assembling a number of articles piecemeal to be assembled later, it didn't make much difference how accurate the measuring sticks were or even how long they were. Generally, it doesn't make much difference how long is a mile, yard or inch or how heavy is a pound or ounce. What is really important is that everyone means the same thing when referring to each unit of measurement. Measurements must be standard to mean the same thing to everyone. of years ago, was the distance across a man's outstretched arms. Half a yard was the 18-inch cubit, and half a cubit was called a span, which was the distance across the hand from the tip of the thumb to the tip of the little finger when the fingers were spread out as far as possible. A hand was half a span. For thousands of years, this was the way people measured comparatively short distances. Each succeeding civilization added its it to mankind's knowledge, building an accumulation of measuring standards and techniques. Some contributed weight measures. Others showed us how to measure time. Still others gave us methods for
Going as far back in time as Noah's ark, the lack of a yardstick was not a serious drawback. Most measuring was done by one craftsman completing one job at a time, rather than assembling a number of articles piecemeal to be assembled later, it didn't make much difference how accurate the measuring sticks were or even how long they were. Generally, it doesn't make much difference how long is a mile, yard or inch or how heavy is a pound or ounce. What is really important is that everyone means the same thing when referring to each unit of measurement. Measurements must be standard to mean the same thing to everyone. The cubit of Noah's time was the length of a man's forearm or the distance from the tip of the elbow to the end of his middle finger. Many times this was useful, because it was readily available, convenient, and couldn't be mislaid. However, it was not a positive fixed dimension or a standard. While the cubit is no longer used as a unit of measurement, there are many customary standards that originated in about the same way. Our foot-rule started out as the length of a man's foot. So, in the early days of history, the foot varied in length, sometimes as much as 3 or 4 inches. Once the ancients started using arms and feet for measuring distance, it was only natural that they also thought of using fingers, hands and legs. They also may have discovered that some surprising ratios existed in body measurements. What is now called an inch originally was the width of a man's thumb. It also was the length of the forefinger from the tip to the first joint. Twelve times that distance made a foot. Three times the length of the foot was the distance from the tip of a man's nose to the end of his outstretched arm. This distance very closely approximates what is called the yard. Two yards equaled a fathom which, thousands of years ago, was the distance across a man's outstretched arms. Half a yard was the 18-inch cubit, and half a cubit was called a span, which was the distance across the hand from the tip of the thumb to the tip of the little finger when the fingers were spread out as far as possible. A hand was half a span. For thousands of years, this was the way people measured comparatively short distances. Each succeeding civilization added its it to mankind's knowledge, building an accumulation of measuring standards and techniques. Some contributed weight measures. Others showed us how to measure time. Still others gave us methods for surveying big areas of land and establishing boundaries. In techniques for measuring weights, the Babylonians made important improvements upon the invention of the balance. Instead of just comparing the weights of two objects, they compared the weight of each object with a set of stones kept just for that purpose. In the ruins of their cities, archaeologists have found some of these stones finely shaped and polished. It is believed that these were the world's first weight standards. The Babylonians used different stones for weighing different commodities. In modern English history, the same basis has been used for weight measurements. For the horseman, the "stone" weight was 14 pounds. In weighing wool the stone was 16 pounds. For the butcher and fishmonger, the stone was 8 pounds. The only legal stone weight in the imperial system was 14 pounds. The Egyptians and the Greeks used a wheat seed as the smallest unit of weight, a standard that was very uniform and accurate for the times. The grain is still in limited use as a standard weight. However, wheat seeds are no longer actually put in the pan of the balance scale. Instead, a weight that is practically the same as that of an average grain of wheat is arbitrarily assigned to the grain. The Arabs established a small weight standard for gold, silver and precious stones which very often were a part of trade or barter deals. To weight the small valuable quantities, they used as a weight standard a small bean called a karob. This was the origin of the word carat which jewelers still use to express the weight of gems and precious metals. In trading between tribes and nations, many of these methods for measuring weights and distances gradually became intermixed, particularly by the Romans who spread this knowledge throughout the known world at that time, also adding some standards of their own. As the Roman soldiers marched, they kept track of the distance they traveled by counting paces. A pace was the distance covered from the time one foot touched the ground until that same foot touched the ground again, or the length of a double step.