The Elsmar Cove Wiki More Free Files The Elsmar Cove Forums Discussion Thread Index Post Attachments Listing Failure Modes Services and Solutions to Problems Elsmar cove Forums Main Page Elsmar Cove Home Page
Welcome To The Old Cayman Cove Forums!
This thread is carried over and continued in the Current Elsmar Cove Forums


The New Elsmar Cove Forums   The New Elsmar Cove Forums
  Measurement, Test and Calibration
  How to calibrate Dial Indicator

Post New Topic  Post A Reply
profile | register | preferences | faq | search

UBBFriend: Email This Page to Someone! next newest topic | next oldest topic
Author Topic:   How to calibrate Dial Indicator
brossbach
unregistered
posted 24 April 2001 08:22 AM           Edit/Delete Message   Reply w/Quote
Can someone show me how or explain step by step how to calibrate a dial indicator the correct way. I'm having problems how to calibrate correctly. I need help!!!!!!!!

IP: Logged

Jerry Eldred
Forum Wizard

Posts: 136
From:
Registered: Dec 1999

posted 24 April 2001 09:04 AM     Click Here to See the Profile for Jerry Eldred   Click Here to Email Jerry Eldred     Edit/Delete Message   Reply w/Quote
I can give you the basics here. If you will contact me via email, I can probably scrounge up a procedure.

There is differing opinions as to whether or not dial indicators even need calibration, as they are a relative indicator (delta measurements only, and not absolute length measurement). Even the Air Force has a variety of recommendations. One of these is just to check the repeatability and smooth action of the gears, as they don't "drift", only get worn or broken.

To calibrate you need a gauge stand, a surface plate, and appropriately sized gage blocks. The dial indicator is mounted on a gage stand on the surface plate, a zero is set on the dial, then gage blocks are inserted, and dial indicator reading is taken to determine that it is functioning properly. Check for smooth easy movement of the measuring rod on the indicator.

That's an over-simplified version of the method. If you email, I can look to see if I have a procedure. Send me manufacturer and model of the indicators you are calibrating.

------------------

IP: Logged

AJLenarz
Forum Contributor

Posts: 25
From:Princeton, IN, USA
Registered: Feb 2001

posted 24 April 2001 09:43 AM     Click Here to See the Profile for AJLenarz   Click Here to Email AJLenarz     Edit/Delete Message   Reply w/Quote
Bossbach,

I donât consider myself a gage calibration guru. Ultimately you need to develop a calibration procedure that fits your companyâs needs. Below is just an example of a calibration procedure I have observed elsewhere.

Vertically operating dial indicators will be checked for accuracy and repeatability using a precision bench micrometer in a horizontal position (a special holding fixture may be constructed for the purpose). The horizontal test type indicator may be checked for accuracy and repeatability using the micrometer height gage or gage block stack-ups. In addition each indicator will be inspected for smoothness of operation, spindle looseness, proper spring tension and backlash in reversing direction. The following steps must be taken in inspecting dial indicators:

1) Check for sticking. Move the spindle slowly from the rest position to the maximum limit of travel and return by means of hand pressure.
2) Check for spindle looseness by pushing the spindle back and forth in a direction perpendicular to its axis. Check for rack pin side play by attempting to rotate the spindle, record any deflection.
3) Note whether the return spring pressure is excessive.
4) Dial Indicator: Set the indicator up in a precision bench micrometer in a horizontal position so that the spindle is in line with the movable anvil and bears on this anvil. Advance the micrometer head and to different points and note whether the changes in the micrometer head and check different points in the other direction. Note whether there is any backlash and record results on the calibration record

Test Indicator: Mount the indicator on a surface gage or height gage and zero the indicator to zero on the micrometer height master. Rotate the micrometer head to different points and note whether the changes in the indicator are the same as shown on the micrometer head. Reverse direction of the micrometer head and check at different points in the opposite direction, note whether there is any backlash and report results on the calibration record.

Acceptance limits: Each indicator must meet the following requirements. A) The spindle is not to stick at any point across the entire range of the indicator. B) Spindle play and rack pin side play must not exceed 0.0002 inch deflection in the dial reading. C) There is to be no excessive spring pressure.

Accuracy of calibration will be within plus or minus one unit of graduation at any point covering the entire range of the indicator. The indicator is to repeat to zero at rest. On indicators graduated to 0.0001 inch or less the acceptable tolerance for this requirement is one graduation. On indicators where the graduations are 0.0005 inch or more the tolerance is 0.0002 inch.

IP: Logged

DICKIE
Forum Contributor

Posts: 46
From:Romulus, MI, USA
Registered: Feb 2001

posted 24 April 2001 01:43 PM     Click Here to See the Profile for DICKIE   Click Here to Email DICKIE     Edit/Delete Message   Reply w/Quote
I agree with what everyone has been said but would like to add that for the first 2 1/2 revolutions of indicator reading measurements should take place every quarter turn to check pinion gear and transfer gear accuracy. Then measurements can be taken at several intervals up to full scale, this will verify accuracy of the rack. This holds true for all dial gages. (calipers, bore gages, etc).

IP: Logged

Ryan Wilde
Forum Contributor

Posts: 20
From:Mineola, NY, USA
Registered: Feb 2001

posted 25 April 2001 05:12 PM     Click Here to See the Profile for Ryan Wilde   Click Here to Email Ryan Wilde     Edit/Delete Message   Reply w/Quote
You might want to get ahold of a copy of ANSI B89.1.10M, which states all of the accuracy, repeatability, and hysteresis (backlash) tolerances. The height stand method works well, but fails to provide hysteresis (backlash) data, which may be fine as long as you aren't performing runout type measurements which require ascending and descending measurements.

Ryan

IP: Logged

Jeremy S
unregistered
posted 09 May 2001 11:25 PM           Edit/Delete Message   Reply w/Quote
DIAL / DIGITAL INDICATORS(Spindle Type)

Calibration Frequency: 12 months
(can be modified according to stability, purpose and usage)

1.0 Scope:
This method describes the calibration of spindle type dial /digital indicators.
Measuring range: 0 to 5 inches (0 to 125 mm).

2.0 References: This document is based on the NAVAIR procedure.

3.0 Definitions: TI: Test Instrument
NAVAIR: Navy Air Force

4.0 General Requirements:
Environment:
áTemperature: Change should not exceed 2 deg. F/h (1 deg. C/h).
68 ± 1 deg. F (20 ± 0.5 deg. C) is required for tolerances smaller
than ± 0.0001 inch (± 0.0025 mm).
á Humidity: No excessive humidity.
á Air quality: N/A.

Stabilization:
á Stabilize equipment and standard at ambient temperature.
A minimum of one hour is recommended.

Preliminary Operations:
á Clean TI.
á Verify TI for damage such as nicks or burrs.
á Ensure that the mechanical action of gaging mechanism is smooth with no
evidence of sticking or binding.
á Ensure that gaging tip is not loose.
á For dial indicators having a revolution counter, ensure that the revolution counter
indicates ± 0.5 DIV. of zero position counter when dial is zeroed.
á Ensure that indicator is perpendicular to standard.

Standards and Calibrating Equipment:
áThe standards and equipment used must have a valid calibration certificate.

5.0 Equipment:The following equipment is considered a minimal requirement and any
equivalent equipment may be used.

1. Grade 0 Gage Blocks.
for a resolution greather than 0.0001 in. (0.002 mm), grade 2 Gage Blocks could
be used).
or indicator calibrator, uncertainty +/- 0.00005 inch ( 0.001 mm).
or supermicrometer, uncertainty +/- 0.00005 inch ( 0.001 mm).
2. Magnifying glass or microscope.
3. Indicator fixture or support (if required).
4. Grade A / Class 1 surface plate (if required).

6.0 Calibration Process: Use only the portion of the calibration method applicable
to the TI and use manufacturer specifications and limits
when available.

Item Test Characteristics Acceptance Limits Test Method

1 Graduations Good contrast Visual

2 Wear of gaging tip No flat on spherical Magnifying glass
or ball gaging tip or microscope

3 Repeatability Manufacturerâs specifications Gage block
verify at: 25, 50, and 75 or or
percent of full scale (5 readings) Digital: ± 1 graduation indicator calibrator
or
Dial: ± 0.2 graduation supermicrometer

4 Linearity Manufacturerâs specifications Gage block
verify at: minimum 4 positions or or
(8 positions is recommended) indicator calibrator
inward and outward direction ± 1 or ± 2 graduations or
supermicrometer
Digital:
Zero TI with a pre-load of
5 to 10 percent of range

Dial:
Zero TI with a pre-load of
¹ turn of the pointer on the dial

Example of a dial indicator

Resolution Reading Range
0.0005 0-20 0.050

Test: Inward Outward
Points: 0.012, 0.024, 0.036, 0.024,
0.036, 0.048, 0.012 and 0.000.

7.0 Notes:

1. Record readings, maintenance such as servicing, adjustment, repairs
or modifications.

IP: Logged

All times are Eastern Standard Time (USA)

next newest topic | next oldest topic

Administrative Options: Close Topic | Archive/Move | Delete Topic
Post New Topic  Post A Reply
Hop to:

Contact Us | The Elsmar Cove Home Page

Your Input Into These Forums Is Appreciated! Thanks!


Main Site Search
Y'All Come Back Now, Ya Hear?
Powered by FreeBSD!Made With A Mac!Powered by Apache!