MarkJoel said:
Claes,
Thanks for the information, but it is not what I was looking for.
I am trying to create something like a publication history but for issues, events, phone conversations, etc. that deviate from a project scope. This document would basically be used internal and not released to a customer. It could be used by QA and management to track quality issues for example, items not completed on a checklist with a specific reason why and what needs to be incorporated for the next document release.
Does this help?
It is basically a cover your butt type document so that things can be tracked not only from a project perspective but from a document or information developer perspective.
First let me echo Claes's welcome
I agree with Claes that the closest mention in ISO of the system you propose is
b) to review and update as necessary and re-approve documents,
Next, help me clarify the basic issue of this "errata" sheet you and your organization want to prepare.
First, if there is a possibility the document will be revised and reissued in the future, these "points" which arise can merely be attached to the in-house copy as annotations for reference when the document is reviewed for revision.
That review period can be short (minutes or hours) or long (months or years), depending on the criticality and nature of the "points" which are discovered. (a copy of the annotations can go into the system discussed in the next paragraph.)
If the document is probably NEVER going to be revised and re-issued, you have two choices
- issue an "errata" sheet as an addendum to the original document when such errata may have an effect on the use or effect of the document (calculation errors, misplaced decimals, illegible graphics, etc.)
- send the errata sheet (or copies of annotations) to a special file (internal) maintained by the Quality Department which will examine and parse the information therein to identify trends in errata which may be addressed with new or modified training as "continual improvement" to reduce the number and severity of such errata in preparing future documents.
Let me emphasize that distributing the raw errata to document authors without filtering for trends makes it difficult for those authors to distill that information to make personal "rules" for future authoring. The task I see which has most value for your organization is determining which errata are "systemic" and which are rare "outliers" with no probative value or use in creating or modifying a training system.
The "reasons" you give in your first post
- lack of time
- no spell check performed - customer request
- graphics not readable
only describe one erratum (error) - "graphics not readable" - the others have no value per se. "Lack of time" means nothing unless it is tied to a specific error committed because of rushing (VERY difficult to make the connection, by the way.) "No spell check performed" means nothing unless tied with spelling errors which change the intended meaning or render the document unintelligible.
For example, we could transpose the homonyms for "site" in the following sentence and it would still be intelligible to most people, thus not destroying the value of a document:
Correct: At first
sight, the web
site seemed an an ideal source to
cite in the research paper.
Incorrect, but still intelligible (if not credible): At first
cite, the web
sight seemed an an ideal source to
site in the research paper.
(Although a literalist might quibble if the sentence were taken out of context as this one is.)
I can make the case the author of the second sentence could have meant:
At the first
citation (the noun form of the verb "cite" which is often shortened to "cite" used as a noun), the
sight I saw on that particular web page would be an ideal source to
site (put) in the research paper.
By the way: CYA is not a Quality function - the point is to use discovery and detection of errors as a springboard for improvement.