Software Process Implementation Problems

C

CATERAF

Hallo,

I am having a rather difficult time implementing ISO 9001 clauses 7.3.2 and 7.3.3 with our software development team and am seeking some guidance.
Just as a bit of background, we are a small company of <15 (4 of which are software) and do design, development, production, etc.

We've just started using the program Team Foundation Server as a way to help us achieve ISO so the software team don't need to be generate word/wiki documents and can spend more time coding (read: not documenting. They have an allergy to documentation).

However! What I thought would be smooth sailing really isn't..

We first got the software team to generate their requirements. This is something they have never done (other than a general requirement of 'the customer needs xx product') but they started (hoorah!). However, they then realised that they've got a huge load of requirements to meet so rather than documenting them they're trying to shortcut the process by 'grouping' them into what work they want to. This is okay to me as long as they're capturing what things are required in the product (and reviewing it, etc).

I then asked how we were going to make sure their design meets the requirements. Their plan is to do two global design documents that are based on the requirements but don't show traceability to the requirements (e.g., they won't show that paragraph #20 in the design meets requirement #6) My understanding of 7.3.3 is that the design must meet the input requirements. Does it need to be to this level of traceability though?
They implied that they will write the design guidelines with the requirements in mind, and then, as they work on the requirement and change it through different states ('new' to 'in progress' to 'done') they'll be taking into consideration the tasks implemented, requirements, design guidelines and tests. Would this count as verification against the design and development inputs?

I don't feel like it is because it's all just 'inferred', but they are so adament about doing the bare bones of documentation. Their reasoning is that they're a small team (4 man team) and it's not feasible to do extensive documentation.. (which may be true, I'm not sure how much to expect from them as a small team?).

As I'm not trained in ISO/QA but am learning as I go I feel like it's a case of the blind leading the blind and don't want to lead them off down the wrong path.

Any advice about getting their compliance and about how to fulfill this design dilemma would be greatly appreciated please! :)
 

sagai

Quite Involved in Discussions
There is a guidance for ISO9001 implementation for software companies, it is called ISO90003. If you are looking purely for compliance than I tend to suggest to have a read of it.
Cheers ! :bigwave:
 

John Broomfield

Leader
Super Moderator
Hallo,

I am having a rather difficult time implementing ISO 9001 clauses 7.3.2 and 7.3.3 with our software development team and am seeking some guidance.
Just as a bit of background, we are a small company of <15 (4 of which are software) and do design, development, production, etc.

We've just started using the program Team Foundation Server as a way to help us achieve ISO so the software team don't need to be generate word/wiki documents and can spend more time coding (read: not documenting. They have an allergy to documentation).

However! What I thought would be smooth sailing really isn't..

We first got the software team to generate their requirements. This is something they have never done (other than a general requirement of 'the customer needs xx product') but they started (hoorah!). However, they then realised that they've got a huge load of requirements to meet so rather than documenting them they're trying to shortcut the process by 'grouping' them into what work they want to. This is okay to me as long as they're capturing what things are required in the product (and reviewing it, etc).

I then asked how we were going to make sure their design meets the requirements. Their plan is to do two global design documents that are based on the requirements but don't show traceability to the requirements (e.g., they won't show that paragraph #20 in the design meets requirement #6) My understanding of 7.3.3 is that the design must meet the input requirements. Does it need to be to this level of traceability though?
They implied that they will write the design guidelines with the requirements in mind, and then, as they work on the requirement and change it through different states ('new' to 'in progress' to 'done') they'll be taking into consideration the tasks implemented, requirements, design guidelines and tests. Would this count as verification against the design and development inputs?

I don't feel like it is because it's all just 'inferred', but they are so adament about doing the bare bones of documentation. Their reasoning is that they're a small team (4 man team) and it's not feasible to do extensive documentation.. (which may be true, I'm not sure how much to expect from them as a small team?).

As I'm not trained in ISO/QA but am learning as I go I feel like it's a case of the blind leading the blind and don't want to lead them off down the wrong path.

Any advice about getting their compliance and about how to fulfill this design dilemma would be greatly appreciated please! :)

CATERAF,

Before we discuss design inputs per 7.3.2 and design outputs per 7.3.3 did you engage the software design team in planning the design including its architecture and its validation?

Some of the design inputs influence the design itself (the architecture, coding and user interface) and other design inputs provide the basis for verification and validation. Obviously the designers/coders need to understand these inputs so they can accept them and use them or clearly reject them in the first design review after collating the design inputs.

Knowing how the design output (mostly the software itself but including the user manual) is going to be validated from the planning stage allows the design team to think and act preventively as they create.

Stop emphasizing documentation. You may need to return to the social activity of planning to engage the team in seeing the big picture and understanding the needs of users and others who should benefit from their work. Once they understand this let the team decide what must be documented.

John
 
P

pldey42

I think you've created a conflict situation. According to the original post, "Team Foundation Server as a way to help us achieve ISO so the software team don't need to be generate word/wiki documents and can spend more time coding," yet you're trying to impose documentation - which, contrary to common practice, isn't mandated in ISO 9001.

For 7.3.2 the requirements must be determined and recorded. While that's often done with documented requirements specs (documents) this is not the only way. Agile, for example (a methodology that TFS claims to support with templates) uses "Stories" to capture requirements.

For 7.3.3 you need the outputs of design and development - the code, mainly - to be in a form suitable for verification against the Stories, and TFS appears to support that, e.g. with code reviews and "Test Cases."

More at https://msdn.microsoft.com/en-US/library/vstudio/dd380647

It all needs to be under version control and with configuration management for product versions and variants, but I guess that's all part of TFS.

While I've not used either TFS or Agile, it seems to me that you could map a lifecycle methodology that TFS supports (sounds like either Agile or SCRUM might sit well with your developers) into 7.3.2 and 7.3.3 and meet the intent and the letter of the standard whilst avoiding the dreaded word "documentation."

If you go for formal certification you'll need an auditor sufficiently familiar with incremental, iterative lifecycle methods to understand the mapping.

Hope this helps,
Pat
 
C

CATERAF

Thanks for all the responses!

I missed some key information in my first post it seems!

Firstly, I?ve been trying for an ?improvement? approach rather than a ?document it for ISO? but the team are reinforcing their own ideas of iso = documentation (e.g., I suggest that requirements are a good idea because of x, y, z , and then a team member speaks up with ?so basically we need to document it?). They agree that the process is useful but anything that reduces their code output (such as time writing requirements or testing code) is a problem because they think that ?output? is what is most important, not meeting customer requirements. Thankfully our manager said ?we are following the process? and are about ?meeting customer requirements? so that?s been helpful.

Secondly, we're using an agile/scrum (bit from both really) methodology and use user stories for our requirements.

I think my real problem is I?m not clear on what they must document so they?re not clear, understandably .. (which leads me to the next bit)

For 7.3.2 the requirements must be determined and recorded. While that's often done with documented requirements specs (documents) this is not the only way. Agile, for example (a methodology that TFS claims to support with templates) uses "Stories" to capture requirements.

Yes, this is what we're using. We are using an Agile methodology and we have written our requirements in a story-like way. I left it up to one of our software team to decide how they'd like to write the requirements and they chose stories. This is all in TFS to make it easier for them.

For 7.3.3 you need the outputs of design and development - the code, mainly - to be in a form suitable for verification against the Stories, and TFS appears to support that, e.g. with code reviews and "Test Cases."

Thanks -- this is one area I was very grey. I didn't know what the 'output' was. It could be either the broad design documents, or the code, or..
My understanding was that the design is both the charts and drawings that they draw before the code, and the code itself. Perhaps not..

So, if I understand correctly what you?re saying is that as long as we can show that the code meets the requirements, we would fulfil the clause? I hope so, because that?s what we?re on track for.

As a side note, John said "Some of the design inputs influence the design itself (the architecture, coding and user interface) and other design inputs provide the basis for verification and validation. Obviously the designers/coders need to understand these inputs".

Are you suggesting that the design inputs are separate to the requirements?

Thanks for all the advice.. anything you have is much appreciated. I'm feeling out of my depth!
 

John Broomfield

Leader
Super Moderator
CATERAF,

All of the design inputs are requirements and these requirements must be analyzed for conflicts and feasibility.

This Wikipedia page "has some issues" you may agree it offers some insights on the analysis of design requirements:

https://en.m.wikipedia.org/wiki/Requirements_analysis

Perhaps the team can first agree upon the criteria for deciding what must be documented?

John
 
P

pldey42

"Thankfully our manager said ?we are following the process? "

Hmm ... but the risk is that if the engineers do not see the process as useful, they'll subvert it. I once saw a manager pretend to deal with changing customer requirements and endless delays due to bug fixing by saying "We freeze the design." Which was unrealistic because it had to change to accommodate changed requirements and fix design errors. So the engineers changed the code and left the design frozen, and out of date. By over-controlling, the manager actually lost control because the frozen design gave him a false sense of stability.

So for every document you insist upon, there has to be a need that they buy into. That's why I would map their implementation of SCRUM or Agile into ISO 9001 and not do it the other way around.

If the software they're writing involves uncertainty, the incremental model of Agile and SCRUM is valuable, if harder to manage because there's no waterfall. If that's problem for management and deadlines, invite the engineers to help find a solution using Agile or SCRUM ideas as appropriate. As John says, defining a process is about figuring out a way to work together such that engineering and management needs are satisfied.

Outputs from design include the code (of course) and any intermediates like architecture, design, stories (I knew them as "use cases" but I think they're similar), entity relationship diagrams, state diagrams, class hierarchies, whatever suits. And, of course, user manuals, makefiles, scripts, installers, change notes, test specifications, test results, review records or code inspection records, etc.

Showing that the code meets the requirements is the essence of 7.3.5 verification. It's often done with tests that are specified in test specs with clear reconciliation to stories.

You also need 7.3.6 validation, which is often pilot testing involving the customer, which is about making sure it works (regardless of the verification to specification) because there could be errors in the specs. For example, in validation I'd involve real users, including the clumsy ones who do things nobody thought of when writing the specs.

My understanding is that with Agile, a lot of this can be done incrementally, so as to avoid nasty surprises in acceptance testing, having run most of the tests in earlier increments.

One social engineering technique that might help is to involve the software engineering opinion leader(s) in defining the process, and a critically-thinking engineer in the audit process.

Hope this helps
Pat
 
C

CATERAF

Thanks for all the replies.

Hmm ... but the risk is that if the engineers do not see the process as useful, they'll subvert it.

So for every document you insist upon, there has to be a need that they buy into. That's why I would map their implementation of SCRUM or Agile into ISO 9001 and not do it the other way around.

I do agree that the engineers need to see the process is useful. I think that half the software team are on board and agree that it's necessary and they've been doing as they're asked and seem fine with that. (Well, there's some resistance but they'd do it if they were 'formally' asked .. they're just given wriggle room).

It's the software manager/project manager who isn't really getting it because he doesn't like a 'formal' approach of project management (but will follow scrum/agile if it suits, else he won't). He said he will do an 'informal approach' because that is 'how he works'. Unfortunately the rest of the team need 'formal' direction otherwise they're floundering around not knowing what to do, or they wriggle out of tasks, or they finish a task and then go onto whatever they think they'd like to do next. For example, I am doing some work with them so am part of their 'team'. But, without the project manager holding a meeting or detailing it, I had no work to do this week for this particular project. It was only when one of the software team did the managing with the 'project manager' that i got some useful, 'formal' tasks.

What do you do about this though? Our project manager is on board with the process, but he doesn't want to project manage because he doesn't like it much, he isn't qualified to do it (no training to manage people) and he's time pressured and considers code as the most important thing of the process. Getting someone else to project management is what he'd like but there isn't anyone else..


Outputs from design include the code (of course) and any intermediates like architecture, design, stories (I knew them as "use cases" but I think they're similar), entity relationship diagrams, state diagrams, class hierarchies, whatever suits. And, of course, user manuals, makefiles, scripts, installers, change notes, test specifications, test results, review records or code inspection records, etc.

Showing that the code meets the requirements is the essence of 7.3.5 verification. It's often done with tests that are specified in test specs with clear reconciliation to stories.

Thanks for this - this is what I thought. I'll keep that in mind as we go through the process. I think right now just getting them to follow part of the process is a start and we're going to have to tackle this one step at a time.

This Wikipedia page "has some issues" you may agree it offers some insights on the analysis of design requirements:

https://en.m.wikipedia.org/wiki/Requirements_analysis

Perhaps the team can first agree upon the criteria for deciding what must be documented?

Thanks John -- this would work except, when i've asked them previously, their criteria for what must be documented is prettymuch 'nothing unless ISO says we must'... very hard when i'm not an ISO expert and don't know exactly what i'm asking them to document either.. hmm!

Thanks for the article too -- so true! we've experienced lots of it already..
 
Last edited by a moderator:
P

pldey42

Difficult. But at least this manager is honest. In my experience they usually cover up their managerial deficiencies and formally direct the team down the wrong path, then blame them for failure. So, honesty's a good foundation.

I think your "one step at a time" approach is the way to go, working with the manager to formalise what works, leaving the rest on the shelf until it can be incorporated to everyone's satisfaction.

"Wait for the pain" is one approach. When managers are in pain they'll change.

Here's an example: I once was tasked with writing a code of practice for C++ programming. I wrote it, and a critical mass of opinion-leading engineers bought into it. We agreed that the only way to enforce it was with code reviews - but the managers would not allow them, regarding review activities as not adding value. So we shelved our nice Fagan review process.

Until the project hit major pain. System testing revealed it was full of bugs, unsurprisingly. So there was a mad bug hunting phase. Many people had nothing to do - so they let me form a team to go bug hunting by reviewing code, against our code of practice. We found bugs galore and reported them to the coders, who were delighted and able to fix them. We kept records on the time we spent and the number of bugs we found and compared the data with similar information on the productivity of bug hunting with testing. Not surprisingly, we verified what the literature predicts: code reviews are an efficient way of finding bugs. This analysis enabled us to persuade managers to put reviews of samples of code into the process.

I would suggest a similar approach here. As you implied, define what you can of the process and as problems occur, talk together about the right process fix, and add it in.

Have you asked what people are expected to do when the informal process gives no guidance? Are there just too many cooks to coordinate? Does the manager need time for him or herself to think before giving formal instructions? Is the manager also the architect or chief designer? If so, could management duties be delegated to someone more suited, if junior?

For education, would the budget stretch to a book? There are several Agile books cheaply available. (Personally, I found Steve McConnell's "Rapid Development" invaluable, but it was written in 1996 and may have been overtaken by Agile.)

When there's time pressure, allowing people to wander around without direction is wasteful; some form of co-ordination would surely help. How about a daily meeting, 15 minutes, where the team self-organize? (Which is no doubt something like Agile.)

Or a weekly meeting, 60 minutes, reviewing the last week's successes and failures and planning tasks for the following week?

It seems to me you have something going for you - a manager who at least communicates and may be willing to work with you. (For many in this situation, the relationship between management and quality is dysfunctional.) Perhaps this is a learning-by-doing-together opportunity, as you said, step by step - and focusing upon what each person is good at.

Just 2c
Pat
 
Top Bottom