Showing posts with label Rules. Show all posts
Showing posts with label Rules. Show all posts

Monday, August 15, 2011

Testing is indespensable. Or is it?

I added a set of rules (questions, really) to my wiki, concerning product test as measurement of product quality. Please check it out and spread the word: Article on PlanetProject.

Tuesday, July 28, 2009

Why it is stupid to write specifications and leave out background or commentary information

Follow me on Twitter: @rolfgoetz

I wrote a set of rules over at PlanetProject that explain why it is a good idea to specify requirements (or designs) giving commentary or background information, and how to do it.

The article is based on a rather tiring discussion with a hired reqiurements specification writer in one of my projects. She seems to get defensive as soon as I suggest she should supply more 'context' for the readers of her request for proposal document.

More education please, this will bring us closer to the 'professional' bit of 'professional business analyst'.

Read the article here. Comments or addidions welcome! (It's a wiki...)


Refurbished: Non-Functional Requirements and Levels of Specification

follow me on Twitter: @rolfgoetz

After an interesting and insightful discussion with Tom Gilb and Sven Biedermann about one of my latest PlanetProject articles I decided to work it over. It is a how-to for a good requirements hierarchy. A good requirements hierarchy is an important prerequisite to a more-conscious and logical design or architecture process. This is because real requirements drive design, not designs in requirements clothes. (Thanks Tom for yet another clarification!)

It seems, in trying to write as short as possible I was kind of swept away from good writing practice and got sloppy with wording. I also found out that there is a philosophical, hence essential difference in how the process of 'requirements decomposition' can be seen.
  1. One school of thought describes requirements decomposition as a process to help us select and evaluate appropriate designs.
  2. The other school describes requirements decomposition as being a form of the design process.
Personally, I subscribe to the second meaning, because of a belief of mine: mankind is very used to solution-thinking, but very new to problem-thinking (Darwin weights in). So most forms of thinking, including requirements decomposition, are outputs of a solution-finding or design process. Or, as Sven put it, requirements decompostion is a shortcut to designing, where 'designing' takes on the meaning suggested by number 1 above.

However, it can be very useful to assume there actually is an essential and clear destinction between requirements decomposition and design processes. The point here is: you need to define it that way, then all is well :-) I didn't define it properly, and thus gave rise to many arguments written and exchanged.

Anyway, I hope the article now sheds even more light on the the constant quarrel about so called 'nonfunctional requirements', i. e. what they are, what they are for, why they are so sparse in the common requirement specification documents.

Most requirement specification document I see these days have lots of required functions, and few (if any) requirements for other attributes of the system, like all the -illities. AND many analysts (including me from time to time) confuse non-functional with scalar. Read the article on PlanetProject to learn more.


Sunday, July 26, 2009

Intro to Statistical Process Control (SPC)

In exploring the web about Deming stuff I stumbled upon this site from Steve Horn. It introduces Statistical Process Control (SPC).
There are a number of pages on various aspects, like

The articles are read quickly and understood easily. They are written to the point.

I'd like to add that in Deming's 14 Points, Point 3 needs a special interpretation for software or (IT) project people.
The point says: 
3. Cease dependence on inspection to achieve quality. Eliminate the need for inspection on a mass basis by building quality into the product in the first place.

Deming was talking about (mass) production. In that field, "inspection" means roughly the same as "testing" in our profession.
In fact, inspections (of requirements, designs, code and test cases) are among the most effective activities you can do in software, for building quality into the product in the first place.

BTW, if you worry about your software development organisation NOT really testing their products, it is a very wise idea to first introduce a couple of  inspection stages, for this is both more efficient (economic) and effective (less defects). It is also a sensible way to introduce learning. 
Testing is about finding defects, Inspections are about pointing out systematic errors and give people a real chance for preventing them in the future.

Here's one quote from Steve I like in particular:
Confident people are often assumed to be knowledgeable. If you want to be confident the easiest way is to miss out the 'Study' phase (of the Plan-Do-Study-Act-Cycle) altogether and never question whether your ideas are really effective. This may make you confident but it will not make you right.

(words in brackets are mine)


Tuesday, May 05, 2009

Atomic Functional Requirements

The discussion on shall vs. must at Tyner Blain , on which I have blogged recently, triggered me to wrap up what I have learned about atomic functional requirements over the years. 
See 3 principles and 4 rules on PlanetProject. Feel free to do responsible changes.
The article  explains how to write atomic functional system requirements so that the spec is easy to read, and ambiguity is kept to a minimum.
Note that it is NOT about the even more important (non-)functional user requirements here, in a sense of what a stakeholder expects from the system, which problems shall be solved, what shall be the inherent qualities of the solution etc. I do not intend to argue that any spec must include atomic, functional requirements. But sometimes setup is so that it must. Don't forget to establish the context first.
Now enjoy the article on how to write atomic functional requirements. Thanks for your comments.

Monday, April 06, 2009

Rules for Checking Use Cases

A couple of months ago I assumed the role of a quality manager for one of my projects. I learned that document quality control requires a set of defined rules to check documents against. Because the documents to do quality control on were 90% use cases, I defined a set of 8 rules for checking use cases, working with authors and checkers, as well as subject matter experts (read: customers). you can find them on Planet Project.
For all readers who face the challenge to find a BA job, or at least a (second) job where BA skills are applicable, there's an interesting discussion going on at the modernAnalyst-Forum. I suggested to move to a QA perspective. This idea fits with the above Use Case Checking Rules.

Wednesday, February 11, 2009

7 Habits of Highly Effective Risk Managers

I'm proud to announce that Raven, author of the wonderful Raven's Brain weblog published another guest post of mine.
It is on effective risk management, titled "7 Habits of Highly Effective Risk Managers".

As you can expect from my material it is a set of rules, that risk managers can easily apply to their jobs. It might also help if you are NOT the risk manager in your endeavor, but want to help someone who is.

Here's the condensed version:

R1: Be responsible.
R2: Analyse in depth.
R3: Use hard facts.
R4: Do more than one thing.
R5: It's not fire and forget.
R6: Integrate.
R7: Don't change the scales of measurement.

In the blogpost itself I explain every rule/habit in detail. Thanks Dick Karpinski for nitpicking and thanks Mr. Covey for the title.
Enjoy!

Monday, September 08, 2008

Basic Structure for Writing Acceptance Criteria / Scenarios

Name: Basic Structure for Writing Acceptance Criteria / Scenarios
Type: Rules
Status: final
Version: 2008-09-08
Sources: Dan North's Behavior Driven Development (RSpec). I was introduced to this by Chris Matts.

Gist: Specify scenarios in a human (and machine) readable, yet compact way.

R1: Use the following sentence structure:
GIVEN <context>
WHEN <event>
THEN <expected behavior>

Example:
GIVEN the user has proper credentials
WHEN he selects an order for deletion
THEN the system accepts the selection


Example:
GIVEN the system is running under stress profile XY
WHEN interface A sends a service request
THEN the system responds to A in 3 seconds or less


R2: If you need to specifiy more complex conditions, use AND, OR, NOT
GIVEN <context 1>
OR <context 2>
WHEN <event 1>
AND <event 2>
THEN <behavior 1>
NOT <behavior 2>

4 Strategies Towards Good Requirements

Name: 4 Strategies Towards Good Requirements
Type: Rules
Status: final
Version: 2008-09-08

Gist: Left with the 3 Most Powerful Requirements Principles, what should you do? The following 4 rules can help you adhere to them.

R1: You should get to the point.
- find the priority stakeholders and use their point of view
- don't write requirements that arbitrarily constrain the scope
- seperate strategic requirements from basics, and from solutions or designs
- ask 'why' a couple of times (root-cause analysis)

R2: You should quantify requirements. 'don't be weary, it's less work than you think, once you tried
- every quality requirement can be expressed in numbers (by definition)
- every functional requirement is associated with at least one quality requirement (just ask 'how good?')

R3: You should give your requirements time to evolve, nobody really knows them completely and consistently, now.
- you will find the real requirements by frequent delivery of your product to the stakeholders, e. g. by feedback
- don't get trapped by the idea someone knows all the requirements just because you specify an existing system ' the world kept turning in the meantime, yielding 2-10% requirements creep per month.

R4. You should analyse the value of your requirements and then focus on the ones with best ROI.
- the real requirements are the profitable ones
- it is economic to focus on the top ten (max) requirements

Tuesday, July 29, 2008

Why and How to State the Credibility of Estimations

Name: Why and How to State the Credibility of Estimations
Type: Principles
Status: final
Version: 2008-09-08
Sources: T. Gilb's work on Impact Estimation Tables, see http://www.gilb.com/; my own experience

Gist: Have you seen presentations suggesting a project or other endeavour with the purpose of convincing someone that the project would be the right thing to do? More often than not these presentations include shiny numbers of relatively low cost and relatively high benefit (of course). But how often does the presenter tell something about the Credibility of the numbers?
I believe every decision maker needs some assessment of credibility. If you don't provide him with how believable your evidence is, the decision maker might turn to other sources, like "what's this guy's success record?", "I don't believe her so I will resort to a very tight budget in order not to risk too much", etc.

Same problem with any estimate you prepare and present. These Principles explain why you should always give credibility information, and the Rules explain how you can do it.

" An estimate that is 100% certain is a commitment." -- Unknown

Credibility information DEFINED AS any evidence or number that expresses how certain we believe a statement is

Principles
P1: Estimates are assumptions about the future, therefore they must be uncertain.
"Unfortunately, tomorrow never gets here." - Leo Babauta

P2: Predictions are a difficult matter. The farther away the state we predict, the more difficult, the riskier.

P3: Credibility information is a measure of how certain we are about a predicted state.

P4: Decision makers need some assessment of the risk involved in the decision. What can happen in the worst case? (It is equally valid to ask for the best case, but seriously, the best case happens far too seldom to take it into account)

"Most people really like to be able to sleep at night." -- Heuser's Rule

P5: The clearer you state uncertainty,
* the easier it is to distinguish between different strategies
* the more likely other people believe you
* the clearer you yourself see the risk involved
* the better you can argue about a strategy
* the easier it is to learn something about the risk involved and to do something about it
* the clearer you give or take responsibility for an endeavour
Note: While the decision maker obviously accepts a risk involved by seeing uncertainty numbers, there are decision makers who don't like the idea at all. I guess this is a reason why credibility information isn't requested very often.

P6: If credibility information is missing, we by default assume the number or the source is not credible at all.
Note: Maybe the presenter just forgot to provide the information, and then it shouldn't be a problem. We can send him to get it.

Rules
R1: Any number that expresses some future state should be accompanied by credibility information.

R2: If you predict a value on a scale, like "end date", "targeted budget", "performance improvement", or "maintenance cost", give a range of that number.
Notes:
* It's useful to do this in a plus/minus X% fashion. The percentage is a clear and comparable signal to the audience.
* It is not mandatory to give the best/worst case estimates an equal range. How about "we are sure to complete this in 10 weeks, minus 1 plus 2"?

* In Rapid Development, Steve McConnell actually suggests that you communicate estimates in a range that gets smaller over time, with the larger number first. "Six to four months" sounds strange - but if you say the smaller number first, people tend to forget the larger one.)

R3: If you provide evidence, say where you have it from, or who said so, and what facts (numbers, history, written documentation) lead to this conclusion.
Note: Do this in the backup slides for example. At least you should be able to pull it out of the drawer if requested.

R4: Consider using a safety margin, like factor 2 (bridge builders), factor 4 (space craft engineers).
Notes:
* The margin is an expression of how much risk your are willing to take. Thus, it is a way of controlling risk.
* Use margins whenever you find a number or source is not very credible (like < 0.4), you don't have any historic data, or if there's no economic way of achieving higher credibility.
* Safety Margins do not necessarily increase real costs (but planned costs)

R5: Always adjust your cost/benefit-ratios by a credibility factor.
Notes:
A simple scale would be {0.0 - guess, no real facts available, 0.5 - some facts or past experience available, 1.0 - scientific proof available}
A more sophisticated scale would be
{0.0 Wild guess, no credibility
0.1 We know it has been done somewhere (outside the company)
0.2 We have one measurement somewhere (from outside the company)
0.3 There are several measurements in the estimated range (outside the company)
0.4 The measurements are relevant to our case because <fact1, fact2> with credibility <x, y> (don't get trapped in recursion here)
0.5 The method of measurement is considered reliable by <whom>
0.6 We have used the method in-house
0.7 We have reliable measurements in-house
0.8 Reliable in-house measurements correlate to independent external measurements
0.9 We have used the idea on this project and measured it
1.0 Perfect credibility, we have rock solid, contract-guaranteed, long-term, credible experience with this idea on this project and, the results are unlikely to disappear}

Friday, July 04, 2008

7 Useful Viewpoints for Task Planning Oriented Towards Quick Results

Name: 7 Useful Viewpoints for Task Planning Oriented Towards Quick Results
Type: Rules
Status: final
Version: 2008-07-04

Gist: While planning tasks or activities, and at the same time focusing on quick results, it is a good idea to describe each task by means of different viewpoints. Here's my suggestion, derived from Tom Gilb's work, the BABOK and my own experience.

R1: Each task should be described in terms of
Purpose. Why are you doing this? This is the single most important category, because it may lead you to the understanding that this is not the right task.
Entry. What do you need in order to begin and complete the task? This prevents you from doing things without the proper basis, which would most likely causes unnecessary re-work.
Process. Which steps do you follow? This is a little plan for your task. It may be only one step, but most of the times there are more than one, even for small tasks. Splitting tasks into steps allows you to return to previous steps in case you learn something new.
Exit. What are the results of the task? What will be the benefit? This is what counts and what can give you the beautiful chance of realizing benefits 'along the way' some greater set of tasks.
Stakeholders. With whom will you have to speak? Whose requirements will you have to take into account? This prevents re-work caused by ignoring influential people.
Methods/Techniques. How can you do it? Note anything that may help you completing the task.
Time-Limit. How long do you plan to work on it? If you need substantially longer, you will have a clear indicator that something is wrong. Short deadlines help you focus (80/20 principle).

Thursday, July 03, 2008

7 Rules for High Effectiveness

Name: 7 Rules for High Effectiveness
Type: Rules
Status: final
Version: 2008-07-03

Gist: Effective problem solvers develop a mind set of effectiveness. Here are the 7 Habits of Highly Successful People, from Stephen R. Covey, Simon & Schuster, Inc., 1989

R1: Be Proactive - take the initiative
R2: Visualize the end from the start - know where you're going ' it is better to do this quantitativly
Note: A good technique for doing R2 and R3 is the Kepner-Tregoe-Method, which is quite well described here. Better, still, is using Impact Estimation Tables (because of their much better handling ob quantitative information.)
R3: List priorities ' it is better, still, to do this quantitativly
R4: Think WIN/WIN
R5: Understand - listen, listen, listen / learn, learn, learn
R6 Synergize - make the whole more than the sum of the parts
R7: Seek Personal Renewal in various areas:
- Physical: exercise, nutrition, stress management
- Mental: reading. Thinking
- Spiritual: value clarification, meditation
- Social/Emotional: empathy, self-esteem

Friday, May 30, 2008

6 Ways to Propagate Non-Functional Requirements

Name: 6 Ways to Propagate Non-Functional Requirements
Type: Rules
Status: final
Version: 2008-05-30

Gist: Most BAs and a lot of other project people would immediatly agree to "nonfunctional requirements (NFR) are important in almost any software development." However, most often these people also would agree to sentences like "but it is hard to do", "we have a special sitiuation here, we can't do it", "I haven't managed to convince the others", or the exceptionally honest "I don't really know where to start", or "Why should we invest in NFRs?". I want to show how to convince others to REALLY do nonfunctional requirements, and how to REALLY start with it yourself, if that is your concern.

R1: Don't do it alone. At least get some like-minded chap to talk to regularly. Who are the opinion leaders? Who are early / quick adopters in your area? Does it help to get somebody elso on board? How about somebody from higher up the food chain? From my own experience I'd say it's likely that you know what goes wrong and what to do about it, but that someone else is better at marketing ideas.
Note: I saw people hype an unimpressive topic so much that everybody started talking about it. (I'm not talking about 'offshoring' or 'SOA' here, but smaller, company-internal things.) However, there is a certain risk you then cannot control it anymore :-)

R2: Use all available official means, e. g corporate idea management, any SPI programme, your regular talk with your superior, the chair you own on any related board. Which department is responsible?
Note: You don't want to fight department borders here, especially in large organizations. Maybe it's better to get a job at this other department.

R3: Get outside help. "A Prophet Hath No Honour in His Own Country". Maybe some external expert can spark the fire within your organization. These people specialize in quick conviction! Get them in and find a way to make more senior management listen to them.
Note: I could recommend the Gilb family, but that would be surreptitious advertising. However, Tom Gilb's book 'Competitive Engineering' is full of examples, his website full of papers that could help you out.

R4: Do baby steps (ok, that one shows up in nearly all how to-guides nowadays). Can you "just do a pilot" on some project? Or start with a few requirements you really feel comfortable about. Advance with the one most commonly used, like customer satisfaction, availability, maintainability.
Note: While there are good maintainability metrics to apply on systems, it seems nobody can show a measurable relationship between these metrics and, say, the future cost of maintenance. Maybe that is because the metrics focus on the code too much (analyzability), and maintenance has to be seen as a more holistic process.

R5: Let (bad) examples speak for you. Where are the bad examples? I'm sure you will find some quickly. Can you show missing or bad NFRs being a root cause for some problem? Does anyone else think so (see R1)? Really get into a few examples so you know them inside-out.

Note: I suggest you use examples that show that it does not help the user a single bit if these requirements are met. Example: Some detailed requirement saying it should take x seconds from the button-click until something is persistenty saved. I bet the user is not really interested in saving in 8 out of 10 cases. He cares about the time he needs to accomplish some task greater than saving, or about some way to reuse work.

R6: Talk about it. Once you started your endeavor to push NFRs, maybe with a pilot, make a public commitment towards your goal and keep updating an audience about your progress. Talk about your proceedings at the coffee machine, in the cafeteria or the casino, in order to keep you focused and the audience listening.

Friday, May 16, 2008

8 Useful Tips to Pick an Analyst

Name: 8 Useful Tips to Pick an Analyst
Type: Rules
Status: final
Version: 2008-05-16

Gist: Every now and then I run into the task to pick a business analyst or systems analyst for a project or customer. Sometimes this starts with writing a job offer, sometimes with a colleague asking if I could take part in a job interview with some prospect. What should you look for?

R1: Clarify what tasks the candidate will have. 'Ok, this is a no-brainer

R2: Clarify what the challenge in this particular project or this particular job will be.

R3: Clarify what (analyst-specific) professional skills will be needed, now, and in the future, in this particular job.

R4: Clarify what soft skills will be needed, now, and in the future, in this particular job.

R5: Make sure the candidate knows these things (and has a sound approach to dealing with them):

  • Stakeholders seem to change their mind on a regular basis. 'I think they don't, it just looks like a change
  • Goals are seldom properly understood.
  • Demand and Supply speak two different sociolects.
  • Stakeholders (among other people...) have trouble communicating about future, i.e. the benefits, or the system-to-be.
  • Other team members, including fellow analysts, architects, and QA-folk, deserve proper appreciation.
  • A problem is a solution to a problem is a solution to... (also see the means-and-ands post)
  • Many people (the prospect, too? Would be a bad sign for an analyst.) have a well developed answer reflex, so finding out what the problem really is can be quite a challenge.
  • Any required product quality can be described in a meaningful, quantified way. (Refer to Tom Gilb's extensive and highly useful wokr on the subject.)

R6: Clarify what testing skills will be needed. 'An analyst that does not know how to design acceptance criteria?

R7: Look for someone who uses words like 'now and then', 'generally', 'frequently', 'some', 'somewhat', 'arguable', 'among other', 'on the other hand', 'also', 'able', 'allow'.

Note: The language gives you hints on the prospect's experience level. Experts tend to have many answers.

R8: Avoid people who use words like 'steady', 'always', 'every', 'all', 'absolute', 'totally', 'without doubt', 'nothing', 'only', 'entirely', 'without exception', 'must', 'have to'.

Monday, May 05, 2008

9 Rules for Requirements as Means of Communication

Name: 9 Rules for Requirements as Means of Communication
Type: Rules
Status: final
Version: 2008-05-05

Gist: What are the requirements for? From time to time I run into a project where the business analyst in charge (or requirements manager, or what ever they call him) does not know the answer. These projects do requirements because some governing company rule says they should. To my mind, requirements as all about transferring ideas from one person to another, or one group of people to another group. Here's what a BA should think about concerning communication.

R1) Always identify the readers. Make sure you sort out primary and secondary readership. Go for the group that benefits from the requirements NOW, if in doubt.

R2) Don't forget to ask your primary readers how they want to read requirements. Spoken or written prose, pictures, tables? Do they know UML? What's the common language of all your readers? Go check the SOPHIST's material on language and requirements if 'plain natural language' is the answer to the latter question.

R3) Always check if you can reduce the need to write things down by including people in requirement workshops with your stakeholders.

R4) Challenge arguments like 'we need the requirements after the project's finished, for maintenance'. Will there be an organization maintaining the system? Will they use your requirements?
Note: I know this rule hurts. However, I haven't seen a single specification that is still useful years (or even months) after the project's end, if this spec isn't maintained as well.

R5) Don't overestimate your memory. If you will be on a project for several months you yourself will need some aid for remembering that darn detail.

R6) Adjust your requirements style to what you're specifying. Is your system user interface-heavy? Database-strong? Functions? Constraints? Highly interactive? Real-time? Product family?

R7) Think of it: can the team (of designers, programmers, testers, users) be allowed to ask for the information they need, in contrast to you providing them 'one-size-fits-all' requirements.

R8) make sure everybody understands that perfect communication is not possible. The best thing you can do is speak with everybody a lot. The worst thing you can do is write a specification from beginning to end and throw that piece of paper over the fence (also see BUFR)

R9) You don't need requirements at all? You cannot not communicate! (Watzlawick)

Friday, April 25, 2008

12 Helpful Rules for Refactoring a System

Name: 12 Helpful Rules for Refactoring a System
Type: Rules
Status: final
Version: 2008-04-25

Gist: to provoke 2nd thoughts if you are to take some part in a refactoring. Refactorings seem to come about more frequently these days, maybe because even the stolid of the business people start to talk about 'agile'. Like all these now-buzzwords, there's a lot of potential misunderstanding hiding in the very word.

Refactoring DEFINED AS modifying something without changing its externally observable behaviour with the purpose of making maintenance easier. Note that you can refactor many things, ranging from systems to architectures to design to code. However, relevant literature suggests that Refactoring is meant to relate to Design and Implementation. See Martin Fowler's Website. Also see Code Refactoring.

R1) Whenever you hear the word refactoring, run! 'just joking...

R2) Precisely understand what the various stakeholders think a refactoring is. You'll be surprised. Correct the wording if needed.

R3) Precisely understand what purpose the refactoring has, from each of the stakeholder's point of view. Align expectations before you start.

R4) Assume a great deal of resistance from the people who pay. Refactoring for them is another word for "Give me $$$, I'll give you what you already have! " Find out who will pay for maintenance.

R5) By all means reduce the risk of having a system with new bugs after the refactoring. This can be done by rigorous test driven or even behaviour driven development for example. Any change in behaviour should be treated as a bug.

R6) Challenge the business case (if any ...) of the refactoring. At least, do proper cost-benefit analysis. For example, the system may not live long enough to amortize refactoring.

R7) Prepare for the situation that nobody can provide exact and dependable information on how future budgets will be affected by the refactoring

R8) Back off from refactoring if significant portions of the code will be both refactored and worked over due to new requirements. Do one thing at a time and bear in mind R5.

R9) Prepare for 3 different points of view if you ask 2 people for what specific refactorings should be done.

R10) Think of short-term and long-term effects.

R11) The improvement must be both quantified beforehand and measured afterwards, at least for non-local changes to the system.


R12) Be suspicious if someone suggests only one or two specific refactoring techniques will suffice to reach a project-level goal.

Friday, April 11, 2008

11 Rules for Prioritizing Features in a Customer-Oriented Organization

Name: 11 Rules for Prioritizing Features in a Customer-Oriented Organization
Type: Rules
Status: final
Version: 2008-04-11

Gist: While prioritizing features, people people quite often take a ranking approach in order to balance different stakeholder's perspectives. This really may prevent satisfied stakeholders (or a shot time to market), and may lead to a mediocre release plan. Why? The math used in most ranking approaches is wrong. Here are 11 Rules for proper prioritization.

Source: http://tynerblain.com/blog/2008/04/09/improved-prioritization, thanks to Scott Selhorst for yet another great idea. Go there if you need a more visual explanation. I added my bits and pieces.

Prerequisites:
E1: You have a list of features gathered from various stakeholders.

E2: Your stakeholders are of different importance. ' maybe one represents the customer of your product, another is development/manufacturing, yet another is some governing technical department. Read about Finding Key Stakeholders.

E3: Each stakeholder has ranked all features, with smaller numbers for the more important ones.

Rules:
R0: Make sure you have a proper understanding of the goals (read Specifying Goals and Decomposing Goals) of the project or product. Make sure you understand your set of stakeholders in the light of the goals.

R1: Ignoring E2 above, the simplest approach would be to add the ranks for each feature and - voila - the features that are ranked best on average win.

R2: Not ignoring E2, don't make the mistake to give each stakeholder a weight and multiply the respective ranks with the weight before you sum the rank-numbers. This only works if you have lucky numbers.

R3: You can do like R2 suggests, but then you have to reverse the ranking numbers of E3, giving large numbers to the more important features.

R4: You should think hard what the stakeholder's weights represent. If you have one stakeholder that is more important than the others, it's paramount to make him happy, at least not to upset him. Do this by delivering his important features first, and only after that you think of features for the other stakeholders.
Note: Take the example from E2. I bet the purpose of your business is to make the customer happy.

R5: Make sure you do a Kano analysis in order to find features nobody thought of in the first place. This may be your advantage over competitors.

R6: If confronted with a large (> 20 items) list of features, let the stakeholders break it down into importance classes first. It's virtually impossible for the average stakeholder to truely rank so many items.

R7: Don't give in if your stakeholders say they need to know the costs before they can rank the features. You want to know what they want to have! You can always work out things to be cheaper, especially if your features are real (business) requirements, not solutions disguised as requirements. Bear in mind: "If you don't know the value of a feature, it does not make sense to ask what it costs ". (Tom DeMarco)

R8: If your features are Use Cases, go here.

R9: if all the above does seem to simple, revert to Karl Wiegers (First Things First), Alan Davis (The Art of Requirements Triage), Don Firesmith (Prioritizing Requirements), Lena Karlsson (An Experiment on Exhaustive Pair-Wise
Comparisons versus Planning Game Partitioning
), or QFD.

R10: Read Mike Cohn (Agile Estimating and Planning).

Friday, April 04, 2008

7 Rules for Explaining Requirements to Colleagues 'Downstream'

Name: 7 Rules for Explaining Requirements to Colleagues 'Downstream'
Type: Rules
Status: final
Version: 2008-04-04

Gist: As a business/systems analyst, you will inevitably be confronted with the need to tell somebody about the requirements. The challange is to transport your knowledge effectively.

colleague downstream DEFINED AS someone who takes your requirements as an input for her work. I.e architects, developers, desiners, testers, techical authors.

R1: Include your audience as early and as frequently as possible. Don't fall to the BUFR principle, even if you are part of an organization that follows that principle. YOU can always make things a little differently, i. e. use unofficial channels. CC people, invite them to your stakeholder meetings, talk to them at the coffee machine.

R2: In general, use feedback cycles. It is not sufficient if your audience nods or exclaims 'understood!'.

R3: Explain the requirements, and let the others do a design right on the spot. The idea is not to come to a good design, but to see if you have effectively transported your ideas. Do the same with testers, let them lay out their test plans on the spot. Expand this Idea by doing the proven, highly efficient JAD.

R4: Don't be mislead by R3, it may be not sufficient. If you as well have to explain your requirements to someone who is not available (offshoring?), consider a formal review by someone who is not available.

R5: There's a classic but sometimes forgotten technique to make sure 'the other side' has understood: you write the business requirements, they write the system requirements. Again, this is not sufficient.

R6: Make sure your 'downstream colleagues' have to do with the subject matter for long years. (I. e. stretch projects as long as you can ;-) This way, they become subject matter experts (SME's). Weird, but efficient. However, beware of the risk that the new SME's are running the show, not your business stakeholders.

R7: Make sure YOU understand, that you cannot not communicate (see Watzlawick). However, also the reverse is true: You can't communicate, really. You will present your model of the knowledge and the receiver inevitably will have a different one.

Friday, March 28, 2008

BA's weaponry: Standards on Business Analysis

Name: BA's weaponry: Standards on Business Analysis
Type: citation
Status: draft
Version: 2008-03-28

Gist: to give a list of standards that are available and can be useful for business analysts

ISO/IEC 42010:2007, Recommended practice for architectural description of software-intensive systems

ISO/IEC TR 19759:2005; Guide to the Software Engineering Body of Knowledge (SWEBOK)
The Guide to the SWE BOK

BABOK Version 1.6, The Business Analysis Body of Knowledge, The IIBA 2007

ISO/IEC 15288:2002, System life cycle processes

Product Development Body of Knowledge (PNBOK)

Enterprise Unified Process, Enterprise Business Modeling
Enterprise Unified Process, Enterprise Architecture

You can also try websearches with these keywords:

Business analysis standards it process ieee

Enjoy!

Friday, March 14, 2008

How to figure out what the problem really is

Name: How to figure out what the problem really is
Type: Principles, Process, Rules
Status: final
Version: 2008-03-27
Source: Don Gause & Gerald Weinberg, Are Your Lights On?; two own thoughts

Problem DEFINED AS difference between things as desired and things as conceived
Note: Don's and Gerald's original definition stated 'perceived' instead of my 'conveived'. I changed this due to some buddhism-oriented mindset of mine. Thanks to my friend Sven for reminding me.

Principles
P1) Each solution is the source of the next problem.
P2) Moral issues tend to melt in the heat of a juicy problem.
P3a) You can never be sure you have a correct problem definition, even after the problem is solved.
P3b) You can never be sure you have a correct problem definition, but don't ever stop trying to get one.
P4) The trickiest part of certain problems is just recognizing their existance.
P5) There are problem solvers and solution problemers.
P6) In spite of appearances, people seldom know what they want until you give them what they ask for.
P7) The fish is always last to see the water.
P8) In the valley of the problem solvers, the problem creator is king.

Process
S1) Ask: Who has the problem? Then, for each party, ask: What is the essence of your problem?
S2) think of at least three things that might be wrong with your understanding of the Problem. If you can't think of at least three things that might be wrong with your understanding of the problem, you don't understand the problem.
S3) Test your problem definition on a foreigner, someone blind, or a child, or make yourself foreign, blind, or childlike.
S4) Generate points of view. Every point of view will produce new misfits between problem and solution.
S5) As you wander the weary path of problem definition, check back home once in a while to see if you haven't lost your way.
S6) Once you have a problem statement in words, play with the words until the statement is in everyone's head.
S7) Ask: Where does the problem come from? Who sent this problem? What is she trying to do to me? The source of the problem is most often within you.
S8) Ask yourself: Do I really want a solution?
S9) Ask Why 5 times. Normally this leads you to the essence of the problem.
S10) If a person is in a position to do something about a problem, but doesn't have the problem, then do something so he does.
S12) Try blaming yourself for a change - even for a moment.

Rules
R1) Don't take s.o. solution method for a problem definition - especially if it's your solution method.
R2) Don't solve other people's problems when they can solve them perfectly well themselves. If it's their problem, make it their problem.
R3) If people really have their lights on, a little reminder may be more effective than your complicated solution.
R4) If you solve s.o.'s problem too readily, he'll never believe you've solved his real problem.
R5) Don't leap to conclusions, but don't ignore your first impression.
R6) We never have enough time to do it right, but we always have enough time to do it over.
R7) We never have enough time to consider whether we want it, but we always have enough time to regret it.