Reviews Archives - Jama Software Jama Connect® #1 in Requirements Management Fri, 21 Jun 2024 20:03:29 +0000 en-US hourly 1 Ensure Product Quality with These Review Process Best Practices https://www.jamasoftware.com/blog/ensure-product-quality-with-these-review-process-best-practices/ Thu, 05 May 2022 10:00:58 +0000 https://www.jamasoftware.com/?p=61689

Review Process Best Practices

Introduction

Reviews play a key role in successful product and systems development, helping to ensure the new project meets stakeholder, market, and compliance requirements. Peer and approval review processes enable organizations to both iterate and innovate quickly while providing a dedicated process to apply appropriate rigor for final reviews. In addition, integrating item workflow with approval reviews can eliminate manual processes and reduce human error.

In this blog post, we examine a generic approach to reviews and review coverage, independent of the application used. In the second half of this post, we’ll look into the way Jama Connect® can be used to support the review processes described in the first half.

Defining the Scope of a Review

The reviews discussed in this post are in reference to informal (aka ‘peer’) reviews, leading up to formal (aka ‘approval’) reviews.

For medical device manufacturers, the first part can also be applied to their existing document management (quality) system, where their formal review and sign-off are recorded for quality bodies, like the FDA (Food and Drug Administration).

Reviews Play a Key Role in Successful Product Development.

Reviews are an essential part of any product quality process, comparable to testing the product. Document reviews take place in the engineering phases of a product lifecycle when there is no product (parts) that can be subjected to tests yet.

Like testing, a review by itself can never cover 100% of all issues that are in the item under review. And as with testing, a review process is defined at multiple levels, each with a different focus, attention, or goal to ensure the highest degree of coverage that can be achieved.

Review Coverage

There are different ways to ensure your reviews get enough overlapping coverage to catch the issues:

  1. The reviewers you invite
  2. The focus you give each reviewer
  3. The goal you set per review

Related: Leveraging Peer and Approval Workflows to Optimize Your Peer and Approval Process


Who Should Be Included In A Review:

The documents you write, you do not write for your own benefit. Documents created form a means for knowledge transfer from the author to the next person in the product lifecycle. The information within the document should therefore be understandable for its intended audience (readers) and users (appliers).

Since the documents written are related to the level these documents describe in the system’s engineering V-model, it can quickly be understood which process roles need to be invited as reviewers:

Process roles depicted in the diagram above:

At any one level within the applied information architecture, there are at least two next levels that need to be invited, to make sure these following levels understand the information provided through those documents: the next level downstream (decomposition and detailing) and the same level across testing.

It is also important to know if the author has understood the documentation that was provided by the process role in the product engineering phase before theirs (i.e., the upstream level).

Inviting colleagues in the same field, or at the same product level as other product lines (i.e., fellow Subject Matter Experts, or SMEs), is a good way to ensure those reviewers understand and evaluate the documents against company standards to ensure a consistent quality of deliverables. It also ensures that specific product knowledge is spread throughout the company when involving fellow SMEs outside their own project.

Because of their differences in process roles, each reviewer will naturally focus on the information that is important for them to understand in order to be addressed in accordance with the process they manage and maintain.

For example:

The System Requirements Specification (SRS) should be reviewed by the person(s) responsible for writing:

  1. The stakeholder/customer requirements (upstream; correct interpretation and coverage of that level’s ‘asks’)
  2. The Subsystem Requirements Specification (downstream; is it understandable, unambiguous, and specific enough to be able to ‘answer’ the ‘ask’ that the SRS has for them)
  3. The System Acceptance Test Plan (across; is it understandable, unambiguous, and specific enough to be able to be tested)
  4. SMEs on topics described and/or referenced in your document (quality, sanity, and completeness)

General Guideline: There should be at minimum three, but preferably four reviewers in any review.

Tips for Conducting a Successful Review:

  • When parts of the system will be developed and provided by a third party (e.g., subcontractor), include that subcontractor.
  • When reviewing the product needs, or tests to validate these, of a specific customer, include that customer (or meaningful representatives).
  • Although there does not appear to be any relation in the various test levels, it is still interesting to invite testers from those other levels, as they provide different insights, they’ve applied for similar tests defined at their level.

Assign Focus Areas to Each Reviewer

Even though reviewers get invited depending on their process role, related to the document under review, it is also important to assign focus areas to each reviewer to ensure not all reviewers comment on the same spelling error, which usually is only a minor inconvenience that takes away the focus on the important issues.

Simply mentioning what their expected contribution is will already achieve such a focus, i.e.:

  • A reviewer invited because of their upstream relation to your document should be assigned to look at the correct interpretation and coverage of their provided input.
  • A reviewer invited because of their downstream relation to your document should be assigned to check if they understand your document and if it is unambiguous and specific enough for them to further decompose and detail.
  • A reviewer invited because of their ‘across’ relation to your document should be assigned to check if they understand your document and if it is unambiguous and specific enough for them to define test cases and/or test approaches.
  • A reviewer is invited because they’re an SME to ensure the quality, sanity, and completeness of your document.
  • Finally, only assign one of them to also check for grammar and spelling errors. This (simple) assignment will ensure all other reviewers won’t remark on them, as someone else is already tasked with that and it keeps focus on their own assigned areas.

Informal Reviews Leading up to Formal Reviews

Not every review carries the same weight. Not every review has a formal context and thus doesn’t require the involvement of authorized (senior) colleagues, or managers, to formally sign off on the document.

Reviewers tasked with formal sign-off of your document, usually have yet another focus than reviewers tasked with evaluating the quality and completeness of its content. Combining these two types of reviewers will ensure either role will question their contribution to the review while the other role is addressing their found issue. Therefore, it’s advisable to have a two-level approach to reviewing your documents:

  1. Evaluate the content regarding any product, service, or inconsistencies.
  2. Evaluate the content regarding any business contextual (i.e., legal, contractual) aspects.

The review process and approach as described above have the quality and completeness of the content in mind, to allow the formal approval reviews to be nothing more than an administrative necessity; The subsequent approval of a document becomes a hammer piece.

“Review Center is facilitating communication. It has ensured a shared view of the world and agreement from all stakeholders. There are no surprises anymore. Jama Connect enables us to review documents and make decisions easily with everyone coming to a shared conclusion.” Craig Grocott Head of Systems Engineering, Teledyne e2v


Related Customer Story: With Jama Connect®, TELEDYNE e2v Improves Communication and Reduces Risk


Review Process in Jama Connect

The review process and approach described above are independent of any application supporting your review process and/or approach.

Jama Connect supports the above-described review process and approach. It can even provide more focus.

Divide and Conquer

Atomic Nature of Jama Connect Items

All Items, Components, Sets, and Folders are atomic.

Although Jama Connect doesn’t really have the concept of ‘documents’, most customers use their ‘Sets of’ and ‘Folders’ to represent the content and respectively (chapter/paragraph) structuring of their documents. As with documents where chapters and paragraphs are used to group and structure your information related to specific topics, level/priority of information, Jama Connect uses Folders, and a folder structure in a similar way within a Set.

Because everything in your Jama Connect project’s structure is atomic, you can select a Set and generate a document, or start a review of that Set, and all child elements underneath.

Utilizing This Atomic Nature of Items

The same ability is there when you select a Folder within that Set, allowing you to only select chapters and/or paragraphs on specific topics from your total document to your subject matter experts (SMEs), without them having to go through the entire document. Each specific topic can be sent for review to each (group of) SME(s) to get the most out of finding issues and correcting the (technical) content of each part of your document.

Once all reviews on all specific topics are concluded, you can move to formally approve the entirety of that document.


Related: Review ROI Calculator – Improve Review Process


Rolling Reviews

A “rolling” review is a review that changes the content of requirements that are included in each of the review’s revisions. Using this methodology, the review is much smaller in scope and can typically be completed faster.

Rolling reviews are standard Engagement Workbook nowadays. This mechanism actually binds a number of the review approaches discussed above into one:

  • Divide and conquer
  • Peer reviews leading up to approval reviews

It’s centered around the fact that all Items are atomic, each Item Type has the same status workflow, and the use of Filters to define the content of your review, where each new version you create of your review, will re-collect all Items that comply to the filter you’ve set up and baselines them.

It allows you to review each Item, or a subset of Items, separately, and collect all Items that have reached a status that indicates these Items are mature enough (content is of the quality, sanity, and completeness your organization strives for). Those Items then allow you to organize an approval review for the entire ‘document’.

Jama Connect Review Process

When initiating a review in Jama Connect, the steps included support this generic review process:

  1. Include the linked Items – upstream, downstream, and across – so your review invitees can evaluate the traceability.
  2. When using the ‘Rolling Review’-approach, select the corresponding filter.
  3. Invite at least three, but preferably four, colleagues in accordance with the contributions you can expect from their engineering role.
  4. Rewrite the standard invitation text of the email to assign focus areas to the invitees.

Related: Best Practices for Jama Connect Review Center


Additional Review Activities

Collaborative Review Meetings

Jama Connect allows organizations to run reviews online, which enables reviewers to determine when and where to spend their time participating in that review. However, much is to be learned from review meetings, where comments of a reviewer spark new insights and subsequent questions from another reviewer.

If the Moderator organizes a Review Meeting, a collective get-together, to discuss the review results and its found issues with all review participants, focus on the issues found, avoid discussions that take longer than a few back-and-forths, as the goal of the meeting is to try to process as many issues as possible in the (short) time available.

These discussions are important, so write down their topics and allow time to go into these discussions later; Ensure the meeting timeframe of the review session has a section for the actual review and a section for discussions.

Simply accept all grammar and spelling errors and ask the author to correct them after the session.

One thing to consider:
Abbreviations, terms, and definitions and how they’re used throughout your document do matter and should not be considered grammar or spelling errors!

Preparing For a Review Session

Insist everybody comes into the review session prepared, i.e., they’ve read the review comments of the other reviewers, made notes, and have their response ready. If they’re not prepared, participants may only read and evaluate reviewers’ comments and then respond to a comment much later, while the rest of the reviewers are already addressing the next issue.

Being prepared means the meeting can have short, to-the-point, and decisive discussions during the session, while still allowing you to process as many issues as possible.

In Conclusion

Defining the steps for approaching reviews and review coverage will help teams bring the scope of the review process into a more precise focus. By using an iterative and collaborative approach for reviewing requirements and other artifacts in real-time, organizations can improve stakeholder alignment, reduce review cycles, and ease the path to compliance.

RELATED



]]>
Leveraging Peer and Approval Workflows to Optimize your Peer and Approval Process https://www.jamasoftware.com/blog/leveraging-peer-and-approval-workflows-to-optimize-your-peer-and-approval-process/ Tue, 05 Oct 2021 10:00:38 +0000 https://www.jamasoftware.com/?p=56539 approval workflows

Reviews play a key role in successful product and systems development, helping to ensure the new project meets stakeholder, market, and compliance requirements. Peer and approval review processes enables organizations to both iterate and innovate quickly, while providing a dedicated process to apply appropriate rigor for final reviews. In addition, integrating item workflow with approval reviews can eliminate manual processes and reduce human error.  

In our most recent Ask Jama webinar, “Leveraging Peer and Approval Workflows to Optimize Your Review and Approval Process,” we talked about how to:  

  • Enable item transitions to be automatically triggered by a finalized review – reducing errors, clicks, and time needed to manually transition review items  
  • Improve Part 11 compliance by allowing organizations to set a locked status to review items that have been finalized  
  • Configure your own smart defaults for reviews by moving all settings to the central organization administration 

Below is a recoding of a webinar and an abbreviated transcript.


 

 

Thank you all for joining us today. My name is Julie Goodner, I’m one of the Senior Product Managers here at Jama Software. I’ve been with Jama for roughly two and a half years, and working in many areas of the application, most recently into the Review Center. My drive, really, is to make our product easy and functional for our customers. I am always available to hear your thoughts and ideas, so if you want to reach out to me at any time, please feel free to do so. And today I will be highlighting the work the team has completed in Review Center by adding these new Peer and Approval templates. 

In today’s webinar, we will be discussing how leveraging these new Peer and Approval review templates help connect with your Items Workflow to create efficiency and reduce errors in your project, and how including Workflow with your Approval review will eliminate manual steps and reduce errors. And lastly, we’ll be going through all these settings and features in Jama Connect. I’ll give a full rundown, how to do this, and where to do it. This is the overview of today’s demonstration. 

First, I will be the org admin going into the Review Center settings and showing you the new Peer and Approval templates along with some features that we have added into the Default template, which was the way you’re used to doing it in the past. And then from there, I will take you into Workflow and show you how Workflow and Approval templates merge together to create a cohesive experience for you when you’re finalizing the Approval review. Next, I’ll become a moderator and look at the new settings in Review Center and see the Approval and Peer. And then I’m going to walk you through what it looks like to transition a Peer review into an Approval review, and then see your items finalized at the end. 


RELATED: Check out our upcoming webinars by visiting our events page! 


So in Review Workflow, in Settings, the Approval review with Workflow reduces errors, clicks, and time needed to manually transition reviewed and approved items in your project. How this works is to enable the new Approval review, like I just stated, and we’ll walk through that again, review template and configure the Approval workflow for your items. In addition, this improves Part 11 compliance by allowing organizations to set a lock status to a reviewed item that has been finalized, no longer having to go back manually and locking those items after your review has been approved. And for efficiency, we have moved all the review setups into the admin section. And by doing this, your org admin can now set up smart defaults to eliminate confusion for your moderator when they create that review. 

Once the moderator now sees these new templates, they want to know what the difference is between probably the Peer and Approval review. Well, how we look at it as a Peer review is used to collaborate with your team, refine your requirements, and get them into the spot that they’re really ready for that final approval. Once they’re ready, the moderator then can transition the Peer review to an Approval review. They can invite all the stakeholders and, when completed and signed off, they can finalize that Approval review, which will then automatically trigger the Workflow items. So they don’t again, have to do a batch update or go into each item, transition, put locks on them, et cetera. All this is done automatically for you, now, with this new Approval review with Workflow. When the review is transitioned, it maintains all the previous comments and signatures, again for audit, or if you need to look back in historical facts. Peer review and Approval review transitions are all captured in the activity stream and version history. 

This greatly improves the moderator’s visibility to participants’ progress, and workflow transitions apply to review items when finalizing that review. And I just want to call this out, that the settings no longer change if a review, settings in the admin have been updated when a review is in flight. So if you’re working on a review and your org admin changes something, for whatever reason, your review will maintain its exact settings as they were when you started that review or edit that review. If you do need to take on these new settings, you will have to create a new review. 


RELATED: Introducing Jama Software’s New Hands-On Workshops (H.O.W. with Jama Connect®)


Alright, so let’s get into the demo. I’ve opened my project, I’m now the org admin. I’m going to go into my admin section, and we just got the 862 release, which will give you these new settings. First, I’m going to go into the Review Center. As you’ll notice, we have added the electronic signature settings and the optional settings into this main area. So again, your admin can now create, like I said before, those smart defaults. And these will carry over into your Workflow or your Review wizard when they create that review. But let’s really talk about the Peer and Approval. First, let’s get into Peer. You’ll notice all the settings are the exact same as they are in the Default or Non-template review. We suggest that you let the moderators override, but if you don’t want to, you can simply turn this off. What that means is a moderator can override any of the settings that your org admin has set up. 

You can have electronic signature, but we don’t think you need it for a Peer. But if your company does, you can just simply keep those on. And we always suggest to have the comments show up in the single item view, but again, you can turn those off as well. We have them private, or you can make them public, and so on and so forth. Once I’m done with that one, I’m going to go into my Approval review. Again, the settings are the same. These settings can not be overridden by your moderator. So once they’re set up by your org admin, they will be the settings in your wizard. If you do need to adjust them, you will have to contact your org admin and have them update whatever settings you need, and then you can carry forward with your reviews. 

To learn more about optimizing leveraging peer and approval workflows to optimize your reviews, watch the full webinar here. 



]]>
Webinar Extra: More Answers About Effective Requirements Reviews https://www.jamasoftware.com/blog/best-practices-for-reviewing-requirements/ Wed, 18 Dec 2019 12:00:28 +0000 https://www.jamasoftware.com/?p=36719

You have questions, we have answers.

Ahead of our recent webinar, “Ask Jama: Tips and Tricks for More Effective Reviews,” we asked those who registered to quiz us with their toughest queries around reviewing requirements.

In response, we received hundreds of questions ranging from dealing with unresponsive reviewers to figuring out how requirements can best be reviewed collaboratively to getting the most out of Jama Connect™ Review Center.

Our experts who led the webinar, Erin Wilson, Senior Consultant, and Joel Hutchinson, Senior Product Manager, tried to answer as many questions as possible following their presentation.

For all the answers Erin and Joel provided during the webinar, they weren’t able to get to all the submissions. So we tracked Erin and Joel down following the webinar, locked them in a room, and grilled them about some of the questions that were still outstanding. You’ll find the output of those questions and answers below, and be sure to download our guide for more best practices around reviews.

Q: How can I review a large number of requirements effectively without consuming a lot of stakeholders’ time?

Erin: Choose the right participants and be thoughtful about the number of people you will invite into a review. Too many, and you risk not having enough time to incorporate the right feedback. Too few, and you risk not receiving enough feedback or missing critical stakeholder input. As far as the number of requirements, just performance-wise, we usually recommend 250 items in the review or less. I would like to see much less. I prefer to do things in more of an iterative fashion, where you’re sending things in and out of review as they become ready. I do understand that a lot of customers have to send something that resembles a document for review, and that might entail having many more requirements in there. I would hope that would be the exception rather than the norm. What do you think, Joel?

Joel: It’s a chicken and egg question. What’s the balance of too much content and not enough content? What’s the balance of too many people and not enough people? There’s no right or wrong answer.

I think there’s another consideration as well, in terms of electronic signatures, which is: how do you want to aggregate your signatures? Every review has the ability to configure electronic signatures. Let’s say I’ve got 250 requirements that I want approved. Do I need to show all their signatures in the same spot? Do I need to make sure that the same people reviewed all 250? That could influence whether you set up multiple reviews or one review. We have the ability to export the signatures that are made on the review, so if that’s enough, and you could stitch those documents together, great. If it’s not, and you need it all to be shown as the same general time frame that everything was approved, then you may want to lump them together.

Learn how much time and resources our customers save by using Jama Connect Review Center in this infographic.

Q: What are a few things I can do right now to conduct more effective reviews?

Erin: You can read our best practices for moderators and participants. Setting clear goals to make sure you tell people the type of feedback that you’re looking for so that people just don’t go down a rabbit hole. Are you looking for feasibility? Something grammatically correct, like, what is it that you’re looking for? Again, just picking the right participants is huge, and then making sure a team is prepared, so that they have been trained, that they understand why are we moving to the Review Center? Why are we doing this?

Joel: I think the tags of the comments is an instant improvement. When you come into a review, and you have a question on something, ask a question. If you’ve got an issue, raise an issue. That’s true in meetings, as well. If you have an issue, come out and say it. It’s the same thing in an online medium.

Q: How are we most likely to catch critical errors in requirements reviews?

Joel: Make sure you bring in the right people first off.

Erin: Making sure that people are focused on what they’re supposed to be reviewing for. If they are supposed to be reviewing as a mechanical engineer, make sure they’re looking at it through the lens of a mechanical engineer. Collaboration is huge, too. And again, if you see something that seems a little odd, don’t be afraid to say something. Your team should feel empowered to be able to speak up.

Joel: I think this is one of those questions where, just from a data perspective, smaller reviews are better. There are studies that have shown that, if you have to go through multiple pages of content, your attention span’s going to wane. Try to keep things down to a level that’s manageable, and then bring in the right people to actually unearth those critical errors. Smaller, more frequent reviews, I think would lend towards the ability to look at things more critically.

I think with a lot of things it’s a balance. When I was doing this in industry, we would mix together, there were certain things that we had to do by certain milestones. That’s forced, right? Those are bigger reviews. You have to do smaller, more collaborative reviews along the way, otherwise you’re never going to be ready for those big reviews. The same thing happens in a virtual environment. Really, we rely on the organization to know their product development process and what that right pace is. Those are the types of things that you should be thinking about. You should be thinking about when do we have to actually sign this stuff and move it, and what do we need to do to get ready for that?

Get an overview of the recent updates to Jama Connect Review Center on our blog.

Q: How do we get reviewers to move their requirements forward once approved?

Erin: It’s important to have the team understand what the process is: What now? What if? What happens here? And what happens when this review is done, and the requirements need to be transferred to some kind of end state. And then, after approval or acceptance of the requirements, you would want a way for participants and even non-participants to go back into the project and say, “Hey, now we’ve got all of these approved user needs. These are all ready for me to take action on.” We can set up filters, where we say, “Here’s all of our approved user needs, but show me just those approved user needs that are missing validation.” Maybe the validation team would know, “Hey, this is our bucket of requirements to go work on.” Stuff like that.

Q: How do you set a realistic review date, so you don’t have to keep opening the review?

Joel: There’s a balance of, when do I want people to actually look at this stuff? You set a date two months in advance. When is somebody going to look at it? Probably a week before it’s due.

That said, the reason we suggest a week is that usually, that’s the timeframe where somebody will actually think about needing to do something.

Erin: The expectations have to be clear. You have to be very clear, and this is where a Jama champion can come into play as well. You’ve got these Jama cheerleaders, so to speak, who are helping to coach team members along, and making sure that you’re monitoring the status, and monitoring the progress that people are making.

Whether you set the review for tomorrow or a week from now, you have to make sure that the expectation is understood, and that there’s something that’s going to happen at the end of that day.

If you continue to keep updating, and updating, and updating, because this one person is a holdout, then you’re going to keep doing that forever.

Joel: That’s also where the stats page comes in. As a moderator, you’re trying to drive the conversation and give people the environment to be successful. You can tell if somebody’s not participating. That’s where the stats page comes in. You could go take a look and see if there is an item, or is there something that we’re talking about that nobody understands and nobody’s reviewing? That’s something to ask questions about. Is there a person that just won’t play the part? That’s something else to ask questions about. Ultimately, the decisions of what you do with those, that information, that’s up to the team.

See how Jama Connect Review Center improves collaboration and increases efficiency in the approval process by watching our demo video.

Q: How can teams manage reviews with team members spread out across the globe, such as in the US, Europe, and China?

Erin: I recommend setting some kind of a cadence for reviewing items and giving feedback. When a moderator is providing feedback and making changes to the reviewed content, I would recommend that they get into the habit of publishing the revisions of the review at the same time, each day or every other day. That way, people can prioritize their own work and they can expect when a new version of the review is going to be published.

Jama Connect is inherently a really great collaboration tool. We just have to take a human approach, where we help people understand what’s going to come next and how to set those expectations.

Q: Can you talk a little bit about exporting review data from Jama Connect Review Center?

Joel: This is something we’ve been focusing on a lot recently, and we’ve been focusing on it for a particular reason. We want to be able to show that Review Center could be where you put your electronic signatures, and as a result of doing that, we’ve bought into certain things.

One is that we need to be able to get information out of the system that took place in the review. A review is a special place. It’s where all the collaboration happens. Depending on how much of that collaboration you want to share out, we have different types of exports for you.

We have an activity history export. That’s the audit trail. It’s just the facts: everything that’s happened in the course of all the versions of a review gets exported. That’s something that you could put in your document repository, you can send it to somebody. You can send that to Word or create a PDF.

In addition to that, we have an export that looks at a particular version of a review. It’s like, “Give me the content. Give me the people that signed and when they signed. Give me the roles that those people took as part of signing that thing, and then all of the comments that happened.” And then we put those comments out in almost like a blog post — a threaded style — so that you can follow along quickly and understand what took place in that review.

Q: What’s the one thing that people will take away from this webinar if they watch the full thing?

Joel: When a customer moves to Jama Connect in the first place, they need to have a heart-to-heart, and say, “What are we trying to do? How often should we talk?” That’s really what moving to an online medium for some of this content becomes important for. You can’t have all the conversations in Jama Connect, but you can have many of the conversations within the platform. How do you want to have conversations? How do you want to approve of stuff? When do you want to approve?

No tool is going to tell you the best way to manage your company. You have to think about that stuff before you move in. There’s a lot of choices, but once you make those choices, then the idea is that the conversation gets easier.

Learn how to get the most from Jama Connect Review Center by downloading our best practices guide.

]]>
Streamlining Requirements Reviews: Best Practices for Moderators, Reviewers, and Approvers https://www.jamasoftware.com/blog/best-practices-for-streamlining-requirements-reviews/ Tue, 19 Nov 2019 12:30:00 +0000 https://www.jamasoftware.com/?p=36349

Reviews play a key role in successful product and systems development. They help ensure a new product will meet stakeholder, market, and compliance requirements.
Unfortunately, not all teams recognize the importance of implementing a solution for a formal review process: Almost a third of teams either have no requirements management system in place and rely on informal forms of collaboration and reviews with email and shared spreadsheets, according to a research report from Engineering.com.

Outdated review processes — involving long email chains, shared spreadsheets, and lengthy meetings — stifle collaboration, increase miscommunication, and result in team misalignment. This often leads to long review cycles, versioning issues, and an abundance of unnecessary meetings.

While a breakdown in communication can happen at any point in the product development lifecycle, reviewing and approving requirements is a particularly important process for ensuring all stakeholders are aligned. In fact, collaboration and clear communication during the review process has tangible benefits that impact speed to market, product quality, and your bottom line.

And while every review is different, there are generally three primary roles that exist in a product review: Moderators, Approvers, and Reviewers. In Jama Connect™ Review Center, each of these roles can be formally assigned to mirror best practices and ensure everyone understands the scope of their responsibilities.

Formal and informal reviews may necessitate different things from each of these roles, but we’ve compiled a list of best practices by role to make reviews go quickly and smoothly.

See the development time savings and efficiencies that our customers are experiencing with Review Center in this infographic.

Best Practices for Requirements Reviews by Role

As the Moderator, you are ultimately responsible for facilitating the review and incorporating the feedback from Approvers and Reviewers.

Best Practices for Moderators:

Provide thorough guidance. What type of feedback are you seeking? That the requirements are valid and correct? Or that the requirements are feasible? Or that the requirements are written well with proper grammar and syntax? Be sure to include specific focus and instructions in the review invite so all participants know exactly what to provide.

Balance the number of participants. Think carefully about the number of people you invite to a review. Too many and you’ll never have time to incorporate all the feedback. Too few and you may not receive enough feedback or miss critical stakeholder opinions.

Incorporate all feedback. If you have thoughts, feedback, or ideas related to a requirement, add comments for transparency so all participants can see the feedback.

Revise! It’s ok to publish lots of revisions during a review. Just make sure that all participants are looking at the latest revision so they can easily compare differences across revisions.

Close reviews when they are complete. Reviews finish when 1) you have enough feedback or 2) the deadline is reached. If you have enough feedback prior to deadline, make sure that you close it.

Best Practices for Reviews and Approvers:

If you are taking the role of Approver or Reviewer, your primary responsibility is to provide feedback.

Focus your feedback based on the Moderator’s instructions. What did the Moderator request you to review? Technical feasibility? Validation of requirement needs? Grammar and syntax? If you’re unsure, ask your Moderator.

Highlight important feedback. When adding feedback, highlighting text helps others know that you are focusing feedback on that specific piece of the requirement.

Categorize your comments for clarity. Indicate if your feedback is a question, proposed change, or issue.

Clearly communicate when you are finished. Make sure you clearly communicate that you are finished providing feedback so the moderator will know you are done. Keep in mind that you don’t need to comment on every item – you can abstain from providing feedback on certain items in the review.

Register for our upcoming webinar, “Ask Jama: Tips and Tricks for More Effective Reviews.”

The Benefits of Conducting Reviews in Jama Connect™

Jama Connect Review Center allows teams to:

  • Assign roles such as Moderators, Approvers, and Reviewers
  • Send product requirements for review
  • Define what’s required
  • Invite relevant stakeholders to participate, collaborate, and iterate on resolving issues
  • Approve agreed-upon requirements

Facilitating the improvement of collaboration and communication during reviews is resulting in major returns for Jama Software customers.

Take RBC Medical (now known as Vantage Medtech), for example, who now saves an average of $150,000 per project after they moved from semi-manual processes to conducing reviews in Jama Connect.  Now that RBC Medical has a centralized place to manage and collaborate on reviews, they’ve all but eliminated the need for lengthy, in-person meetings or back and forth emailing, making reviews more efficient and scalable.

But cost savings isn’t the only positive business outcome to result from an improved review process. MediSync estimates that Jama Connect Review Center has saved the team 80% of planning time that previously would have been wasted on meetings, sorting through versioned documents and emails, and consolidating feedback in review cycle.

Another customer, global healthcare leader Grifols, says that Review Center has helped it shorten review cycles from three months to fewer than 30 days, while reducing budget overruns. It estimates savings of over 80 hours per project in medical device development.

By simplifying the revision and approval process, Review Center streamlines reviews and facilitates team collaboration. With the ability to easily provide feedback where required, stakeholders and participants can move quickly and efficiently through reviews and on to the next stage of product development.

To learn more about best practices for moving through reviews quickly and seamlessly, download the Jama Software Guide to Review Center Best Practices.

]]>
How Online Reviews Affect A Product’s Perceived Value https://www.jamasoftware.com/blog/how-online-reviews-affect-a-products-perceived-value/ Tue, 01 Aug 2017 16:08:21 +0000 https://www.jamasoftware.com/?p=24540

Today, most consumers research products online before purchasing. Doesn’t matter if the product is B2C or B2B, or even if there’s a single competitor in the marketplace. You can bet customers are searching for opinions on your product, as well as alternative options.

“We are seeing the growing power of a customer in driving perceptions of brands/products, and reviews is just one way,” Tom Collinger, Executive Director of the Spiegel Research Center at Northwestern University, wrote in an email to Jama Software.

Recent research has highlighted the ways in which consumers perceive online reviews and how they inform purchasing decisions. And from that has come insights into how companies can think about online reviews — including why negative ones aren’t all bad — as well as ideas for future-proofing from backlash.

Value of Online Reviews

Researchers at the Institute of Cognitive Neuroscience at University College London, for instance, recently looked at how online reviews influence the perception of a product. The study first had 18 participants rate a range of Amazon items based only on image and description. The subjects were then asked to score the products a second time, but were instead shown the image along with aggregated user reviews, which displayed the average score and total number of reviews.

Turns out the subjects’ opinions were very much swayed by reviews, as their second round of ratings fell somewhere between their original score and the average. As Science Daily notes, “Crucially, when products had a large number of reviewers, participants were more inclined to give ratings that lined up with the review score, particularly if they lacked confidence in their initial appraisals, while they were less influenced by ratings that came from a small number of reviewers.”

As the study showed, people leaned more toward group consensus when their confidence about a product’s overall quality was low and the pool of reviews was large. Anecdotally, this seems to track. If you see a product online with over a thousand five-star reviews versus one with just two five-star reviews, you’re probably more likely to trust the one with more reviews, since it appears more credible.

Importance of Average Scores

Not that products with tons of 5-star reviews receive a blanket pass. Another recent study on the power of online reviews, this time conducted by Northwestern University’s Spiegel Research Center, in conjunction with the platform PowerReviews, analyzed millions of customer experiences from online retailers.

Northwestern discovered that products with near perfect scores can sometimes appear almost too good to be true. “Across product categories, we found that purchase likelihood typically peaks at ratings in the 4.0 – 4.7 range, and then begins to decrease as ratings approach 5.0,” the report states. So while you shouldn’t seek to attract purely negative reviews, some can help your product’s authenticity.

Beyond that, Northwestern also detailed some other ways in which negative reviews can help a product. By displaying at least five reviews online, positive or negative, the purchase likelihood is 270% greater than that of a product with zero reviews, according to the study.

Importance of Early Reviews

The study also discovered that nearly all increases in purchase likelihood of a product from online reviews occurred within the initial 10 reviews posted, with the first half of those being the most influential.

Of course, not all websites display reviews the same way. Some sort by review quality over chronological order, for instance. For the purposes of this research, the first five were found to be the most important regardless of how they were presented. “Our analysis describes the first five the consumer sees,” Collinger wrote in an email to Jama Software. “This is independent of the way in which they are listed by the retailer.”

This is why it’s so important for a company to get a product right at release. If a new offering gets savaged by a wave of online reviews initially, regardless of how they’re sorted on a website, those will be the first and most influential opinions people read when considering a product. And that reputation can stick. Patching the problem in future releases well help, but then you’re counting on people updating their reviews later on, which is no sure thing.

Good Products Come From Good Processes

When managing online reviews, the Northwestern study urged companies to focus on the first five reviews, embrace critical reviews, and follow-up purchases via email to urge consumers to write reviews. In fact, reviews from actual, verified buyers — as opposed to those who post reviews anonymously — are more likely to be positive than negative. Also, making it easy for buyers to post reviews on a company’s website, for instance, regardless of their device or platform, was recommended.

From a business standpoint, one other fundamental way to get online reviews working in your favor is by releasing the best product possible. That starts with a solid development process, with plenty of quality safeguards in place.

Best practices like implementing test management early and often, for instance, will reduce the number of defects or bugs that are likely to show up in your final release. Often, it’s exactly those types of design misfires that’ll get a product maligned by early online reviews.

And while ensuring the quality of a final product is a key factor of success with online reviews, according to Collinger, it doesn’t stop there. “Getting the entire customer experience right is the very simple solution to leverage this growing customer influence,” he wrote.

]]>
Review Center: Adding Items to Review https://www.jamasoftware.com/blog/review-center-adding-items-to-review/ Thu, 21 Feb 2013 06:03:19 +0000 https://www.jamasoftware.com/?p=2752  

How can I add items to a review after it started?

There will come a time when you forget to add an item to a review. Rather than start over it would be great to be able to add items to a review that is already in progress. While there are some inherent risks to adding items to a review in progress, we also understand that it is necessary.

The key to adding items to a review already in progress starts with how you initiate the review. For example, if you start a review by clicking on a container like Project, Component, Set, or Folder, you will be able to add additional items to that review. However, if you start a review based on a selection of individual items you will not be able to add items to that review once it is in progress.

Invest two minutes and watch this short video as we break down the steps required to add items to a review already in progress. If you have any follow up questions, either post a comment below or access support via support.jamasoftware.com.

 

 

]]>
Peer Reviews: Two Eyes Aren’t Enough, Part 2 https://www.jamasoftware.com/blog/peer-reviews-two-eyes-arent-enough-part-2/ Mon, 22 Oct 2012 19:57:04 +0000 https://www.jamasoftware.com/?p=3159 Here are some additional recommendations about how to make peer reviews of your requirements documents as effective as possible.

Don’t Overwhelm the Reviewers

Many BAs wait until their requirements specification is complete before they present it to some reviewers. Busy developers, testers, project managers, and users have difficulty finding the time to scrutinize a large document on short notice. It’s far more effective to review an evolving document incrementally. Give reviewers just a few pages at a time, preferably soon after an elicitation activity. Nearly anyone should be able to find thirty minutes to review a small quantity of information once in a while.

Incremental, informal reviews will uncover a lot of problems. Expect to make multiple review passes through the requirements over time. Each review cycle will reveal errors that the reviewers didn’t spot the first time. As the requirements continue to change and grow, people might need to revisit portions of the requirements documentation that they examined in an earlier draft.

You don’t need to hold a meeting for each peer review. Sometimes it works fine just to ask one or a few reviewers to look something over and tell you what they see. Combine these sorts of informal individual reviews, which I call peer deskchecks, with formal inspections of documents that are nearly done.

Build a Collaborative Partnership with Project Stakeholders

Explain to users why their input is so critical to ensuring the quality of the requirements and how it contributes to the quality of the ultimate software product. Make them understand that their review effort is a vital contribution, not an idle exercise. Begin forging this collaboration early in the project so these participants realize they’re valued members of the team.

On many projects, my software development teams at Kodak identified product champions, key customer representatives who worked closely with the business analysts. We negotiated the exact responsibilities with each product champion. But one responsibility was not optional: to review requirements specifications and evaluate prototypes. Without user review, we couldn’t tell whether we’d accurately captured the voice of the customer. We were pleased that all of our product champions accepted this responsibility. They provided great value through their reviews.

Invite the Right Reviewers

Determine early in the project what perspectives you need represented in your requirements reviews and who can provide these perspectives. Figure 1 illustrates the essential points of view a requirements review should take into account. Particularly consider getting the participation of the following:

• Customers who provided requirements input.

• Developers who will have to design and implement the requirements.

• Testers who will have to verify that the requirements were properly implemented.

Work products must be reviewed in a context, not in isolation. The reviewers must ascertain whether the work product meets its own specification. The top-level requirements documentation has no specification or reference document, so you need customers or others who provided requirements input to review the deliverable. Also, you might invite another BA to participate who’s adroit at spotting poorly written or missing requirements. The downstream “victims” of the requirements specification can check to see whether it will satisfy their needs. And if your product connects in some way to any other products, have representatives of those other components make sure the pieces will fit together properly.

Rather than having all these different reviewers just read through the document, consider using perspective-based reading, in which each reviewer examines the deliverable from the point of view of a specific document consumer. For example, a user seeks to determine whether the documented requirements would in fact let him achieve his business objectives. A developer checks to see whether the document contains the information he needs to design and implement a solution. A tester considers whether the requirements are precise and detailed enough to be verifiable. These different points of view will reveal different types of problems.

Have Reviewers Examine Appropriate Deliverables

It might not be reasonable to expect all your user representatives to effectively review a detailed software requirements specification. They should certainly understand use cases, though, as use cases ought to be written from the user’s point of view. Make sure your reviewers can comprehend the requirements documents and diagrams well enough to validate them. If the requirements documents are too technical for the reviewers to follow, you’re wasting their time.

Design for Reviewability

Present the information in a specification in forms that make it easy for reviewers to understand it and to examine it for problems. There are many ways to communicate besides natural language text. If your eyes glaze over when reading a long list of textual requirements, maybe a diagram or a table would be an effective alternative. Remember that a requirements specification is a communication tool. If your requirements deliverables don’t speak to their intended audiences, the deliverables need further work.

Inspect All Requirements Deliverables

Informal reviews certainly are helpful, but more systematic inspections will find more defects. Inspections and other group reviews also are a way to force the issue of getting reviewers to actually look at the work product. Inspections are a powerful technique for spotting ambiguous requirements. During an inspection, one inspector (not the author) serves as the reader. The reader presents his interpretation of each requirement to the other inspectors. If his interpretation doesn’t match their own understanding, perhaps the team has detected an ambiguity, a statement that can be interpreted in more than one way. Individual informal reviews often overlook ambiguities because an ambiguous requirement can make sense to each reader, even if it means something different to each of them.

Emphasize Finding Major Errors

The greatest leverage from a review comes from finding major errors of commission and omission. These are the defects that can help you avoid extensive—and expensive—rework much later in the project. Ambiguous and erroneous requirements send developers and testers in the wrong direction. Missing requirements are among the hardest errors to detect. They’re invisible, so inspectors don’t see them during their individual preparation. Because they don’t exist, the inspection reader won’t describe them.

Fixing typographical and grammatical errors is useful because any changes that enhance effective communication are valuable. However, this should be done before sending out the document out for broad review, perhaps by having a single skilled editor go through it initially. Otherwise, reviewers can trip on these superficial errors and fail to spot the big defects that lie underneath. When I see an issues log from a review that contains mostly cosmetic and spelling mistakes, I worry that perhaps the reviewers overlooked major problems.

No business analyst can get the requirements right on his own. Get a little help from your friends to make sure that what you’ve written will satisfy customer needs and will let the rest of the development team do a first-class job.

Check out Peer Reviews: Two Eyes Aren’t Enough, Part 1.

Jama Software has partnered with Karl Wiegers to share licensed content from his books and articles on our web site via a series of blog posts, whitepapers and webinars.  Karl Wiegers is an independent consultant and not an employee of Jama.  He can be reached at http://www.processimpact.com.  Enjoy these free requirements management resources.

 

]]>
Peer Reviews: Two Eyes Aren’t Enough, Part 1 https://www.jamasoftware.com/blog/peer-reviews-two-eyes-arent-enough-part-1/ Fri, 19 Oct 2012 17:14:48 +0000 https://www.jamasoftware.com/?p=3157 In my view, the most powerful quality practice available to the software industry today is inspection of requirements documentation. A peer review is an activity in which someone other than the author of a work product examines that product to find defects and improvement opportunities. An inspection is a type of formal, rigorous team peer review that can discover more subtle problems than individual reviewers might spot on their own. Removing errors in the requirements saves many times the cost of the inspection because of the rapidly escalating cost of remedying defects that are found later in the project.

Unfortunately, inspections and other peer reviews of requirements documents often aren’t held when intended, and those that are performed don’t always go well. Reviewers take only a superficial look (if they look at all), and they miss many of the major defects that lurk in the specification. Conducting an ineffective review can give the team unjustified confidence in the work product’s quality.

This pair of articles, adapted from my book More about Software Requirements, presents several ways to make your requirements reviews more effective and to encourage prospective reviewers to participate in them. For an in-depth exploration of different peer review approaches and how to make them work for you, see my book Peer Reviews in Software: A Practical Guide (Addison-Wesley, 2002).

Note that, although I refer to requirements documents in these articles, the same principles also apply whether you’re storing your requirements in a spreadsheet, a database, a requirements management tool, or on note cards. No matter what their form, you really ought to have multiple pairs of eyes examine your project’s requirements for problems.

Educate the Reviewers

Suppose someone hands you a document and asks for your input. Your instinctive approach probably is to begin reading at the top of page one and see whether any problems jump out at you as you go. Maybe you’re not too sure what to look for. As you continue reading, you begin to get tired and bored. Your own work backlog tugs at your conscience. So you flip through the rest of the pages, tell the author about the minor issues you found, and move on with your life. There’s a better way.

Don’t expect that reviewers will automatically know what to do. If you’re a business analyst who’s relying on input from others to validate and verify your requirements, you need to educate your reviewers. Ideally, you can get them to take some training on how to perform peer reviews, such as my eLearning course called “Software Inspections and Peer Reviews” (http://www.processimpact.com/elearning.shtml#sipr). If not, at least tell them about the purpose of reviews, the various sorts of review approaches, and what you expect of the reviewers. This is particularly important if you plan to hold inspections, which involve a more structured process than do the more familiar informal reviews.

Interact Respectfully

Peer reviews are at least as much a social interaction as they are a technical practice. I believe that asking someone to tell you about mistakes in your work is a learned behavior, not an instinctive behavior. Authors need to learn how to request and accept constructive input about mistakes they’ve made. If you’re holding inspections or team reviews that involve a meeting, make sure the participants understand how to collaborate effectively and constructively. Many organizations hold ineffective reviews because participants are uncomfortable and they aren’t sure how to behave in a review meeting.

One state government agency that hired me as a consultant told me that their review participants refer to the review as “going into the shark tank”—not an appealing prospect. If you’ve been burned by a caustic review in the past, you’ll be reluctant to participate in such a stressful and ego-busting experience again. Therefore, it’s important to make reviews be constructive experiences in which all participants feel that they’re contributing to improving the team’s collective work products.

People need to be considerate when they provide feedback to the author of the work product being reviewed. I like to simply make observations about the product, rather than telling the author he did something wrong. If you’re the author, you might react differently if a reviewer accusingly says, “You forgot to fill in section 3.4 of the SRS template,” instead of simply commenting: “I didn’t see anything entered in section 3.4 here” or “Section 3.4 seems to be missing.” Be thoughtful about how you phrase your comments. Perhaps you’re a reviewer today, but you might be the author the next time.

Focus on Specific Problems

Tell the reviewers what kind of input you’re seeking in each situation so they can focus their attention on giving you useful feedback. Give them tips on how to study and analyze a requirements specification. For example, you might invite certain reviewers to start reading at some point other than the beginning of the document. After all, developers won’t read the entire document sequentially during construction. This is a way to get fresh eyes looking at various sections, rather than having all of the reviewers peter out partway through the document and miss a lot of errors in the latter pages. Some people suggest reading the document from back to front, section by section, to see whether the questions they come up with are all answered earlier in the document.

Give the reviewers a checklist of typical requirements errors so they can focus their examination on those points. You can find checklists for reviewing software requirements specifications and use case documents at http://www.processimpact.com/pr_goodies.shtml. You might suggest that the reviewers make multiple passes through the document, perhaps during separate review cycles of a growing document. They can look for different types of problems from the checklist during each pass. Alternatively, ask different reviewers to use separate parts of the checklist to broaden the review coverage. Eventually requirements reviewers will internalize the review checklists and know what sorts of problems to look for.

In part two of this article I’ll share some additional tips for making your requirements reviews as effective as possible.

Check out Peer Reviews: Two Eyes Aren’t Enough, Part 2.

Jama Software has partnered with Karl Wiegers to share licensed content from his books and articles on our web site via a series of blog posts, whitepapers and webinars.  Karl Wiegers is an independent consultant and not an employee of Jama.  He can be reached at http://www.processimpact.com.  Enjoy these free requirements management resources.

 

]]>