Blog / article

Share

5 User Acceptance Testing Best Practices for Your Salesforce Project

From a Quality Assurance Analyst at Traction on Demand

I like to see things done well, so maybe it’s no coincidence that I’ve found a career in quality assurance. Being analytical and detail-oriented means that I pick up on things that other people miss, which has been a major strength in being a Quality Assurance Analyst at Traction on Demand. When you’re able to make a difference by using your strengths to help your team, it feels great to be able to contribute to a larger overall goal. But it’s not all warm fuzzies. There’s a lot to consider in quality assurance.

When adopting anything new—such as tools, platforms, automations, processes or systems—it is vital to have the key stakeholders evaluate them. Empower them to provide timely feedback about the overall user experience and whether their business needs are being met. This stage of the project lifecycle is commonly referred to as user acceptance testing (UAT). If a project’s UAT phase is not planned and executed well, this could potentially cost thousands of dollars to fix critical defects just prior to or—eek!—after going live with a new feature, system or product.

With that in mind, here are five effective user acceptance testing best practices for your Salesforce project:

1. Assemble an Awesome User Acceptance Testing Team

UAT testing should be led by “super users” within an organization. These users will learn to use the system before organization-wide training begins. They will become mentors to others later on, so it’s important to choose UAT participants who are not only highly motivated but also good at thinking critically and teaching others. If you have a mix of users that must be involved but are low in motivation or skill, think of ways to support them.

Generally, you want to get keen people on board so valuable pieces of feedback are sent to the development team, and the overall acceptance of the system is not negatively biased.

2. Choose the Appropriate UAT Feedback Tools

Any feedback from UAT testing must be documented in an objective manner. Establish a set of guidelines for clearly describing issues so when the information is passed on, there is no ambiguity for the developer when they analyze the provided information and come up with a solution to fix the issue.

For medium to enterprise-sized organizations, I suggest using Jira. Jira’s integrated ticket management system allows organizations to automate and streamline many of the manual tasks that UAT team perform, for example, reporting and tracking issues. For smaller organizations on a budget, you might consider Smartsheet or a Google Docs spreadsheet as a means to allow multiple users to work at the same time while sharing UAT-related information and feedback online.

3. Take Time to Train Your UAT Testers

UAT testers should be given clear expectations on the purpose of user acceptance testing. This may encourage participants to explore and take the time to learn what they are testing—and that it is a good thing! They may be likely to find more bugs or issues in this ad hoc or exploratory way.

Every UAT tester should be given a description of each piece of functionality that they have been assigned to evaluate and test. In addition, they should be given background information on how it should work. These details can be captured as “user stories” consisting of a number of elements that capture business requirements using plain language understood by everyone:

  • As a (user): Salesforce marketing user
  • I want to (take this action): launch a wizard that lets me choose and select a date filtered report
  • So that (I get this result): I can view a report graph, which shows me the total number of email campaign responses that were received in the month of May

When it comes to documenting any issues and reporting them, the UAT testers should be given a reference guide or “cheat sheet” to keep the process consistent.

4. Anticipate Every Possible Outcome—But Don’t be Too Strict

What happens if a UAT tester deviates from evaluating the documented user stories mentioned above? Will the whole system crash and the world implode? This is highly unlikely, but as more user stories are validated by UAT testers through a variety of “test case scenarios,” the greater the chance of issues being discovered and fixed before the system goes live.

That being said, there needs to be a balance between writing test cases containing sufficient detail and allowing UAT testers the freedom and flexibility of testing the system as if they were using it in their day-to-day routine. This can help bring usability and functionality issues to the surface, which may have otherwise gone unnoticed had the UAT tester been following a rigid test case and focusing only on expected outcomes.

5. Keep the Channels of Communication Open

It is important to have good communication channels between UAT testers and the development team. Set expectations upfront as part of UAT training and reiterate them again before testing begins. For example, if not all the configurations have been implemented in the UAT environment, alert your UAT testers about this ahead of time so they are not surprised by “broken” functionality and incorrectly reporting these as issues.

Have regular touchpoints (e.g. daily review sessions) with every UAT tester to keep track of their progress, and generally talk to your UAT testers! Think of ways to motivate your them. It could be as simple as ensuring you spend a few minutes with each of them discuss how to make the UAT experience better for them, or perhaps you could add incentives, such as providing snacks or a free lunch for their participation.

UAT can, of course, become a lot more complicated than what I’ve outlined here, but if you keep these overall five best practices in mind, it will help keep you on the path of quality assurance excellence!

If you’ve made it this far but are still unsure about how to implement such a process within your organization, don’t hesitate to get in touch with the experts on our team.

jonathan uy traction on demand

Written by Jonathan Uy, Quality Assurance Analyst at Traction on Demand.

Got a project for us?

We have detected that your browser is out of date. As a result, this website may not display properly. Please update your browser for the best experience.