Introduction
The client is an organization providing cryptocurrency payment services and personal wallets. Our project is a Salesforce Sales Cloud implementation and customization to ease client’s business processes: lead management, lead conversion, opportunity closing and new customer onboarding, account management, support, and campaigns and replace a bundle of third-party systems with Salesforce
Challenge
- The project was in an active phase of development at that time.
- The project has been in the development stage but there was no testing documentation or QA on the project before.
- Unstable product quality due to frequently changing requirements
- Many integrations with third-party services complicated the testing process.
Goals
- We have to cover available features with test cases and in the meantime create test coverage for new features in development.
- Create smoke and regression test sets to set up regular testing processes to improve the quality of the product.
- Come up with a solution how to test integration with third-party services with no access to it on the QA environment.
- Take an active part in the verification and validation of new features in development processes.
- Provide regular reporting for stakeholders.
- Make the development and testing process transparent and clear.
Solutions
At the client’s request, it was decided to use Testrail as our main test management tool. Test cases are divided by Objects (Modules) and features that are implemented or related to these Objects (submodules). Also, its integration into Jira provides us with better access to test cases for developers and visibility for Test Results of concrete tickets and it helps to control test case coverage.
We have come up with a solution on how to imitate data that comes from different instances. That way we could test whether the feature reacts correctly to the data outside of the organization.
For the test cases, we have decided to cover features in chronological order as it was developed by sprints, but coverage of new features was in priority. Eventually, when we had enough testing documentation, a pre-deployment smoke test set was created and a regression test set soon after. Eventually, it was discovered that despite the clean results of pre-deployment checks some issues still appear in production due to differences in integration solutions in the QA environment. Due to this problem, we created a production smoke test set which includes tests for integrations and features which have great value for business processes.
Also, the QA team is taking an active part in all processes throughout development:
- Clarifying requirements;
- Participating in grooming sessions;
- Writing acceptance criteria for tickets with high priority;
- Writing and updating test cases and test sets;
- Preparing release notes about new features which were included in a recent deployment;
- Preparing regular test reports after completing testing on QA and production environments
Conclusions
As of now, all features are covered by test cases, testing documentation is updated regularly in every sprint. Pre-deployment smoke testing is updated monthly (every 2 sprints). We are performing regression testing one module per sprint (small modules and submodules are performed together). The general level of QA processes increased significantly which is also confirmed by the client and users. There are plans to implement autotests for frequently repeated checks, which will speed up the testing process.