I participated in a very interesting conversation that dealt with the issue of “how should technical teams properly interpret and manage compliance demands which include but not limited to, market licenses, legal requirements, regulations, and external and internal technical standards?” This discussion yielded a number of challenges which proved to require:
- improving the cooperation and collaboration across multiple teams including Legal;
- a steering committee with the correct stakeholders to decide how the company will interpret and manage the imposed external regulations and technical standards;
- a method of managing and making visible “a working body of compliance demands” that should embody the company’s official mandate on both external and internal guidelines the company, as a whole, should follow;
- a method of on-going management of the company’s interpreted compliance demands; and
- elevating software testing to assist in this effort as well as build a supporting collaboration between change management and internal auditing/quality assurance.
Considering the above challenges, one may ask “how can test automation support the compliance demands process?”
The Big Picture
As mentioned in the overview, this is not an effort that can solely be managed by quality assurance, software testing or test automation. So let’s look at the big picture and identify where automation fits into the overall process.
In short the user stories for this scenario are quite simple.
1.0 As an organization, I would like the company to have visibility to the current market licenses, regulations, technical standards and policies as a company that should be followed.
1.1 As a member of technical (PO, developer, tester, architects, etc), I would like to have an automatic alert and report of any new and/or changed regulations.
1.2 As a tester, I would like to have tests (functional, unit, api, etc) to help verify the technical aspects of the compliance demands are being adhered to.
As the test automation developer, you would add to your test framework the following support:
- A compliance_checker.class
- A method in the class to check for new compliances. If the compliances are new, alert and report with the new compliance information.
- A method in the class to check for modified compliances. If the compliances have been modified, alert and report with the modified compliance information.
- A method in the class to check for deleted compliances. If the compliances have been deleted, alert and report with the deleted compliance information.
- Process the compliance testset that contains the tests that verifies the finalized solution, functionality, internal code solutions, etc adhere to the technical aspects of the compliance demands. If any test fail, update the tests with a failed status, a failed message, if available a screen shot. (My preference is for any test case that fails, provide screen shots of each step in the process if verifying at the gui level). As part of the normal automated support, alert and report the failed tests.
In order to successfully setup test automation support for verifying the compliance demands, we require the assistance from internal auditors/quality assurance to create a repository containing the information the company is required to follow. Yes this is bigger than software testing! Perhaps an optimal way of capturing and managing this information is to use:
- Creating a database using spreadsheets is always the simplest way to create and manage this type of data. However it lacks other features like email notifications and activity logs people use to keep up with the changing information.
- ALM (Application Life Cycle Management)
- Using a system such as this allows the creation of the compliance demands leveraging features such as filtering, reporting, traceability, management, email alerts etc to manage the data. It is also easier, in my humble opinion, for automation to hook into the system to gain access to the information and provide feedback, if required as well as traceability to the tests that perform the on-going verification.
- People have been basically forcing Jira to act like an ALM type of system. It is an awesome solution. The traceability features are not natively there but perhaps one can build a level of traceability using the reporting features of automation. However one can easily leverage features such as confluence and dashboards to help manage the alerts. Jira has an API as well that automation can leverage to retrieve and report back information.
In the language of agile, the PGO (product governance officer) is responsible for managing the compliance user stories. In this case, the PGO is actually the policy and procedure makers within the company. Your company might call this person an internal auditor or quality assurance analyst. We must respect how they wish to control this information however as long as it is in some manner that allows the automated tests to do two things:
- Interrogate for added entries
- Interrogate for modified entries
- Interrogate for deleted entries
- NOTE: A last minute thought here. Depending on the system used to manage the compliance demands, one can perhaps use a pipeline process to alert and/or kick-off automated tests in changes are made to the compliance demands.
If the reports yield changes then the POs (product owners) can properly create user stories and prioritize them so the development (including testers) teams can properly develop and test according to these changes.
Depending on the solution (web applications, web api applications, services, etc) you are developing will dictate the actual type of tests you should develop to verify the compliance demands are met. My recommendation is when automating these tests, the actual tests should be separated from your TestFramework. We can cover the reasons why in another article but for now, the tests should verify the compliances are met.
If you are using a solution that supports traceability, I would leverage it. Linking tests to the actual compliances creates visibility and demonstrates the company, as a whole, is indeed adhering to both internal and external compliance demands as well as show the health of the developed solutions released to production.
Alerts can be used to bring to the attention of the technical staff that the compliance demands have been changed in some way and the team needs to do one of the two following things:
- Resolve a bug that has broken the systems adherence to the technical compliance demands
- Update code and tests to meet the changed technical compliance demands
A number of reports can be developed to support this effort and for different stakeholders. Consider the following sample ideas of reports and stakeholders:
Added, Changed Compliances
Failed compliance tests
PO, developers, testers
System health of Compliance Demands
Again this is an idea to think outside of the box on how test automation can support managing an internal process as well as system verification. Have fun with it!