Creating workflows for each automated test in Dictionary Manager module

@kdaud I would hesitate around calling this “each test” as having a workflow-per-test is going to quickly get a little ridiculous. The Dictionary Manager currently has 9 feature files consisting of 30 specs and having either 9 or 30 workflows feels like the wrong number. I would instead suggest that we break down tests (if it makes sense to do so) into the test personas we already have defined.

This can easily be done by tagging and npm scripts. See the docs on the cucumber-cypress-preprocessor and note that as an npm script it would be something like:

"test:integration-basic-user": "cypress-tags run -e TAGS='@basic-user'"

That said, there are some real challenges to running the current e2e tests against an actual instance, something we work around by spinning up a “clean” instance of the OCL backend when running the tests in CI. Getting the tests working against a “less clean” environment probably will require the greatest deal of work, but it would be valuable to be able to run these tests against, e.g., the QA version of the Dictionary Manager.

To that end, I would suggest that it’s best to run these workflow-oriented tests in Bamboo rather than GitHub Actions. That way, we can ensure that they only run after the QA instance has been updated to the latest copy of the code.

1 Like