New processes and used new tools to align with other teams outside CS as well.
*If the release is not explained in Demod, Tech and CS need to arrange a Post-Release call
**Beacon and Demo team timeline is explained in this portion of the document
When any release is confirmed and announced to the rest of the team members, the Post-Release Testing team will need to assign group(s) to do the testing for the particular release. After that, the respective group will need to schedule a post-release call with the Product Owner (If the feature is not explained or released prior to DEMOD).
Then the respective group will do the testing, and the feedback will be delivered based on the release grading.
If there is no blocker or issue that makes the release not usable, then the group will need to file the feedback into the Google Currents (Product Release Post). Also contact the Beacon team and Demod team and let them know that the release is good to go for both Demo Platform and Helpscout Documentation.
Below is the link to access the CS Post-Release Internal Testing Sheet
https://docs.google.com/spreadsheets/d/1oHDTemzswhE0j6SVMOJs1-u0w2moZPeJZ3j_ldVf91Y/edit?usp=sharing
We are trying to cooperate with the tight timeline from the Tech team, as the release is mostly released into the Production and we are not able to test it on the Development Environment, we will need to have a pack but reasonable timeline to deliver the feedback of the Post-Release Testing.
The Post-Release Testing should take only 3 days (maximum) to complete, and other processes such as adding to the beacon and adding to demo platform will follow in their respective timeline.
We divide the process into 3 grades, and these grades, will determine the duration needed to deliver the post-testing result:
High
This also applies to urgent release, this will be prioritized and all the output of the testing (Feedback, and first draft beacon) will be delivered in a maximum of 24 hours. At least 2 working groups and possibly more depending on the release nature (Platform wide, affecting so many products or changing the current flow)
Medium
Medium grade release will have an extended time up to 48 hours after the release, the beacon and demo platform implementation will be available by 1 week after the release. We will allocate 1 group at least and a maximum of 2 groups if there is a request.
Low
The low-grade release is the least priority and will be delivered at a maximum of 72 hours after the release. We will allocate 1 group to do the QCA for this grade.
The timeline above will also include the tech team announcement and also post-release call. And if there is any other special case release, or special/urgent request coming from the tech team, we will also prioritize it.
Urgency
Aside from the grade above, all testing should be done within 3 working days!
We are doing the testing in a production environment, so please limit the test only to these environments:
CS Test Event (DO NOT TOUCH) (823)
Jublia Test Event (1555)
Self-Serve Test Event (1885)
Outside of that event, you can also test it on your own test event (if there is any)
In this new flow for the post-release, we want to increase the engagement from the Tech Team itself into the flow, by doing some checklist while releasing to help us better record and maintain the post-releases documents,
Announce Product release to google currents and group chat
Please also include the GitLab main ID for easier identification
Also please check if there is any conjunction between the past product request, if there is any, please also put the Product Request Current Links into the Product Release Notes
Add the release grading (Low, Medium or High) and specific notes if there is any (Ex: Limitation, special condition or configuration, trigger, and related other notes)
Schedule Post Release call with CS group assigned (If the release is not covered in DEMOD or not yet)
CS will post the testing result and findings on CS Post-Release Testing Sheet as well as google currents (on the same post as the product release)
Tech to check on the CS comment and tick out the Tech Acknowledge Checkboxes on the CS Post-Release Testing Sheet
Tech comments also can be put down in the Google currents (CS will be documented back in the CS Post-Release Testing Sheet)
As discussed, the release will come out with a grade from the tech team, this grade will be based on the technical complexity during the development and related product that is affected by the new release development.
Those grading are:
High
Major releases that have a lot of impacts.
Medium
Less major and still requires us to set up manually, also has some impacts on other products.
Low
Minor releases, UI improvements, or other minor changes.
We can’t justify which release will fall into the above category easily, so this grading will come out from tech during their development and they will tag one of these gradings into the new release when announcing it to the team.
Before we divide the CS team into 2 teams as below:
Internal Release Testing Team
All releases affect the CS and the rest of the internal team's workflow.
External Release Testing Team
All releases that will affect the end-users or client-facing releases.
We intend to renew this as the flow actually can’t keep up with the releases from the tech team. The reason is that some of the releases don't fall into a specific category of internal or external releases. We will not divide the team anymore and will rely on the CS Roster for assigning the CS for the releases.
Group A | Group B | Group C | Group D | Group E |
Julian | Adnan | Agustin | Matthew | Harry |
Jia | Harry | Zabdi | Jia | Adnan |
Sherlly | Princess | Julian | ||
We divide the team by group so it is easier to assign personnel for the post-release testing.
We will assign the group based on the complexity of the release, ideally, 1 group for each release, but if the grade is high (or there is a specific note from Tech that the release is huge) we will assign more than 1 group (up to 6 people)
Julian and Agustin will be responsible for the whole Post-Release Testing Flow. From Assigning the group for testing to following up on the updates on the Google Currents
Scenarios:
Tech released a feature at 11/11/2021, release grade is low, we will assign only 1 group which is Group A (as the first group)
Tech released a feature at 15/11/2021, release grade is medium, we will assign only 1 group which is Group B (as the 2nd group)
Tech released 2 features on the same day on 18/11/2021 release grade is medium and low, we will assign the earlier release to group C and the later release to Group D.
Tech release a major product on 25/11/2021, release grade is high, we will assign group E + Group A (total of 2 groups to clever the high grade)
We prepare a place for CS to store the testing asset (Screenshot and Video Recording) for better explain the issue faced by CS while testing the new release, please use the below link for the Testing Asset Folder
Step on how to create the file the asset:
Open the Testing Asset Folder
The release is categorized based on the month of the release, click on one of the months that in line with the releases
Create a new folder with the GitLab issue as the folder name
Put any asset like a screenshot or screen recording inside the folder
Put the folder link into the CS Post-Release Testing Sheet
Format for Google Currents Post
Product release title:
GitLab ID:
Group:
What's changing:Cs feedback: (maybe with some screenshots) :
Product request title:
GitLab ID:
Group:
Feedback / Suggestion:
Why we need to improve this:
Product release title:
What's changing:Cs feedback: (maybe with some screenshots)
Link of the product release.
To better implement the release into the Beacon and Demo platform, we created a flow with a quick timeline to accommodate the Beacon and our Demo Team.
The beacon team under Customer Education already updated their own process on the beacon creation, please do refer to this document for more information.
https://docs.google.com/document/d/1MX9TDJsP3Ty_SsTIrpCjaUDrAtglo33EJWPMkdOLAiY/edit
After the release is tested, and no more fixes are required we can implement it on the Demo Platform
CS in charge of the implementation of the Demo Platform:
Zabdi
With a maximum of 3 days after testing to apply for the release on the Demo Platform, we divided the release implementation in base on how busy are the CS in charge of this process.
In a Peak Period (busy with many projects at the same time) - will be implemented at a maximum of 72 hours since the testing is done.
A Non-Peak Period (not so busy) - will be implemented at a maximum of 24 hours since the testing is done.