By Larry Barker
Our journey to automated application deployment continues! In our previous article, we set out to craft an automated release pipeline using Build Manager and AWS CloudFormation - one capable of spinning up a new Domino hosting environment with the press of a button. Now it's time to expand upon this process and include application-specific steps. We can do so by selecting an app design from the Template Registry and prepping it with typical Build Steps before deploying into our freshly created AWS infrastructure for testing (as seen in Fig 1B). This way, developers will never have to worry about redundant and tedious setup tasks after each fresh deployment!
With any managed release, it's essential to consider all of the options for ensuring an optimal configuration. There are plenty of paths to take and various approaches that should be carefully weighed – ultimately leading us toward a successful outcome!
Versions, Approval Mechanisms and Notifications
Initially, we need to determine how we are going to manage the version integrity of our application release now that it has left development and is ready to be provisioned into other environments. In this case, ‘TEST’.
There are a few ways in which we can approach this with Build Manager:
Set up virtual representations of our target environments within Template Registry to silo the application’s version availability;
Configure workflow approvals for stakeholders or gatekeepers for each environment;
Specify teams or roles to receive application movement notifications and related communications;
Assign roles/individuals responsible for application promotions;
Lockdown target servers appropriate for use with each build path/environment;
Enable auto-version selection for source templates during promotions, i.e the latest possible version of the application approved for use within that environment will be selected;
Or various other combinations that incorporate some or all of the above.
Build Process Approach
The critical question is: What Build Manager release steps are required to achieve the desired outcome for any of our possible approaches? Our end goal is to transition our latest template version into our on-demand, AWS-provisioned ‘TEST’ environment we configured a pipeline for earlier.
One method we could consider would be to perform any of the typical build steps that we would use to prepare the template for a new environment first (see fig. 2: green). This may be actions such as the setting of template or element properties, agent attributes, signing, design audits, or any other desired step capable within Build Manager.
We could then use deployment-orientated steps/commands (see Fig. 2: red) to transfer the new template to our on-demand EC2 instance. We could then reference it from within the Domino One-touch Setup JSON file to be created at the time of launch. Finally, we could use the Build Manager ‘Create Test Data’ step to copy in our standard test data set.
It is important to note that typically Build Manager might use a ‘Refresh DB Design’ step (see Fig. 3) to deploy and refresh inheriting NSFs on the target servers. Because we are setting up on-demand purpose-built servers and environments as part of the pipeline, we will use Domino One-touch setup for this step.
Configuring Build Steps
Now that we have determined that we will be using a Domino One-Touch setup rather than. a Build Manager Refresh Step to create our application at the time of deployment; our new pipeline needs a few extra Build Steps to achieve this.
Specifically, we have three ‘Run Command’ steps and a ‘Copy DB’ step that will result in our application and template being lodged to the new ‘TEST’ server when used in combination and in the correct sequential order. After that, we can use a ‘Create Test Data’ step to drop in our desired standardized test data set or any other documents.
Build Step: Copy DB
The first deployment action will be to drop a copy of the desired template version locally within a ‘Deployment’ folder after any Build related steps have been completed.
In this case we may stipulate our ‘Deployment’ folder and use a Build Manager variable to dynamically apply a standardized naming convention to our application, including applying an an NTF (template) extension. This will enable the deployment steps that follow to have Domino One-touch setup create an NSF based on the newly prepared template.
Build Step: Run Command – Provision AWS Stack
Based on our earlier post, this step does not require reconfiguring or any changes to be made to the CloudFormation YAML file. We include this now as it is essential to understand the timing of this command step is critical due to the fact that it sets up the AWS infrastructure required for the remaining steps to complete successfully.
Essentially, we are deploying the AWS EC2 instance, Elastic IP, and Route 53 DNS entries that our Domino server will use.
Build Step: Run Command – Transfer Template to TEST01
Once we have collected all of the templates that make up our application within a central collection point, we can use another ‘Run Command’ step to copy our templates up to our EC2 instance as they will be used in later steps by One-Touch setup to create our respective NSFs.
Another important aspect to consider within this step is what approach to use in order to provision the JSON file used by the One-touch setup to configure the Domino server and initial application settings. As an example, you may choose to deploy this file along with the newly released templates if it is application specific. However, if your JSON file is more generic and you find it may be used across several application pipelines, then perhaps you should choose to bake the file into a standard AWS EC2 AMI that can be reused as part of the initial EC2 CloudFormation stack provisioning.
Build Step: Run Command – Domino One-Touch Setup
Whether you choose to deploy your One-touch JSON configuration file at the time of release with your application or as part of the AWS AMI, it will be this configuration file that we will use to designate our environment variables and how we want Domino to create our new applications.
If we look at our earlier JSON configuration file, we concentrated on the input variables used to configure the Domino server configuration preferences. This time around, we want to include ‘appConfiguration’ variables/objects to have One-touch setup create new NSFs based on our deployed templates that we copied up to the EC2 instance in the previous build step.
As an example, if we wanted to:
Create a ‘test.nsf’ based on our deployed ‘test.ntf’;
Assign a ‘test user’ initial ACL entry;
Create an initial configuration document based on our DBConfig form, along with some settings entries;
we could build upon our earlier JSON script with something like the following:
The key takeaway is that you can use One-touch Domino setup files containing input variables and objects to provision newly deployed templates from Build Manager as part of a standardized release. Additionally, there is nothing to stop you from exploring more advanced options like using tags and variables along with command-line search and replace actions to dynamically place the input variables within the JSON configuration file upon release. But that is a topic for another day!
Build Step: Create Test Data
Once all of the steps to initialize a new NSF have been completed, you may wish to copy in a standardized set of test data. With the 'Create Test Data' build step, this process is made easy and straightforward. Not only does it allow you to populate your new NSF with quality assurance test data, but also saves vast amounts of time and effort when compared to adding each element manually. This method actively assists in bootstrapping your newly initialized NSF quickly and effectively - giving you quick access to the test data essential for testing application features.
When it comes to selecting the source data, there are several options available. Perhaps one of the easiest and most commonly used alternatives to obtain source data is through other Domino servers or network shares. Of course, this can be broadened by targeting views or even selecting specific documents via a formula. An overview of the basics are outlined in the example below.
Aside from any release notifications being sent out to any stakeholders or release managers, this might typically be one of our final steps. After all, prior to the successful completion of any major release, sending out notifications to stakeholders or release managers is a crucial step. Not only does this ensure that all relevant parties are informed of the progress and development, but it also allows for immediate feedback should any issues arise during the release process.
With Build Manager, you have the opportunity to customize your Domino application releases in order to meet specific goals, like an on-demand statically configured AWS stack and Domino ‘TEST’ server. Whether it's a unique environment setup, testing and validation requirements, or integration with different components of a larger system, there are more than twenty powerful build steps available to support any number of configurations. Additionally, for even greater customization and flexibility, Build Manager also provides a few advanced build steps that offer customization beyond out of the box functionality. Actions such as launching custom agents, running command-line instructions or console commands. It's important to remember that Build Manager allows you to create specialized build pipelines by mixing and matching from all these options.
If you have any questions, or want to discuss any aspect of using Build Manager to deploy HCL Domino applications, please email us at techsupport@teamstudio.com