Did you start building your integration layer with TIBCO BusinessWorks6? Do you use BW6 for exposing your new API’s? Do you work agile? Are you looking for Continuous Integration / Continuous Delivery for your new BW6 components?
Well, if you answered any of these questions with “yes” then continue reading.
TIBCO as all other integration providers is keen to facilitate Continuous Integration and Continuous Delivery (CI/CD), thus they provided a tool which will help you to standardize your BW6 build. In this article I will explain the required steps to create your CI/CD pipeline and walk you through the automatic build, deployment and testing, so you can proceed with automatic release processes.
Firstly you will need to get the TIBCO Maven plugin and install it in your development environment.
I did my walkthrough using a specific version: “TIB_BW_Maven_Plugin_1.1.0”. But of course you can explore the latest version.
As soon as you unzip the downloaded plugin, you will find install.bat. Just run it and provide your TIBCO BW6 home folder. The installation process will start. In the studio you will find a new option in your package context menu to generate a Maven POM for your application. When you click this option, it will start a new wizard that takes you through the generation process, asking you to provide some information about the server, domain, appSpace and appNode. The outcome of this process will be a new Maven folder in your project, containing the parent POM file which will refer to your original projects. It will also create a new POM file for both the application module and the package.
Commit your newly generated folder and the new POM files.
Once this is done you don’t need TIBCO BW6 anymore but you will need a reference to the jar file(s) in an artifact repository. In my case I used the jFrog Artifactory community version to store both the references for TIBCO plugins jar files and later to store the outcome artifacts from the build step.
To set up the CI/CD pipeline, I used Jenkins as a deployment tool and used OSS JFrog Artifactory together with Maven. This is the process step by step:
- Install and configure Maven in your build server:
- Downloading and installing Maven is an easy step. All you need to download is the desired version from the Apache Maven site. In my case I used “apache-maven-3.3.9”
- Make sure you have the following system variables defined:
- JAVA_HOME pointing to your java home.
- M2_HOME pointing to where you unzipped maven
- Check your installation by running the command “mvn -version”. This should show you the following:
- Install Artifactory server: here you could choose what Artifactory server you feel comfortable with. In my case I tried both using the local file system as an artifact storage and jFrog community version. Both were fine. You need to remember the configurations because you will need them to update your Maven settings.xml file.
- Install Jenkins: If Jenkins is not installed on your build server then you need to do so now. Make sure you have the Maven plugin installed. Install the required plugin for the Artifactory server which you are going to use, and don’t forget the plugin for the source management.
- Create a Maven job for deployment: this is the most important step to get the job done.
- Go to Jenkins home.
- Create a new item.
- Give it a name.
- Chose the type of the job to be “Maven Project”.
- Click ok, which will take you to the configuration screen.
- Choose your source management. In my case it was SVN, but you could use any source management system supported by Jenkins or one of the plugins. Specify the source repository where Jenkins will create the workspace and maintain your sources.
- Build trigger: here I used the post commit trigger from SVN. I set it up with the “poll SCM” option which is configurable, which will trigger the job to start automatically with each commit.
- Build step: in this step I provided the relative path of the parent POM file which was created and committed when you committed your code.
- Goal: In order to get Maven to work correctly you need to specify a goal. In my case the goal was “deploy”, which means the job will be directly deployed to the specified environment when the POM file has been created. With this step you could also use other goals like build which will only build the ear file which could then be stored in the Artifactory server and used later for deployment.
- Post steps: in my case my post requirement was to run my pre-stored SOAPUI project to make sure the deployed application is running and responding correctly.
- Finally, I used the post build action to store the binary files to be used to release to next environment. Here again I did use the jFrog artifactory to store the ear file and the required jars for later use.
The above steps and explanation were used in a development environment. The ability to continuously integrate newly committed code and then have it continuously deployed to our development environment, is a big step forward. It is the first step in our automation pipeline, leading to fully automated deployments all the way up to the production environment.
I have managed to set up this solution for one of our clients. It worked really well. Since then my client came back with new requirements. One of the requests was using VSTS from Microsoft as a deployment tool for their deployment framework. So currently I am setting up VSTS to deploy TIBCO BW6 application using VSTS. Once this is a success I will share with you the steps and the results.
For now, the initial results have been very promising.
Wish me luck with my new challenge.