One of the coolest features of CLM 4.0.4 (and personally most waited), that I wanted to blog about is the Simulation Dependency Build. This feature allows you, basically, to run a Dependency Build without physically building anything, but generating all the metadata involved in these builds.
What does “simulation” mean and what are the benefits?
Simulating a Dependency Build basically means that the build will run in all the steps without generating any output. This is accomplished avoiding the outputs generation steps in the Dependency Build process. However, as the rest of the steps are completed, all the metadata associated with the Dependency Build process is generated. The later is important to bear in mind, as it is for this reason that a build simulation has no difference with a non-simulated one in: (a) what it has to do with dependencies resolution; (b) the metadata of built files (the build maps). Therefore a simulation build will affect subsequent builds (whether simulated or not) for the consideration of what has to be rebuilt.
As side note, to reinforce previous message, do not confuse this feature with the Preview build feature: Preview mode just shows what would be built, while Simulation does execute the build.
With this basic approach, there is a set of benefits that you can achieve:
- The first advantage has to do with the build machine resources savings that you can obtain from running a build process without the steps of outputs generation.
- The feature also allows to run builds on non-z/OS platforms. Which is another great advantage of this “simulation” mode, for exercising Dependency Builds in a different platform.
Note: there is a restriction if the “Link-Edit” option is used on a Translator, then the running platform needs to be z/OS.
The diagram of the solution depicted next has no real differences over a “regular” Dependency Build, but next I will describe some additional considerations:
Build Server: installation considerations
Any supported platform can be a Build Server, with the restriction if the “Link-Edit” option is used on a Translator, as previously described. Let’s briefly mention what the Build Server installation considerations are:
- In this machine you need to install System Z Build Toolkit (or the i Toolkit if you were to simulate i Builds). If it’s a non-Z platform, you can copy the “buildtoolkit” folder from you Z Build Server.
- The Build Agent has to be installed and running.
- For non-Z platforms, you will need to copy “ibmjzos.jar” into the ext folder of the JDK you are going to use. This file can be obtained from inside a z/OS JDK at http://www-03.ibm.com/systems/z/os/zos/tools/java/products/sdk7_31.html
RTC Server: logical considerations
There are some considerations from the server side to take into account for the definitions to use in combination with this feature.
Considerations for the Build Definition:
- The load directory has to be server specific, which means that it has to be valid for the platform the Build Server is running on.
- Keep resource prefix with original values, do not change this value even in the case of running in a different platform, as otherwise it will be divergences at the time of comparing the registries of what was built (remember the generated metadata is good!)
- “Trust outputs” must be enabled or the build will throw an error
Considerations for the Build Engine definition when the Build Server is non-Z platform
- The environment variables needed for Z platforms have to be set. I will cover these next in this post with an example.
Simulation with Mortgage sample and a non-Z Build Server
In this section of the post I want to show the usage of this feature with the Mortgage sample, using a Linux machine as Build Server. Some of the information may sound redundant, but I think if you are trying this for the first time can be useful for a full understanding.
The Build Server I am using is a Linux machine (distributed, not a zLinux). In this machine I have performed the following steps:
- Downloaded a platform specific Build Toolkit. In this case, I downloaded the zip format of it and uncompressed it on my machine desired location.
- From the Toolkit I installed and configured the Rational Build Agent in this machine. You can follow the directions in the InfoCenter topic.
- Installed a JDK (you can reuse an existing compatible one), and copied the “ibmjzos.jar” file into the ext folder.
- Copied the “buildtoolkit” folder from my Z Build Server. Note I didn’t replace the folder from my Linux toolkit installation, but that may be another option as the copied one just adds Z functionality specific libraries that we will need.
- Created a password file for the build user. You can do this using the JBE command located in the “buildsystem/buildegine” folder of the toolkit installation as follows: ./jbe.sh -createPasswordFile sim.password
Where “sim.password” is the name I wanted to give the file to generate.
- Created a new Build Engine in JKE Banking for the installed agent, in my example I called it “jke.rba.engine.dev.sim.lin”. There is a set of properties that need to be added to this Build Engine, as it is referring to a non-Z platform machine. You can find details in this wiki page section; in this example of my test it would look like the following:
Once completed these configuration steps, I was ready to run a Simulation Dependency Build using my Linux machine. I made a change to “JKEMPMT.cbl” file in MortgageApplication-JKEMPMT project, delivered it to the stream and made the following build request:
Once the build has completed, the result will appear as any other build result with the addition of “(Simulation)” as part of the label. The result is any other regular one, for example in the External Links section I can find the Build Map of the modified file and the link to the Build Report:
One particular difference, however, is the “simulationRegistry.xml” file that appears in the Downloads tab. The format of this file is similar to one of a build report, and it is incremental between simulation runs; keeps track of the outputs that would have been generated by this build execution, information that is used to resolve physical dependencies.
In this post I wanted to talk about the Simulation Dependency Build feature and show a little example on how I used it in a Linux box with the hope it can serve others willing to use or investigate the feature.
I wanted to finish the post with some comments on additional topics that you may find interesting …
Simulation BUILD AND IMPACT on OTHER EE FEATURES
When combining simulated builds with promotion, you must check the “Skip timestamp check when build outputs are promoted” option on the Promotion Definition to avoid any potential mismatch between the timestamps stored in the simulation registry and the real ons of the binaries present (from non-simulated builds) in the system.
simulation builds and non-simulation combination
When running a simulation build after a non-simulation one, it is recommended to run it with “Build changed items only” disabled to avoid the simulation registry be out of date due to the non-simulation build created new outputs.
This I think is a real cool feature that can be used for updating the dependency metadata saving resources either by running it on an alternate platform for the build server or merely by the steps it avoids. It also allows you to self-enable and further understand how Dependency Build works removing the dependency on a Z platform. Finally, a great advantage of this feature can be obtained in migration scenarios, from other solutions to RTC, this is one great usage for which more information is to come, stay tunned!