As organizations increasingly adopt cloud-based infrastructure, the need to seamlessly connect...
Replacing ExecuteCommand with ExecuteHTTPRequest in IBM PA V12
The following is a technical note discussing the replacement of ExecuteCommand with ExecuteHTTPRequest in the migration from IBM Planning Analytics server version 11 to version 12.
Background
The Difference
Challenges
The move away from ExecuteCommand in IBM PA V12 poses certain challenges, namely for organizations with years of accumulated automation built on the function. For these companies, the replacement of ExecuteCommand marks a fundamental break that requires the re-architecting of an entire automation strategy.
The immediate challenge for most implementations is the complete removal of direct operating system access from TI processes. ExecuteCommand’s ability to spawn PowerShell scripts, run batch files, or execute shell commands provided developers with the ability to extend the capabilities of Planning Analytics. In V12’s containerized architecture, direct OS access is fundamentally incompatible with the security and isolation requirements of modern cloud deployments. Organizations must now identify every instance where a TI process calls an external script, and determine on a case-by-case basis whether the functionality can be replicated with REST APIs or whether it requires external orchestration tools. The immediate question for every PA developer is: what does this script actually do? Through migration, many organizations will discover that they have undocumented PowerShell scripts written a decade ago, batch files that process proprietary data formats, and shell commands that interact with legacy systems that have no REST APIs. Each represents a unique migration challenge with no drop-in solution. Beyond ExecuteCommand itself, the deprecation creates a cascade effect. TM1RunTI (a peripheral EXE often invoked with ExecuteCommand) will also be deprecated.
While ExecuteCommand can be used in countless ways, we found 3 common uses when examining our implementations:
- Data Directory Backup: No longer required in V12, as it is handled natively through PAW Administration.
- RunTI: A relatively easy migration path using RunProcess for parallel processing within an instance, or using ExecuteHTTPRequest with the tm1.ExecuteWithReturn endpoint to kick off a process in another instance.
- Sending emails from a TI process to notify users of successful process completion, errors that need intervention, data exports, or data validation concerns.
Email and Notification Systems
Email Example Migration
In version 11 and prior, all that was required to send a simple email update was less than 20 lines of PowerShell utilizing an IBM provided or IT department configured SMTP server, and a TI process wrapper to run the script with your arguments.

The entire implementation could be built and tested in under an hour. Any developer familiar with basic scripting could modify the recipient list, adjust the subject line, or add attachments. The SMTP server handled authentication, retry logic, and delivery: proven infrastructure that required no ongoing maintenance from the PA team. This was the gold standard for "it just works" automation.
Option 1: Modern Email API Services
Let’s be clear about what this means in practice. The ‘pure’ V12 approach involves replacing SMTP with a REST-based email service like SendGrid, MailGun, or the Microsoft Graph API. Third party tools like SendGrid and MailGun typically charge by usage, while some organizations will have access to the Microsoft Graph API through their existing M365 services. Challenges to this approach include cost, vendor lock-in, credential management, increased complexity, and integration with existing network policies on accepted domains for inbound emails. For comparison to the simple ExecuteCommand solution above, here is a solution that leverages the Microsoft Graph API in the new ExecuteHTTPRequest ecosystem.

The implementation now requires two separate TI processes working in tandem. The first process must:
- POST credentials to Microsoft’s OAuth endpoint
- Parse the JSON response (with no native JSON support in TI)
- Extract and cache the bearer token
- Handle token expiration and refresh logic
While the second process will:
- Retrieve the cached token
- Construct a properly formatted JSON email object
- POST to the Graph API endpoint
- Parse the response to confirm delivery
- Implement retry logic for transient failures
What had once been 20 lines of PowerShell and a simple wrapper now becomes likely 100+ lines of TI code spread across multiple TI processes, new control objects, and new OAuth complexity that most PA developers never had to manage. If it breaks at 2am during month-end close, support teams are now debugging JSON parsing errors and OAuth token refresh failures, instead of simply checking if the SMTP server is up.
Option 2: External Middleware Service
A more pragmatic approach for many organizations is to deploy custom or low-code middleware services that can accept simple REST calls from PA and handle email services internally. This shifts complexity out of TI processes and into a dedicated service layer. This can be implemented with either a low-code platform like Microsoft Power Automate or Azure Logic Apps, or through custom built microservices.
The low-code approach involves the use of a visual workflow designer to create an HTTP trigger that accepts simple JSON payloads with send to, subject, and body fields. This workflow would handle the authentication and the SMTP connection or REST based email service in the background, while exposing a webhook to call from PA. From PA’s perspective, the TI process will simply post minimal JSON to the webhook URL. This comes with potential considerations:
- Platform Licensing Costs (Power Automate: $15/user/month)
- Webhook URL security and rotation policies
- Flow monitoring and alerting infrastructure
- Knowledge transfer, ongoing maintenance
Organizations with development resources can also elect to build a lightweight REST API to accomplish the same tasks. This custom microservice approach involves building and deploying a custom lightweight service that exposes a REST endpoint that PA can call. This service would handle SMTP connections to your existing internal mail infrastructure, providing a simple API for PA to interface with. You would then deploy this service on your existing AWS/Azure/on-prem infrastructure. This custom service approach maintains simplicity on the PA side, while keeping SMTP internal and avoiding external vendor costs. However, it also comes with its own considerations, like:
- Development effort
- Deployment infrastructure (containerized service on your cloud, on prem)
- Network configuration (connecting PA to your middleware)
- Authentication strategy (OAuth vs. API keys vs. TLS)
- Monitoring and logging
- Maintenance overhead
The Cost of an Email
What was once a simple PowerShell script and TI wrapper has expanded into a multi-layered architecture requiring decisions about vendors, authentication mechanisms, infrastructure, and ongoing costs. A typical organization sending automated emails now faces development time to design, implement, and test the new approach, potential ongoing costs for external email API services or Power Automate licensing, infrastructure for potentially new services to deploy and monitor, credential management for new secrets to secure and rotate, and dependency management for external services that can fail independently of PA.
The Middle Ground Dilemma
Strategic Benefits
Conclusion
More than a function replacement, the transition from ExecuteCommand to ExecuteHTTPRequest represents a fundamental evolution reflecting Planning Analytics’ maturation into a cloud-native platform. While migration requires significant effort and careful planning, organizations that invest in proper implementation will find their PA environments more secure, maintainable, and aligned with modern integration patterns. The scope of this change should not be underestimated. Organizations treating this as a standard version upgrade will be met with significant friction, while organizations approaching this as an automation modernization project will be much better positioned for success.