The following is a technical note discussing the replacement of ExecuteCommand with ExecuteHTTPRequest in the migration from IBM Planning Analytics server version 11 to version 12.
Background
Those who have worked with IBM PA for any amount of time know that ExecuteCommand has been the Swiss Army knife TM1 developers have used to accomplish tasks that TI processes cannot handle natively. Need to send an email? Spawn a PowerShell script. Need to process a data file that TI does not support out of the box? Write a script and call it from your TI process. ExecuteCommand has solved countless problems for developers when OS-level operations were required to automate the data pipeline in enterprise systems. Longtime enterprise users of Planning Analytics often have dozens of scripts woven into their nightly and weekly chores. The fundamental question for every organization on Planning Analytics is: can your existing automation be replicated through HTTP calls, or does it require OS-level access that no REST API can provide?
The Difference
In V12, the ExecuteCommand function is no longer supported and there is no one-for-one replacement. The function that most closely resembles the functionality is ExecuteHTTPRequest. While ExecuteCommand was an escape hatch that let you run arbitrary code on the server, giving you unlimited flexibility at the cost of security risks, OS dependencies, and debugging headaches, the new ExecuteHTTPRequest is the opposite: a narrowly scoped, deliberately constrained, purpose-built function for REST API communication. No intermediate files. No PowerShell wrappers. No spawning additional processes. The request is handled, the response is cached in memory, and you parse it directly in your TI script.
Challenges
The move away from ExecuteCommand in IBM PA V12 poses certain challenges, namely for organizations with years of accumulated automation built on the function. For these companies, the replacement of ExecuteCommand marks a fundamental break that requires the re-architecting of an entire automation strategy.
The immediate challenge for most implementations is the complete removal of direct operating system access from TI processes. ExecuteCommand’s ability to spawn PowerShell scripts, run batch files, or execute shell commands provided developers with the ability to extend the capabilities of Planning Analytics. In V12’s containerized architecture, direct OS access is fundamentally incompatible with the security and isolation requirements of modern cloud deployments. Organizations must now identify every instance where a TI process calls an external script, and determine on a case-by-case basis whether the functionality can be replicated with REST APIs or whether it requires external orchestration tools. The immediate question for every PA developer is: what does this script actually do? Through migration, many organizations will discover that they have undocumented PowerShell scripts written a decade ago, batch files that process proprietary data formats, and shell commands that interact with legacy systems that have no REST APIs. Each represents a unique migration challenge with no drop-in solution. Beyond ExecuteCommand itself, the deprecation creates a cascade effect. TM1RunTI (a peripheral EXE often invoked with ExecuteCommand) will also be deprecated.
While ExecuteCommand can be used in countless ways, we found 3 common uses when examining our implementations:
- Data Directory Backup: No longer required in V12, as it is handled natively through PAW Administration.
- RunTI: A relatively easy migration path using RunProcess for parallel processing within an instance, or using ExecuteHTTPRequest with the tm1.ExecuteWithReturn endpoint to kick off a process in another instance.
- Sending emails from a TI process to notify users of successful process completion, errors that need intervention, data exports, or data validation concerns.
Below, we will focus on the available solutions for email integration as we move to the modernized V12 TM1 server.
Email and Notification Systems
Emailing from TI processes has historically been implemented with a PowerShell script that interfaces with SMTP servers. Migrating this functionality to V12 requires standing up and maintaining a dedicated email API like SendGrid or Office 365/Graph. Some organizations are considering other external notification services that can communicate more directly via HTTP requests. Each approach may require new infrastructure, new authentication mechanisms, and significant development effort. In the following sections, we will illustrate the technical changes required by the migration, as well as the architectural complexity that replaces what was once a straightforward automation.
Email Example Migration
In version 11 and prior, all that was required to send a simple email update was less than 20 lines of PowerShell utilizing an IBM provided or IT department configured SMTP server, and a TI process wrapper to run the script with your arguments.
The entire implementation could be built and tested in under an hour. Any developer familiar with basic scripting could modify the recipient list, adjust the subject line, or add attachments. The SMTP server handled authentication, retry logic, and delivery: proven infrastructure that required no ongoing maintenance from the PA team. This was the gold standard for "it just works" automation.
In V12, this straightforward approach becomes significantly more complex. Organizations face several viable migration paths, each with different trade-offs.
Option 1: Modern Email API Services
Let’s be clear about what this means in practice. The ‘pure’ V12 approach involves replacing SMTP with a REST-based email service like SendGrid, MailGun, or the Microsoft Graph API. Third party tools like SendGrid and MailGun typically charge by usage, while some organizations will have access to the Microsoft Graph API through their existing M365 services. Challenges to this approach include cost, vendor lock-in, credential management, increased complexity, and integration with existing network policies on accepted domains for inbound emails. For comparison to the simple ExecuteCommand solution above, here is a solution that leverages the Microsoft Graph API in the new ExecuteHTTPRequest ecosystem.
The implementation now requires two separate TI processes working in tandem. The first process must:
- POST credentials to Microsoft’s OAuth endpoint
- Parse the JSON response (with no native JSON support in TI)
- Extract and cache the bearer token
- Handle token expiration and refresh logic
While the second process will:
- Retrieve the cached token
- Construct a properly formatted JSON email object
- POST to the Graph API endpoint
- Parse the response to confirm delivery
- Implement retry logic for transient failures
What had once been 20 lines of PowerShell and a simple wrapper now becomes likely 100+ lines of TI code spread across multiple TI processes, new control objects, and new OAuth complexity that most PA developers never had to manage. If it breaks at 2am during month-end close, support teams are now debugging JSON parsing errors and OAuth token refresh failures, instead of simply checking if the SMTP server is up.
Option 2: External Middleware Service
A more pragmatic approach for many organizations is to deploy custom or low-code middleware services that can accept simple REST calls from PA and handle email services internally. This shifts complexity out of TI processes and into a dedicated service layer. This can be implemented with either a low-code platform like Microsoft Power Automate or Azure Logic Apps, or through custom built microservices.
The low-code approach involves the use of a visual workflow designer to create an HTTP trigger that accepts simple JSON payloads with send to, subject, and body fields. This workflow would handle the authentication and the SMTP connection or REST based email service in the background, while exposing a webhook to call from PA. From PA’s perspective, the TI process will simply post minimal JSON to the webhook URL. This comes with potential considerations:
- Platform Licensing Costs (Power Automate: $15/user/month)
- Webhook URL security and rotation policies
- Flow monitoring and alerting infrastructure
- Knowledge transfer, ongoing maintenance
Organizations with development resources can also elect to build a lightweight REST API to accomplish the same tasks. This custom microservice approach involves building and deploying a custom lightweight service that exposes a REST endpoint that PA can call. This service would handle SMTP connections to your existing internal mail infrastructure, providing a simple API for PA to interface with. You would then deploy this service on your existing AWS/Azure/on-prem infrastructure. This custom service approach maintains simplicity on the PA side, while keeping SMTP internal and avoiding external vendor costs. However, it also comes with its own considerations, like:
- Development effort
- Deployment infrastructure (containerized service on your cloud, on prem)
- Network configuration (connecting PA to your middleware)
- Authentication strategy (OAuth vs. API keys vs. TLS)
- Monitoring and logging
- Maintenance overhead
In summary, the middleware approach trades TI complexity for infrastructure complexity. Instead of eliminating any work that needs to be done, you are relocating it to a different resource pool that can manage it more effectively.
The Cost of an Email
What was once a simple PowerShell script and TI wrapper has expanded into a multi-layered architecture requiring decisions about vendors, authentication mechanisms, infrastructure, and ongoing costs. A typical organization sending automated emails now faces development time to design, implement, and test the new approach, potential ongoing costs for external email API services or Power Automate licensing, infrastructure for potentially new services to deploy and monitor, credential management for new secrets to secure and rotate, and dependency management for external services that can fail independently of PA.
This single use case, automated email, illustrates the broader migration challenge. What was once a trivial automation with ExecuteCommand now requires architectural decisions, external dependencies, and ongoing operational overhead. Multiply this complexity across every use of ExecuteCommand in a PA implementation, and the scope of V12 migration becomes clear.
The Middle Ground Dilemma
The architectural magnitude of this change has led some organizations to seek middle-ground approaches to migration issues. Hubert Heijkers, STSM and Program Director with the IBM PA team, has created an open source ExecuteCommand service that acts as middleware accepting ExecuteHTTPRequest calls from V12 and executing OS commands on behalf of Planning Analytics. The GitHub repository can be found here. From an architecture perspective, this moves ExecuteCommand outside of PA’s container instead of eliminating it. While this is a viable bridge approach that lets you keep your existing scripts, it still requires the deployment and maintenance of another service that has OS-level command execution access, defeating the isolation benefits of containerized PA. You are still executing arbitrary OS commands, but they are now initiated via HTTP instead of directly on the server. While this kind of solution allows organizations with strict timelines to migrate to V12 quickly, it will only buy you time to re-architect gradually instead of all at once.
Strategic Benefits
Nevertheless, there are also certain advantages that come with the introduction of ExecuteHTTPRequest in V12. While migration presents real challenges, the architectural shift away from ExecuteCommand addresses long standing security and operational concerns that linger in existing PA implementations. One of the most significant vulnerabilities in Planning Analytics is the ability to execute arbitrary code on the server with OS-level access. This is being replaced with constrained, well-defined HTTP requests that align with modern security principles. Additionally, ExecuteHTTPRequest is inherently platform agnostic, so it will operate consistently across Windows, Linux, containers, and cloud providers without maintaining OS-specific scripts. Debugging and observability also benefit from improvements in this context. HTTP operations now occur within PA’s execution context, with structured response handling (status codes, headers, responses, etc.), rather than spawning external processes that run outside TM1’s logging framework. The architectural discipline that ExecuteHTTPRequest enforces positions PA well in an API-driven landscape. These improvements do not eliminate implementation complexity, but they do provide tangible value to organizations willing to invest in a proper implementation.
Conclusion
More than a function replacement, the transition from ExecuteCommand to ExecuteHTTPRequest represents a fundamental evolution reflecting Planning Analytics’ maturation into a cloud-native platform. While migration requires significant effort and careful planning, organizations that invest in proper implementation will find their PA environments more secure, maintainable, and aligned with modern integration patterns. The scope of this change should not be underestimated. Organizations treating this as a standard version upgrade will be met with significant friction, while organizations approaching this as an automation modernization project will be much better positioned for success.