ACG Business Analytics Blog

IMPORTANT: Your TM1 System May Stop Working on Nov 25, 2016

Posted by Kevin Nichols on Sun, Sep, 11, 2016 @ 09:43 AM

Important Notice for All IBM Cognos TM1 Users:
 
There is a possibility that your TM1 system may stop working on November 24, 2016.
The existing TM1 security certificates that are used in the Secure Sockets Layer (SSL) communications that occur between TM1 clients and servers are scheduled to expire on November 24, 2016. If your system is affected, your users will not be able to connect to your TM1 Server and use TM1 Web, TM1 Perspectives etc on November 25, 2016 unless you take appropriate action.
 
Is Your System Impacted?
If you are running TM1 version 10.X your system is most likely impacted as in versions 10.X the SSL settings are turned to ON by default. If you are running an older version of TM1 we recommend that a system review is performed to ensure continuity of service.
 
What should you do
IBM has created a new certificate that will replace the soon to be expired one and will provide even stronger encryption (2048 bit vs. 1024 bit). This new certificate has an expiration date of August 25, 2022 and needs to be installed to replace the existing one. The details for converting your TM1 environment to the new SSL certificate can be found in the following IBM Tech Note:
 
 
Please contact ACG at support@acgi.com if you have questions, concerns or would like to discuss. We are happy to help review your TM1 system, confirm all updates are made properly and ensure TM1 system availability.

Object Security Integration with IBM Cognos TM1

Posted by James Doren on Sat, Jun, 27, 2015 @ 10:40 AM

One of the key security-related selling points within IBM Cognos TM1 is the tool’s ability to assign different security levels to different objects within the TM1 model for any given user or group.  Properly handling object security interaction within a TM1 model can provide an unparalleled level of control over the system, user base, and data.  As we have seen at many clients, it will improve the reliability of your data by ensuring that only the users qualified to change data at a given intersection point will be doing so, and allows for unique configurations by personalizing the security rights for each individual group. This added flexibility helps users get the most out of their TM1 model, so understanding exactly how it works is very important.

The security rights can be set up to gain the most control possible over what users can and cannot see, add, delete, or change.  There are 4  different levels of object security which pertain to a user’s ability to manipulate data:

  • Cube Security
  • Dimension Security
  • Element Security
  • Cell Level Security

For the examples following, we will assume a single user is assigned to a single group.  This way, when we talk about the ‘User’, we understand that this user is part of a TM1 security group.  Let’s look at these different levels of object security in the simplest form.  Say for example, a user is assigned READ access to a cube called ‘Planning Cube A’. If this is the sole security assignment, then the user will have READ access on all intersections of the cube. If a user is assigned READ access to a dimension called ‘Time’, then the user will have read access to all intersections involving the ‘Time’ dimension.  Now let’s say that the only security assigned to a user is an Element level security on the ‘Time’ dimension.  For example, the user is assigned READ access for the element ‘2015’ within the ‘Time’ dimension, which gives the user READ access ONLY to the element ‘2015’.  The same concept applies to cell-level security. 

Simple enough, right?  Well, the tricky part lies within the way the security settings for these four different objects interact with each other when multiple levels of object security are assigned.  Let us look at the example below:

You have a cube called ‘Planning Cube A’ with the following dimensions:

  1. Version
  2. Territory
  3. Account
  4. Year
  5. Measure

You assign a user READ access to the ‘Planning Cube A’ cube, but you also assign the same user WRITE access to all of the elements in this cube.

What do you think would happen in this example? In this case, the READ access of the cube will override the WRITE access of the elements, allowing the user to view the data within the cube but not allowing them to update any of the data. 

Let us look at another example, assuming we are talking about the same cube with the same dimensions as the last example: 

You assign a user WRITE access to the ‘Planning Cube A’ cube, WRITE access to all of the elements within all of the dimensions EXCEPT for the ‘Version’ dimension where you assign the user READ access to all of the elements within the ‘Version’ dimension. 

In this case, the elements in the ‘Version’ dimension identify every intersection in the cube, and since the user is assigned READ access to all elements in this dimension, the user is unable to update data within the cube. 

As stated by IBM, “When groups have security rights for a cube, those rights apply to all dimensions in the cube, unless you further restrict access for specific dimensions or elements”.

Why this type of interaction is useful

This type of object security interaction is essential for having a full level of control over your users and groups.  By being able to set security access at different object levels, you are able to pinpoint and assign the security settings for a giver user exactly how you want them to be.  Imagine this scenario, referring to the same cube/dimensions as the prior examples:

You wish to have all groups be able to view and read cube data for all territories specified in the ‘Territory’ dimension.  You also want each group to be able to update cube data ONLY for their own territory and ONLY for ‘2015’. 

By properly manipulating the different levels of object security, and utilizing the way they interact with each other, this can be accomplished quite easily with no coding and no additions to the security model. 

To achieve this, we would take the following steps:

  1. Assign each Territory group WRITE access to the ‘Planning Cube A’ cube
  2. Assign each Territory group READ access to all territories (element level security) within the ‘Territory’ dimension that do not apply to that group
    • For example, assign the ‘North’ territory group read access to all elements except for ‘North’ in the Territory dimension
  3. Assign each group READ access to all elements except for the ‘2015’ element in the year dimension (element level security)

With this security scheme set up, users will be able to view all cube data, but will only be able to update cube data related to the territories applicable to them. 

While these are just a few examples of how the interaction between object security works, there are many great applications that can come out of fully understanding how the relationships work.  In summary, if a cube has READ access specified for a group, then even if the elements or cells are specified as WRITE access, users in the group will not be able to update any cube data.  However, if a cube is given WRITE access for a specified group, all of the dimensions by default take the access of the cube security assignments.  It is in this way, a sort of top-down approach, that specific intersections and elements can be restricted for a particular group. 

Topics: TM1 Technology, IBM Cognos, IBM Cognos TM1

Trends in Enterprise Performance Management...

Posted by Rob Harag on Sun, Jun, 07, 2015 @ 06:13 PM

Enterprise Performance Management (EPM) systems are becoming more powerful, user friendly and more accessible - that is the core theme of an article published recently by CFO Magazine. Advances in technology allow EPM systems to process larger quantity of data on-demand and thus increase their analytical capabilities. Wider choice of specialized products and new features such as compatibility with Excel, predictive analytics, cloud computing or mobile access are transforming the way users interact with the applications and thus help advance many traditional functions and business processes.

While IBM, SAP and Oracle command about 70% of the market, there are plenty of sophisticated systems to choose from in the remaining 30%. Reversing a prior trend of industry consolidation, more companies are entering the market and provide interesting capabilities that are often tailored to a specific market or industry segment. Examples of such focus areas include budgeting, forecasting, profitability optimization or scenario modeling.

Gartner categorizes EPM solutions into two main buckets:  Office of Finance Processes and Strategy Processes. Most of today's solutions, however, aim to combine financial and operational data across the company and provide a fullly integrated view of the company's performance. Such increased transparency and insight into underlying trends and drivers facilitates a higher focus on key factors affecting financial performance. This in turn helps to more effectively execute a company's business strategy.

There are three primary drivers that are currently changing the way people and companies interact with and deploy EPM solutions:

Speed – with the increased use of in-memory computing, systems are more nimble and flexible to complete calculations in real time and are able to handle the ever increasing quantity of data. The improved performance facilitates a number of benefits such as usability, collaboration, integration and others. These capabilities are further changing the role of Finance and help it to be a more effective facilitator of change, trusted advisor to the business and a driver of value.

Usability – there have been great advances in user friendliness and usability of the tools as vendors continue to make significant investments into more intuitive and easier to understand user interfaces. Dashboarding, visualization, integration with Excel, web-based capabilities and templates are some of the key focus areas. Cloud computing plays an important part in increasing adoption through better flexibility of use. Finally, mobile computing facilittes more distributed data collection and reporting with the abiltiy to drill down directly on the mobile device, typically an iPad.

Integration – EPM systems are becoming increasingly more integrated with solutions expanding far beyond the traditional financial planning and reporting. Operational planning, sales performance management, strategic planning are only a few examples of such new use cases. Integration of these systems and processes provides a higher quality output and facilitates collaboration across groups and departments, which yields more efficiency and improves performance. Predictive Analytics, while still in early stages, is further pushing the frontier on the type of insight and depth included in financial information involved in decision-making.

Topics: Business Forecasting, Performance Management, Financial Planning and Analysis

The Future of IBM Cognos TM1

Posted by Jim Wood on Wed, May, 13, 2015 @ 09:46 AM

IBM Cognos TM1 version 10 has now been around for a good while and has already gone through several minor and major updates. The move from version 10.1 to 10.2 itself was a significant change, even though it was within the same release stream. The key updates included the move from .Net to Java as the main platform and support for multi-threading queries.

Despite all these changes and updates, the underlying framework and structures remained the same. Despite the addition of CAFÉ, which delivered significant optimization of performance and flexibility for users over a wide area network, most interfaces have not changed for a number of years.

So the question has to be: What’s next for TM1?

IBM have been aware of some customer frustrations with the TM1 interfaces for quite some time. Performance Modeler and Application Web server were a step in the right direction but they were more aimed at the Cognos Enterprise Planning market for ease of transition. Interfaces such as Architect, which are still heavily relied upon, haven’t seen a major update since the start of version 9.

It seems that IBM plans to address these issues in the next release through a tool called “Prism”, which is currently in Alpha phase. It is not entirely clear and known what the new interface will look like as IBM is playing their cards close to their chest, however rumors have it that it will be a significant upgrade and will close the gap to some of the other analytics and visualization platforms. What has been confirmed thus far is that Prism will replace Architect and Perspectives.

IBM provided an overview of Prism at their IBM Vision 2015 conference in Orlando in May 2015

Is Prism going to be massive leap forward for TM1?

Hopefully yes. Even though only a small amount of information is available, it is clear that IBM is making a heavy push to further enhance the system and is determined to strengthen its position on the market by committing resources. This enhancement to the front-end of the tool would be a welcome and refreshing improvement that will go a long way in streamlining the overall usability. Time will tell if Prism will live up to its promise but the development seems to be moving in the right direction.

What do I need to do to make sure I’m aware of Prism will bring?

With the demonstration at IBM Vision 2015 in May, chances are a lot more information will be available in the near term as to the scope and look / feel of the system. You can always contact your IBM account representative / business partner and ask to be included on communication as new information becomes available. For those that want to be really close to the updates, you can sign up as Beta tester and gain a better understanding of what is coming and if / how it will impact your application and vision for the system.

Topics: TM1 Technology, IBM Cognos, IBM Cognos TM1, Performance Management

4 Reasons Why Financial Planning & Analysis Is Not Effective

Posted by Rob Harag on Fri, May, 01, 2015 @ 09:40 AM

Considering how important the Financial Planning and Analysis (FP&A) function is in providing critical business insight to company CFOs and CEOs, it is surprising how low its capabilities are still regarded amongst business executives. According to a recent study published by APQC, only 40% of 130 executives surveyed described the FP&A function as effective. Given all the talk and focus in the industry on business analytics and insight, as well as the availability of numerous systems and solutions from niche players and mainstream vendors, it is surprising that the FP&A function is still rated this low. Based on our observation and experience, there are four primary factors that drive this outcome:

1. Continuously raising the bar on expectations

The demand for meaningful business insight and analysis has been growing continually over the past years. Economic uncertainty, regulatory scrutiny, increasing competition, changing weather patterns or geopolitical pressures keep creating new and more urgent demand for information that FP&A needs to satisfy. Even as the FP&A capabilities, processes and technology improve, the FP&A groups are under increasing pressure to respond to more and more ad-hoc requests from many different groups and stakeholders. As a result, they spend most of their time scrambling to respond to requests under rapid fire and often do not have the ability to step back and develop capabilities that would bring them to the next level. What was working well yesterday is not good enough today.

2. Playing catch-up with technology

Today’s finance and planning infrastructure is typically complex as a result of aggressive growth and multiple mergers and acquisitions. Often times the reporting and analytics relies on information from multiple legacy or department level systems. Excel typically plays a significant role in arriving at the end product. To further complicate matters, the demand for business driver detail and modeling require information residing in transactional systems that are outside finance. Even though companies continue to invest and integrate the various systems, ongoing business evolution continues to fuel more changes at a rapid pace, thus creating a moving target. As a result, Finance spends most time pulling information together from various systems and compiling reports or explaining variances at a high level (vs budget) rather than analyzing data and providing value add insight.

3. Slowly adopting advanced planning techniques

Majority of companies still rely on a fairly static process for budgeting, forecasting and reporting. While some of the advanced techniques such as rolling forecast or driver based planning are slowly creeping in, they are not the prevailing methodology yet. It is not uncommon for a large organization to spend 6 months preparing the annual budget and for the budget to be obsolete shortly after it has been completed. Forecast is often times the result of a finance process (YTD Actuals plus Budget) and includes little real time insight from the business. Reporting tends to be backward looking, focused on results to-date and explaining variances to plan or forecast. As such, financial planning is not sufficiently flexible and aligned with business strategy and is slow to adapt and accommodate changes in business conditions.

4. Advanced Analytics still in early stages

While predictive analytics, demand planning and other capabilities are all the buzz these days, they are at a very early stage of adoption. Even if they exist, they are typically not integrated with the overall financial performance management framework. They are usually performed by a dedicated group of people, many times statisticians or other similar functions, in isolation and focused on a specific area / problem. The result is then used as an input into the planning process for further processing. Tremendous upside exists in integrating the forward looking capabilities into financial planning from both systems and procedural perspective, however it will take some time until this practice becomes mainstream.

Topics: Business Forecasting

Should I be Developing TM1 Applications Using Performance Modeler?

Posted by Jim Wood on Wed, Jun, 04, 2014 @ 12:44 PM

Performance ModelerBefore the advent of IBM Cognos TM1 10 any company or individual looking to complete development within TM1 used the tried and tested but dated and unhelpful development tool sets. When IBM introduced TM1 10 back in 2011, they introduced a whole new, web based development tool set called Performance Modeler. This partnered with the rebranding of what was previously known as Contributor (now known as Application Server) introduced a new development tool partnership that wasn’t just IBM trying to bring new options in to play, it was the introduction of whole new development and deployment methodology. This new methodology is based around using the Application Web portal as the front end for gathering and reporting on planning information with Performance Modeler used to get you there.

With any new methodology there is always a level resistance to it, and this was the case within the current TM1 community. While the old tool set was not the best, it was very well known. We have seen recently however, with IBM pursuing this methodology that more and more people, especially those new to TM1 are following IBM’s new pathway. So with more people starting to use Performance Modeler and with IBM continuing to develop it further, all TM1 users / developers need to start looking at it as a serious project implementation option.

So the question has to be: should you be looking to use Performance Modeler?

Performance Modeler is easy to use.

Performance Modeler has plenty of new wizards built in to make previously difficult tasks a lot easier. There is a built in wizard (called guided import) for building cubes, dimensions and even for importing data. Adding calculations has been made much easier. On top of making calculations easier to build IBM has also added many new functions to Performance Modeler, a perfect example of this is the Lag function that makes building time based balances easier to build and maintain.

Performance Modeler makes building Process Workflow easy.

With its perfect partnership with Application Server, Performance Modeler makes building the parts required for a Workflow implementation easier. Within Performance Modeler you can create the required hierarchy for managing your workflow and it also handles dimensions differently so that any cube built is optimized to work well within the workflow framework.

Performance Modeler offers you a range of new options.

Performance Modeler has added development options that were not possible using the old tool sets. For example removing a dimension from within a cube is now a simple drag and drop task. Previously this would have involved rebuilding the cube.

Performance Modeler is the way forward.

Performance Modeler is a key part of the IBM Cognos TM1 development road map. IBM see this new development methodology as a key part of making TM1 an integral part of their FP&A strategy going forward. As it progresses, more features and improvements will be added.

Now that we’re past the early development stages it’s an ideal time to take a look at performance modeler and see what benefits it can bring to you.

Topics: IBM Cognos TM1, Business Forecasting, Performance Modeler

Integrating Resource Management with Cognos Command Center

Posted by Peter Edwards on Tue, Jan, 28, 2014 @ 04:46 PM

For companies with large data stores, the benefits of performing analytics are impossible to deny: an IBM study revealed that enterprises which conduct analytics have up to 1.6x revenue growth, 2x EBITDA growth and 2.5x stock price appreciation of those that don’t. [1] However, as data volumes continue to escalate, the cost of providing analytics climbs as well. 90% of companies with 1,000 or more employees access several distinct data sources to complete their accounting close, and 43% use four of more. In the finance industry, professionals spend only 22% of their time performing actual analysis – the rest is spent waiting for data or verifying its authenticity. [2] 

Cognos Command Center

Because large datasets lead to longer downtimes for maintenance, most IT departments perform automation to manage systems. These custom built systems tend to be fragile, require updates when source systems are updated or added, and rely on the knowledge of a few key members of the team. Issues that do arise require substantial involvement of IT personnel, and if critical members of the team are absent, they can cause massive delays.

IBM Cognos Command Center is a new product offered by IBM that facilitates database management by consolidating all of your resources into one dashboard. Cognos Command Center is designed to be usable by business professionals without extensive technical knowledge, and provides enterprises with a robust alternative to custom data warehouse management.

Why should you use IBM Cognos Command Center? Here are a few reasons:

Command Center is Intuitive

Cognos Command Center comes packed with plugins so you can snap-in functions that would normally require custom coding and maintenance from IT. Perform file operations, secure data transfers, and native database functions without writing one line of code. Save money on IT resources, and better utilize your existing employees by allowing them to spend less time on routine tasks. Command Center takes the guesswork out of process failures by alerting you immediately by email, and translating commonly encountered errors into plain English. IT can even add procedures to deal with recoverable errors automatically and silently.

Command Center is Secure

Manage your servers from anywhere in the world via the web or your mobile device. Many complex systems involve the use of stored login credentials, but most companies do a poor job securing this data, representing a major security risk from both internal and external sources and accounting for 13% of all breaches according to a Verizon’s 2013 Data Breach Investigations Report. [3] Cognos Command Center encrypts sensitive data, and users can run processes that utilize stored login credentials without ever being able to read them. In the event of failure or intrusion, any change, no matter how small, can be traced to a single user.

Command Center is Flexible

IBM Cognos Command Center uses global variables to aid in synchronization of data between servers. Manage your workflow centrally, and update parameters on every server you own with a single click. Never lose hours of uptime to rolling back data because of a forgotten parameter again! Command Center integrates workflows for heterogeneous environments into one management system, and its plugin-based system means functionality can quickly be added for new storage systems as they are introduced.

Command Center is Supported

IBM Cognos Command Center is the newest member of the IBM Cognos Business Intelligence family. IBM was ranked a clear “leader” in Gartner’s 2013 Business Intelligence & Analytics Magic Quadrant. The Cognos platform’s ability to serve as a complete BI solution has been bolstered by IBM’s recent high profile acquisitions. IBM Cognos received high marks from clients regarding road map and future vision, product quality, and ability to integrate with information infrastructure. [4] The addition of Command Center to the Cognos family represents IBM’s ongoing commitment to deliver the most full-featured business intelligence platform available.

Topics: IBM Cognos TM1, IBM Cognos Command Center, Star Analytics

How Do TM1 Financial Systems Ensure Clean Data?

Posted by Peter Edwards on Wed, Jan, 08, 2014 @ 03:44 PM

Summary

Almost half of new business initiatives fail because of poor data, according to a recent article on the EPM Channel website. While manual processes can temporarily fix issues, The IBM TM1 technology uses a series of checks and balances to ensure that businesses can build a database on clean data, saving employees time and avoiding costly mistakes.

How Do TM1 Financial Systems Ensure Clean Data?

Almost half of new business initiatives fail because of poor data, according to a recent article on the EPM Channel website. This trend fuels mistakes in everything from bill payments to shipping, and often means that your best employees spend more time organizing data instead of analyzing it.

ibm_cognos_tm1This is a long-standing pain a lot of businesses incur with dirty data. With Big Data, businesses have large amounts of data, so the quality of that comes into question. Organizations often are looking at ways to improve their data because they have too many manual processes in place or their data is being cleaned in a non-automated fashion. If your data is clean and organized, you’ll streamline processes and align the organization on a set of numbers. As the EPM Channel article points out, it’s best to do this at the start, saving employee resources and avoiding costly problems.

There are systems that can help. TM1 financial solutions enforce what’s called referential integrity in data. Many times, when pulling data to put into TM1, there are problems in supporting systems that must be fixed. For instance, if you’re pulling data from a financial system that’s doing consolidations, there could be numbers at an account level that do not necessarily total to the parent account that they should roll up into. That’s because the system allows users to store two different numbers. This means there are a number of child accounts that fail to add up to the appropriate total because the database doesn’t enforce referential integrity.

Most relational databases or ERP applications don’t necessarily enforce these rules. They try to put business rules in place, but fundamentally the technology allows users to enter data in different places, creating conflicting information. When pulling data into the TM1 financial systems software, the detail-level account data rolls up into the parent account, creating that figure. This method ensures that the parent figure isn’t entered separately and avoids any situations where the child accounts don’t add up to the parent number. Therefore, users who leverage TM1 as there database are much more likely to have quality, clean data.

To learn more about TM1 financial solutions or see a TM1 demo contact ACG.

Contact ACG

Topics: TM1 Technology, IBM Cognos TM1, Business Forecasting, Clean Data, Performance Management

Seven Ways IBM Cognos TM1 Aids Forecasting

Posted by Peter Edwards on Fri, Dec, 06, 2013 @ 04:53 PM

Summary

Does your company suffer from common forecasting problems? In a white paper on the subject, the IBM Beyond Budgeting Round Table illustrates how the IBM Cognos TM1 business intelligence solution can help finance managers identify common symptoms and implement solutions that lead to a healthier organization.

7 Ways IBM Cognos Aids Forecasting

Whether companies realize it or not, they likely suffer from at least one symptom of forecasting illness. Misconceptions and antiquated forecasting processes lead to decreased accuracy, quality and profitability for businesses. To successfully treat these forecasting illnesses, finance managers must identify common symptoms and implement solutions that lead to a healthier organization.

In a white paper titled “Seven Symptoms of Forecasting Illness” the IBM Beyond Budgeting Round Table illustrates how the IBM Cognos TM1 business intelligence solution can help identify and cure these pervasive ailments.

1) Semantic confusion: A company might show signs of semantic confusion if the organization finds it difficult to deal with unexpected or undesirable forecasts. This symptom can manifest as a blurred line between the forecast and the company’s goal.

Cure: To address this symptom, companies can use the IBM Cognos TM1 business intelligence solution to allow managers to create, maintain, and reference multiple forecast scenarios easily and efficiently.

2)  Visual impairment: A company may suffer from visual impairment if it is obsessed with the year-end forecast numbers, or if it is surprised by new developments at the beginning of the fiscal year.

Cure: Companies should focus on building a comprehensive company forecast strategy. BP is a great example of the power of combined forecasting. BP brought all of the company’s stakeholders together in a large auditorium and efficiently completed a comprehensive company forecast.

In some companies, salespeople are either pushing their numbers to count toward next year’s sales goals or holding their numbers to this year to benefit themselves under their compensation plans. You really need to separate compensation from the forecasting process to take this manipulation out of the process and get truer information.

3)  Systemic overload: Systemic overload is characterized by an increasing need to add greater detail and additional analysis in the misguided view that more data is always better in the forecasting process.

Cure: Don’t get stuck in the fine details. With IBM’s TM1 financial planning solution, companies can quickly build scenarios that reflect potential realities while achieving a balance between initial data input and subsequent analysis.

For best practices, you should only be planning the primary financial line items in your business. You should identify 20-50 of these line items that are most important in your business. If you go into too much detail in your budget, you are going to spend all your time configuring the budget and not enough time doing the analysis of it afterward.

4)  Prosperity syndrome: If a company’s forecast reflects optimistic growth regardless of negative industry trends or current economic conditions, it may be suffering from prosperity syndrome.

Cure: To alleviate this symptom, companies should consider current market conditions, examine expected competitor strategies and rely on accurate data provided by TM1 financial systems.

5)  Lack of coordination: If an organization has multiple sources, or silos, of data, it can’t create reliable, consistent forecasts.

Cure: In order to have a good forecasting process, you also need good data organization in the company. This can be achieved by adopting a single forecasting system throughout the entire organization and feeding data into the forecasting system from other enterprise systems like the General Ledger or the Human Resource Management systems.

6) Asocial behavior: If an organization routinely manipulates or alters its forecasts, even if doing so is not in the best interest of the company, it may suffer from asocial behavior.

Cure: Companies should reward managers and employees based on the value they provide, rather than rewarding “sandbaggers” for abusing the system. This can be achieved by eliminating links between incentive compensation and forecasts.

7)  Lack of executive sponsorship: If individual departments are attempting to improve the forecasting process, but others are split over the importance of the task, an organization may suffer from a lack of executive sponsorship.

Cure: In order for a company to be successful in their financial planning and reporting efforts, it has to come from the top down. These processes will touch multiple parts of the organization and must be designated a key strategic initiative of the entire company in order for it to be successful.

All companies suffer from at least one type of forecasting illness. These factors are a reminder that every business needs to constantly examine its core process and culture to ensure they’re using the most accurate forecasting process and tools. By implementing a data analysis tool like the IBM Cognos TM1 business intelligence solution, organizations can achieve a high level of efficiency and promote healthier forecasting.

Free Whitepaper: 7 Symptoms of  Forecasting Illness

Topics: Data Analysis Tool, Business Intelligence, IBM Cognos TM1

4 Ways IBM Cognos TM1 BI Solution Can Cure Forecasting Problems

Posted by Peter Edwards on Mon, Nov, 18, 2013 @ 03:45 PM

IBM business cognos tm1 forecasting rfSummary
Does your company suffer from common forecasting problems? In a white paper on the subject, the IBM Beyond Budgeting Round Table illustrates how the IBM Cognos TM1 business intelligence solution can help finance managers identify common symptoms and implement solutions that lead to a healthier organization.

Whether companies realize it or not, they likely suffer from at least one symptom of forecasting illness. Misconceptions and antiquated forecasting processes lead to decreased accuracy, quality and profitability for businesses. To successfully treat these forecasting illnesses, finance managers must identify common symptoms and implement solutions that lead to a healthier organization.

In a white paper titled “Seven Symptoms of Forecasting Illness” the IBM Beyond Budgeting Round Table illustrates how the IBM Cognos TM1 business intelligence solution can help identify and cure these pervasive ailments. Here are three of the top forecasting issues with a bonus issue that the ACGI experts often see.

1) Semantic confusion: A company might show signs of semantic confusion if the organization finds it difficult to deal with unexpected or undesirable forecasts. This symptom can manifest as a blurred line between the forecast and the company’s goal.
Cure: To address this symptom, companies can use the IBM Cognos TM1 business intelligence solution to allow managers to create, maintain, and reference multiple forecast scenarios easily and efficiently.

2)  Visual impairment: A company may suffer from visual impairment if it is obsessed with the year-end forecast numbers, or if it is surprised by new developments at the beginning of the fiscal year.
Cure: Companies should focus on building a comprehensive company forecast strategy. BP is a great example of the power of combined forecasting. BP brought all of the company’s stakeholders together in a large auditorium and efficiently completed a comprehensive company forecast.

In some companies, salespeople are either pushing their numbers to count toward next year’s sales goals or holding their numbers to this year to benefit themselves under their compensation plans. You really need to separate compensation from the forecasting process to take this manipulation out of the process and get truer information.

3) Lack of coordination: If an organization has multiple sources, or silos, of data, it can’t create reliable, consistent forecasts.
Cure: In order to have a good forecasting process, you also need good data organization in the company. This can be achieved by adopting a single forecasting system throughout the entire organization and feeding data into the forecasting system from other enterprise systems like the General Ledger or the Human Resource Management systems.

4)  Lack of executive sponsorship: If individual departments are attempting to improve the forecasting process, but others are split over the importance of the task, an organization may suffer from a lack of executive sponsorship.
Cure: In order for a company to be successful in their financial planning and reporting efforts, it has to come from the top down. These processes will touch multiple parts of the organization and must be designated a key strategic initiative of the entire company in order for it to be successful.

All companies suffer from at least one type of forecasting illness. These factors are a reminder that every business needs to constantly examine its core process and culture to ensure they’re using the most accurate forecasting process and tools. By implementing a data analysis tool like the IBM Cognos TM1 business intelligence solution, organizations can achieve a high level of efficiency and promote healthier forecasting.

To learn more about these and other key forecasting issues companies are facing, download the white paper, “Seven Symptoms of Forecasting Illness.”

 

Free Whitepaper: 7 Symptoms of  Forecasting Illness

Topics: IBM Cognos TM1, Business Forecasting, Performance Management

Contact ACGI

Tel: (973) 898-0012

Follow ACG