ACG Business Analytics Blog

Working with Virtual Dimensions in IBM Planning Analytics 2.0.

Posted by Theo Chen on Mon, Apr, 23, 2018 @ 02:01 PM

Ability to create virtual dimensions from attribute-based hierarchies is a key update in IBM Planning Analytics 2.0 and a huge source of value. This is a major differentiation for IBM PA compared to other similar platforms. It provides even greater flexibility to the user to analyze data using user-defined parameters on the fly. With virtual hierarchies, companies have much greater flexibility in designing solutions that will provide greater efficiency yet maintain the flexibility to include attributes for thorough analysis.

What are Virtual Dimensions

Virtual dimensions are dimensions that are created in an existing IBM Planning Analytics system on demand by selected end users (system administrators). These dimensions are not part of the core system structure / design, but rather are created ad-hoc as needed based on attributes of individual existing dimension elements. Attributes can be created as and when needed while the system is in use and there are no practical limitations to how many attributes can be created for every single member.

Once a virtual dimension is created, it can be used just like any other dimension. It can be selected for reporting, it can be brought into a cross-tab view for analysis or used as a target for input for (plan, forecast) data.

What Virtual Dimensions are NOT

Virtual dimensions in IBM PA are NOT the same as the ability to create alternate rollups / hierarchies or sorting / filtering by attributes. We find that the concept is a bit hard to understand initially and people default to the capabilities they know. Unlike filtering and alternate hierarchies, virtual dimensions go deeper and provide a more thorough way to analyze the data.

Example:

We have a sales reporting application with Customer and Business Unit being the two key dimensions. Other dimensions include Account, Time etc. The Customer dimension is organized by Industry – so Customers rolling up to Sectors rolling up to Industries that roll up to Total Customer.

Let’s say I want to understand my sales by size of company (Enterprise vs Mid-Market vs Small and Medium Enterprise, or SME) in addition to the industry rollup. I do not have “Company Size” defined as a dimension so under previous rules I would have the following options:

  • Redesign the cube to include “Size” as a dimension – depending on the size of the overall model that could be a significant undertaking and could take a lot of time
  • Build an alternate rollup of Customer by size – in this case I would have to choose the Industry or Size rollup for reporting and analysis but I could not use both and would still not get the desired cross-view
  • Embed “Size” inside the existing “Industry” rollup – this would be extremely inefficient as I would have to include multiple rollups within each Industry / Segment for Size and repeat them for each Industry / Segment and deal with conflicting element values – not desirable from maintenance perspective and yet I would still not get the same flexibility for analysis

 Enter “Virtual Hierarchies” – with IBM Planning Analytics 2.0 I can simply create a new “Attribute” and label each Customer an Enterprise, Mid-Market firm or SME. Once that is done, I would turn this Attribute into a Virtual Hierarchy and it would show up in my “Set” menu. Once there, I can simply drag the “Size” Hierarchy into the cross-tab view and see the breakdown of customer by size and industry in a simple cross tab view. I am able to drill down or pivot to analyze the data and even input into the virtual intersections to post adjustments or forecast sales. And of course element security still works at the leaf level. Extremely powerful…

Key Benefits

The key benefit implication from this capability include the following:

  • Savings in (RAM) memory due to less number of core dimensions required and thus the size of the core model
  • Better overall performance and usability of the model with less clutter / complexity
  • Much greater flexibility to adjust existing models and get deeper insight
  • Ability to conform with model standards with any customization done locally

What to Look Out For

Some points to be aware of when working with virtual hierarchies include the fact that they can currently only be built with 2 levels – TurboIntegrator scripting is required for deeper hierarchies with more levels. Also, elements in virtual hierarchy are not elements in main itself – each element in virtual has to be unique from every element in dimension. There is no security (yet) for 'virtual' elements, although that should be addressed soon.

The impact of Virtual Hierarchies will be different for existing vs net new applications. For existing models, it will add flexibility through the ability to expand structures without going through a potentially substantial application redesign. For new solutions, it creates an opportunity to design models with more simplicity and rely on Virtual Dimensions to provide scalability and flexibility in the future.

Give us a call to discuss how this great new capability can add value to your platform and review options to provide more insight and analytical power.

Topics: TM1 Technology, IBM Cognos TM1, Performance Management, Financial Planning and Analysis, IBM Planning Analytics

IBM Cognos Controller 10.3.1. Released

Posted by Rob Harag on Sun, Dec, 03, 2017 @ 10:35 AM

The latest version of IBM Cognos Controller is now available and features integration with IBM Planning Analytics Workspace. This provide tighter integration and opens up possibilities for better insight, analytics and provides more power to the hands of users.

IBM Planning Analytics is a powerful interface that facilitates reporting, visualization, analytics and planning / forecasting. With a better integration to Controller, it can be further used as a common interface for both the Record-to-Report as well as Report-Analyze-Model-Plan cycles.

A high level overview of the integration capabilities can be viewed at the link below

https://ibm.box.com/s/nscuagvqlfo37m5sn3ckm7hb6lo2y75f)

IBM Cognos Controller is a robust application to support Financial Consolidation and Close. It is a finance-owned solution that is managed and maintained by Finance and offers out-of-the-box consolidation capabilities without need to do any programming or scripting. With its pre-built capabilities to manage the close process including workflow, review and reporting, it is a system that vastly simplifies and streamlines the month end process and provides a robust foundation that improves control and compliance.

IBM Cognos Controller is available with equal functionality for on-premise as well as cloud based deployment. It offers over 260 pre-built and packaged reports to facilitate ease of adoption and maximize ROI.

Some additional capabilities off the Controller 10.3.1. Release include the following:

  • Improved data import in Controller Web
  • More efficient remediation of import errors in real time during the import
  • Drill from data entry forms to the source level detail in the import file
  • A number of other popular customer enhancement requests
  • Limited use license for PA Local & WORKSPACE
  • Support for click to run

Publicly accessible knowledge center site is https://www.ibm.com/support/knowledgecenter/SS9S6B_10.3.1

1

Topics: IBM Cognos Controller, FInancial Consolidation and Close

IMPORTANT: Your TM1 System May Stop Working on Nov 25, 2016

Posted by Kevin Nichols on Sun, Sep, 11, 2016 @ 09:43 AM

Important Notice for All IBM Cognos TM1 Users:
 
There is a possibility that your TM1 system may stop working on November 24, 2016.
The existing TM1 security certificates that are used in the Secure Sockets Layer (SSL) communications that occur between TM1 clients and servers are scheduled to expire on November 24, 2016. If your system is affected, your users will not be able to connect to your TM1 Server and use TM1 Web, TM1 Perspectives etc on November 25, 2016 unless you take appropriate action.
 
Is Your System Impacted?
If you are running TM1 version 10.X your system is most likely impacted as in versions 10.X the SSL settings are turned to ON by default. If you are running an older version of TM1 we recommend that a system review is performed to ensure continuity of service.
 
What should you do
IBM has created a new certificate that will replace the soon to be expired one and will provide even stronger encryption (2048 bit vs. 1024 bit). This new certificate has an expiration date of August 25, 2022 and needs to be installed to replace the existing one. The details for converting your TM1 environment to the new SSL certificate can be found in the following IBM Tech Note:
 
 
Please contact ACG at support@acgi.com if you have questions, concerns or would like to discuss. We are happy to help review your TM1 system, confirm all updates are made properly and ensure TM1 system availability.

Object Security Integration with IBM Cognos TM1

Posted by James Doren on Sat, Jun, 27, 2015 @ 10:40 AM

One of the key security-related selling points within IBM Cognos TM1 is the tool’s ability to assign different security levels to different objects within the TM1 model for any given user or group.  Properly handling object security interaction within a TM1 model can provide an unparalleled level of control over the system, user base, and data.  As we have seen at many clients, it will improve the reliability of your data by ensuring that only the users qualified to change data at a given intersection point will be doing so, and allows for unique configurations by personalizing the security rights for each individual group. This added flexibility helps users get the most out of their TM1 model, so understanding exactly how it works is very important.

The security rights can be set up to gain the most control possible over what users can and cannot see, add, delete, or change.  There are 4  different levels of object security which pertain to a user’s ability to manipulate data:

  • Cube Security
  • Dimension Security
  • Element Security
  • Cell Level Security

For the examples following, we will assume a single user is assigned to a single group.  This way, when we talk about the ‘User’, we understand that this user is part of a TM1 security group.  Let’s look at these different levels of object security in the simplest form.  Say for example, a user is assigned READ access to a cube called ‘Planning Cube A’. If this is the sole security assignment, then the user will have READ access on all intersections of the cube. If a user is assigned READ access to a dimension called ‘Time’, then the user will have read access to all intersections involving the ‘Time’ dimension.  Now let’s say that the only security assigned to a user is an Element level security on the ‘Time’ dimension.  For example, the user is assigned READ access for the element ‘2015’ within the ‘Time’ dimension, which gives the user READ access ONLY to the element ‘2015’.  The same concept applies to cell-level security. 

Simple enough, right?  Well, the tricky part lies within the way the security settings for these four different objects interact with each other when multiple levels of object security are assigned.  Let us look at the example below:

You have a cube called ‘Planning Cube A’ with the following dimensions:

  1. Version
  2. Territory
  3. Account
  4. Year
  5. Measure

You assign a user READ access to the ‘Planning Cube A’ cube, but you also assign the same user WRITE access to all of the elements in this cube.

What do you think would happen in this example? In this case, the READ access of the cube will override the WRITE access of the elements, allowing the user to view the data within the cube but not allowing them to update any of the data. 

Let us look at another example, assuming we are talking about the same cube with the same dimensions as the last example: 

You assign a user WRITE access to the ‘Planning Cube A’ cube, WRITE access to all of the elements within all of the dimensions EXCEPT for the ‘Version’ dimension where you assign the user READ access to all of the elements within the ‘Version’ dimension. 

In this case, the elements in the ‘Version’ dimension identify every intersection in the cube, and since the user is assigned READ access to all elements in this dimension, the user is unable to update data within the cube. 

As stated by IBM, “When groups have security rights for a cube, those rights apply to all dimensions in the cube, unless you further restrict access for specific dimensions or elements”.

Why this type of interaction is useful

This type of object security interaction is essential for having a full level of control over your users and groups.  By being able to set security access at different object levels, you are able to pinpoint and assign the security settings for a giver user exactly how you want them to be.  Imagine this scenario, referring to the same cube/dimensions as the prior examples:

You wish to have all groups be able to view and read cube data for all territories specified in the ‘Territory’ dimension.  You also want each group to be able to update cube data ONLY for their own territory and ONLY for ‘2015’. 

By properly manipulating the different levels of object security, and utilizing the way they interact with each other, this can be accomplished quite easily with no coding and no additions to the security model. 

To achieve this, we would take the following steps:

  1. Assign each Territory group WRITE access to the ‘Planning Cube A’ cube
  2. Assign each Territory group READ access to all territories (element level security) within the ‘Territory’ dimension that do not apply to that group
    • For example, assign the ‘North’ territory group read access to all elements except for ‘North’ in the Territory dimension
  3. Assign each group READ access to all elements except for the ‘2015’ element in the year dimension (element level security)

With this security scheme set up, users will be able to view all cube data, but will only be able to update cube data related to the territories applicable to them. 

While these are just a few examples of how the interaction between object security works, there are many great applications that can come out of fully understanding how the relationships work.  In summary, if a cube has READ access specified for a group, then even if the elements or cells are specified as WRITE access, users in the group will not be able to update any cube data.  However, if a cube is given WRITE access for a specified group, all of the dimensions by default take the access of the cube security assignments.  It is in this way, a sort of top-down approach, that specific intersections and elements can be restricted for a particular group. 

Topics: TM1 Technology, IBM Cognos, IBM Cognos TM1

Trends in Enterprise Performance Management...

Posted by Rob Harag on Sun, Jun, 07, 2015 @ 06:13 PM

Enterprise Performance Management (EPM) systems are becoming more powerful, user friendly and more accessible - that is the core theme of an article published recently by CFO Magazine. Advances in technology allow EPM systems to process larger quantity of data on-demand and thus increase their analytical capabilities. Wider choice of specialized products and new features such as compatibility with Excel, predictive analytics, cloud computing or mobile access are transforming the way users interact with the applications and thus help advance many traditional functions and business processes.

While IBM, SAP and Oracle command about 70% of the market, there are plenty of sophisticated systems to choose from in the remaining 30%. Reversing a prior trend of industry consolidation, more companies are entering the market and provide interesting capabilities that are often tailored to a specific market or industry segment. Examples of such focus areas include budgeting, forecasting, profitability optimization or scenario modeling.

Gartner categorizes EPM solutions into two main buckets:  Office of Finance Processes and Strategy Processes. Most of today's solutions, however, aim to combine financial and operational data across the company and provide a fullly integrated view of the company's performance. Such increased transparency and insight into underlying trends and drivers facilitates a higher focus on key factors affecting financial performance. This in turn helps to more effectively execute a company's business strategy.

There are three primary drivers that are currently changing the way people and companies interact with and deploy EPM solutions:

Speed – with the increased use of in-memory computing, systems are more nimble and flexible to complete calculations in real time and are able to handle the ever increasing quantity of data. The improved performance facilitates a number of benefits such as usability, collaboration, integration and others. These capabilities are further changing the role of Finance and help it to be a more effective facilitator of change, trusted advisor to the business and a driver of value.

Usability – there have been great advances in user friendliness and usability of the tools as vendors continue to make significant investments into more intuitive and easier to understand user interfaces. Dashboarding, visualization, integration with Excel, web-based capabilities and templates are some of the key focus areas. Cloud computing plays an important part in increasing adoption through better flexibility of use. Finally, mobile computing facilittes more distributed data collection and reporting with the abiltiy to drill down directly on the mobile device, typically an iPad.

Integration – EPM systems are becoming increasingly more integrated with solutions expanding far beyond the traditional financial planning and reporting. Operational planning, sales performance management, strategic planning are only a few examples of such new use cases. Integration of these systems and processes provides a higher quality output and facilitates collaboration across groups and departments, which yields more efficiency and improves performance. Predictive Analytics, while still in early stages, is further pushing the frontier on the type of insight and depth included in financial information involved in decision-making.

Topics: Business Forecasting, Performance Management, Financial Planning and Analysis

The Future of IBM Cognos TM1

Posted by Jim Wood on Wed, May, 13, 2015 @ 09:46 AM

IBM Cognos TM1 version 10 has now been around for a good while and has already gone through several minor and major updates. The move from version 10.1 to 10.2 itself was a significant change, even though it was within the same release stream. The key updates included the move from .Net to Java as the main platform and support for multi-threading queries.

Despite all these changes and updates, the underlying framework and structures remained the same. Despite the addition of CAFÉ, which delivered significant optimization of performance and flexibility for users over a wide area network, most interfaces have not changed for a number of years.

So the question has to be: What’s next for TM1?

IBM have been aware of some customer frustrations with the TM1 interfaces for quite some time. Performance Modeler and Application Web server were a step in the right direction but they were more aimed at the Cognos Enterprise Planning market for ease of transition. Interfaces such as Architect, which are still heavily relied upon, haven’t seen a major update since the start of version 9.

It seems that IBM plans to address these issues in the next release through a tool called “Prism”, which is currently in Alpha phase. It is not entirely clear and known what the new interface will look like as IBM is playing their cards close to their chest, however rumors have it that it will be a significant upgrade and will close the gap to some of the other analytics and visualization platforms. What has been confirmed thus far is that Prism will replace Architect and Perspectives.

IBM provided an overview of Prism at their IBM Vision 2015 conference in Orlando in May 2015

Is Prism going to be massive leap forward for TM1?

Hopefully yes. Even though only a small amount of information is available, it is clear that IBM is making a heavy push to further enhance the system and is determined to strengthen its position on the market by committing resources. This enhancement to the front-end of the tool would be a welcome and refreshing improvement that will go a long way in streamlining the overall usability. Time will tell if Prism will live up to its promise but the development seems to be moving in the right direction.

What do I need to do to make sure I’m aware of Prism will bring?

With the demonstration at IBM Vision 2015 in May, chances are a lot more information will be available in the near term as to the scope and look / feel of the system. You can always contact your IBM account representative / business partner and ask to be included on communication as new information becomes available. For those that want to be really close to the updates, you can sign up as Beta tester and gain a better understanding of what is coming and if / how it will impact your application and vision for the system.

Topics: TM1 Technology, IBM Cognos, IBM Cognos TM1, Performance Management

4 Reasons Why Financial Planning & Analysis Is Not Effective

Posted by Rob Harag on Fri, May, 01, 2015 @ 09:40 AM

Considering how important the Financial Planning and Analysis (FP&A) function is in providing critical business insight to company CFOs and CEOs, it is surprising how low its capabilities are still regarded amongst business executives. According to a recent study published by APQC, only 40% of 130 executives surveyed described the FP&A function as effective. Given all the talk and focus in the industry on business analytics and insight, as well as the availability of numerous systems and solutions from niche players and mainstream vendors, it is surprising that the FP&A function is still rated this low. Based on our observation and experience, there are four primary factors that drive this outcome:

1. Continuously raising the bar on expectations

The demand for meaningful business insight and analysis has been growing continually over the past years. Economic uncertainty, regulatory scrutiny, increasing competition, changing weather patterns or geopolitical pressures keep creating new and more urgent demand for information that FP&A needs to satisfy. Even as the FP&A capabilities, processes and technology improve, the FP&A groups are under increasing pressure to respond to more and more ad-hoc requests from many different groups and stakeholders. As a result, they spend most of their time scrambling to respond to requests under rapid fire and often do not have the ability to step back and develop capabilities that would bring them to the next level. What was working well yesterday is not good enough today.

2. Playing catch-up with technology

Today’s finance and planning infrastructure is typically complex as a result of aggressive growth and multiple mergers and acquisitions. Often times the reporting and analytics relies on information from multiple legacy or department level systems. Excel typically plays a significant role in arriving at the end product. To further complicate matters, the demand for business driver detail and modeling require information residing in transactional systems that are outside finance. Even though companies continue to invest and integrate the various systems, ongoing business evolution continues to fuel more changes at a rapid pace, thus creating a moving target. As a result, Finance spends most time pulling information together from various systems and compiling reports or explaining variances at a high level (vs budget) rather than analyzing data and providing value add insight.

3. Slowly adopting advanced planning techniques

Majority of companies still rely on a fairly static process for budgeting, forecasting and reporting. While some of the advanced techniques such as rolling forecast or driver based planning are slowly creeping in, they are not the prevailing methodology yet. It is not uncommon for a large organization to spend 6 months preparing the annual budget and for the budget to be obsolete shortly after it has been completed. Forecast is often times the result of a finance process (YTD Actuals plus Budget) and includes little real time insight from the business. Reporting tends to be backward looking, focused on results to-date and explaining variances to plan or forecast. As such, financial planning is not sufficiently flexible and aligned with business strategy and is slow to adapt and accommodate changes in business conditions.

4. Advanced Analytics still in early stages

While predictive analytics, demand planning and other capabilities are all the buzz these days, they are at a very early stage of adoption. Even if they exist, they are typically not integrated with the overall financial performance management framework. They are usually performed by a dedicated group of people, many times statisticians or other similar functions, in isolation and focused on a specific area / problem. The result is then used as an input into the planning process for further processing. Tremendous upside exists in integrating the forward looking capabilities into financial planning from both systems and procedural perspective, however it will take some time until this practice becomes mainstream.

Topics: Business Forecasting

Should I be Developing TM1 Applications Using Performance Modeler?

Posted by Jim Wood on Wed, Jun, 04, 2014 @ 12:44 PM

Performance ModelerBefore the advent of IBM Cognos TM1 10 any company or individual looking to complete development within TM1 used the tried and tested but dated and unhelpful development tool sets. When IBM introduced TM1 10 back in 2011, they introduced a whole new, web based development tool set called Performance Modeler. This partnered with the rebranding of what was previously known as Contributor (now known as Application Server) introduced a new development tool partnership that wasn’t just IBM trying to bring new options in to play, it was the introduction of whole new development and deployment methodology. This new methodology is based around using the Application Web portal as the front end for gathering and reporting on planning information with Performance Modeler used to get you there.

With any new methodology there is always a level resistance to it, and this was the case within the current TM1 community. While the old tool set was not the best, it was very well known. We have seen recently however, with IBM pursuing this methodology that more and more people, especially those new to TM1 are following IBM’s new pathway. So with more people starting to use Performance Modeler and with IBM continuing to develop it further, all TM1 users / developers need to start looking at it as a serious project implementation option.

So the question has to be: should you be looking to use Performance Modeler?

Performance Modeler is easy to use.

Performance Modeler has plenty of new wizards built in to make previously difficult tasks a lot easier. There is a built in wizard (called guided import) for building cubes, dimensions and even for importing data. Adding calculations has been made much easier. On top of making calculations easier to build IBM has also added many new functions to Performance Modeler, a perfect example of this is the Lag function that makes building time based balances easier to build and maintain.

Performance Modeler makes building Process Workflow easy.

With its perfect partnership with Application Server, Performance Modeler makes building the parts required for a Workflow implementation easier. Within Performance Modeler you can create the required hierarchy for managing your workflow and it also handles dimensions differently so that any cube built is optimized to work well within the workflow framework.

Performance Modeler offers you a range of new options.

Performance Modeler has added development options that were not possible using the old tool sets. For example removing a dimension from within a cube is now a simple drag and drop task. Previously this would have involved rebuilding the cube.

Performance Modeler is the way forward.

Performance Modeler is a key part of the IBM Cognos TM1 development road map. IBM see this new development methodology as a key part of making TM1 an integral part of their FP&A strategy going forward. As it progresses, more features and improvements will be added.

Now that we’re past the early development stages it’s an ideal time to take a look at performance modeler and see what benefits it can bring to you.

Topics: IBM Cognos TM1, Business Forecasting, Performance Modeler

Integrating Resource Management with Cognos Command Center

Posted by Peter Edwards on Tue, Jan, 28, 2014 @ 04:46 PM

For companies with large data stores, the benefits of performing analytics are impossible to deny: an IBM study revealed that enterprises which conduct analytics have up to 1.6x revenue growth, 2x EBITDA growth and 2.5x stock price appreciation of those that don’t. [1] However, as data volumes continue to escalate, the cost of providing analytics climbs as well. 90% of companies with 1,000 or more employees access several distinct data sources to complete their accounting close, and 43% use four of more. In the finance industry, professionals spend only 22% of their time performing actual analysis – the rest is spent waiting for data or verifying its authenticity. [2] 

Cognos Command Center

Because large datasets lead to longer downtimes for maintenance, most IT departments perform automation to manage systems. These custom built systems tend to be fragile, require updates when source systems are updated or added, and rely on the knowledge of a few key members of the team. Issues that do arise require substantial involvement of IT personnel, and if critical members of the team are absent, they can cause massive delays.

IBM Cognos Command Center is a new product offered by IBM that facilitates database management by consolidating all of your resources into one dashboard. Cognos Command Center is designed to be usable by business professionals without extensive technical knowledge, and provides enterprises with a robust alternative to custom data warehouse management.

Why should you use IBM Cognos Command Center? Here are a few reasons:

Command Center is Intuitive

Cognos Command Center comes packed with plugins so you can snap-in functions that would normally require custom coding and maintenance from IT. Perform file operations, secure data transfers, and native database functions without writing one line of code. Save money on IT resources, and better utilize your existing employees by allowing them to spend less time on routine tasks. Command Center takes the guesswork out of process failures by alerting you immediately by email, and translating commonly encountered errors into plain English. IT can even add procedures to deal with recoverable errors automatically and silently.

Command Center is Secure

Manage your servers from anywhere in the world via the web or your mobile device. Many complex systems involve the use of stored login credentials, but most companies do a poor job securing this data, representing a major security risk from both internal and external sources and accounting for 13% of all breaches according to a Verizon’s 2013 Data Breach Investigations Report. [3] Cognos Command Center encrypts sensitive data, and users can run processes that utilize stored login credentials without ever being able to read them. In the event of failure or intrusion, any change, no matter how small, can be traced to a single user.

Command Center is Flexible

IBM Cognos Command Center uses global variables to aid in synchronization of data between servers. Manage your workflow centrally, and update parameters on every server you own with a single click. Never lose hours of uptime to rolling back data because of a forgotten parameter again! Command Center integrates workflows for heterogeneous environments into one management system, and its plugin-based system means functionality can quickly be added for new storage systems as they are introduced.

Command Center is Supported

IBM Cognos Command Center is the newest member of the IBM Cognos Business Intelligence family. IBM was ranked a clear “leader” in Gartner’s 2013 Business Intelligence & Analytics Magic Quadrant. The Cognos platform’s ability to serve as a complete BI solution has been bolstered by IBM’s recent high profile acquisitions. IBM Cognos received high marks from clients regarding road map and future vision, product quality, and ability to integrate with information infrastructure. [4] The addition of Command Center to the Cognos family represents IBM’s ongoing commitment to deliver the most full-featured business intelligence platform available.

Topics: IBM Cognos TM1, IBM Cognos Command Center, Star Analytics

How Do TM1 Financial Systems Ensure Clean Data?

Posted by Peter Edwards on Wed, Jan, 08, 2014 @ 03:44 PM

Summary

Almost half of new business initiatives fail because of poor data, according to a recent article on the EPM Channel website. While manual processes can temporarily fix issues, The IBM TM1 technology uses a series of checks and balances to ensure that businesses can build a database on clean data, saving employees time and avoiding costly mistakes.

How Do TM1 Financial Systems Ensure Clean Data?

Almost half of new business initiatives fail because of poor data, according to a recent article on the EPM Channel website. This trend fuels mistakes in everything from bill payments to shipping, and often means that your best employees spend more time organizing data instead of analyzing it.

ibm_cognos_tm1This is a long-standing pain a lot of businesses incur with dirty data. With Big Data, businesses have large amounts of data, so the quality of that comes into question. Organizations often are looking at ways to improve their data because they have too many manual processes in place or their data is being cleaned in a non-automated fashion. If your data is clean and organized, you’ll streamline processes and align the organization on a set of numbers. As the EPM Channel article points out, it’s best to do this at the start, saving employee resources and avoiding costly problems.

There are systems that can help. TM1 financial solutions enforce what’s called referential integrity in data. Many times, when pulling data to put into TM1, there are problems in supporting systems that must be fixed. For instance, if you’re pulling data from a financial system that’s doing consolidations, there could be numbers at an account level that do not necessarily total to the parent account that they should roll up into. That’s because the system allows users to store two different numbers. This means there are a number of child accounts that fail to add up to the appropriate total because the database doesn’t enforce referential integrity.

Most relational databases or ERP applications don’t necessarily enforce these rules. They try to put business rules in place, but fundamentally the technology allows users to enter data in different places, creating conflicting information. When pulling data into the TM1 financial systems software, the detail-level account data rolls up into the parent account, creating that figure. This method ensures that the parent figure isn’t entered separately and avoids any situations where the child accounts don’t add up to the parent number. Therefore, users who leverage TM1 as there database are much more likely to have quality, clean data.

To learn more about TM1 financial solutions or see a TM1 demo contact ACG.

Contact ACG

Topics: TM1 Technology, IBM Cognos TM1, Business Forecasting, Clean Data, Performance Management

Contact ACGI

Tel: (973) 898-0012

Latest Posts

Follow ACG