Avoiding Interdimensional Irrelevance in EPM Cloud: A Smarter Design Approach

When designing an EPM application like Oracle FCC (Financial Consolidation and Close), it’s tempting to try to fit all data into a single cube. We have several system dimensions like Movement and Data Source to play with along with the four Custom dimensions. But forcing data into places it doesn’t belong can lead to a tangled mess of interdimensional irrelevance, hurting both performance and usability.

What Is Interdimensional Irrelevance?

Interdimensional irrelevance occurs when dimensions intersect in ways that don’t make logical or business sense. This leads to sparse intersections, bloated cube sizes, and confusing user experiences. For example, trying to report on a statistical driver against a legal entity that doesn’t use it creates meaningless intersections that slow down processing and clutter reports.

Our Design Challenge

We faced a situation where certain data elements, while important, didn’t naturally fit into the FCC hierarchy. These were supplemental metrics and drivers that were useful for analysis but didn’t belong in the core consolidation structure. Initially, we considered shoehorning these members into the existing hierarchy, but this quickly proved problematic:

  • Adding non-consolidation data to FCC can introduce unnecessary complexity.
  • Sparse data intersections may slow down calculations and retrieval.
  • Mixing supplemental and core financial data risks confusion and misinterpretation.
  • Additional supplemental requirements might create what was coined as a “dumpster dimension”

The Solution: A Supplemental Application

To maintain clarity and performance, I would argue that offloading these supplemental data elements into a separate Planning FreeForm application is a better move. In the on-premises days, we would spin up little analytic cubes all over the place to hold data that really didn’t make sense in a larger cube. I don’t see why we wouldn’t do something similar with EPM Enterprise Cloud customers as well. In my eyes the benefits are:

  • Preserve the integrity of the FCC hierarchy by keeping it focused on core financial data.
  • Optimize performance by reducing sparsity and irrelevant intersections.
  • Enable targeted analysis in the supplemental cube without compromising the main application.
  • Stitch the reporting together in Narrative Reporting and/or ad-hoc analysis with Smart View.

This approach gives the flexibility to design each cube for its specific purpose, while still allowing for integration where needed pushing data through data maps or integrations.

Key Takeaways

  • Don’t force-fit data into hierarchies where it doesn’t belong.
  • Use supplemental applications to isolate non-core data.
  • Design with both performance and user experience in mind.
  • Interdimensional relevance should be a guiding principle in Essbase architecture.

When we respect the boundaries of dimensional logic, we can create cleaner, faster, and more maintainable solutions.

25.11 EPM Updates for Data Integration and EDM

In the November 2025 update for Oracle EPM, we will see some additional changes to the Data Integration Actions menu. A new “Other” actions folder will be added with the Report Execution and System Maintenance Tasks options. This is another quality of life update that brings us one step closer to parity between Data Integration and Data Management. To put it plainly, if you’re not using the Data Integration UI in Oracle EPM, you should get familiar with it.

EDM didn’t have any new updates this month, but Oracle has a new video on the Consolidation Requests. Consolidation requests allow the combination of multiple in flight requests into a single consolidation request for approval by a change governance committee. Requests can only be consolidated from the same view in EDM. The consolidation of requests can help simplify the approval by a change board. If part of a consolidation request needs to be pushed back, the consolidation request can be discarded. For more information, see the video here: https://youtu.be/vujzO5bQsi4

Who moved my Data Integration menu? Embracing change with Setup and Configure

In Dr. Spencer Johnson’s 1998 bestseller, Who Moved My Cheese?, four characters navigate a maze in search of cheese. The book’s main themes are that change is inevitable and that we must anticipate, adapt, and embrace it to be successful in work and life.

Fast forward to the 10.25 Oracle EPM update, and we find ourselves in a similar maze. This time, the “cheese” is the data management Action menu items. And yes, they are about to be moved.

The Data Integration home page has undergone a subtle but powerful transformation. The familiar Actions menu has been reorganized into two new dropdowns: Setup and Configure.

The Setup menu is where you define the structure of your data environment. Think of it as mapping your maze before you start running:

  • Applications: Define your integration targets and sources.
  • Locations: Create and maintain locations for mapping.
  • Period Mapping: Align time-based data across systems.
  • Category Mapping: Manage application scenarios.
  • Query: Setup and modify data source queries.

Once your maze is mapped, it’s time to optimize your tools and security. This is where the Configure menu comes in:

  • System Settings: Control the behavior of your integration engine.
  • Security Settings: Safeguard access and permissions.
  • Agent: Manage the EPM Integration Agent settings.
  • Download Agent: Get the EPM Integration Agent software.

Just like the characters in Who Moved My Cheese? learned to adapt to their new reality, this menu redesign helps users adapt to their data environment more efficiently. By grouping actions based on context, users can find what they need faster and act with greater confidence eventually. Those of us who have switched to using the Data Integration UI will take a little bit to get used to it, but I think this is a small quality of life change that we will come to appreciate.

This update applies across business processes including Account Reconciliation, Planning, Tax Reporting, and more.

In the end, the cheese will always move. The question is: will you move with it?

Strategic Deployment Models for Oracle EDM: From Metadata Steward to Master Creator

By now, most of the world knows what EDM is and what it does. Even though EDM has been out for several years at this point, I believe its strategic potential is being overlooked. Too often, organizations treat EDM as a tactical metadata tool tied solely to their EPM applications, rather than recognizing it as a foundational investment in enterprise-wide data governance. We play games with EPM Enterprise licenses to try and keep the node counts under 5,000 but that is really undervaluing the impact EDM could have.

It has been designed to be much more than a connector; it’s a platform for harmonizing metadata across business domains, enabling alignment, auditability, and agility. When deployed thoughtfully, EDM becomes a metadata authority that can support Finance, HR, Supply Chain, and beyond. But that vision only materializes when companies stop thinking of EDM as a bolt-on and start treating it as a core pillar of their enterprise architecture.

EDM can be leveraged not just as a catalog of data elements, but as a strategic asset for downstream reporting and analysis tools. How you deploy EDM can dramatically shape its impact. This post explores three strategic deployment models for EDM:

  1. As the originator of new metadata records
  2. As a metadata steward downstream from source systems
  3. As a metadata harmonizer across different business units

EDM as the Primary Metadata Creator

In this model, EDM is the primary source for creating new metadata records such as cost centers, products, legal entities, or reporting hierarchies. Business users or administrators initiate requests directly in EDM, and once approved, metadata is pushed downstream to consuming systems. This could be called “hub and spoke” where EDM is the controller for all metadata.

This deployment scenario is ideal for:

  • Organizations with centralized governance
  • Enterprises looking to remove “shadow” systems and rogue metadata creation
  • Use cases requiring strict audit trails and approval workflows

EDM’s request workflow ensures intentional and controlled metadata changes, aligning with organizational policies. Approval processes with multiple stages can reinforce robust data governance, maintaining consistency and compliance across systems. Additionally, EDM’s REST APIs can enable automated integration with downstream applications.

EDM as a metadata steward

In this deployment scenario, EDM receives metadata from upstream systems (such as CRM, ERP, or MDM platforms), and acts as a governance checkpoint. It matches incoming records to existing nodes, merges duplicates, and applies survivorship rules to determine which properties to retain.

Ideal for:

  • Enterprises with decentralized metadata creation
  • Organizations integrating multiple source systems
  • M&A scenarios requiring metadata harmonization

EDM has key features that can help with these scenarios like the Matching Workbench for deduplication along with merge logic and survivorship rules. Matching and deduplication relies on a logical tag for each node in EDM called a data source. Data source provides a foundation for Matching or Deduplication rules by defining the scope of metadata to be analyzed.

The key benefit to this method is to allow existing upstream applications to continue to own key business dimensions, but provide a central hub to consolidate and distribute those dimensions to downstream applications.

EDM as Federated Metadata Hub

In this hybrid model, EDM acts as a metadata exchange platform across multiple domains like Finance, HR, or Supply Chain, each with its own governance model. EDM doesn’t own all metadata but facilitates alignment and synchronization.

This deployment method is ideal for:

  • Large enterprises with domain-specific governance
  • Multi-cloud or multi-ERP environments
  • Organizations with regional autonomy but global reporting needs

EDM supports domain-specific modeling for Finance, HR, Supply Chain, and beyond, allowing each unit to maintain its own governance structure while participating in enterprise-wide metadata harmonization. Features like subscription requests facilitate cross-domain alignment by automatically propagating approved changes to related hierarchies, ensuring consistency without manual intervention. EDM’s security model and approval workflows help decentralized teams manage metadata collaboratively while preserving accountability.

This model enables business units to continue to operate with autonomy while providing governance which is ideal for balancing agility and control.

Which deployment strategy you choose should take into consideration your organization’s maturity in data governance. Do your end users know enough about the business to submit their own requests directly into EDM? Is there a deliberate approval workflow for changes to your chart of accounts? What are your compliance requirements and audit needs around metadata changes? What is the priority for your business (e.g., speed vs. control)?

Oracle EDM isn’t just a bolt-on EPM module; it’s a strategic enabler of enterprise agility, compliance, and insight. The key is choosing the correct deployment scenario that matches your business needs. Those business needs don’t stop at your Planning or Consolidation applications. That’s why EDM should be considered as a tool to be used across the enterprise. There is a reason it’s called Enterprise Data Management after all.

EDM 25.09 Update – Request Monitoring Dashboard

In the September 2025 udpate (25.09), Oracle is adding a Request Monitoring Dashboard to EDM! Designed to enhance visibility and control over change requests, this dashboard empowers administrators, data stewards, and integration leads to streamline workflows and improve data quality across the enterprise.

The Request Monitoring Dashboard is a centralized interface that allows users to track and analyze open requests throughout their lifecycle. Whether you’re managing metadata changes, hierarchy updates, or complex multi-domain governance processes, this dashboard offers real-time insights into request activity, aging, bottlenecks, and contributor performance.

Key Features:

  • Lifecycle Tracking: Monitor requests by type, priority, workflow stage, and assigned contributors.
  • Custom Filters: Apply and save filters to focus on specific request attributes.
  • Dashboards:
    • Open Requests: View volume and distribution.
    • Active Owners: Identify who’s driving change.
    • Aging and Exceptions: Spot delays and anomalies.
  • Drilldowns & Drill-Across: Dive deep into request details or pivot to related metrics.
  • Export Capability: Download request activity for offline analysis or stakeholder sharing.
Request Monitoring Dashboard displaying open requests, active owners, aging, and exceptions. Features include request count by stage, open request distribution by application, and a snapshot of outstanding requests.
Sample Request Monitoring Dashboard image courtesy of Oracle

Why It Matters:

Managing change requests efficiently is critical to maintaining data integrity and operational agility. The dashboard helps teams:

  • Reduce request cycle time
  • Identify and resolve workflow bottlenecks
  • Improve exception handling
  • Enhance collaboration across business units

The Request Monitoring Dashboard isn’t just a new feature—it’s a strategic tool for proactive governance. By surfacing actionable insights and enabling smarter oversight, the Oracle EDM dev team continues to raise the bar for enterprise data management.

To find out more about this release, see the August 21 Oracle EPM Event by Rahul Kamath and Matt Lontchar here: https://community.oracle.com/customerconnect/events/606792-epm-whats-new-and-whats-coming-in-oracle-enterprise-data-management-edm-cloud

The EDM 25.09 features list can be found here: https://docs.oracle.com/en/cloud/saas/readiness/epm/2025/edm-sep25/25sep-edmcs-wn-f40991.htm

EPM Cloud 25.08 Updates – TLS, JRE, ARC Pipelines, Oh My

The EPM updates for 25.08 were released and we have an update to the TLS changes. Oracle has decided to continue supporting TLS 1.2 indefinitely, but only with ciphers deemed to be strong. The extension of support for TLS 1.2 gives Oracle and its customers a welcome bit of flexibility. Oracle has also released a document on how to test with the latest TLS ciphers document here: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/tsepm/cloud_epm_test_tls_ciphers.html

EPM Automate is switching to Java 17 instead of Java 8. Windows users rejoice! With the EPM Automate “update” command, EPM Automate will download and install the Java 17 runtime environment as part of the update process. Linux/Unix and Mac users will need to update their user-installed Java version to continue using EPM Automate 25.08 and after. Java 8 was released over ten years ago, so it’s good to see a newer version is being implemented. Linux/UNIX/Mac OSx users can go here to find how to update their Java version: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/cepma/installing_epm_automate_linux_unix.html

Account Reconciliation Cloud is getting Data Integration Pipelines with this update. Pipelines will be available on ARC pods with the following job types:

  • Create Reconciliation
  • Generate Report for Account Reconciliation
  • Import Attribute Values
  • Import Balances
  • Import Pre-Mapped Balances
  • Import Pre-Mapped Transactions
  • Import Rates
  • Run Auto Match
  • Run Auto Alert
  • Set Period Status

This should mean that we can define ARC jobs on any EPM Data Integration Pipeline and cross pods (similar to how we can run EDM exports across pods with Pipeline).

Before we go, I just wanted to take a moment to celebrate the deprecation of the Data Management/Data Integration job schedules. If anyone out there has braved the pain of that scheduler, those scheduled jobs need to be converted to the EPM Platform Job Scheduler before the 25.09 update. There is a System Maintenance Task job in Data Management called “Migrate Schedules to Platform Job Scheduler” to help with that effort. The EPM Job Scheduler isn’t available in PCM and ARC, unfortunately. If you use either of these business processes, scheduling outside using EPM Automate or rest calls is probably your best bet (and likely what most other customers are using anyways).

Oracle EPM AI features deliver on promises from long ago

I have accepted the fact that I am getting old (or is it “more experienced”?). At this point, I have been working on and around Oracle EPM products for almost twenty years. In the early 2000s, I was getting data from Hyperion Enterprise before we installed Essbase to do reporting. I dove into Essbase and began learning as much as I could. Once I reached a point where I felt I had done everything I could at my position as an administrator, I moved into consulting in 2010 to continue developing myself and learning more. As part of that, I took some training on OBIEE to help support customers with BI installs.

My point is, for 15+ years (maybe 20) EPM and BI users have heard about the promise of self-service BI: empowering users to analyze and visualize data independently. I remember hearing this in my OBIEE training and it was exciting to think about users digging into the data to answer business questions.

The thing with BI products is that there has to be someone technical to connect all of the data sources on the back end. It takes a special someone to figure out the right strings to pull to get all of those data sources normalized and linked up so that end users can do their reporting and analysis. It may be my bias as an implementer, but I don’t know how far users go past the initial dashboards that get created. I certainly hope it’s more common than I have seen.

As I sat in the Kscope Sunday Symposium presentations by Oracle product management and heard about all of the AI features coming to Oracle EPM, it dawned on me that all of the amazing things that I imagined 15 years ago will soon be possible and more accessible than ever. Users will soon be able to to chat with the AI built into Oracle EPM products and get visualizations fed back to them. To recycle an old sales pitch, analysis at the speed of thought is about to be real.

I am looking forward to seeing the developments in Oracle EPM products and I’m excited to see what our customers do with them. You can find the current AI features available in Oracle EPM products here: https://docs.oracle.com/en/cloud/saas/fusion-ai/aiafl/epm-features-with-ai.html. That list is about to get much longer. These are exciting times we live in.

Smart Split feature in 25.07

New EPM Data Integration Features in 25.07

Oracle updates for the 25.07 patch just recently came out and there are a couple of great features for Data Integration in the mix this month.

First, a new application role called “Data Integration – Administrator” is rolling out. This access role will grant a user access to all activities in Data Integration. This means a user will be able to create/manage integrations, execute and monitor pipelines, and perform data and metadata extraction and transformation from on-premises sources using the EPM Integration Agent. The new role is a fantastic addition to allow a user to manage your integrations without giving them Service Administrator permissions on the rest of application. This applies to pretty much all EPM business processes including ARC, EPCM, FCC, Planning, PCM, and Tax Reporting.

The second update is the addition of the Smart Split feature in Pipeline. Basically, Essbase has a governor and it gets mad when you try to push too much data into it. The solution up to now has been to split a large volume data integration into multiple smaller slices of data to get around the limit. Going forward, we can set up a large integration like normal with one big data load rule. Then, in Pipeline we can add an “Integration with Smart Split” job which will split the files for us based on the Split Dimension specified. This will allow the system to bypass the governor to submit smaller data slices without requiring the creation of multiple integrations. Smart Split will be available in EPCM, FCC, Planning, and Tax Reporting

Check out the Proactive Support Blog update here for more information on all of the 25.07 updates: https://blogs.oracle.com/proactivesupportepm/post/oracle-planning-july-2025-cloud-updates

EPM Data Integration Copy Features

Just catching up after Kscope and I was going to write a quick blog post about the EPM Data Integration Copy Integration and Copy Pipeline features, but I was scooped.

These features were released in the 25.04 updates (April 2025), but I haven’t had a chance to use them yet and actually forgot it was a thing until Mike Casey talked about it at Kscope.

My friend Trey Daniel just happened to post about it five days ago on LinkedIn. Please check out his post on the subject here: https://www.linkedin.com/pulse/cross-pod-migrations-selected-oracle-epm-cloud-data-trey-daniel-mba-yglgc

One thing I didn’t realize was that it is possible to copy integrations and pipelines to other pods by using the connection feature. The EPM community is alive and well thanks to those who share their knowledge.

UPDATE: TLS 1.2 Deprecation Testing

After the 25.06 update was released, I did a quick test of a Windows 10 VM with Smart View and EPM Automate. The concern was that TLS 1.3 is supported only on Windows Server 2022 and Windows 11 and that our customers on older versions of Windows may have issues.

The test consisted of a Hyper-V Windows 10 Enterprise Evaluation VM with MS Office 365 installed. Using a test pod with the Vision Planning sample app installed, I tried to get in and start testing around 5:30 PM CDT (22:30 UTC) but the update wasn’t pushed yet. I tried a couple of times to run the “epmautomate rundailymaintenance” command to force the update, but no luck. After 6:00 PM CDT, I tried the rundailymaintenance again and it worked.

My Smart View ad-hoc template retrieved data just fine. Similarly, EPM Automate logged in after the update and told me it needed an upgrade. I ran the upgrade command and logged out. Even after the upgrade, EPM Automate logged in just fine.

Looks like a big nothing burger, which is the best result for us all. This was a test of end user tools, so I would still recommend all of you out there in EPM land to thoroughly test after this update just to make sure everything is good.