With the first calendar quarter of 2026 in the books, Oracle is ready to resume the monthly update schedule for EPM applications. Since we haven’t had a normal update schedule for a few months, the EDM development team has a backlog of updates that will finally be automatically pushed this month. The descriptions of the features are on Oracle’s site and you can find that link at the bottom if you haven’t seen it already.
Spotlight Features:
In the EDM list of spotlight features from Oracle we see the following:
A generative AI assistant is now available within Universal application registration to quickly create and configure properties for node types based on sample data files.
Hierarchy viewpoints can now be optionally visualized in an organization chart format rather than the standard tree format used by default.
Global connections can now be defined to Microsoft Azure Blob Storage in order to share data with external applications and processes from a centralized storage location in a Microsoft Azure cloud environment.
Gen AI Assistant:
This sounds like a very cool feature to help speed up the development process. My take is that EDM build phases can sometimes be a little tedious with all of the clicks needed to wire things up correctly, so this is a good first start to try and streamline that process. This is obviously a first iteration, so I’m excited for the future when you can just point EDM to an FCC or Planning application and have the AI assistant read the dimensionality and then you can choose which dimensions to set up in EDM. On top of that, just imagine you can say to an AI assistant, “I have this ERP system here’s a file with my segments. Here’s the URL for my EPM application. Build two EDM apps for the metadata management and create a mapping viewpoint to manage the data integration maps as well.”
Viewpoints in Tree format:
I have seen this functionality and I am used to seeing things in the default visualization so that makes more sense to my brain. I’ll have to play around with the tree view to see how useful it will be for me.
Azure Blob Storage connection:
Azure Blob Storage connection is a great addition. This especially makes sense as more and more organizations are seeing the benefit of using EDM for non-financial domains. At a prior employer, we were mastering employee roles that were fed into Azure Blob Storage which eventually fed HR systems. The ability to send an extract direct to Azure Blob Storage will streamline that process and make some of the automations created using EPM Automate and WinSCP unnecessary. It will streamline the integrations which is a win in my book.
Other Updates:
New Validations for FCC applications:
There are some new validations that will be added to FCC applications for the Account dimension. These will be enabled by default on new EDM FCC applications, but need to be manually enabled on existing applications. These will replace some custom validations that customers have created. It’s great to see that some of the features that customers have asked for in the Cloud Customer Connect Idea Lab coming into reality.
Address verification:
When importing addresses, customers with Oracle’s address verification service can ensure that good addresses are loaded into EDM from the beginning. Data cleanliness is very important when mastering Customer or Supplier dimensions, so this is a great addition to the product.
OAuth2 Authentication for Oracle cloud ERP and Financials connections
OAuth token based authentication is preferred for most IT shops since it doesn’t require maintaining a password. This is great news that token authentication is being implemented more and more across the Oracle stack.
Of course, these are just my opinions. If you have a different perspective, I’d love to hear it. Sharing knowledge is one of my core values and as the saying goes, a rising tide raises all boats.
The first car I ever bought was a 1964 Chrysler Newport. It was a strange car for a 16-year-old in 1996, but it was amazing. The smell of the interior. The feeling of that big block 361 cubic inch wedge V8. The satisfying “clunk” when pushing the buttons to change the gear. The “Forward Look” years at Chrysler design led to space-age choices like pushbutton gear selections on their automatic transmissions, as well as futuristic styling and gauges that glowed green. That car met an unfortunate demise when a guy swerved to miss someone who ran a stop sign, and he caved in the rear quarter panel. I think my beautiful tank became a demolition derby car after that.
My first car was just like this.Push button selector on the dash
As some may know, I started working on Essbase 20 years ago. As an Essbase administrator, I loved to automate much of my tasks, even using Hyperion Application Link (HAL) to alert me when new files appeared to be loaded during month-end. As the build phase wound down and I moved into support and maintenance, I found time on my hands during the workday. My love of old cars and lack of knowledge about them led me to an internet forum for hot-rodding misfits who pride themselves on upholding the tradition of hot rods and custom cars – as long as they are pre-1964. You see, 1964 was the birth year of the Mustang, and the years that followed became the muscle car era. These hooligans did not mess around with muscle cars, nor VW bugs, but that’s a different story.
I tried to restore old cars over the years. I bought a 1964 Chrysler Newport convertible with the intention of restoration, but that project was much bigger than I was able to tackle at the time. My next project was a 1951 Plymouth Cambridge. My goal was to chop the top and change the suspension using airbags to get it low. My Mopar (Dodge, Chrysler, Plymouth) habit kept me choosing odd cars that really didn’t have any support in the aftermarket, so I ended up learning how to fabricate and weld. I started with floor pans on the Newport convertible and eventually moved on to welding the top of the Plymouth.
Unfortunately, the Plymouth project eventually stalled and I gave up. Around this time, I started working as a consultant. My wife and I decided that we should move closer to a major airport to make my work travel easier and we settled on the Dallas/Fort Worth area. With an impending move and sale of our home in small town Iowa, I pretty much gave my car project away and had an auction to sell off most of my other car parts.
That house listing came around the time the entire housing market collapsed. Our house was listed for over a year. Without a project car, I started to get restless and needed some outlet. I needed another project, but I was going to finish this one. I wanted something a little easier that didn’t need to be as nice as a custom car. That led me to build a hot rod. It could be a little rough around the edges since that just adds to the character. I didn’t want to mess with chopping a top again, so that led me to looking at roadsters. In January of 2012, I bought a Moleskine journal and started documenting the process just to have an outlet for the things that were in my brain. I sketched what I thought the end result might look like and how I wanted to set up my suspension.
And so, it begins…
At this point, I had two daughters and was expecting to move to Texas, so I couldn’t really make any major purchases unless they were a great deal. Before the move, I acquired a full 1940 Ford front suspension: axle, spindles, brakes, drums, wishbone, and front spring. This is a lot of components for $275 and quite a steal. Then, I needed a frame for whatever car I decided to build, so I bought 24 feet of rectangular steel tubing in Iowa as well shortly before the move.
Of course, the house was finally sold and we moved to Texas around August of 2012. By this point, I had pretty much settled on a 1926-1927 Model T roadster. If you know hot rods, you know that 1932 through 1934 Fords are very desirable, but you have to pay to play with those. Model As are nice, but I didn’t like the 1928-29 cowl and I figured a 1930-31 A would be out of my price range. Model Ts aren’t the most likely selection for a hot rod, but there were a few of them in the 1950s and 60s.
Then came the new homeowner stuff. Painting, setting up kids’ rooms, and getting settled in. In April of 2013, I went to a large swap meet hoping to find a fiberglass body. I walked about 7 miles that day up and down the rows and found my body – an original steel roadster in rough shape. Missing part of the subframe that the body attaches to, minus one door, and plenty of rust. But, I am thrifty and the asking price was $500. After agreeing to $350, I loaded the body onto my trailer and headed home.
This body is obviously very rusty and I am missing some critical components. I took the body completely apart panel by panel so that I could do rust removal via electrolysis. In the mean time, I bought some lumber to build a rudimentary frame table and began work on laying out my frame. In 2013, I found another body in South Dakota on eBay that had nice doors and a solid bottom half, but the top was beat up – the opposite of my body. So, I rented a Dodge Caravan with the fold-flat seats and drove to fetch that body.
During the frame build, I found a 1966 Dodge 361 cubic inch big block for sale in Oklahoma. This is the same type of engine that I had in my first car. I was able to grab that as well as a 1963 push button automatic from Dallas. I found a 1969 Roadrunner rear axle that had a 3.23 gear ratio in it and found some rear brakes from a guy who lives close. I bought an original driveshaft on eBay that I had to cut apart and shorten as well as many shipments from Speedway Motors and Summit Racing.
Eventually, I had pieced together something that looked like a car. I drove it around the neighborhood a couple of times to get a feel for it, but there were some issues to be fixed. I had to get a different torque converter for the transmission as it didn’t drive correctly which required buying a later parts transmission and swapping out the input shaft. The three Holley 94 carburetors made the engine way too rich, so I switched to a four-barrel carb to make things easier. At this time, I decided to paint the car, so it all came apart in 2021.
I primed the car and started body work in my garage which is messy business. There was a family who moved out down the block and set their leather loveseat on the curb that had been clawed by their cats. That couch made its way into my garage and was skinned and put away until I was ready for the interior. Once the body was almost done in primer, I realized that trying to perfect this 100 year old body and paint it shiny would take a long time, so epoxy primer as a topcoat is what I chose.
I casually would check out Facebook Marketplace hoping for a deal on an upholstery sewing machine and found an amazing deal a couple of hours away. I grabbed cash and came back with an Adler walking foot sewing machine which is an industrial machine great for sewing leather and other thick fabrics. Around January, I had the body back together and wired again and decided to work on the upholstery. It was a steep learning curve, but the upholstery came out good enough for this old buggy.
So, over 13 years after my original idea began, I now officially have a running and driving 1960s inspired hot rod. This is the longest project I have ever tackled and without the patience of my wife and family it would not have been possible.
Oddly enough, the lessons I learned turning wrenches in the garage kind of map to how I approach Oracle EPM projects.
1. It All Starts With a Clear Vision
Every successful EPM implementation begins the same way a successful car build does: with a vision.
When I started my roadster, I knew what I wanted the end state to look like. Not just cosmetically, but mechanically, structurally, and emotionally. I could see the car finished, even on days when it was nothing more than a bare frame. And of course, as any true hot rodder will tell you, I sat in it and made engine noises every chance I got.
EPM projects require that same clarity. I like to start with the end in mind. This helps us make sure we are building the necessary features to enable that reporting or dashboard that is the end result. A well-defined future-state with data flows, integrations, and user experience, all baked in is what keeps the team aligned. Without it, both cars and projects drift into scope creep, wasted effort, and frustration.
2. Budgeting: Reality Meets the Ideal
Anyone who’s ever built a custom car knows the truth: You will spend more than you expect.
Not because of mismanagement, but because as you get deeper into the process, you see opportunities to improve things you hadn’t originally considered. The same is true in EPM.
Budgeting is about:
identifying what’s essential,
understanding what’s optional,
and planning for the unexpected.
Sometimes, that chrome windshield that you originally didn’t want starts to make sense when you see the features that it provides. And, while you’ve got things apart, you may as well make it as nice as you can. I’m sure EPM consultants can relate to that in their projects.
3. Change Management Matters, In the Garage and in the Office
When you work on a car for over a decade, technology evolves. Parts that didn’t exist when I started became the new standard, like the electric parking brake I installed after seeing how nice the one is in my wife’s Honda. I can’t say that my vision changed but, I changed as time went along.
To avoid rework, you must learn to communicate and adapt.
Change management in EPM projects isn’t just formal documentation; it’s helping stakeholders understand why changes are needed, how they support the long-term vision, and what the impacts will be. Sometimes that involves moving the budget a little. Sometimes it means that users might need to change their business process a little.
Whether it’s a new parking brake system or a redesigned planning process, people need time and clarity to adjust.
4. Resource Constraints Are Real
My hot rod project had two ever-present constraints:
Time – The hours you want to spend are never the hours you actually have. Ability – Some tasks stretch your skills; sometimes you have to learn a lot before you can even begin to develop a skill.
Oracle EPM projects follow the same pattern. Teams juggle:
competing priorities
limited SME availability
skill gaps
integration dependencies
You don’t succeed by pretending constraints don’t exist. Teams can succeed by planning for those constraints and setting the schedule around them.
5. Waiting for the Right Tools
I had the leather from that roadside couch for over two years before I found the right deal on the right sewing machine. Sometimes waiting is the smartest move; forcing progress with the wrong tools usually leads to expensive cleanup. My wife’s Project Runway household sewing machine wasn’t going to cut it when sewing through up to four layers of leather.
EPM programs experience similar bottlenecks:
waiting for upstream system modernization
waiting for data governance decisions
waiting for cloud capabilities to mature
waiting for internal skill development
Patience isn’t the opposite of progress. Sometimes it is progress.
6. Agility: Adjusting Priorities Without Losing the End State
When you’re 13 years into any project, life happens. Family, work, budget shifts, and other priorities interrupt even the best-laid plans. What kept the project moving forward was the ability to adjust short-term priorities while keeping the long-term vision intact. That’s textbook agile thinking.
In any project, we should focus on:
breaking down the vision into flexible increments,
delivering value continuously,
and being ready to pivot without compromising the destination.
Agility keeps the journey alive.
7. Sticking With It: The Power of Completion
There’s nothing like turning the key on a car you built with your own hands. I drove around just today checking some rear suspension changes and clocked mile number 35. The sound, the vibration, the smells; it’s deeply rewarding.
But the moment that surprised me most? The reactions from people on the road. The thumbs up. The smiles. The nods of approval.
That feedback loop makes every late night, busted knuckle, and sliced hand feel worthwhile.
EPM projects are no different. When users finally experience the system with faster reporting, cleaner data, and simpler processes, their satisfaction validates the effort. Their reactions are the equivalent of those thumbs up on the highway.
It reminds you that completion isn’t just a milestone. It’s a celebration. And just maybe, that completion doesn’t mean that it’s actually done, it could just be Phase 1.
When designing an EPM application like Oracle FCC (Financial Consolidation and Close), it’s tempting to try to fit all data into a single cube. We have several system dimensions like Movement and Data Source to play with along with the four Custom dimensions. But forcing data into places it doesn’t belong can lead to a tangled mess of interdimensional irrelevance, hurting both performance and usability.
What Is Interdimensional Irrelevance?
Interdimensional irrelevance occurs when dimensions intersect in ways that don’t make logical or business sense. This leads to sparse intersections, bloated cube sizes, and confusing user experiences. For example, trying to report on a statistical driver against a legal entity that doesn’t use it creates meaningless intersections that slow down processing and clutter reports.
Our Design Challenge
We faced a situation where certain data elements, while important, didn’t naturally fit into the FCC hierarchy. These were supplemental metrics and drivers that were useful for analysis but didn’t belong in the core consolidation structure. Initially, we considered shoehorning these members into the existing hierarchy, but this quickly proved problematic:
Adding non-consolidation data to FCC can introduce unnecessary complexity.
Sparse data intersections may slow down calculations and retrieval.
Mixing supplemental and core financial data risks confusion and misinterpretation.
Additional supplemental requirements might create what was coined as a “dumpster dimension”
The Solution: A Supplemental Application
To maintain clarity and performance, I would argue that offloading these supplemental data elements into a separate Planning FreeForm application is a better move. In the on-premises days, we would spin up little analytic cubes all over the place to hold data that really didn’t make sense in a larger cube. I don’t see why we wouldn’t do something similar with EPM Enterprise Cloud customers as well. In my eyes the benefits are:
Preserve the integrity of the FCC hierarchy by keeping it focused on core financial data.
Optimize performance by reducing sparsity and irrelevant intersections.
Enable targeted analysis in the supplemental cube without compromising the main application.
Stitch the reporting together in Narrative Reporting and/or ad-hoc analysis with Smart View.
This approach gives the flexibility to design each cube for its specific purpose, while still allowing for integration where needed pushing data through data maps or integrations.
Key Takeaways
Don’t force-fit data into hierarchies where it doesn’t belong.
Use supplemental applications to isolate non-core data.
Design with both performance and user experience in mind.
Interdimensional relevance should be a guiding principle in Essbase architecture.
When we respect the boundaries of dimensional logic, we can create cleaner, faster, and more maintainable solutions.
In the November 2025 update for Oracle EPM, we will see some additional changes to the Data Integration Actions menu. A new “Other” actions folder will be added with the Report Execution and System Maintenance Tasks options. This is another quality of life update that brings us one step closer to parity between Data Integration and Data Management. To put it plainly, if you’re not using the Data Integration UI in Oracle EPM, you should get familiar with it.
EDM didn’t have any new updates this month, but Oracle has a new video on the Consolidation Requests. Consolidation requests allow the combination of multiple in flight requests into a single consolidation request for approval by a change governance committee. Requests can only be consolidated from the same view in EDM. The consolidation of requests can help simplify the approval by a change board. If part of a consolidation request needs to be pushed back, the consolidation request can be discarded. For more information, see the video here: https://youtu.be/vujzO5bQsi4
In Dr. Spencer Johnson’s 1998 bestseller, Who Moved My Cheese?, four characters navigate a maze in search of cheese. The book’s main themes are that change is inevitable and that we must anticipate, adapt, and embrace it to be successful in work and life.
Fast forward to the 10.25 Oracle EPM update, and we find ourselves in a similar maze. This time, the “cheese” is the data management Action menu items. And yes, they are about to be moved.
The Data Integration home page has undergone a subtle but powerful transformation. The familiar Actions menu has been reorganized into two new dropdowns: Setup and Configure.
The Setup menu is where you define the structure of your data environment. Think of it as mapping your maze before you start running:
Applications: Define your integration targets and sources.
Locations: Create and maintain locations for mapping.
Period Mapping: Align time-based data across systems.
Category Mapping: Manage application scenarios.
Query: Setup and modify data source queries.
Once your maze is mapped, it’s time to optimize your tools and security. This is where the Configure menu comes in:
System Settings: Control the behavior of your integration engine.
Security Settings: Safeguard access and permissions.
Agent: Manage the EPM Integration Agent settings.
Download Agent: Get the EPM Integration Agent software.
Just like the characters in Who Moved My Cheese? learned to adapt to their new reality, this menu redesign helps users adapt to their data environment more efficiently. By grouping actions based on context, users can find what they need faster and act with greater confidence eventually. Those of us who have switched to using the Data Integration UI will take a little bit to get used to it, but I think this is a small quality of life change that we will come to appreciate.
This update applies across business processes including Account Reconciliation, Planning, Tax Reporting, and more.
In the end, the cheese will always move. The question is: will you move with it?
By now, most of the world knows what EDM is and what it does. Even though EDM has been out for several years at this point, I believe its strategic potential is being overlooked. Too often, organizations treat EDM as a tactical metadata tool tied solely to their EPM applications, rather than recognizing it as a foundational investment in enterprise-wide data governance. We play games with EPM Enterprise licenses to try and keep the node counts under 5,000 but that is really undervaluing the impact EDM could have.
It has been designed to be much more than a connector; it’s a platform for harmonizing metadata across business domains, enabling alignment, auditability, and agility. When deployed thoughtfully, EDM becomes a metadata authority that can support Finance, HR, Supply Chain, and beyond. But that vision only materializes when companies stop thinking of EDM as a bolt-on and start treating it as a core pillar of their enterprise architecture.
EDM can be leveraged not just as a catalog of data elements, but as a strategic asset for downstream reporting and analysis tools. How you deploy EDM can dramatically shape its impact. This post explores three strategic deployment models for EDM:
As the originator of new metadata records
As a metadata steward downstream from source systems
As a metadata harmonizer across different business units
EDM as the Primary Metadata Creator
In this model, EDM is the primary source for creating new metadata records such as cost centers, products, legal entities, or reporting hierarchies. Business users or administrators initiate requests directly in EDM, and once approved, metadata is pushed downstream to consuming systems. This could be called “hub and spoke” where EDM is the controller for all metadata.
This deployment scenario is ideal for:
Organizations with centralized governance
Enterprises looking to remove “shadow” systems and rogue metadata creation
Use cases requiring strict audit trails and approval workflows
EDM’s request workflow ensures intentional and controlled metadata changes, aligning with organizational policies. Approval processes with multiple stages can reinforce robust data governance, maintaining consistency and compliance across systems. Additionally, EDM’s REST APIs can enable automated integration with downstream applications.
EDM as a metadata steward
In this deployment scenario, EDM receives metadata from upstream systems (such as CRM, ERP, or MDM platforms), and acts as a governance checkpoint. It matches incoming records to existing nodes, merges duplicates, and applies survivorship rules to determine which properties to retain.
Ideal for:
Enterprises with decentralized metadata creation
Organizations integrating multiple source systems
M&A scenarios requiring metadata harmonization
EDM has key features that can help with these scenarios like the Matching Workbench for deduplication along with merge logic and survivorship rules. Matching and deduplication relies on a logical tag for each node in EDM called a data source. Data source provides a foundation for Matching or Deduplication rules by defining the scope of metadata to be analyzed.
The key benefit to this method is to allow existing upstream applications to continue to own key business dimensions, but provide a central hub to consolidate and distribute those dimensions to downstream applications.
EDM as Federated Metadata Hub
In this hybrid model, EDM acts as a metadata exchange platform across multiple domains like Finance, HR, or Supply Chain, each with its own governance model. EDM doesn’t own all metadata but facilitates alignment and synchronization.
This deployment method is ideal for:
Large enterprises with domain-specific governance
Multi-cloud or multi-ERP environments
Organizations with regional autonomy but global reporting needs
EDM supports domain-specific modeling for Finance, HR, Supply Chain, and beyond, allowing each unit to maintain its own governance structure while participating in enterprise-wide metadata harmonization. Features like subscription requests facilitate cross-domain alignment by automatically propagating approved changes to related hierarchies, ensuring consistency without manual intervention. EDM’s security model and approval workflows help decentralized teams manage metadata collaboratively while preserving accountability.
This model enables business units to continue to operate with autonomy while providing governance which is ideal for balancing agility and control.
Which deployment strategy you choose should take into consideration your organization’s maturity in data governance. Do your end users know enough about the business to submit their own requests directly into EDM? Is there a deliberate approval workflow for changes to your chart of accounts? What are your compliance requirements and audit needs around metadata changes? What is the priority for your business (e.g., speed vs. control)?
Oracle EDM isn’t just a bolt-on EPM module; it’s a strategic enabler of enterprise agility, compliance, and insight. The key is choosing the correct deployment scenario that matches your business needs. Those business needs don’t stop at your Planning or Consolidation applications. That’s why EDM should be considered as a tool to be used across the enterprise. There is a reason it’s called Enterprise Data Management after all.
In the September 2025 udpate (25.09), Oracle is adding a Request Monitoring Dashboard to EDM! Designed to enhance visibility and control over change requests, this dashboard empowers administrators, data stewards, and integration leads to streamline workflows and improve data quality across the enterprise.
The Request Monitoring Dashboard is a centralized interface that allows users to track and analyze open requests throughout their lifecycle. Whether you’re managing metadata changes, hierarchy updates, or complex multi-domain governance processes, this dashboard offers real-time insights into request activity, aging, bottlenecks, and contributor performance.
Key Features:
Lifecycle Tracking: Monitor requests by type, priority, workflow stage, and assigned contributors.
Custom Filters: Apply and save filters to focus on specific request attributes.
Dashboards:
Open Requests: View volume and distribution.
Active Owners: Identify who’s driving change.
Aging and Exceptions: Spot delays and anomalies.
Drilldowns & Drill-Across: Dive deep into request details or pivot to related metrics.
Export Capability: Download request activity for offline analysis or stakeholder sharing.
Sample Request Monitoring Dashboard image courtesy of Oracle
Why It Matters:
Managing change requests efficiently is critical to maintaining data integrity and operational agility. The dashboard helps teams:
Reduce request cycle time
Identify and resolve workflow bottlenecks
Improve exception handling
Enhance collaboration across business units
The Request Monitoring Dashboard isn’t just a new feature—it’s a strategic tool for proactive governance. By surfacing actionable insights and enabling smarter oversight, the Oracle EDM dev team continues to raise the bar for enterprise data management.
The EPM updates for 25.08 were released and we have an update to the TLS changes. Oracle has decided to continue supporting TLS 1.2 indefinitely, but only with ciphers deemed to be strong. The extension of support for TLS 1.2 gives Oracle and its customers a welcome bit of flexibility. Oracle has also released a document on how to test with the latest TLS ciphers document here: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/tsepm/cloud_epm_test_tls_ciphers.html
EPM Automate is switching to Java 17 instead of Java 8. Windows users rejoice! With the EPM Automate “update” command, EPM Automate will download and install the Java 17 runtime environment as part of the update process. Linux/Unix and Mac users will need to update their user-installed Java version to continue using EPM Automate 25.08 and after. Java 8 was released over ten years ago, so it’s good to see a newer version is being implemented. Linux/UNIX/Mac OSx users can go here to find how to update their Java version: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/cepma/installing_epm_automate_linux_unix.html
Account Reconciliation Cloud is getting Data Integration Pipelines with this update. Pipelines will be available on ARC pods with the following job types:
Create Reconciliation
Generate Report for Account Reconciliation
Import Attribute Values
Import Balances
Import Pre-Mapped Balances
Import Pre-Mapped Transactions
Import Rates
Run Auto Match
Run Auto Alert
Set Period Status
This should mean that we can define ARC jobs on any EPM Data Integration Pipeline and cross pods (similar to how we can run EDM exports across pods with Pipeline).
Before we go, I just wanted to take a moment to celebrate the deprecation of the Data Management/Data Integration job schedules. If anyone out there has braved the pain of that scheduler, those scheduled jobs need to be converted to the EPM Platform Job Scheduler before the 25.09 update. There is a System Maintenance Task job in Data Management called “Migrate Schedules to Platform Job Scheduler” to help with that effort. The EPM Job Scheduler isn’t available in PCM and ARC, unfortunately. If you use either of these business processes, scheduling outside using EPM Automate or rest calls is probably your best bet (and likely what most other customers are using anyways).
I have accepted the fact that I am getting old (or is it “more experienced”?). At this point, I have been working on and around Oracle EPM products for almost twenty years. In the early 2000s, I was getting data from Hyperion Enterprise before we installed Essbase to do reporting. I dove into Essbase and began learning as much as I could. Once I reached a point where I felt I had done everything I could at my position as an administrator, I moved into consulting in 2010 to continue developing myself and learning more. As part of that, I took some training on OBIEE to help support customers with BI installs.
My point is, for 15+ years (maybe 20) EPM and BI users have heard about the promise of self-service BI: empowering users to analyze and visualize data independently. I remember hearing this in my OBIEE training and it was exciting to think about users digging into the data to answer business questions.
The thing with BI products is that there has to be someone technical to connect all of the data sources on the back end. It takes a special someone to figure out the right strings to pull to get all of those data sources normalized and linked up so that end users can do their reporting and analysis. It may be my bias as an implementer, but I don’t know how far users go past the initial dashboards that get created. I certainly hope it’s more common than I have seen.
As I sat in the Kscope Sunday Symposium presentations by Oracle product management and heard about all of the AI features coming to Oracle EPM, it dawned on me that all of the amazing things that I imagined 15 years ago will soon be possible and more accessible than ever. Users will soon be able to to chat with the AI built into Oracle EPM products and get visualizations fed back to them. To recycle an old sales pitch, analysis at the speed of thought is about to be real.
I am looking forward to seeing the developments in Oracle EPM products and I’m excited to see what our customers do with them. You can find the current AI features available in Oracle EPM products here: https://docs.oracle.com/en/cloud/saas/fusion-ai/aiafl/epm-features-with-ai.html. That list is about to get much longer. These are exciting times we live in.
Oracle updates for the 25.07 patch just recently came out and there are a couple of great features for Data Integration in the mix this month.
First, a new application role called “Data Integration – Administrator” is rolling out. This access role will grant a user access to all activities in Data Integration. This means a user will be able to create/manage integrations, execute and monitor pipelines, and perform data and metadata extraction and transformation from on-premises sources using the EPM Integration Agent. The new role is a fantastic addition to allow a user to manage your integrations without giving them Service Administrator permissions on the rest of application. This applies to pretty much all EPM business processes including ARC, EPCM, FCC, Planning, PCM, and Tax Reporting.
The second update is the addition of the Smart Split feature in Pipeline. Basically, Essbase has a governor and it gets mad when you try to push too much data into it. The solution up to now has been to split a large volume data integration into multiple smaller slices of data to get around the limit. Going forward, we can set up a large integration like normal with one big data load rule. Then, in Pipeline we can add an “Integration with Smart Split” job which will split the files for us based on the Split Dimension specified. This will allow the system to bypass the governor to submit smaller data slices without requiring the creation of multiple integrations. Smart Split will be available in EPCM, FCC, Planning, and Tax Reporting