Challenges of SAP Integrated Business Planning in the Cloud

Posted by Peter Clancy on 24-Nov-2016 14:27:47

Find me on:

Implementing a cloud solution today, even in an organisationally cloud aware environment, is still a daunting task. There are many requirements to ensure that a product is safe and secure to use, as well as ensuring it achieves the business benefits that are required.

This blog focusses on Challenges of SAP’s Integrated Business Planning (IBP) solution, a cloud based planning software, and tries to give pointers as to what to do when you develop your own clould migration strategy. The IT industry is still in a period of growth with cloud based solutions industry wide, so it’s quite important to consider any challenges that a company would face in implementing and using IBP operationally – this blog seeks to identify the 6 top challenges and see how IBP or SAP addresses these.

IBP is currently only available in the cloud, and I am not aware of any plans to make the application on-premise. To implement IBP, an organisation needs first to be ready for cloud based technology - having a solid cloud strategy and a consistent architecture that is forward looking to ensure that any products chosen now are fit for purpose for the future. Having this strategy in place ensures that you address any challenges that arise from cloud based applications in an organised and structured way, keeping alignment with organisational goals and the reality of any implementation.

There are several challenges an organisation implementing IBP will face because it is delivered in the cloud and they can be grouped under the following headings, each of which I will discuss below (in no order of priority):

  • Software Updates
  • Loss of Control of Software and Services
  • Security
  • Master Data Replication and the Batch Schedule
  • Integration
  • Lack of Customisation

Software Updates

A primary challenge that an organisation must accept with cloud based software is also one of the benefits of being on a single platform within the cloud – enforced application updates. When planning for software updates in SAP IBP, there are two cloud products involved with on-premise agents (HCP IS and SDI described in a later section), local Excel updates and anything that is in a company’s on-premise current landscape that is integrated.

Fig 1.jpgFigure 1: An example software landscape

From Figure 1 above, you can see that there are several components of software involved in an SAP IBP landscape.

For SAP IBP itself, we have a quarterly update frequency - resulting in quite frequent impacts on the landscape. This can affect HCP integration service interfaces, the HCP integration service Agent and local Excel Add-In updates on distributed appliances.

SAP IBP Customers are notified by SAP when an update is about to occur and given a window of a few weeks in which to opt for a date – so there is some control over when updates can occur for a customer.

Our experience has shown that after each update, a regression test is required to determine that all parts of the application are still functioning – so development of test scripts, and ideally an automated testing tool and environment are recommended to highlight any issues and allow their resolution by SAP as soon as they are identified.

A client has the option of a two-tier landscape (Development & QA on a single instance, with a separate Production instance) or a more traditional three-tier landscape (Separate Development, QA & Production instances). Any updates to these environments are staggered by SAP, so that a customer has time to perform testing and work with SAP in case of any issues. If a client does multi-tracking with project releases and support tracks, then this can further complicate the landscape.

It is worth mentioning at this stage that SAP IBP is a single-tenant model per productive instance, allowing complete data isolation from other clients that SAP may have in any given instance.

Attention should be given to the integration points to make sure that HCP integration service interfaces still function, and in the case of a HCP integration service update, this may require an agent update on-premise. These updates are simple, and to date have not had any impact or major outage required – however a design should be put in place to cater for this single point of failure.

Note also that the HCP integration service updates will be on a different cycle (as will all other software in the landscape) and so together with SAP IBP updates outage periods should be synchronised to reduce impact on business services.

Automated deployment of the Excel Add-In should be catered for after its been tested – this testing of the Excel Add-In should cater for all the channels of delivery that a company has (windows 7, 8, 10, tablet or mobile if this is in use). The latest release of the Excel Add-In is usually backwards compatible for a few previous releases, although you may miss out on new functionality if you don’t roll it out.

Loss of Control of Software and Services

Organisations have grown and developed their own competencies in IT technology. Some organisations are better than others, but all have capability whether they have built it in-house, relied upon outsourcing arrangements or have partnered with third parties to maintain and deliver their systems.

This capability is to maintain their own systems that gives them a competitive advantage and high levels of service with systems that are completely under their control.

When considering cloud applications, this model changes and subsequently there can be significant internal hurdles within an organisation to using business critical applications on the cloud. Architectural hurdles such as security and Data Privacy concerns, Geodata location mix with a loss of control of this service and a fear that the service will not be as responsive as it can be with self-managed custom applications and support.

A significant challenge for an organisation is handing over IT function of the software to a third party and subsequent loss of control of Software and Services.

This can become even more complex in a hybrid environment where we are part cloud and part on-premise, and lines of responsibility and operational factors can become blurred around communications outages, response time issues, or integration problem solving.

Whilst cloud provisioning has the benefit of reducing overall IT requirements and specialised support staff within the company (a significant benefit), the benefit comes at a cost of loss of direct control of the software. Companies are being asked to trust the service provider to run certain parts of their business, and this can cause concerns over business continuity and IT Service levels.

A sound strategy for software updates and solid testing strategy for both Integration, Browser and the Excel Add-In and any extensions made will help alleviate this issue, and bring change into the business in a stable and sustainable fashion.

Security

Security should be a top concern when considering any cloud based application. If you were to break this down, security concerns consist of protection through;

Authentication and authorisation
Data in motion
Data at rest

Users who require access to the business data must authenticate themselves and their identity must be verified by SAP Cloud Identity or an on-premise identity provider, depending on the actual system landscape.

This area is catered for in SAP IBP by specifying Business Roles to a Business User. Relatively standard practice now, the roles allow access to the Fiori Tiles and functions.

Security of Data in motion – or integration data – should be handled by technology. There are three ways to integrate to IBP;

File based upload / download (Test only)
HCP integration service (formerly known as HCI Data Service)
SAP HANA Smart Data Integration

This is done quite well using standard communication mechanisms by SAP IBP – all forms of connectivity for data in motion use HTTPS encryption on a standard algorithm and security provider.

A simplified view of security of data at rest can be ensuring data is secure in the cloud providers Data Center (DC), and security of any local data.

For DC Protection, SAP IBP is highly leveraged by being a SAP product. It’s hardware is in well-established DCs that are now tuned for cloud provisioning. In terms of what you would expect today from a DC, SAP IBP has all the main features, complying with ISO 27001 and multiple other business continuity requirements. Scalability through web dispatcher farms, abstraction of network topology from the outside world, redundant hardware storage system performs regular backups, multiple firewalls / zones that divide the network into protected segments and a lot more.

Safe to say the data is isolated and protected in the DCs, with each client having their own SAP IBP virtual instance.

Somewhat new to the SAP world now is the decentralised or local data that users will store on their desktops or mobile devices. The model of SAP IBP is to allow for planning data to be downloaded and manipulated in an Excel spreadsheet. This raises several challenges now for organisations to govern how this decentralised / local form of data is protected, and how to handle any issues it may deliver.

Guidelines from SAP specifically mention local Excel sheets that contain business planning data in general and personal data should be secured by the security mechanisms of Microsoft Office (for example password protection) and the client operating system (such as hard drive encryption).

Using Excel within the organisation for this business-critical work also introduces some risk of viruses and effects of viruses. This does increase the risk and spread of any infection as these files will be shared throughout the organisation (for example in the S&OP process) – however with organised virus protection by a company this can be countered.

Master Data Replication and the Batch Schedule

Master Data and Batch jobs in any project I have been in with Global companies is always an area that can have some challenges. Solutions from all parts of the business need to work together, and this involves selections of large sets of data, conversion of this data and movement of this between systems with subsequent processing lead times – a batch schedule.

In SAP Advanced Planning & Optimisation (APO), batch job schedules of 8-16 hours are not uncommon for Optimisations – and this is one of the things that SAP IBP seeks to address – being able to simulate different optimisations in SAP IBP on HANA is a huge step forward. For those organisations using the Optimiser, at the moment for supply the Optimiser that is being used is the same one that APO has and works in the same way. this will have the same runtimes, so if your organisation currently has a 10 hour SNP optimisation then like for like the runtime should be the same in SAP IBP with significant savings in SAP IBP for it being on HANA in the read and write steps.

SAP IBP will be retrieving data from legacy systems that are not in-memory – these systems will be the bottleneck in an SAP IBP implementation. In an organisations ecosystem, applications need to work together so retrieving and sending master and transactional data and making further updates back to executional systems will still need to occur – so SAP IBP and integration to and from SAP IBP will need to fit into an existing batch schedule.

Complications will arise because SAP IBP doesn’t raise events at the moment – so batch job integration is an area that is immature.

An organisation will need to determine how SAP IBP fits into their operational landscape, and cater any batch schedule into the plans of an implementation project, and adjust their processes accordingly. The design of this is best done up front to reduce issues around go live.

Integration

There are three integration technologies in place for SAP IBP. For Response, we use SDI (HANA Smart Data Integration) and for all other IBP applications we use HANA Cloud Platform (HCP) integration service (formerly known as HANA Cloud Integration, Data Services). See Figure 2 for a diagrammatical view of the components of SAP IBP, basic data flows and Cloud touchpoints with on-premise systems.

Fig2.jpg

Figure 2: SAP IBP Integration Components

Outside of these two integration technologies we can use a .csv file for upload and download – however for the purposes of this section we are not covering this as it’s not intended for productive use.

"Standard" SAP IBP Integration: HCP integration 

HCP integration service is the standard tool for integration of all data to SAP IBP, except for Response. For Control Tower, S&OP, Demand, Inventory and Supply, there is no other supported productive way of doing this. Whilst not specifically part of SAP IBP, HCP integration service is the main productive way of getting information in to and out of SAP IBP, so inherently it has to be reviewed as part of the complete SAP IBP toolset.

HCP integration service is a middleware tool for communication of on-premise Extract Transform Load (ETL) style integrations. It enables simplified connectivity and mapping from on-premise SAP systems to SAP IBP, using standard extraction mechanisms, with mapping for Source and Target data structures. It simplifies connectivity through an on-premise agent that handles connections into the .

The core of this technology is built on what I would call a lightweight client of SAP Business Objects Data Services (BODS) - a stable and widely used ETL tool that SAP customers have been using to integrate volume based data into BW and other SAP products for some years now.

In terms of integration, SAP IBP offers "out of the box" templates in what is now called HCP integration service. These templates are designed to accelerate implementation as part of SAP’s Rapid Deployment Solution (SAP RDS).

The main challenges I have seen so far with HCP integration service (Data Services) are:

HCP integration service does not enable any re-use of what is built. It does not provide for the development of a library of routines, functions that can be used in several interfaces.

The SAP provided templates are often not complete, so require enhancement and depending on the planning area requirement, this can be substantial.

Complex filtering and routing of interface data would require some thought as the capabilities of the tool are there. Filtering logic is built using custom code, and cannot be shared. The implementation of filtering is quite poor for any sort of complex requirement (say a filtering cockpit)

There is no clear message around how HCP integration service sits with the HCP integration service PO Toolset.

If perhaps SDI will become the integration toolset for the future – for me why have 3 tools when you can have one?

For me there is another concern in this area - HCP integration service covers quite a breadth of technology, integrating through with its own set of adapters – SOAP, OData, RFC, file interfaces through to custom adapters. HCP integration service, Data Services and Process Orchestration have quite a degree of overlap, and there is far more documentation and help around on Process Orchestration than its Data Services counterpart - in fact as of today Data Services are not even mentioned in HCP integration service.

This tool will more than likely be merged into the HCP integration service Process Orchestration, and for the future this could lead either to some rework or at least porting of interfaces. Having developed on the two, I can say that HCI PI from a development paradigm is the better tool with more capability and features to enable true development (re-use for starters).

On the plus side and quite a major advantage, HCP integration service simplifies connectivity quite substantially. A company must install what’s called the Agent inside their DMZ (just behind their firewall), and the Agent handles all communication over standard ports. There is no need for a company to handle reverse proxies, setup complex networking rules to ensure that it’s protected - the tool communicates in standard internet protocols and encrypts data across the line on HTTPS.

HANA Smart Data Integration: Response 

HANA Smart Data Integration (SDI) is the tool that is used by SAP for real-time, or near-real time, updates with SAP IBP. SDI is capable of batch replication as well, as far as we are aware currently, there are no plans to replace HCP integration service with SDI.

Currently with SDI and the SAP IBP Response architecture, integration is via batch output to a file and is not real-time. This is being addressed in future releases, although it’s not clear at this stage when this will be.

Unlike the other existing modules in SAP IBP which use time series to store the planning inputs and results, the response part of response and supply uses data structures that model orders. Whereas the other modules are focussed on medium and long term strategic and tactical planning, response is an operational tool that must both consider existing individual sales, production and purchase orders and generate new planned orders. To ensure consistency with the backend execution system, it also has to replicate master data such as materials, customers, plants, work centres, bills of material and routings. Planned orders are being produced and because of this data is required in near real time from ECC, and accuracy of data means accuracy of orders – so it’s important to get the latest information.

Why SDI and not HCP integration service? For response, HCP integration service is not the right tool for this task, the latency of sending data would just be too great, so instead SAP have used SDI as the integration tool. SDI works on an event-triggered basis when connected to ECC, enabling up-to-the-minute or quicker integration.

SDI uses Imodels and Change documents to flag delta changes and to determine what should be sent to SAP IBP – this is very close to the Core Interface (CIF) methodology for APO which works quite well.

For the current release of Response, SDI will support batch integration only through downloading data to files for SDI to pick up and pass to and from SAP IBP. Delta and near-time replication will come in a later release (for background you can check this with notes 2289945 and 2316969 which give the upload and download programs for SDI integrated file integration with SAP IBP).

The following Object type can be sent back to the source system:

Planned orders

Purchase requisitions

Stock transfer requisitions

Sales orders (only confirmations)

Again, data is sent through SDI and integration into ECC is by using a file based mechanism currently and near real-time is not supported now.

Due to the near real-time requirements, SDI will require monitoring and operational excellence to make sure that business operations are not affected, and this can be tough to do on a new application with new integration technology.

Lack of Customisation

Organisations are used to changing systems at will to make an exact fit for the way that they work. Historically, this comes from internal IT having to develop these systems, but now the industry is changing - processes within industries are being standardised, and software surrounding these processes are following this best practice and ways of working. This standardisation and industrialisation means moving to more standard software is a reality.

However, even in a "standard" process, companies will want to change the ways in that tools run to fit their specific business requirements.

In terms of integration, and as stated above, clients have access to HCI DS, which can be fully customised to suit mostly any ETL situation and SDI for coming near real time requirements.

For SAP IBP customisations, it should be stated up front that most requirements in SAP IBP can be met using configuration. New master data fields and key figures can be generally be added with relative ease to enhance the solution - the solution is very open and flexible in this area.

For detailed customisations that standard SAP development currently has – say BADIs, custom programs, User Exits etc. there is no extension framework in SAP IBP to do this. The entire technical layer is hidden from the user, and whilst there is a way of making a change to the system for a client (so called L-Code), this can only be done by SAP themselves and is not even recommended by them and these changes are expensive to implement.

An easy to spot place to look at for customisation is in the Optimiser used for supply planning. The optimiser for APO and SAP IBP share a significant code set and logic. In APO this software comes with an API that has inputs (e.g. master data, requirements) and outputs (e.g. Orders to APO).

In APO, via a BADI, you can intercept and change these inputs - say to exclude certain requirements from optimisation, change optimiser costs dynamically etc. These changes enable specific business requirements, and in some cases where volume is concerned are essential to getting the optimiser to run in a timeframe that is acceptable

In SAP IBP, you do not have the ability to do this - meaning the requirements that we have now and the efficient running of the optimiser are just not compatible on some of more advanced implementations.

Having talked to SAP about this, it’s a topic that is acknowledged, but nothing is in the pipeline for custom development. Strange in an open source world that we have today.

Net result for a client – for SAP IBP there is much less opportunity to customise the solution. The business will need to compromise on requirements and perform a much more in depth due diligence during selection (and specifically for the Optimiser, an evaluation of SAP IBP for Supply as a suitable replacement now would need to be done). However, it should be recognised that the use cases for the optimiser in SAP IBP and APO SNP are different. APO SNP is intended as an operational planning tool whereas in SAP IBP, the Optimiser is aimed at producing a constrained supply plan balanced with demand for an S&OP process. The former often requires more complex and customised logic to produce an executable plan.

When considering operational planning (say SAP IBP for demand feeding SNP), we have developed a hybrid approach of using APO as the planning engine to keep the business requirements intact and performance at an acceptable level whilst utilising the front end of SAP IBP for its flexibility and scenario based planning. I think this will be a theme moving forward in SAP IBP for the next version or so.

Conclusion

SAP IBP delivers significant business benefit to companies with mission critical supply chains. As with any new product that is built from ground up, we should remember that there are development priorities and SAP has put focus into the right areas.

Delivery of a product that is secure, reliable, scalable and stages delivery of functionality incrementally to the market makes sense. These features must come first, because if any of them were to fail then it would have huge implications for the product and companies that use the product.

In implementing in the cloud there will be an increase in security concerns and integration challenges that aren’t under an organisation’s control. To cater for this an organisation must adapt to control what it can, working with SAP and other parties to govern areas that are not under their control and ensure the safety of their own ecosystem.

When outsourcing software, an organisation hands over control of that software to the provider, accepting that in exchange for a reduction in internal capability and a constantly evolving product the cost of this is being vigilant on software updates and regression testing and less flexibility in customising the product.

Click here for a full list of Olivehorse blogs.

Peter Clancy

Technical Director, Olivehorse Consulting                                                                                                           

Ask us about any SAP IBP queries you have or about our range of IBP services

Read more on: IBP, HANA Cloud Platform, Integration Services, HCI, Cloud