Skip to Content

Licensing, Sizing and Architecting BW on HANA

I’ve had more than a few questions on BW on HANA Licensing and Sizing, and it seems that there isn’t anything authoritative in the public domain. So here we go, but before we start…


Architecting BW on HANA systems requires some care. First, database usage, number of indexes and aggregates, use of database compression, reorgs and non-Unicode systems all cause a variance in compression in the HANA DB. The best way to size a HANA DB is to do a migration.

In addition, you may choose to use the cold data concept, to archive/delete prior to migration or to use Sybase IQ for NLS. All of these will vary the amount that you need. And don’t forget growth – you need to plan for data volume growth, and M&A activities or projects which may increase data volumes.

If you get this wrong with SAP HANA, then you may buy the wrong hardware. I’ve worked with customers who bought 3-4x too much, and customers who bought 3-4x too little, so please get expert advice.

In addition be careful when architecting HANA systems, whether you need Dev/Test/UAT, if you have a big system, will it be scale-out, will there be a standby node, and is there HA/DR? Where will you store backups and application servers?

So whilst this blog is intended to help and inform, the responsibility lies with you for getting it right. If in doubt, get the services of an expert. Now we’ve got that out the way!

What are the license models for BW on HANA?

It is possible to buy BW on HANA in one of two ways:

1) By the 64GB unit. As noted in this slide deck, this is EUR 60k per unit for up to 10 units, and then the price decreases with every additional 10 units you buy, and future licensing purchases are accretive and retroactive.

2) By Software Application Value. You pay 8% of your total SAP purchase price and SAP provide an unlimited runtime license for BW. This is also available at 20% including ERP on HANA.

As has been described before, BW on HANA is non-discountable, but you should always have a frank discussion about your overall license package with your Account Exec.

Note that this purchase covers you for all usage: Dev, Test, Training, HA and Disaster Recovery. The only time when you need anything else is if you want to build HANA Enterprise models, and in this case you may need a HANA Enterprise license.

Generally, the SAV licensing is much cheaper unless you are a large organization who has a lot of SAP software and a small BW. If you are a mid-size organization with a big BW, the SAV licensing can be 10% of the unit-based price.

How do I size BW on HANA?

There is an attachment to SAP Note 1736976 – Sizing Report for BW on HANA. This note contains some manual corrections, and then needs to be installed via SAP Transaction SNOTE. Ensure you run the latest version, because it is constantly updated. You can then run ABAP Report /SDF/HANA_BW_SIZING.

When you run this report, run it with and without future growth, and keep both sets of numbers. It will produce a text file. It will look like this.

Screen Shot 2014-02-06 at 8.00.54 PM.png

Now it is necessary to be careful when interpreting this report. In this case, no growth was assumed and it is a 120GB MSSQL database, which it suggests will be a 127GB HANA DB. The sizing tool tends to be conservative and over-size slightly, especially for small systems.

In newer versions of this tool it will tell you how many Medium (512GB) nodes you would need, or how many Large (1TB) nodes. This is a rule of thumb, use it with care.

Now ensure that you think about what you are sizing for. For instance, you may feel that you can archive or delete data. Now is a good time to do this, and if you look at the PSA and DSO Change Log sizes in this system below, a cleanup is definitely in order. Also, you can set some data to be “cold” in HANA and purge it from memory after the migration. You can remove this from the sizing if you like.

If you have a very large system (greater than 3-4TB of HANA required) then it may be cost-effective to use the IQ Near Line Storage (NLS). You can subtract any data that you can archive from NLS from your sizing, but be careful: the NLS software is only good for cold data that is not updated frequently.

How do I architect BW on HANA?

First, start by sizing your productive environment. Once you have this, you can decide the production architecture. In my case here I only need 160GB RAM, so I would buy a 256GB HANA system.

Once you require more than 1TB RAM then you will need to move to a scale-out HANA system. This is where customers often go wrong. Let’s assume we use Medium (512GB) nodes and the sizing tool says we need 100GB for the row store and 1.5TB for the column store. The row store requires one master node, and the column store fits on the remaining nodes. This means that we need 4 active nodes, plus one standby node if we want high availability. That’s 5x Medium (512GB) nodes for production.

Now we need to architect for disaster recovery, and we can take the same as production.

Now we can architect our test system. If our disaster recovery can be warm (i.e. take some time to start up in case of a failure) then we can share this with our test system. This may make sense if you want a production sized copy in test. Note that if you do not have a DR system you will need a dedicated test system. If you have a scale-out production environment, always ensure you have a scale-out test system for scale-out testing.

And now you need a development system. Normally I recommend copying the existing system, and one 512GB node should be sufficient unless development is a copy of production. Use common sense.

From here you can work with a hardware vendor for the best approach, but be careful – the hardware vendors often cut some items out to cut cost (or indeed add extra hardware to get a larger sale), and I’ve dealt with a number of customers who have been bitten by this and have had to buy substantial amounts of extra hardware. Ensure that your hardware partner has an upgrade policy for the amount of hardware you expect to need in the future, based on growth.

Final Words

My final word would be to make sure that you get good advice throughout this process, and sanity check it yourselves. With a regular database, if you size it wrong then you can add more RAM or disk at relatively low cost, and you will just sacrifice performance. With HANA, you will have overspent, or will have to spend a significant amount to change your HANA architecture. Depending on your design, this can be very inconvenient.

Your first HANA deployment is critical, because it will set the tone of sentiment in the business for HANA as a technology stack. Take the time to get this part right, and you will help your BW on HANA deployment on its way. Your project will appreciate you for it!

Thanks to HANA Distinguished Engineer Lloyd Palfrey for his input on this blog!

Screen Shot 2014-02-06 at 8.00.54 PM.png
You must be Logged on to comment or reply to a post.
  • Thanks for this John.

    I'm now at my second Hana site, which coinincidently makes two now that have had sizing challenges.  

    Will be interesting to see how our current partner's 4th crack stacks up to this approach.


  • Hi John,

    Thanks again for another helpful post.

    I think you may have missed this post from last year on SCN by Mark Bernard on how NOT to size BW on HANA!

    How NOT to size a SAP NetWeaver BW system for SAP HANA

    We have been using that and other info here in order to first conduct a trial migration. We currently have trial hardware installed, will perform a trial (or two) over the next 3 months and take the learnings from these trials and apply to our proper full landscape migration later in the year.

    As mentioned it is a tricky topic and a lot is driven by the sizing, but also by the non-functional requirements such as required BW RPO and RTO. Additionally this is constrained by the way SAP have architected how HANA does Scale out, HA and DR, data replication, etc... You could end up having to buy a lot of separate HANA hardware appliances across multiple Data Centres if business criticality from the BW application is required...

    Cheers, Phil G.

    • I did read that post but this is my take on the same topic, updated with a few extra things.

      Agreed that any mission critical environment can require a lot of hardware, especially since many mission critical systems need 4 or even 5 tiers. Of course, HANA isn't any different to other appliances in this setting.


      • Hi John,

        As usual there is some good content in the comments attached to that post by Marc Bernard. They are worth reading if you already haven't.

        Cheers, Phil G.

  • > Generally, the SAV licensing is much cheaper unless you are a large organization who has a lot of SAP software and a small BW.

    not agree, you pay 8% of your total SAP purchase price

    • I found SAP moved quickly to adjust their pricing when I told them there was no way my company would pay a percentage of our entire SAV for BW on HANA, when less than 20% of our users have access to BW.

      I find the '8% ot total SAP SAV' pricing model for BW on HANA to be ridiculous, and it certainly stopped dead any interest my company had in migrating BW to HANA.

      This is unfortunate as HANA is a fantastic product that I would dearly like to implement.

      • Have you talked to them about either HANA Base, or the Limited Runtime where you pay a portion of SMBV? I'm really surprised if there's not a reasonable way.

        • For us, I'm afraid the HANA boat has sailed for the time being.

          I wanted to deploy a fresh BPC on HANA platform, but the initial pricing was far too prohibitive, leaving us with little choice but to bolt-on BPC to our existing BW, running DB2.

          Project starts in May. You'll be pleased to hear that Bluefin are our partners. Every cloud, and all that....


          • Hi John,

            Could you share more detail regarding HANA Base or the Limited Runtime based on portion of SMBV? We are also looking to implement BPC Embedded under HANA.

            We have paid full price for SAP BPC even we use the planning part only. As well as we have other licenses for SSM and SAP Business Objects. So it's really significant burden to spend more money.

            To be honest, from my point of view SAP isn't fair, HANA Database has they own significant benefit already. And SAP should introduce that to customer as alternative database system. But what SAP did is they have developed many new functionalities only for HANA and left behind other database. This is totally bad moves. Indirectly, SAP has push customer to move to HANA and ask more money for it.



          • Honestly with the details, you gotta talk to your AE 🙂

            But you can certainly buy HANA Base by the 64GB unit, which I believe would cover you for BPC, perhaps with just a few units. It's a lot less expensive (~25%) of the cost of HANA Enterprise.

            As for a LRU - those can sometimes be cut to include a portion of the user base.

            I've never worked with a customer where we couldn't make this work!

    • Hi Mikhail,

      I agree. The trick is convincing and justifying the cost of this for Test with multiple HANA Appliances / Nodes required. I am running up against this exact argument at the current site I am at.

      For some reason the spotlight seems to be always on the cost of the hardware, rather than the resulting costs of not being able to do proper testing, and what happens when changes hit Production operation.

      I am trusting that the trial migration(s) we plan to conduct, and the non-functional business requirements around RPO, RTO, proper testing prior to Prod will provide a convincing case.

      Regards, Phil G.

      • If your customer wants both HA (Scale-Out) and DR, there is an alternative that could potentially be more cost-effective than purchasing a whole Scale-Out architecture just for QAS. Both HANA DR options (System or Storage replication) allow you to share the same HANA appliance/scale-out deployment for both the QAS system and the secondary Production DR (with separate storage mounts for the log/data files). In this sense, you could avoid having to pay for a tripled Scale-out architecture for PRD, PRD DR and QAS.

        • Yes, though of course you can't have QAS running and an active standby, which makes it useful for systems where you need a low RPO but not a low RTO.

          Definitely a good conversation to have with your business and hardware vendor.

          • Yes again agree John.

            You need BOTH an active QAS/Pre-Prod system as well as a "data replicated" or "storage replicated" failover HANA architecture running in a different data centre to cater for these testing and failover requirements. It may not turn out to be "tripled" as Henrique has pointed out, but the cost will still be significant, as will the complexity.

            And then there are the other environemenst in a trypical landscape, Dev, Training, Sandbox, etc. It all adds up.

            Cheers, Phil G.

          • For sure. One thing I want to stress to the reader is that this is a sequence of choices. I have many customers who for a BW system find it acceptable to have a 15 minute RTO, when the power to the primary system fails.

            In this case it's clearly the right answer to use an extended storage option and use the DR system for the UAT environment, for this will cut substantial cost.

            In one customer where we were building the business case for a lot of HANA, we created a farm concept where various non-production instances were in various different places, shared with DR systems. Just need to apply business common sense. Which can be hard to come by!

          • Another consideration is how HANA fits with the existing design pattern for HA and DR for the SAP applications that are being migrated to it. When migrating in an existing HA/DR scenario, replacing say the BW database xxx with HANA, this can also place constraints around what is possible and acceptable cost and operational wise.

          • Of course, it's a matter of how much you're willing to pay.

            I mentioned as an alternative to the clearly most expensive option.

            And the business case for having separate QAS and DR environments is dependent on a series of factors, specially: what's the cost of not having a QAS system for a given time (i.e. of impacted projects you'd need to wait to resume, until the primary PRD system was reestablished, cost of opportunity of new features you'd have delivered earlier to the business areas etc.). If those costs are lower than the costs for setting up and maintaining a dedicated DR, then you have your answer.

            This table in Ralf Czekalla's great "HANA in DCs" document has a nice insight into RPO, RTO and Performance Ramp up time for the different HA/DR strategies available for HANA. Of course, the option which has the lowest RTO, RPO and Perf. Ramp up time is also the most expensive in terms of infrastructure (no license difference, though, which is worth mentioning - some competitors charge by active cores, no matter if it's QAS or PRD).


          • Hi Henrique,

            Yes, it all gets back to how much insurance the business are willing to pay for. Agree that Ralf's document is very helpful with all of this. Being able to describe the imapcts of using the verious designs in terms that non-technical people can understand can some times be the challenge.

            Especially when for some reason the hardware $$$ are usually very carefully scrutinised, while the hardware is usually actually a relatively small fraction of the overall project costs. But this too often is the way it works. Everybody wants it to run fast and reliably, and if there is a problem to be quickly available again, and for the data to be protected. However getting the $$$ for this can prove to be difficult.

            FYI there is also now an updated SPS07+ version of Ralf's document, which was updated last October.

            Cheers & Thanks, Phil G.

        • Can you please advice what is a scale out architecture and the term " If your customer wants both HA (Scale-Out) and DR " . I am just trying to read and understand your conversation.

  • Hi John,

    My customer is keen to move their some SAP HR Abap Report to HANA System due to the report performance is really bad. They know SAP HANA can solve the problem. For this case, should i do HANA sizing based on all tables size of HR module or i just choose the related table of Abap reports that they are considered. I hope to hear from your comment. Thank you.

    Best Regards


    • You can just use SLT to transfer only the tables you need.

      As a rule of thumb, you can estimate the data size to be the total database table size in your ERP system without indexes, divided by 5. You will then need to double this number in the HANA system to include working memory.

  • Hi experts - Do you know if Public Budget Formulation (PBF) can run on HANA. I know PBF is an application with BI as it's backend. All the implementations I know are based on traditional databases. I have no time for a POC for this implementation. Appreciate your help.

    • From a technical perspective, yes it should work. This is a NW 7.3 EhP1 Add-In. In addition you will need AS-Java and NWDI.

      Whether it will run better on HANA, I can't tell you. It's an integrated app that used Java and BICS.

      From a support perspective you should check with SAP. They have to support it and this is a niche application. Open an OSS Incident under the PBF support area.

  • Thanks John, you make some important points.

    Another point worth noting is that the the balance of data across nodes that contain the column store is not always even. So as the data grows you will find your capacity constraint is down to the node that has the most data in it. Looking at the cluster wide metrics alone might be misleading. You need to keep track of the per node column store values.

    I'm told balancing the data between nodes in a BW system can have performance implications, is there an approved method re-distribute data across the nodes in the cluster?

      • Yep, elsewhere. In the documentation.

        From the Admin Guide ("4.6.9 Table Placement"):

        "For a complete description of the table placement for SAP BW, see SAP Note 1908073."

    • So yes whilst what you say is correct, the partition strategy for large cubes is a ROUNDROBIN strategy, which will automatically distribute the data for a large cube on all nodes evenly.

      Theoretically, a large master data object might cause a hotspot on a node, but it would have to be REALLY large.

      If you find you are getting hotspots, HANA Studio has a repartition and redistribute tool which will come up with recommendations and automatically repartition and redistribute.

      If you have a LARGE (8+) node system then you might take professional advice from a partitioning expert. I've squeezed 5-100x performance out a HANA complex model by taking a smart partition and modeling strategy. Most customers will never find themselves in this position...

      • John, do I understand you right?

        HANA's distribution monitoring and repartitioning/redistribution capabilites sufficiently support a scenario, where a somewhat impetuous and mostly surprising growth of data size challenges the DB admins.

        We feel well with our ERP systems, but the BW database is a strange country for us ...

        What I wonder is: What skills in the BW area should be be acquired to enable our team to cope with this challenge?

        • hi rudolf

          i have found BW duties, certainly extend from the usual Installation, Maintenance, Modelling, and Development tasks, in today's HANA times.

          i had pointed out earlier, that the Data Distribution Optimizer, does good with repartitioning and distribution.

          there is also the Data Life Cycle Manager (DLM), the other tool in the Data Warehouse Foundation, that can help and support with the management of Warm Data.

          this leads me to the concept only applicable in HANA - "Warm Data", and how it can be managed. If there is any one challenge, then this is it.

          i draw on various topics to manage this challenge:

          Data Extraction technologies: SLT, ODP

          LSA++: Model within this principle, and understand what data goes where, and if it is even needed.

          Partitioning: physical and logical(SPOs). understand where, when, and how to to implement.

          Versions. Stay on top of what versions incorporate what functionality, and upgrade/patch frequently.

          Archiving: know the benefits and limitations, and implement. ADK, NLS.

          ABAP: stay with the technology. utilize AMDP and push down to HANA whenever possible.

          having said all this, it is not easy with the rate of change. SAP are introducing functionality, corrections, updates, fixes etc, at an incredible rate. i see organizations struggle to manage to keep on top of it all.

  • Hi John,

    Thanks for quite helpful information. I know you wrote please contact SAP Account Team regarding On-Premises HANA license, but if you don't mind, could you give rough estimation how much current cost percentage of total SAP license for SAP ERP on Hana. Is it still 20% or less?

    Also along the project migration from current database (such as Oracle) to HANA database, is it common to negotiate discount of SAP HANA price based on current SAP Oracle license contract?

    Thanks a lot,

    Best Regards,


    • Yes you should always get official figures from your account team, because they know your particular situation.

      For 2015, I believe that 15% for SoH and BWoH and S/4 HANA is the number, rising to 22% in 2016. This is to encourage customers to buy/implement in the short term.

      With HANA, the price is the price. You may be able to negotiate a bundle if you are looking for additional items.

  • Hi John,

    Just want to ask regarding current S4HANA Promotion which until Sep 2015.

    For the on-premise version, SAP Business Suite customers need to purchase the SAP S/4HANA foundation-promotion license to run the new SAP S/4HANA code line.

    SAP is offering the following promotion until September 30th2015:

    - Existing SAP Business Suite customers have to procure the SAP HANA runtime license for SAP Business Suite (@15% HSAV = SAP HANA Software Application Value) and will get the SAP S/4HANA foundation-promotion license at no additional cost.

    - Existing SAP Business Suite powered by SAP HANA customers with a valid SAP HANA limited runtime license for SAP Business Suite (LREA) are eligible for the SAP S/4HANA foundation promotion license without additional cost.

    Do you have any though about this statement? Especially regarding 15% HSAV. Do you think this S/4HANA foundation-promotion will cover BW, BPC as well as Friori HANA?

    Thanks a lot,



  • Very helpful article, John, even when reading it today in August 16 ...

    I'd like to see an answer to Robin's Question (Nov 19, 2014) regarding the distribution of data among the nodes ...

    Kind regards, Rudi

    • hi rudolf, and all

      you can now utilize the SAP's Data Distribution Optimizer (DDO), functionality. Part of the Data Warehouse Foundation (DWF) functionality. It takes the HANA Studio ReOrg and Distribution functionality, to the next level.

      It does a good job in simulating and graphing a distribution plan, prior to committing. Parameters can be changed, as to the degree of possible table partition numbers, which can fine tune a distribution.

      you can also use the DDI for specific schemas, and customized tables which may not be present in the Placement Rules table.

  • i have also found that, the BW implemented data model has a large role to play in sizing.

    when the LSA++ Corporate Data has been used, it had not been configured appropriately, thus consuming unnecessary HANA memory.

    also, the appropriate partitioning specification will solve much to do with memory consumption when it comes to DSO activations, and Delta Merges

  • Hi John,

    I do have a customer who is interested in NLS. We discussed this and two Questions came up:
    •My Customer does not have a separate BW-System but is using BW-Embedded on SoH 7.5 Netweaver which is working fine. Is NLS also supported for the BW-Embedded Scenario then?
    •Database: NLS seems to require IQ-DB as NLS-Database. As my customer uses internally maxDB on various Systems the Question came up if the database of the NLS-System can
    also be maxDB instead of IQ-DB. Admins would prefere  maxDB as they are not familiar
    with the IQ-DB at all.
    •In case IQ-DB is the only NLS-Database supported, how about the Costs of that database?
    Is it for free when we use it for BW-Embedded NLS and if not who to calculate the Costs then?

    Really would appreciate your help on this. Thank you.

    Kind regards,