David Dobrin posted a treatise on Solution Manager (I’m quoted, and identified as “the” expert on using Solution Manager for performance tuning, just so you know that David knows me, and overestimates my skill set…). I’m going to dive into one point here, that of decreasing storage costs.
A comment was made to his blog, asking the question (and I’m paraphrasing) “how do I get rid of garbage in my system(s)?” I thought I knew the generally-accepted-practice answer, and that has the initials ILM, but the thread went off into a completely different direction, whereby a little-known, little-used, and probably little-impact feature of Solution Manager is touted as the answer to all your storage problems. Here’s where I’d swear, if I did that, so use your imagination.
But first, some background. David, and a few dozen others, were given briefings by SAP on the wonders of their software road map. I skipped that conference, for reasons I won’t go into here. The problem I see is that too few real-life, hands-on customers attended. I know that Tony de Thomasis was there, but I’m guessing Tony didn’t weigh in on this subject. There’s a SUGEN (www.sugen.net) press release back in 2009 that talks about parts of the KPI index, the fourth being “total cost of ownership”, and part of that being storage costs. I’m not going to go into any more detail, since the texts I have located online are marked “confidential” for reasons that are beyond me.
So, back to storage. Everyone knows disks are cheap. Except that enterprise scale databases take up a lot of disk. And they grow like weeds. We had a 100GB system a decade ago, and thought that was huge. Now, there are systems that grow that much in less than a month. Clearly, reining in this accelerating impact will go straight into your costs. Not so clear, is how one achieves this. Fortunately, SAP has published a how-to guide for information life cycle management for many years, and it’s chock full of how not to grow in the first place, how to get rid of deadwood, and how to run a business-class archiving program. When I ask people at my conference sessions about this publication, few hands are typically raised. So it is no wonder that if someone offers them the Solution Manager blue pill, they’ll take it without faltering.
I searched for one of the tables mentioned in the answer to the space question (“CNVCDMCCA_OBJS”) and got 0 hits on Google, 0 hits on SDN, and but 6 SAP Notes, all dated within the past year. If you asked 100 Basis or database administrators what this table is, and what it does, you might get 2 that know anything about it (until now that is). Well, what is it? It’s a place to store information about your custom tables. Do your custom tables take up a lot of room in your database? Can they simply be eliminated? I have my doubts. But let’s look at code first, data second.
One area that I thought Solution Manager would help with concerns custom code, no longer in use. You know, the report you wrote for that customer a decade ago, who has since figured out that BW gets the information faster and easier. But you’re afraid to delete that code, since you only hear about the reports that don’t work; the ones that just keep running are no source of complaints. And when you get to an upgrade, or an SAP patch, or a Unicode conversion, knowing dead code that can be purged can save a lot of test cycles. So how much room does that code take up? I looked at an R/3 system, specifically the source (and object) tablespaces PSAPEL620D, PSAPEL620I, PSAPES620D, and PSAPES620I. They take up less than 2% of the entire database space, and they don’t grow much. Assuming half the code was custom (not SAP delivered, which is probably more like 99.99%), and half of the custom code was unused (a stretch, maybe), that means you could get rid of one-half of one percent of your SAP storage space. Hardly a killer elevator (or twitter) speech for a project commitment, right?
But, no, as it turns out, this isn’t about custom *code*, it’s about custom *objects*. Well, that should help a lot, right? Well, maybe, but how could it have made any impact on any large customer installations if the KPIs that talk about it are confidential, one of the tables that drive it has ZERO visibility in the places us worker bees talk about this stuff, and the relevant SAP notes are freshly minted? It’s not like we just jump on every note that’s published, throw them into our systems and start executing new code without thinking.
I’m going to quote part of the answer David was given:
|This makes it possible to identify the tables that have no, or just a few, entries and are likely to be not in use.|
Stop. Think. Okay?
We’re talking about storage costs, with systems growing by gigabytes *per day*. What possible impact will it have on that growth curve, to find custom tables that are not in use? I’m baffled.
All right; two pictures below – the first shows a cumulative “used” space picture (not allocated, which is what you pay for, but used, which is what you can clean up, maybe), with a red line pointing to the largest of the source code tablespaces. In this case, it’s the 25th largest space in the system. How much of that is garbage? Well, probably very little.
The second picture shows the growth of that space, in kilobytes, over a one year period. The DB02 transaction shows this (if all is working in your system), and no Solution Manager needed. Flat enough not to worry about buying more storage anytime soon.
If it were me, I’d be looking for the tables that are growing the fastest. Find those that the business doesn’t need after legally sufficient periods of time. Get them out of your system.
I’d also be looking at the different database vendors tools for compression, which in fact, we presented on at the just concluded ASUG / Sapphire conference. We’re shrinking our systems, and while we use Solution Manager on occasion to find outliers or surprises, it’s not the console of choice in looking for deadwood.