Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
horst_keller
Product and Topic Expert
Product and Topic Expert
0 Kudos
Deep and dynamic data objects are another outstanding feature of ABAP.
What would we do without internal tables? With internal tables in combination with strings and reference variables you benefit from the fact that you can handle large amounts of complex data structures via a rather simple API. The API, of course, simply consists of the well known internal table statements.

When working with internal tables and strings, you normally do not care about their internal representation. You address them as normal data objects and let ABAP do all the
dirty work for you. But since nothing is for free, there must be something you have to pay for all that luxury.
In the case of internal tables, the price can be their memory consumption. Normally, you even must not care about this.

But if memory is an issue, it might be interesting for you how such dynamic data objects are organized, in order to make the right design decisions.

In order to know what we are talking about let us define some terms first:


    1. Dynamic Data Object

      Data object, for which the data type determines
      statically all properties apart from the memory consumption. Strings and
      internal tables are dynamic data objects. Both are data objects of deep data
      types. A string (byte string or text string )is a
      dynamic data object of variable length. An internal table is a dynamic data
      object that consists of a variable sequence of lines of the same data type.
      Dynamic data objects are addressed directly in a program.

    2. Deep data types and objects

      An ABAP data type is deep, if the contents of
      its data objects are addressed via references that point to the actual content.
      References are either the contents of explicit reference variables or they are
      handled internally. Internal tables and strings are handled via internal
      references. In contrast, reference variables point to anonymous data objects or
      instances of classes that are created via CREATE
      DATA
      or CREATE OBJECT respectively and cannot be addressed directly.

    3. Deep structure

      A structure is deep if it contains at least one
      deep component.

    4. Deep Table

      Internal table with a deep line type.



To summarize:


    1. In  contrast to static data objects for
      which all technical attributes are defined on declaration, the size of dynamic
      data objects
      , i.e. their memory requirement, is not defined until
      runtime. After declaration, the length of a string is equal to 0 and
      changes at runtime depending on the content assigned to it. After
      declaration, internal tables do not contain any rows. The rows are created
      dynamically at runtime when the internal table is filled.

    2. In contrast to flat data
      objects
      , for which the content corresponds to the actual work data, dynamic data objects are deep.
      Like reference variables, they contain references to the actual content.
      Unlike reference variables, the treatment of the references is implicit
      for dynamic data objects.



 

Note


Although reference variables are deep, they belong to the static data objects since
their actual size is defined, whereas the size of the explicitly referenced
objects is dynamic. The type of an object referenced by a reference variable
defines the dynamic type of the reference variable. The dynamic type can be a
data type or a class.
Memory Allocation for Deep Data Objects

 

When you
are working with deep data objects, memory is requested at program runtime. In
the case of reference variables, this is the memory for administration
information and the created objects (instances of classes and anonymous data
objects). In the case of dynamic data objects (strings and internal tables)
this is the memory area for administration information and for the data objects
themselves. For internal tables, the memory allocation is done in blocks. The size of the first block can be influenced with the INITIAL SIZE addition to TYPES or DATA.

 

Note


Objects referenced
by reference variables can themselves, in turn, be dynamic - that is, they are
or contain strings or internal tables.

 

Memory Requirement of Deep Data Objects


 

The memory requirement for a deep data object consists of statically required memory for
the internal or external reference and dynamically required memory for a so
called header and the actual objects.


    1. The memory requirement for the
      reference is 8 byte. With data- and object references, this is the memory
      requirement of the explicitly declared reference variable. With strings
      and internal tables, this is the memory requirement of the implicit
      reference that is created internally. As long as no dynamic memory is
      requested, the memory requirement of a string or an internal table is
      exactly 8 byte.

    2. The dynamic memory consists of
      a header (string header, table header or object header) and the actual
      data (string, table body, anonymous data object or instance of a class).
      The reference points to the header, which in turn contains the address of
      the actual data and additional administration information. A table header
      can point to extended table headers with further administration
      information if necessary.



The following figure illustrates the memory requirement of deep data objects. A
multilevel header in which the header references further extended headers
occurs only with internal tables.


Dynamic memory (header and data) is requested:

 


    1. for data and object references
      when objects are created

    2. for strings and internal tables
      when content is inserted



When a deep data object is initialized with CLEAR, REFRESH FREE, the actual data is deleted.
However, the reference variables and the header (with dynamic data objects) are
kept. The latter is reused when memory is requested again. Therefore, the
memory requirement of a once used and then deleted dynamic data object consists
of the reference plus the memory requirement of the header (see example below).

Only when the statement FREE is used for internal tables, headers might be deleted as well if they require too much memory. The memory requirement of a header depends upon the fact whether your
program runs on a platform with 32- or 64 bit architecture and amounts to
approximately 100 bytes. As a rule, a table header generally requires more
space than the other headers.

The exact memory requirement of a deep data object can be determined in the ABAP Debugger
via the function  Memory consumption
or via creating a memory snapshot for the Memory Inspector.

Maximum Size of Dynamic Data Objects


The requested memory of dynamic data objects must be available in the current
internal session. Otherwise a runtime error would occur. In addition to the maximum memory
size that the current internal session can request for dynamic data objects,
their maximum size is limited by the following factors:


    1. An upper limit of 2.147.483.647
      for the number of characters in strings
      and rows in all internal tables results from the fact that these are addressed (internally as well as in ABAP statements) via 4 byte integers

    2. The size of
      strings
      and hash tables is limited by the biggest memory block that can be
      requested in one chunk. The absolute maximum is 2 GB for this block and it
      is usually further limited via the profile parameter ztta/max_memreq_MB. For
      strings, the value of the profile parameter directly represents the
      maximum size that can be occupied with the maximum number of characters
      depending on the codepage. The maximum amount of rows of hash tables
      depends on the size for the hash administration that has to be stored
      there. Currently, this size can be calculated from the largest power of
      two that fits into the value of the profile parameter divided by 8. For
      example, if the profile parameter specifies 250 MB, a hash table can
      contain a maximum of roughly 16 million entries (128 times 1024 to the
      power of two divided by 8).



The actual maximum size is generally smaller as given by the above limits, because the
overall available memory is normally not only used by one string or one
internal table.

Sharing Between Dynamic Data Objects


It is a well known fact, that during the assignment of reference variables, only the
references are copied. After an assignment, source and target variable point to
exactly the same data object or the same instance of a class, respectively
(they point to the header, to be more exact).

In order to minimize the memory allocation of dynamic data objects, during assignments
between strings and internal tables, a similar thing happens – the so called
sharing
(for internal tables as of
Release 6.10). That means, the actual data values are not copied for the time
being. Merely the necessary administration entries are copied, so that the
target object refers to the same data as the source object.


    1. With strings, a new internal
      reference is created which points to the already existing string header.

    2. With internal tables, a new
      internal reference and a new table header are created, the latter
      referring to the existing table body.



The sharing is terminated in the moment when there is a change access to either the source
or target object (copy-on-write semantics). Then, the actual copying process
for the data values occurs and the references and headers respectively are
changed accordingly (value semantics of data objects).

Note


    1. Table sharing only occurs on
      tables whose row types itself do not contain table types.

    2. Sharing also occurs when
      passing values to procedures.

    3. For reference variables that
      refer to the same data object or the same instance of a class, the sharing
      is not terminated when the objects are a changed (reference semantics of
      reference variables).



Complex Data Objects with Deep Components


If data objects of complex data types have many deep components, which often, for
example, applies for internal tables with a deep row type, you have to take
care that the administration costs in the form of memory allocated for
references and header do not grow disproportionately large compared to the
actual data content. For complex data objects with relatively little data content, we can distinguish three basic cases:

  • A complex deep data object is sparsely
    populated if it contains many deep components of which a majority is initial. The required memory of such a deep initial
    component is 8 bytes, unless the component does not yet refer to a header.

  • Complex data objects with a
    duplicative fill level

  • A complex deep data object has a duplicative
    fill level if it contains many deep components of which a majority refers to
    the same data. Such components share the dynamic memory and only contribute
    with their references to the required memory. For dynamic data objects, this is
    achieved as long as sharing is effective (see above).

  • Complex data objects with a
    low fill level


A complex deep data object has a low fill level
if it contains many deep components that refer to different objects, strings,
or internal tables in which the objects itself only require very little memory
or are empty.

Deep data objects with a fill level that is sparse, duplicative, and not too low are
generally harmless. For deep data objects with a low fill level and few
duplicative parts , you have to take the memory
requirements for references and headers into account. You should not massively
use such data objects in ABAP. At a low fill level, you might consider a class
wrapping as an alternative to internal tables, as the administrative costs for
objects are lower.

 

8 Comments