Copyright 2024 - BV TallVision IT

It's been a while, but some 10 years back I tried to fill an internal table with data, just to see how much it would take. The table structure of EKPO with just under 200 fields, was appended to 10.000 times. The system didn't even flinch. Memory is pretty much without limit. The problem with large data sets is performance, not actual size. May have something to do with memory-swapping or paging, where pages of memory are stored to a database location somewhere - which is again performance killing. Thus: performance problems before memory problems.

Having said that: there is a way to keep a tap on the memory consumption of your report. Is it a big report ? An interface with a lot of data ? You can get the percentage of available memory that is used by your report as well as the actual memory size through function moduleĀ /SDF/MEMORY_INFO. Or the function modules called by this module. The /SDF/MEMORY_INFO module returns a dandy readible list of numbers which are all labeled and all serve a purpose. I found the "Extended Memory" results handy.

Example report - show memory consuption

Below I've set up a simple report that adds data to an internal table. As the table grows and grows, memory information is displayed. The report:

REPORT zabapcadabra_memory_buster.

DATA: gt_result TYPE  /sdf/is_output_t,
      gw_result TYPE /sdf/is_output,
* A test-table with a considerable line size
      lt_test TYPE STANDARD TABLE OF char1024.

DEFINE memory_info_dump.
* The action on this macro: increase the LT_TEST consumption,
* get memory consumption results and report them.
  clear: lt_test[].
  DO &1 TIMES.
    APPEND 'Something' TO lt_test.
  ENDDO.
* Get memory consumption data
  clear: gt_result[].
  call function '/SDF/MEMORY_INFO'
    importing
      result = gt_result.
* Show gathered data, focus on the Extended memory
  describe table lt_test lines sy-dbcnt.
  write: / 'Internal table LT_TEST has', sy-dbcnt, 'rows of 1Kb'.
  loop at gt_result into gw_result.
    if  gw_result-metric(1) = 'E'.
      write: /(50) gw_result-metric, gw_result-value.
    endif.
  endloop.
  skip.
END-OF-DEFINITION.

START-OF-SELECTION.

  memory_info_dump 50.
  memory_info_dump 500.
  memory_info_dump 5000.
  memory_info_dump 50000.
  memory_info_dump 500000.
  memory_info_dump 1000000.

The results from the /SDF/MEMORY_INFO module reveal a lot more than just the Extended memory matters. But this demo focuses on this only. Report output;

Internal table LT_TEST has         50  rows of 1Kb
Extended Memory:Current used(%)                    3
Extended Memory:Current used(KB)                   163840
Extended Memory:In Memory(KB)                      4190208

Internal table LT_TEST has        500  rows of 1Kb
Extended Memory:Current used(%)                    3
Extended Memory:Current used(KB)                   163840
Extended Memory:In Memory(KB)                      4190208

Internal table LT_TEST has      5.000  rows of 1Kb
Extended Memory:Current used(%)                    4
Extended Memory:Current used(KB)                   172032
Extended Memory:In Memory(KB)                      4190208

Internal table LT_TEST has     50.000  rows of 1Kb
Extended Memory:Current used(%)                    6
Extended Memory:Current used(KB)                   266240
Extended Memory:In Memory(KB)                      4190208

Internal table LT_TEST has    500.000  rows of 1Kb
Extended Memory:Current used(%)                    27
Extended Memory:Current used(KB)                   1171456
Extended Memory:In Memory(KB)                      4190208

Internal table LT_TEST has  1.000.000  rows of 1Kb
Extended Memory:Current used(%)                    52
Extended Memory:Current used(KB)                   2179072
Extended Memory:In Memory(KB)                      4190208

So what's showing ? The last information block shows a total available memory of 4.190.208Kb of which 2.318.336Kb is consumed by the program. That's 55%.