Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
hynek_petrak
Active Participant

This blog and below code has been inspired by this post: Extract data from SAP table / query by takahiro.kubo.


My other motivations was, that apparently there is still a lot of people fascinated by the RFC_READ_TABLE. Despite the fact that it is not the right way of getting information out of SAP R/3. I believe the below way will inspire those people to switch their minds.

To identify a query within the SQ01 system you'll need a UserGroup and QueryName. I believe you are familiar with those terms.

For the SQVI query geting the UserGroup is not that obvious, but we can either extract it from the report name.. From menu QuickView->Additional functions->Display report name, then user group is in the middle of the name, starting with SYSTQV and has 12 characters.

Or you get the user's user group via How to find and execute or work with other users quick views (or quick views created with old user-i...

Specifying input for the selection screen is slightly more complicate (not that much) therefore I recommend to read Takahiro's code or browse  through SCN. For the below example we will call a prepared variant by name.

When creating a variant, select SAP List Viewer as output. Choosing MS Excel may lead to an exception, when calling through RFC.

1. First we invoke the RSAQ_REMOTE_QUERY_CALL with the given parameters. If we call via RFC we want DATA_TO_MEMORY = 'X' otherwise output tables (LDATA) are not filled. For EXTERNAL_PRESENTATION you may choose out of below values. I test the 'Z' only.

'S' convert except for dates

'X' convert

'E' convert curr

'Z' convert curr except for dates


static DataTable FetchQuery(string dest, string user_group, string query, string variant) {
    RfcDestination _dest = RfcDestinationManager.GetDestination(dest);
    IRfcFunction fn = _dest.Repository.CreateFunction("RSAQ_REMOTE_QUERY_CALL");
    fn.SetValue("USERGROUP", user_group);
    fn.SetValue("QUERY", query);
    fn.SetValue("VARIANT", variant);
    fn.SetValue("DATA_TO_MEMORY", "X");
    fn.SetValue("EXTERNAL_PRESENTATION", "Z");
    fn.Invoke(_dest);





2. We store the result into a DataTable, where column names are the query's field names (LISTDESC-FNAME). Takahiro's way 🙂


   DataTable result = new DataTable();
    IRfcTable listdesc = fn.GetTable("LISTDESC");
    for (int i = 0; i <= listdesc.Count - 1; i++) {
        listdesc.CurrentIndex = i;
        DataColumn column = new DataColumn(listdesc.GetString("FNAME"));
        if (result.Columns.Contains(column.ColumnName)) {
            column.ColumnName = column.ColumnName + "_" + listdesc.GetString("FPOS");
            // for duplicate type
        }
        column.Caption = listdesc.GetString("FDESC");
        result.Columns.Add(column);
    }





3. We need to parse the output data and store them to our table's rows. The format of LDATA is

LEN:FIELD_DATA,LEN:FIELD_DATA,LEN:FIELD_DATA;LEN:FIELD_DATA,LEN:FIELD_DATA,LEN:FIELD_DATA;

where each field content is prefixed with a number determining it's length in characters. ',' is a field separator, ';' is a row separator and ':' sparates field length from the data itself. In a first step we concatenate all the LDATA lines into one single string (ldatastr). Those who admit, that this way is not a memory efficient, may implement a read-ahead buffer, but since the RAM memory got cheap, I do not care anymore.


    //read data
    IRfcTable ldata = fn.GetTable("LDATA");
    string ldatastr = "";
    for (int i = 0; i <= ldata.Count - 1; i++) {
        ldata.CurrentIndex = i;
        ldatastr += ldata.GetString("LINE");
    }





4. We read the LDATA content the old telegraph (Electrical telegraph - Wikipedia, the free encyclopedia) way. I.e. read field lenght, check, there is ':', read the field data, check there is ',' or ';' and so on ....


    int pos = 0;
    int col = 0;
    DataRow row = result.NewRow();
    for (; ; ) {
        // read field length
        if (ldatastr.Length - pos < 3) {
            break;
        }
        int fl = int.Parse(ldatastr.Substring(pos, 3));
        pos += 3;
        // assert about ':' separator
        if (ldatastr[pos] != ':') {
            throw new Exception(string.Format("No ':' at {0}", pos));
        }
        pos++;
        // read the field data
        if (ldatastr.Length - pos < fl) {
            throw new Exception(string.Format("Not enough data to read {0} long field at {1}", fl, pos));
        }
        string data = ldatastr.Substring(pos, fl);
        row[col] = data;
        pos += fl;
        switch (ldatastr[pos]) {
            case ',': // continue with the next field
                col++;
                break;
            case ';': // continue with the next row
                col = 0;
                result.Rows.Add(row);
                row = result.NewRow();
                break;
            default: // nobody would expect that
                throw new Exception(string.Format("Unexpected EOF at {0}", pos));
        }
        pos++;
    }
    return result;
}





5. If you need to dump the result into a TAB delimited format ....


   if (result != null) {
        using (StreamWriter writer = new StreamWriter("output.txt", false, Encoding.UTF8)) {
            string sline = "";
            for (int i = 0; i < result.Columns.Count; i++) {
                if (i > 0) {
                    sline += "\t";
                }
                sline += result.Columns[i].ToString();
            }
  writer.WriteLine(sline); // write the header
            foreach (DataRow r in result.Rows) {
                sline = "";
                for (int i = 0; i < r.ItemArray.Length; i++) {
                    if (i > 0) {
                        sline += "\t";
                    }
                    sline += r[i].ToString();
                }
                writer.WriteLine(sline); // write the row
            }
        }
    }





15 Comments
Labels in this area