B2C With Strict Firewall Rules: Mission Possible
Last weekend I took part in a so called “Solution Jam” (and was a co-winner, tied with Roel van den Berge with his impressive ABAP Social Media Listener) at the SAP Inside Track Hamburg.
Kudos for the well-organized event to Renald Wittwer and Peter Langner !
In this SolJam I’ve presented a possible solution on how to manage a B2C scenario if your company/customer has strict firewall rules. Because of the lack of time, we only had 8 minutes for our demos, I couldn’t show the technical details. With this blog post I want to catch up.
Disclaimer
This is just a thought model on how to solve this problem. Don’t use this in a productive environment without further investigations especially in security and sizing!
The scenario
Regular B2B setup
I don’t think further explanations to this scenario are necessary.
B2B setup with strict firewall rules
But what if your company/customer has strict firewall rules where you cannot access your intranet systems from inside your DMZ? This is not very unusual, for good reasons, but the discussion about this topic should not be not part of this blog post.
One possible solution could be to store the Idocs inside the DMZ filesystem, the ERP system pulls and processes these files later. Some years ago I realized such kind of scenario with the use of SAP Business Connector.
B2C setup with strict firewall rules
So far, so good. Buuut, you cannot use files with a B2C connection, where you may have 100s or 1000s of parallel accesses. You need something like a cache database.
Technical details for the demo
In my demo I’ve used Mongo DB as cache DB with a Python-HTTP-Interface on top of it and Xampp as HTTP-Server.
Test setup:
In the DMZ you install Mongo DB, Sleepy Mongoose (the http interface) and your preferred http server.
On ERP side you need
- the ABAP JSON document class (https://cw.sdn.sap.com/cw/groups/zjson)
- MoDBap, the ABAP Mongo DB interface (https://cw.sdn.sap.com/cw/groups/modbap)
For installation and usage instructions please take a look into the corresponding SAP Code Exchange wikis.
Publishing the catalog data
First of all we want to publish some catalog data from the ERP system into the Mongo DB. As example we use data from the SDEMO package (also available in the NW 7.02 Test system).
TYPES: BEGIN OF ts_product
, product_id TYPE sdemo_product_id
, category_id TYPE sdemo_category_id
, name TYPE sdemo_prd_name
, price TYPE sdemo_price
, web_address TYPE sdemo_web_address
, text TYPE char255
, END OF ts_product
.
DATA: products TYPE STANDARD TABLE OF ts_product.
FIELD-SYMBOLS: <product> TYPE ts_product.
DATA: connection TYPE REF TO zcl_mongo_connection,
db TYPE REF TO zcl_mongo_db,
collection TYPE REF TO zcl_mongo_collection.
*——————————————————————–*
SELECT sdemo_pd~product_id
sdemo_pd~category_id
sdemo_pd_na~name
sdemo_pd_pr~price
sdemo_pd_wr~web_address
sdemo_text~text
INTO TABLE products
FROM sdemo_pd
INNER JOIN sdemo_pd_na
ON sdemo_pd_na~name_guid = sdemo_pd~name_guid
AND sdemo_pd_na~langu = ‘E’
INNER JOIN sdemo_pd_pr
ON sdemo_pd_pr~product_guid = sdemo_pd~product_guid
INNER JOIN sdemo_pd_wr
ON sdemo_pd_wr~product_guid = sdemo_pd~product_guid
INNER JOIN sdemo_text
ON sdemo_text~text_guid = sdemo_pd~description_guid
AND sdemo_text~langu = ‘E’.
*——————————————————————–*
connection = zcl_mongo_connection=>connect( ‘192.168.38.52’ ).
db = connection->db( ‘sithh’ ).
collection = db->collection( ‘products’ ).
LOOP AT products
ASSIGNING <product>.
<product>-web_address = |http://192.168.38.53:8000{ <product>-web_address }|.
collection->insert( <product> ).
ENDLOOP.
Simple, isn’t it?
Ok, one short explanation for the <product>-web_address: this is the field, where the web address of the product photo is stored. To simplify the example I’m just storing the Mime repository address of the SAP system, this will not work behind the firewall. In a real scenario you have to export the pictures to your http server.
After executing the report, you have the catalog data as JSON records in your DB.
Reading catalog data
In the application (SAPUI5) we now read the catalog data via the http interface (JSONP, because the http server and the http interface are using different ports!) and bind the data to the model class and the table rows:
<!– load table content via JSONP –>
<script type=’text/javascript’ src=”http://192.168.38.52:27080/sithh/products/_find?callback=bindcatalog“></script>
// Callback function from Mongo DB (JSONP)
function bindcatalog (testdata) {
oModel.setData(testdata);
oTable.bindRows(“results”);
};
Result:
After adding some products to the shopping cart, we are ready to order:
As mentioned already above, the http server and the interface are using different ports, so we have to use Ajax for a cross domain POST:
$.ajax({
type: ‘POST’,
url: “http://192.168.38.52:27080/sithh/orders/_insert” ,
data: “docs=” + oOrderModel.getJSON(),
crossDomain: true,
…
});
In the ERP system we are now able to read the incoming orders (via job) and create customer master data, sales orders or what ever.
DATA: connection TYPE REF TO zcl_mongo_connection,
db TYPE REF TO zcl_mongo_db,
collection TYPE REF TO zcl_mongo_collection,
result TYPE string,
json_doc TYPE REF TO zcl_json_document,
dump TYPE string_table.
FIELD-SYMBOLS: <line> TYPE string.
*——————————————————————–*
connection = zcl_mongo_connection=>connect( ‘192.168.38.52’ ).
db = connection->db( ‘sithh’ ).
collection = db->collection( ‘orders’ ).
collection->find( IMPORTING result = result ).
json_doc = zcl_json_document=>create_with_json( result ).
json_doc->dumps( IMPORTING result = dump ).
LOOP AT dump
ASSIGNING <line>.
WRITE:/ <line>.
ENDLOOP.
The last lines (after json_doc->dumps) are only used to format the output.
Have fun! (easy, if you are using #sapui5 and a NoSQL database 😆 )
Cartoon used with friendly permission by Oliver Widder
http://geekandpoke.typepad.com/geekandpoke/2011/12/the-hard-life-of-a-nosql-coder.html
Hi Uwe,
thank you for sharing your ideas here and with us in Hamburg.
You have done a very good job.
Peter
Thanks for sharing this, I have taken the (little) time to rate! 🙂
Hi Uwe,
Interesting read. Bit late, but I was wondering, in a B2C scenario where the customer needs to view, change and view his information on a website ( and this needs to be processed in the backend real-time or almost real time ).
How is this scenario handled in a B2C setup with strict firewall rules?
- Do we go to a B2B setup with: ERP -> WebDispatcher/Gateway -> Website ( but in Heavy load situations I can imagine they might want a cache db, or the investments to let ERP handle this load might seem too costly ? )
- Do we use your cache DB setup but allow 1 main call to get all real-time ERP info to the cache and work further on the cache? But than how would we handle changes from that point on.
Kind regards,
Wouter
Hi Wouter,
in a strict Firewall-Rule-Scenarion all doors (ports) are closed, so you'll never have the possibility to get realtime data out of the ERP system, so all data have to be cached within the DMZ.
User record mainanance (self-service) and document history should work, but I think one of the main problems in such kind of scenarios will be the availability check (ATP). This may be a reason why your order confirmation via email lasts some time at many online resellers. The order must first be processed at the backend via job.
Happy holidays
Uwe