Implement “Near Me” in SAP CRM Sales Mobile
To use the SAP CRM Sales mobile application Near Me feature, extract the Geocode data for BP master data when Accounts or Contacts are pushed from CRM to NetWeaver Mobile.
A Customer User Exit is provided as a BADI which the customers can implement to determine the geocode data for each Account or Contact address. Customer should implement this BADI /MSA/BADI_BUSINESS_PARTNER, Method GETDETAIL to fill the LONGITUDE and LATITUDE fields available for the CT_BUPA_HEADER_DETAILS node.
You can obtain LATITUDE and LONGITUDE by free google service.
This google service implements a daily query limit, so you need to store google response for a specific parameter in a table and reuse this latitude and longitude for the same parameter without make a new query everytime.
If you run the data trasmission more times in different day you can see that the Ztable increase row number.
This implementation permit you to bypass google limits.
STEP 1
Create the table ZMOBILEGEOLOC:
STEP 2:
Implement /MSA/BADI_BUSINESS_PARTNER and write GETDETAIL method with the following code:
FIELD-SYMBOLS: <fs_hd> type /MSA/S_BUPA_HEADER,
<fs_bupa_note> type line of crms_bupa_mob_note_t.
data: l_http_client TYPE REF TO IF_HTTP_CLIENT,
l_http_url type string,
content type string,
temp type GEOLON,
via_esc type string,
num_esc type string,
cap type string,
comune_esc type string,
country type string,
prov type string,
status type string,
v1 type char50,
v2 type char50,
v3 type char50,
v4 type char50,
lv_address type char255,
wa_zmobilegeoloc type zmobilegeoloc.
sort ct_bupa_note by spras seq_no ascending.
loop at ct_bupa_header_details ASSIGNING <fs_hd>.
l_http_url = 'http://maps.google.com/maps/api/geocode/xml?address='.
via_esc = <fs_hd>-street.
num_esc = <fs_hd>-house_no.
if num_esc is initial.
num_esc = '1'.
ENDIF.
comune_esc = <fs_hd>-location.
if comune_esc is initial.
comune_esc = <fs_hd>-city.
ENDIF.
cap = <fs_hd>-postl_cod1.
country = <fs_hd>-country.
prov = <fs_hd>-region.
via_esc = cl_http_client=>escape_url( via_esc ).
num_esc = cl_http_client=>escape_url( num_esc ).
comune_esc = cl_http_client=>escape_url( comune_esc ).
REPLACE '%20' WITH '+' INTO via_esc in CHARACTER MODE.
REPLACE '%20' WITH '+' INTO num_esc in CHARACTER MODE.
REPLACE '%20' WITH '+' INTO comune_esc in CHARACTER MODE.
check ( comune_esc is not initial or cap is not initial or prov is not initial ).
CONCATENATE num_esc ',' via_esc ',' comune_esc ',' cap ',' prov ',' country into lv_address.
select single * into corresponding fields of wa_zmobilegeoloc
from zmobilegeoloc
where indirizzo eq lv_address.
if sy-subrc eq 0 and wa_zmobilegeoloc-longitudine is not initial and wa_zmobilegeoloc-latitudine is not initial.
<fs_hd>-longitude = wa_zmobilegeoloc-longitudine.
<fs_hd>-latitude = wa_zmobilegeoloc-latitudine.
else.
CONCATENATE l_http_url num_esc ',' via_esc ',' comune_esc ',' cap ',' prov ',' country '&sensor=false' into l_http_url.
* Creation of new IF_HTTP_Client object
CALL METHOD cl_http_client=>create_by_url
EXPORTING
url = l_http_url
IMPORTING
client = l_http_client
EXCEPTIONS
argument_not_found = 1
plugin_not_active = 2
internal_error = 3
OTHERS = 4.
* Check this was successful
CHECK : sy-subrc IS INITIAL.
* Set the request method
l_http_client->request->set_header_field( name = '~request_method'
value = 'GET' ).
* Set the header
l_http_client->request->set_header_field( name = 'Content-Type'
value = 'text/xml; charset=utf-8' ).
* Send the request
clear status.
data num_request type int4.
num_request = 0.
WHILE status ne 'OK' and num_request < 3.
" google limit
WAIT UP TO 1 SECONDS.
l_http_client->send( ).
* Reterive the result
CALL METHOD l_http_client->receive
EXCEPTIONS
http_communication_failure = 1
http_invalid_state = 2
http_processing_failed = 3
OTHERS = 4.
content = l_http_client->response->get_cdata( ).
* *************************************PARSE XML**************************
* initialize iXML
type-pools: IXML.
class CL_IXML definition load.
* create main factory
data: IXMLFACTORY type ref to IF_IXML.
IXMLFACTORY = CL_IXML=>CREATE( ).
* create stream factory
data: STREAMFACTORY type ref to IF_IXML_STREAM_FACTORY.
STREAMFACTORY = IXMLFACTORY->CREATE_STREAM_FACTORY( ).
* create input stream
data: ISTREAM type ref to IF_IXML_ISTREAM.
ISTREAM = STREAMFACTORY->CREATE_ISTREAM_STRING( content ).
* parse input document ==============================================
* initialize input document
data: IDOCUMENT type ref to IF_IXML_DOCUMENT.
IDOCUMENT = IXMLFACTORY->CREATE_DOCUMENT( ).
* parse input document
data: IPARSER type ref to IF_IXML_PARSER.
IPARSER = IXMLFACTORY->CREATE_PARSER(
STREAM_FACTORY = STREAMFACTORY
ISTREAM = ISTREAM
DOCUMENT = IDOCUMENT ).
IPARSER->PARSE( ).
" read xml
data: loc_node type ref to IF_IXML_NODE,
geo_node type ref to IF_IXML_NODE,
lng_node type ref to IF_IXML_NODE,
lat_node type ref to IF_IXML_NODE,
res_node type ref to IF_IXML_NODE,
status_node type ref to IF_IXML_NODE,
loc_node_collection TYPE REF TO IF_IXML_NODE_COLLECTION,
geo_node_collection TYPE REF TO IF_IXML_NODE_COLLECTION,
status_node_collection TYPE REF TO IF_IXML_NODE_COLLECTION,
res_node_collection TYPE REF TO IF_IXML_NODE_COLLECTION,
res_children_node_list TYPE REF TO IF_IXML_NODE_LIST.
res_node_collection = idocument->get_elements_by_tag_name( 'result' ).
status_node_collection = idocument->get_elements_by_tag_name( 'status' ).
if status_node_collection is not initial.
status_node = status_node_collection->get_item( 0 ).
move status_node->get_value( ) to status.
move status to v1.
endif.
num_request = num_request + 1.
ENDWHILE.
CHECK res_node_collection is not initial.
res_node = res_node_collection->get_item( 0 ).
CHECK res_node is not initial.
res_children_node_list = res_node->get_children( ).
clear geo_node.
data: tmp_node type ref to IF_IXML_NODE,
i type int4.
i = 0.
do res_children_node_list->get_length( ) TIMES.
tmp_node = res_children_node_list->get_item( i ).
if tmp_node->get_name( ) = 'geometry'.
geo_node = tmp_node.
ENDIF.
i = i + 1.
ENDDO.
CHECK geo_node is not initial.
loc_node = geo_node->get_first_child( ).
check loc_node is not initial.
lat_node = loc_node->get_first_child( ).
lng_node = loc_node->get_last_child( ).
check lng_node is not initial.
move lng_node->get_value( ) to <fs_hd>-longitude.
check lat_node is not initial.
move lat_node->get_value( ) to <fs_hd>-latitude.
clear wa_zmobilegeoloc.
check lv_address is not initial and <fs_hd>-latitude is not initial and <fs_hd>-longitude is not initial.
wa_zmobilegeoloc-indirizzo = lv_address.
wa_zmobilegeoloc-latitudine = <fs_hd>-latitude.
wa_zmobilegeoloc-longitudine = <fs_hd>-longitude.
modify zmobilegeoloc from wa_zmobilegeoloc.
endif.
ENDLOOP.
STEP 3
Run data transmission
Hi,
Why to store information in a Z table when there is an standard table?
Why to make it only for those accounts sent to mobile server when you can implement same function for all your accounts when an address is updated/created? With this you don't need a table to store google responses and you only ask a new geoloc when needed.
Google free api for geoloc is about 2500 requests each day. If you have the code in the getdetail and you try to make an initial download, you will have probably much more accounts than 2500. At least in my system I have around 1.5M. What happen if you have 20k?
Shouldn't you close the connection?
I don't really like the solution proposed.
Regards
Hi Jorge,
which is the standard table?
Now we ask a new geoloc only when needed, when we have not a row stored in the table for the input parameter.
I know the google request limits, in fact the you have to run more times the inital download but after some days you have all you information stored in the Z table. If we have 20k i think about 7-8 days... early in the project is not generally a problem with the advantage of using a free and without registration solution.
After the initial download when an addres is updated/created we haven't query limits problem.
Regards
Hi,
Standard table is called GEOLOC.
So with 20k you have to start initial download 8 times? Do you really think that is a good performance? So basicly just to update geoloc you run initial download as many times as you need, so if you have 100k accounts just do it more & more times. Hope you don't have a high volume of customers.
The idea of doing full initial download every single time just to update geoloc is ridiculous I think.
So you say after initial download you don't have update/created problems? If that's true you need to trigger request geoloc after each address modification, and you don't have that implemented in your code. Besides I guess that in your system you don't create more than 2500 accounts per day.... You say you only ask for geocode when it's needed but that's only checking if a geoloc was already requested in the past, not if anything has changed.....
You should really check out your process, don't you think?
Regards
On each project is necessary calibrate the solutions according to the required scenario, the budget (time & costs) and expectations of the customer.
In my project, we have 5000 partners and day it will be inserted / updated 2-300.
If you go on another scale you can use "Google Maps API for Business" (100000 request for day).
I usually go to buy bread by bike and not with a truck.
If you prefer I can change the title and specify that this solution is for small business and small budgets ...
Hi,
For that reason if you blog a how to solution you can't think in one system with 5000 accounts that is from my experience really short number. You need to think in how to do it for other systems.
Even if that is the case, it doesn't make sense at all to make 2 initial downloads in order to fulfil the geoloc or using custom tables instead of standard. This is something really OBVIOUS.
You can make this work for big projects, small budget.
You say you go to buy bread by bike not by truck, but you also need to mention that you need to go thru sortest path, not go to the neighbour city and come back... this is what you are doing right now.
I only was trying to give you an idea on how to do things better. If you don't want to accept it, don't worry neither I do.
Regards
Unfortunately I have not seen any idea, except for the use of the standard table on which I can also be in agreement. Maybe I did not understand it but how you've implemented the geolocation? If you explain your solution it would benefit the whole comunity.
Hi,
Did you check this blog?
http://scn.sap.com/community/abap/blog/2010/04/10/community-project-zgeocode
https://cw.sdn.sap.com/cw/groups/zgeocode
Cheers!
Luis