HADOOP HDFS Explorer built with HANA XS and SAPUI5
For those of you that have started to explore HADOOP you might be familiarly with HUE (Hadoop User Interface) or Linux command line for moving files to and from the HADOOP Distributed File system (HDFS).
As a bit of a novelty I thought it’d be interesting to build a prototype of an HDFS Explorer, using HANA XS & SAPUI5.
HUE File Browser HANA: HADOOP HDFS Explorer
HUE is primarily built using PYTHON with an number of PYTHON plugins, such as DJANGO and MAKO.
NOTE: as standard the webHDFS REST service may run off ports 50070 & 50075 on your HADOOP cluster. I needed to redirect these to 50001 & 50013 respectively to enable xshttpdest to work. For Hana cloud platform the valid ports to use are documented here SAP HANA Cloud Platform
Here a few screen shots comparing the original and the new improved HANA version 😛
HANA XS Version (with Download option):
Select the file and then click the Download button:
The dataflow of HANA XS version is:
The HADOOP webHDFS Rest service makes uses of different url parameters [operations (op)] for interacting with HDFS.
I only used the following operations in this example:
FILESTATUS Which acts like ‘dir’ or ‘ls’ commands on a specified HDFS directory
OPEN Which acts like an ftp ‘get’ command.
NOTE: OPEN is called twice. 1st on the HADOOP NameNode to get the DataNode the file is stored, then secondly on the specified DataNode to download the file.
The full code is available to download here: https://github.com/AronMacDonald/HanaHdfsExplorer
For those interested in saving a file to HADOOP I’ve created an example here, using the CREATE operation: