Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
vvdries
Contributor
Welcome back Community!

Happy to have you here on the hands-on blog of the Node.js I18n – CSV Converter.

Introduction


In this blog we will build the Node.js application itself. This with the 2 features explained in the previous blog Christmas ? Node.js I18n – CSV Converter – Part ?. For those who didn’t read the previous blog yet, these features exist out of creating a single CSV file with all your project translations (based on all your i18n files). Continued with the reverse feature to store all the translations of the CSV file in i18n.properties files in the project in the original directory of every app./module.

Time to start building!

The full code of this application can be found on my GitHub repository: Node.js-I18n---CSV-Converter.


Building the translation App


Building this application will exist out of three main implementations. The first one will be the setup of the Node.js application. More information about building Node.js applications on SAP Cloud foundry can be found here Developing Node.js in the Cloud Foundry Environment.

The second one will be converting all the project its i18n files into one single CSV file.

Last but not least is building all i18n files out of the CSV and storing them into the original and appropriate i18n directories.

 

1. Create the Node.js App


 

Create the directory called i18nCsvConverter.

Open the project in Visual Code or any other Code editor you prefer.

Create the manifest.yml in the i18nCsvConverter directory and add the following content to it:
---
applications:
- name: i18nCsvConverterApp
host: i18nCsvConverter
path: i18nCsvConverterApp
memory: 128M

Inside your i18nCsvConverter create a new directory called i18nCsvConverterApp.

Next open the terminal in Visual Code and navigate to the i18nCsvConverterApp directory by executing the following command:
cd i18nCsvConverterApp

The next command you have to execute is:
npm init

Provide the configuration values or leave them blank. (I left them blank in this case)

Open the json file, it will look like this:
{
"name": "i18ncsvconverterapp",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC"
}

Make some adjustments to it so your json file looks like this:
{
"name": "i18ncsvconverterapp",
"version": "1.0.0",
"description": "",
"main": "createCsv.js",
"scripts": {
"createCsv": "node createCsv.js",
"createI18n": "node createI18n.js"
},
"author": "",
"license": "ISC"
}

We updated the main script to the createCsv.js script. This will be the first file we create later. Under script we see the createCsv.js and createI18n.js scripts. These scripts call the .js script files with the node param. In the end when we execute npm run createCsv or createI18n the node *.js will be executed.

All right all set, time to move on to the CSV part!

 

2. Add the create CSV file feature


 

Create the script createCsv.js inside the i18nCsvConverterApp directory.

Add the following content to the script:
// The purpose of "use strict" is to indicate that the code should be executed in "strict mode". With strict mode, you cannot, for example, use undeclared variables.
'use strict';

// The fs module provides an API for interacting with the file system in a manner closely modeled around standard POSIX functions.
const fs = require('fs');

// NPM Module to recursive read directory async (non-blocking).
const rra = require('recursive-readdir-async');

// Convert .properties files to JSON (using JavaScript).
// The function propertiesToJSON takes a string and returns a JavaScript object.
const propertiesToJSON = require("properties-to-json");

// This module makes easy to convert JSON to CSV and its very customizable.
const jsonexport = require('jsonexport');

// Convert all the backslashes in the path to your project to forward slashes
const filePath = process.argv[2].replace(/\\/g, "/");
console.log("Filepath to search for i18n files: " + filePath);

All these packages/modules we are requiring here need to be installed. Except the file system module. This is know by Node. To install all these packages execute all the commands below inside your i18nCsvConverterApp directory.
npm i recursive-readdir-async

npm i properties-to-json

npm i jsonexport

Add the following options for the recursive reader in the script:
// Recursive reader configuration options
const options = {
mode: rra.LIST,
recursive: true,
stats: false,
ignoreFolders: true,
extensions: false,
deep: false,
realPath: true,
normalizePath: true,
include: [".properties"],
exclude: [],
readContent: false,
encoding: 'base64'
}

More information about this recursive reader can be found here:

https://www.npmjs.com/package/recursive-readdir-async

Next call the list functionality of the recursive reader.

List all files with their fullname, isDirecotry, name and path. Only .properties files are selected because of the config param include: [".properties"],.
rra.list(filePath, options).then(function (list) {
});

This promise returns an array containing objects. These objects are holding the properties fullname, isDirecotry, name and path off all the .properties files. Why only the .properties files? This is because in our recursive reader configurations we defined the following include: [".properties"], this will only search the .properties files.

Next inside our rra.list promise we only select the .properties files that are inside “i18n” directories.

It will select only the i18n files inside the "/18n" directories.
 const i18nFiles = list.filter(file => file.path.substring(file.path.length - 5) === "/i18n");

For each found file we create a new promise inside our aPromises array.

Here we use a regex to define the appPath and language. If no language can be defined it means it is the default translation.

In our promise we read the file with the file appPath since this holds the full path and file name to the file we want to read. We pass our utf-8 encoding and finally when the reading of the file is resolved we return the appPath, language and data. This data is converted from properties to JSON because we used our properties to JSON module.
let aPromises = i18nFiles.map(file => {

let appPath = file.fullname.split(filePath)[1].replace(/(_[a-zA-Z]{2}){0,2}.properties/, "");

let language = file.fullname.match(/(_[a-zA-Z]{2}){0,2}.properties/)[0].replace(".properties", "").substring(1) || "default";

let promise = new Promise(function (resolve, reject) {
fs.readFile(file.fullname, 'utf-8', function (err, data) {
resolve({ appPath: appPath, language: language, data: propertiesToJSON(data) });
});
});
return promise;
});

Next we will execute all our earlier created promises by calling Promise.all. Here we pass our array of promises (aPromises) as parameter. Next by calling forEach on our result (holds the data of all files) we will select all the keys of our json object (create because of the properties to JSON module) and we push the translation to our aTranslationEntries array.

The variable foundTranslation checks if the key already exists in our aTranslationEntries based on the appName and path. If this key already exists for this app it means it is just another translation in another language and we can set this translation on the same object with the language as key. Id it does not exists we push a new object to the aTranslationEntries array.
Promise.all(aPromises).then(function (result) {
let aTranslationEntries = [];
result.forEach(function (file) {
Object.keys(file.data).forEach(function (key) {
let foundTranslation = aTranslationEntries.filter(entry => entry.key === key && entry.appName === file.appPath)[0];
if (foundTranslation) {
foundTranslation[[file.language]] = file.data[key];
}
else {
aTranslationEntries.push({
appName: file.appPath,
key: key,
[[file.language]]: file.data[key]
});
}
});
});
});

Now that we have all our translation entries in our aTranslationEntries array as JSON objects we can pass them to our jsonexport module. This jsonexport module expects an array of json objects with translations and will return a CSV file.

In our sPath we will use our filePath constant to write the CSV file back to the original location with the file name translationFileI18n.csv. Important is passing the "\ufeff" parameter to the writeFile function of the file system module. This will make sure your file is utf-8 encoded. Otherwise when you open your csv in a spreadsheet editor you can have the wrong or the spreadsheet its default encoding.
jsonexport(aTranslationEntries, function (err, csv) {
const sPath = filePath.match(/^(.*[\\\/])[^\\\/]*$/)[1] + "translationFileI18n.csv";
// console.log(csv);

// IMPORTANT to pass \ufeff to tell csv that it is UTF-8
fs.writeFile(sPath, "\ufeff" + csv, (err) => {
if (err) {
console.log(err); // Do something to handle the error or just throw it
throw new Error(err);
}
console.log('Success!');
});
});

 

In the end the code inside your creatCsv.js script should look like this:
'use strict';

const fs = require('fs');
const rra = require('recursive-readdir-async');
const propertiesToJSON = require("properties-to-json");
const jsonexport = require('jsonexport');

const filePath = process.argv[2].replace(/\\/g, "/");

console.log("Filepath to search for i18n files: " + filePath);

const options = {
mode: rra.LIST,
recursive: true,
stats: false,
ignoreFolders: true,
extensions: false,
deep: false,
realPath: true,
normalizePath: true,
include: [".properties"],
exclude: [],
readContent: false,
encoding: 'base64'
}

rra.list(filePath, options).then(function (list) {
const i18nFiles = list.filter(file => file.path.substring(file.path.length - 5) === "/i18n");

let aPromises = i18nFiles.map(file => {

let appPath = file.fullname.split(filePath)[1].replace(/(_[a-zA-Z]{2}){0,2}.properties/, "");

let language = file.fullname.match(/(_[a-zA-Z]{2}){0,2}.properties/)[0].replace(".properties", "").substring(1) || "default";

let promise = new Promise(function (resolve, reject) {
fs.readFile(file.fullname, 'utf-8', function (err, data) {
resolve({ appPath: appPath, language: language, data: propertiesToJSON(data) });
});
});
return promise;
});

Promise.all(aPromises).then(function (result) {

let aTranslationEntries = [];
result.forEach(function (file) {
Object.keys(file.data).forEach(function (key) {
let foundTranslation = aTranslationEntries.filter(entry => entry.key === key && entry.appName === file.appPath)[0];
if (foundTranslation) {
foundTranslation[[file.language]] = file.data[key];
}
else {
aTranslationEntries.push({
appName: file.appPath,
key: key,
[[file.language]]: file.data[key]
});
}
});
});

jsonexport(aTranslationEntries, function (err, csv) {
const sPath = filePath.match(/^(.*[\\\/])[^\\\/]*$/)[1] + "translationFileI18n.csv";
// console.log(csv);

// IMPORTANT to pass \ufeff to tell csv that it is UTF-8
fs.writeFile(sPath, "\ufeff" + csv, (err) => {
if (err) {
console.log(err); // Do something to handle the error or just throw it
throw new Error(err);
}
// console.log('Success!');
});
});
});
});

 

Time to test our creatCsv.js script!

You can create your own folder structure or just create a Multi Target Application (MTA) project in the SAP WebIDE and add the desired amount of modules to it. I created a FioriLaunchpad Module and HTML5 Module to my MTA project. Next export and unzip it.

Go back to your terminal and be sure you are located in the i18nCsvConverterApp directory.

In here execute the following command:
npm run createCsv "the-full-path-to-your-project-including-the-project-itself"

(Keep the double quotes)

If you see the following message:

Filepath to search for i18n files: your-path/your-project-name

Success!

Your are a lucky man cause you just successfully translated all your project its i18n files into one single CSV file!

Go to the directory where your project is located and find the translationFileI18n.csv CSV file. Open it and find all your translations:



Congratulations you just created your CSV file with all your projects its i18n translations!

 

3. Add the create I18n file feature


 

Create the script createI18n.js inside the i18nCsvConverterApp directory.

Add the following content to the script:
'use strict';

// csvtojson module is a comprehensive nodejs csv parser to convert csv to json or column arrays.
const csvToJson = require('csvtojson');

// Parse the json objects to properties for a .properties file
const jsonToProperties = require("properties-file");

// / The fs module provides an API for interacting with the file system in a manner closely modeled around standard POSIX functions.
const fs = require('fs');

// Replace all the backslashes to forward slashes
const filePath = process.argv[2].replace(/\\/g, "/");
const filePathOutput = process.argv[3].replace(/\\/g, "/");

console.log("Filepath to CSV File: " + filePath);

All these packages/modules we are requiring here need to be installed. Except the file system module. This is know by Node. To install all these packages execute all the commands below inside your i18nCsvConverterApp directory.
npm i csvtojson

npm i properties-file

Next the csvToJson Module is uses to parse the CSV file to a Json object. The fromFile function allows us to pass the file path as a parameter and at the end of the promise a JSON object is returned.

Add this functionality as follow:
csvToJson().fromFile(filePath).then((jsonObj) => {
});

Inside the .then function of this promise we will reduce our jsonObj.

So add the following code:
const result = jsonObj.reduce(function (accum, element) {
}, []);

The next step is defining all the unique applications by looking at the app name in the CSV.

This is done in the reduce statement.
const result = jsonObj.reduce(function (accum, element) {
const o = null;
let appName = accum.find(obj => {
return obj.appName === element.appName;
});
return accum;
}, []);

If the app does not exists yet in the accumulator a new object is created and added to the accumulator. This object holds the app and its language and the translation is added for every key and every language. If the app already exists, the translation is added to this app in the right translation file.
if (!appName) {
// create object and add to right langauge
let app = {
appName: element.appName,
}

let aLanguages = Object.keys(element);
aLanguages = aLanguages.filter(function (l) { return l !== 'appName' && l !== 'key' });

aLanguages.forEach(function (lang) {
let key = element.key;
app[lang] = {};
app[lang][key] = element[lang];
});

accum.push(app);
}

Your reduce statement inside your csvToJson Module should look like this:
const result = jsonObj.reduce(function (accum, element) {
const o = null;

let appName = accum.find(obj => {
return obj.appName === element.appName;
});

if (!appName) {
// create object and add to right langauge
let app = {
appName: element.appName,
}

let aLanguages = Object.keys(element);
aLanguages = aLanguages.filter(function (l) { return l !== 'appName' && l !== 'key' });

aLanguages.forEach(function (lang) {
let key = element.key;
app[lang] = {};
app[lang][key] = element[lang];
});

accum.push(app);
}
else {
// add to right langauge array
let aLanguages = Object.keys(element);
aLanguages = aLanguages.filter(function (l) { return l !== 'appName' && l !== 'key' });

aLanguages.forEach(function (lang) {
let key = element.key;
appName[lang][key] = element[lang];
});
}

return accum;
}, []); 

Now that we have our files and its translations as JSON objects we want to create .properties files from it. So for every file and every language of that file a .properties file is created with or default or language or language_COUNTRY naming convention. Wile writing the .properties file the jsonToProperties Module is used to convert the JSON object to a .properties file. The fullFinalOutputPath holds the path to the original location of the original or new .properties file.
 result.forEach(function (appFile) {
let aLanguages = Object.keys(appFile);
aLanguages = aLanguages.filter(function (l) { return l !== 'appName'});
let appName = appFile.appName;
let fullFinalOutputPath = null;
aLanguages.forEach(function (lang) {
let fullFinalOutputPath = filePathOutput + appName ;
if(lang === "default") {
fullFinalOutputPath = fullFinalOutputPath + ".properties";
}
else {
fullFinalOutputPath = fullFinalOutputPath + "_" + lang +".properties";
}

fs.writeFile(fullFinalOutputPath, jsonToProperties.stringify(appFile[lang]), (err) => {
if (err) {
console.log(err);
throw new Error(err);
}
console.log('Success!');

});
});
});

 

In the end the code inside your creatI18n.js script should look like this:
'use strict';

const csvToJson = require('csvtojson');
const jsonToProperties = require("properties-file");
const fs = require('fs');

const filePath = process.argv[2].replace(/\\/g, "/");
const filePathOutput = process.argv[3].replace(/\\/g, "/");

console.log("Filepath to CSV File: " + filePath);

csvToJson().fromFile(filePath).then((jsonObj) => {

const result = jsonObj.reduce(function (accum, element) {
const o = null;

let appName = accum.find(obj => {
return obj.appName === element.appName;
});

if (!appName) {
// create object and add to right langauge
let app = {
appName: element.appName,
}

let aLanguages = Object.keys(element);
aLanguages = aLanguages.filter(function (l) { return l !== 'appName' && l !== 'key' });

aLanguages.forEach(function (lang) {
let key = element.key;
app[lang] = {};
app[lang][key] = element[lang];
});

accum.push(app);
}
else {
// add to right langauge array
let aLanguages = Object.keys(element);
aLanguages = aLanguages.filter(function (l) { return l !== 'appName' && l !== 'key' });

aLanguages.forEach(function (lang) {
let key = element.key;
appName[lang][key] = element[lang];
});
}

return accum;
}, []);


result.forEach(function (appFile) {
let aLanguages = Object.keys(appFile);
aLanguages = aLanguages.filter(function (l) { return l !== 'appName'});
let appName = appFile.appName;
let fullFinalOutputPath = null;
aLanguages.forEach(function (lang) {
let fullFinalOutputPath = filePathOutput + appName ;
if(lang === "default") {
fullFinalOutputPath = fullFinalOutputPath + ".properties";
}
else {
fullFinalOutputPath = fullFinalOutputPath + "_" + lang +".properties";
}

fs.writeFile(fullFinalOutputPath, jsonToProperties.stringify(appFile[lang]), (err) => {
if (err) {
console.log(err);
throw new Error(err);
}
console.log('Success!');

});
});
});
});

 

Time to test our creatI18n.js script!

Add some translations to your i18n.properties file or create some new languages like described in the first blog. I added the language Dutch from Belgium for the App description.

Go back to your terminal and be sure you are located in the i18nCsvConverterApp directory.

In here execute the following command:
npm run createI18n "the-full-path-to-your-csv-translation-file-including-the-translation-csv-file" "the-full-path-to-your-project-including-the-project-itself"

(Keep the double quotes)

If you see the following message:

Filepath to CSV File: path-to-your-csv-file/translationFileI18n.csv

Success!

Success!

Success!

Success!

Success!

Success!

The success amounts depends on how many .properties files.

 

Have a look at your original project. In my case I will find my i18n_nl_BE.properties file inside my HTML5 Module > webapp > i18n directory.



When I open this file, I will see my Dutch description for the appDescription key.



Obviously, the other keys are blank because we did not provide any Dutch translations for these keys.

 

Same result? Awesome good job! You just reconverted your CSV file with all your translations to the corresponding applications inside the right directories in the right language and .properties file with the right naming convention! SUPER AWESOME!

 

What about encoding and escaping?


When I did some Googling I found out that ".properties" files are ISO 8859-1 encoded.

This means when you have strings like for example:

  • Nom d'utilisateur

  • c'était sympa

  • C'est très agréable


Your characters will not be displayed in your app in a right encoded format.

They will appear in some kind of following string:
'b�b�'

Something we do not want in our application.

This can be avoided by using the following functions I found here:
function padWithLeadingZeros(string) {
return new Array(5 - string.length).join("0") + string;
}

function unicodeCharEscape(charCode) {
return "\\u" + padWithLeadingZeros(charCode.toString(16));
}

function unicodeEscape(string) {
return string.split("")
.map(function (char) {
var charCode = char.charCodeAt(0);
return charCode > 127 ? unicodeCharEscape(charCode) : char;
})
.join("");
}

The encoding can be called like this:
var specialStr = 'ipsum áá éé lore';
var encodedStr = unicodeEscape(specialStr);

It will only escape/encode the necessary strings, so it looks like this:
ipsum \\u00e1\\u00e1 \\u00e9\\u00e9 lore

An alternative is using the following npm-package:
npm install unicode-escape

This will escape/encode all your strings and values and not only the "special" ones. But it does save you some functions and lines of code.

It needs to be imported:
const unicodeToJsEscape = require('unicode-escape');

And can be used like this:
unicodeToJsEscape('pasta');

Like I said no special characters here, so we have the following result:
\u0070\u0061\u0073\u0074\u0061

Everything is escaped. If you would add special characters they would be escaped as well.

Obviously you should also import it on top of your "createI18n.js" file after the installation of the package like this:
const unicodeToJsEscape = require('unicode-escape');

 

The only thing you have to do is choose an implementation, and call the encoding/Escaping function

right in your else-statement where you place the value for your ".properties" files. As you can see I chose the unicodeEscape function so I do not escape all charcters.
else {
// add to right langauge array
let aLanguages = Object.keys(element);
aLanguages = aLanguages.filter(function (l) {
return l !== 'appName' && l !== 'key'
});

aLanguages.forEach(function (lang) {
let key = element.key;
appName[lang][key] = unicodeEscape(element[lang]);
});
}

You could also add an extra check in your "createI18n.js" file to skip empty values coming from you CSV file. If you don't do this you will get the following output in your i18n files:
key1=value
=
key2=value

This is undesired behavior and can sometimes mess up your translation file.

To avoid this you can just add the following "if(key)" statement inside your "forEach" like this:
else {
// add to right langauge array
let aLanguages = Object.keys(element);
aLanguages = aLanguages.filter(function (l) {
return l !== 'appName' && l !== 'key'
});

aLanguages.forEach(function (lang) {
let key = element.key;
if (key) {
appName[lang][key] = unicodeEscape(element[lang]);
}
});
}

To tweak the unicodeEscape function a little I said that the char can also be escaped if it is equal to a single quote and I kept to check on the "charCode > 127".
function unicodeEscape(string) {
return string.split("")
.map(function (char) {
var charCode = char.charCodeAt(0);
return charCode > 127 || char === "'" ? unicodeCharEscape(charCode) : char;
})
.join("");
}

Regarding the encoding of your CSV file, you want your customer to be able to read these unicode.

This means you need to convert them back to UTF-8. This encoding part takes place in the "createCsv.js" file. Inside the "forEach" and "Object.keys" function you will see that the escaped and unicode characters in the "file.data[key]" its value are unescaped end decoded again (they were found by the regex and replaced by their utf-8 value by using JSON.parse).
file.data[key].replace(/(\\[a-zA-Z0-9]{5}):*/g, function (string) {
return JSON.parse('"' + string + '"')
})

So the code looks like this:
let aTranslationEntries = [];
result.forEach(function (file) {
Object.keys(file.data).forEach(function (key) {
let foundTranslation = aTranslationEntries.filter(entry => entry.key === key && entry.appName === file.appPath)[0];
if (foundTranslation) {
foundTranslation[[file.language]] = file.data[key].replace(/(\\[a-zA-Z0-9]{5}):*/g, function (string) {
return JSON.parse('"' + string + '"')
});
} else {
aTranslationEntries.push({
appName: file.appPath,
key: key,
[
[file.language]
]: file.data[key].replace(/(\\[a-zA-Z0-9]{5}):*/g, function (string) {
return JSON.parse('"' + string + '"')
})
});
}
});
});

With this you have a nice and quite safe developer and customer friendly environment.

 


What did we learn?


Time for the wrap up guys. In this blog we went over 3 main development parts:

  1. Setting up the Node.js application

  2. Building a i18n to CSV converter for all i18n files inside your project

  3. Building a CSV to i18n converter


 

Question of today, what did we learn?

We learned how to setup and develop Node.js applications and convert i18n files to a single CSV file and back. During the setup we used the SAP setup for Node.js applications and how to make them ready to deploy them to Cloud Foundry. We skipped some steps of that because we did not need them. (so if you want to deploy them have a look at the SAP help documentation link provided in this blog.

In my opinion it was a nice way to play with all these packages/Modules and it will help me to save some time in the future when translating applications.

 

What could be next or a new feature?

  • Add some unzip functionality to the Node.js app.

  • Add a GUI or WebApp interface where you can upload the zip. So now command line tools are needed anymore.

  • Connect it to a continuous integration.



 

What do you guys think that could add some more value? I’m curious about it! So leave a comment when having cool ideas!

Thanks for passing by!

Merry Christmas! Ho Ho Ho!

Kind regards,

Dries
3 Comments
Labels in this area