Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
lbreddemann
Active Contributor
There are a couple of topics that are an ongoing source of confusion even for experienced IT professionals and developers.
UNICODE handling is one example (see here for info), floating point calculations is another one (this is a good primer) and then there is everything to do with time.

Dealing with dates, time, time zones and the like is confusing and counter-intuitive in itself; adding computers to the mix doesn't make this any better.

This post is going to shine some light on what goes into date/time handling in SAP HANA and tries to answer some questions that I came across recently.

The big picture


The SAP HANA DBMS obviously doesn't come with its own timekeeper (though I think it would make a nice gift for HANA DBAs) but uses date/time information provided by the Linux operating system whenever it needs to know the current time.

Linux gets its time from different possible sources, starting with the hardware clock (aka BIOS clock) and, if configured, the ntpd(aemon).
The man-pages on this (hwclock, date, tz, adjtimex, /proc/timer_list) are an interesting read.

A super-short version of the story here is: When the computer is started, the hardware clock sets the current time kept by the battery and the BIOS. This time does not have a time zone assigned to it - it is open to interpretation.
At some point, Linux starts up, reads the hardware clock and sets the current system time and date considering the time zone that was declared for this computer when Linux was installed (yes, one can change this later on).
From here on the hardware clock is out of the picture. All system requests concerning current date/time are served from the Linux managed clock(s) - the System time - and likely continuously updated via the ntp-daemon.
It is possible to check the current hardware time (output in ISO 8061 format😞
> sudo hwclock --utc
2018-03-26 17:57:36.608111+1100

The +1100 is the time zone offset that the system time uses/assumes. The hardware clock does not 'know' what timezone it is in, so specifying either --utc or --localtime and checking which results in the correct local time is crucial here. Once again, the man-page for hwclock has the whole story.

Comparing this to the output of date:
> date
Mon Mar 26 11:05:37 AEDT 2018

we learn the system time is set to use the Australian Eastern time zone.

Another way to check which timezone has been set on system level is to check where the file-system link /etc/localtime points to:
> ls -l /etc/localtime
lrwxrwxrwx 1 root root 39 Mar 12 18:48 /etc/localtime -> /usr/share/zoneinfo/Australia/Melbourne



The/usr/share/zoneinfo folder is the system timezone database directory.

All time zones, including Daylight Saving Time (DST), that the Linux system knows about, can be found here. Since both time zone definitions, as well as DST rules, change regularly (see Time Zone News), these files are commonly updated during Linux updates.
We will come back to the topic of Daylight Saving Time later on - it is yet another can of worms.

The time of the day


Here we are, with Linux booted up and ready to go. We can log in now and ask for the current time and date (as seen above):
> date
Mon Mar 26 18:17:07 AEDT 2018

But what if my user should use a different time zone?

To do that, the Linux shell provides a setting in form of an environment variable named TZ.
Setting the variable to one of the time zones in the timezone database directory allows programs to return the date in the local time of choice:
> export TZ=UTC+0
> echo $TZ
UTC+0
> date
Mon Mar 26 07:18:00 UTC 2018

Any program started from a shell with this setting will now work with the time zone set to UTC.

Becoming a profiler


If this setting should be made permanent (just for the current user) the environment variable has to be set in the shell profile of the user.
The shell profile is an auto-start script that contains the commands the shell should execute every time it gets started.
For the common Bourne shell this script file is called .profile.
As the name starts with a dot, it is a so-called hidden file, but using ls -a in the user's home directory lists it nevertheless:
> cd ~ # change into the user's home directory
> ls -la # list all files, including the 'hidden' files

drwx------ 1 root root 730 Mar 26 05:53 .
drwxr-xr-x 1 root root 200 Jan 31 00:51 ..
[...]
drwx------ 1 root root 16 Jun 13 2017 .hdb
-rw------- 1 root root 146 Mar 26 07:15 .lesshst
drwx------ 1 root root 10 Jun 6 2017 .local
drwx------ 1 root root 34 Sep 11 2017 .mozilla
-rw-r--r-- 1 root root 80 Jan 3 2017 .profile << this one
[...]

If the line export TZ=UTC+0 is added to the .profile-file then the new time zone will be set for the user every time the shell is used.

Now, for Linux based SAP systems things are a tad more ... flexible. And with "flexible" of course, I mean complicated.
Putting all settings into just a single file would make managing system- or organisation-wide settings rather tedious.
Maybe some setting should be user-specific, while others shall apply to all users of a system and so on.
For that, there is the option to call other .profile scripts.

Cutting to the chase, for the SAP HANA operating system user <sid>adm we're looking at the following profile scripts:
 .profile  - standard autostart-file for Bourne shell 
.login - C-shell version

.bashrc - init script for the BOURNE and KORN Shell
.cshrc - standard profile for the C-shell (same content, but slightly different syntax)

.sapsrc.sh - looks for .sapenv scripts and executes them
.sapsrc.csh - C-shell version

.sapenv_.sh - SAP system environment setting - specific to the local host
.sapenv_.csh - SAP system environment setting - specific to the local host - C-shell version

.sapenv.sh - SAP system environment setting
.sapenv.csh - C-shell version

.customer.sh - User specific settings
.customer.csh - C-shell version

Most of these files are created by the installation program for SAP HANA.
When we log in as <sid>adm the files get executed in the following order:
(1) .profile -> (2) .bashrc -> (3) .sapsrc.sh  -> (4) sapenv_<hostname>.sh -> (5) HDBSettings.sh (for HANA systems only)
| OR -> (4) sapenv.sh
+> (6) .customer.sh

Looking at this impressive entanglement you might not be too clear about where to put the time zone setting.

The answer, fortunately, is easy, as nearly all the script files come with a warning in a header comment, telling you to not edit it manually.

This leaves the .customer.sh as the target file for all customization including time zone settings and additional paths or environment settings for Smart Data Access (SDA) connections.
Fortunately, somebody put a comment into .profile to make clear that all custom settings will be found in .customer.sh. That way, even the SAP-unaware Linux admin has a fair chance to figure out in which file settings are done.


SAP HANA zones in time


At this point, we can configure the system's hardware clock, the Linux system clock and the timezone that SAP HANA should use for anything where the "local" timezone is used.

This setting sets the home-timezone for the SAP HANA installation. Important here is to pay attention to the term "installation".
Since tenant databases become the standard for SAP HANA, it has also become possible to use "strong isolation" mode, separating SAP HANA processes of different tenant databases on Linux level. Each tenant database can run in its own Linux user security context.
It is easy now to assume that, since the databases run in their own Linux user accounts, they could each have their own environment with separate timezone settings.
This is, however, not the case, since all tenant databases of an SAP HANA installation share the same nameserver (hdbnameserver) process. And it is the environment of this nameserver process that is shared by all tenant databases.

Therefore, all databases of an SAP HANA installation share the same timezone (as well as the same OS setup for external libraries, like 3rd party DBMS clients, etc.).
If one needs SAP HANA databases with different time zones on the same system, a separate installation is necessary. (state as of SAP HANA 2 SPS 02)

What's it all used for - SAP system timezone handling


All of the mechanisms described so far are in place to provide Linux programs access to a current time that makes sense to the user of the system.

For shared programs like SAP HANA, it is debatable what the current local time should be since users can access the system from around the globe out of completely different time zones.
Whatever time zone you decide upon, often it is simply the local timezone of where the data center is actually located, where most of the users are or the UTC timezone, you should probably pick a Standard timezone since the time changes for summer and winter time can lead to confusion with timestamp gaps and duplication, which can be especially confusing when trying to read log files.
Also, in an SAP system landscape, say with SAP NetWeaver systems based on SAP HANA, the NetWeaver application servers and SAP HANA have to have the same time, including the timezone.

Up to here, all we've looked at was telling the local current time. In SAP HANA one can use functions like now(), current_time(), current_date(), etc. to get the current SAP HANA system time. In many cases, there is also a UTC counterpart for these functions, e.g. current_utctime() if your application should rather work with the UTC timestamps.

SAP HANA also provides conversion functions, utctolocal()/localtoutc(), to convert between local time and UTC date/timestamp information.

Note that the documentation strongly discourages to store data in local time zone. SAP HANA date/timestamp data types don't store the used timezone together with the data, so the application reading the data would need to keep track of that.
Instead, data should be stored as UTC and, if local date information is required, the conversion functions should be used.

Let's stop at this point and take a look at what goes into this conversion.
I mentioned above that time zones and the rules for Daylight Saving Time (DST) are in constant change.
How can SAP HANA know what the currently correct mapping is?

In You got the time? I explained that SAP HANA can use built-in defaults or use a special set of tables that contain the current mapping information.

These tables and more information can be found in SAP note 1791342 – Time Zone Support in HANA.

To make sure the timezone data had been loaded without problems, the view M_TIMEZONE_ALERTS can be used.
If SAP HANA is started without the timezone tables loaded, this view returns:
HOST PORT STATUS TIMEZONE_NAME MISMATCH_BEGIN MISMATCH_END
skullbox 31203 TABLES NOT FOUND ? ? ?

DETAILS
The following timezone tables could not be found or are not readable: SYSTEM.TTZD, SYSTEM.TTZDF, SYSTEM.TTZDV, SYSTEM.TTZZ

HOST PORT STATUS TIMEZONE_NAME MISMATCH_BEGIN MISMATCH_END
skullbox 31204 TABLES NOT FOUND ? ? ?

DETAILS
The following timezone tables could not be found or are not readable: SYSTEM.TTZD, SYSTEM.TTZDF, SYSTEM.TTZDV, SYSTEM.TTZZ

 

Per default, SAP HANA looks for these tables in the SYSTEM schema and expects the entries to be relevant for client 001.
If you want to use a different set of tables, you can adapt these configuration parameters
= Configuration
Name | Default | System
indexserver.ini | |
global | |
timezone_dataset | sap |
timezone_default_data_client_name | 001 |
timezone_default_data_schema_name | SYSTEM |

Important to note here is that there is only one set of conversions possible per SAP HANA database, even if you have multiple SAP clients in the database.
Likewise important: these tables need to be loaded into every tenant database and the SystemDB.
The tenant DBs cannot use the tables from the SystemDB or from another tenant DB.

Yet another caveat with the SAP HANA timezone conversion functions is that they don't take a reference date, but always work based on the currently available conversion rules. If your application requires historical timezone conversions, then this needs to be implemented separately.

In order to check what time zones are currently known to SAP HANA the system view TIMEZONES is available:
select * from timezones;

TIMEZONE_NAME TIMEZONE_DATASET
[...]
AUSNSW sap
ALAW sap
UTC sap
AUSSA sap
PST_NA sap
BRZLWE sap
AUSNT sap
UTC-7 sap
Africa/Abidjan platform
Africa/Accra platform
Africa/Addis_Ababa platform
Africa/Asmera platform
[...]

 

Time zone conversion and Daylight Saving Time (DST)


At a glimpse, this whole topic of timezone conversions can look very simple. After all, it is simply adding or subtracting some hours between two time-zones, isn't it?

To make the life of time zone converters more interesting, Daylight Saving Times have been introduced in some countries for some time. When referring to "local time" this usually includes any DST rules, while the term "Standard Time" indicates that no DST rules are applied.
The major complication with DST is not so much that the number of hours to be added or substracted changes based on the (local) day of the year, but that the mapping between timestamps with and without DST is not contiguous.

Often we think of clock-timestamp values as continuously increasing values with no gaps. It's always one second, and another and another second.

This leads to implicit assumptions like:

  • every possible timestamp between 00:00:00 and 23:59:59 will occur.

  • every timestamp will occur only once.


Both of those assumptions are false during the days when time changes from summer to winter time or back again.
When changing from winter to summer time, the clocks are set back from e.g. 03:00:00 to 02:00:00.
Every second between 02:00:00 and 02:59:00 will happen twice this night.

One the change back, summer to winter time, clocks will be set forward, jumping from 01:59:59 directly to 03:00:00, skipping all timestamps between 02:00:00 and 02:59:59.

Here we see, why it is so important to use standard time for storing timestamps (also in filesystems and trace files, etc.): DST makes timestamps ambiguous.

There are more things wrong with the simplified model of timestamps, e.g. the need for leap seconds to adjust for drifts of the defined day length (one full turn of our planet).
"Counting" based timekeeping systems like UNIX epoch time consider leap seconds, leap years, centuries etc. Covering these is beyond this post but reading up on it is highly recommended.

More problems on a stack


Knowing all this about timezones and conversions between time zones with and without DST, we should be able to use timestamp data correctly in our programs with SAP HANA.
And yet, questions like "Problem with "summer" time (DST)" come up and even I wasn't able to immediately pinpoint the cause of the problem.

At face value, the OP found what looked like a bug in SAP HANA's JDBC driver.
When querying the database via JDBC certain timestamps did not seem to get returned correctly, even though the SAP HANA system timezone was set to UTC:
java -jar ngdbc.jar -u ... 
-c "select to_seconddate('2017-03-26 01:00:00') as \"2017-03-26 01:00:00\";
, to_seconddate('2017-03-26 02:00:00') as \"2017-03-26 02:00:00\"
, to_seconddate('2017-10-29 01:00:00') as \"2017-10-29 01:00:00\"
from dummy"

Connected.

| 2017-03-26 01:00:00 | 2017-03-26 02:00:00 | 2017-10-29 01:00:00 |
-------------------------------------------------------------------
| 2017-03-26 00:00:00 | 2017-03-26 01:00:00 | 2017-10-29 02:00:00 |
^^ ^^ ^^

 

When running the same query with hdbsql and thus avoiding JDBC the result is very different:
hdbsql -U ...  -A 
"select to_seconddate('2017-03-26 01:00:00') as \"2017-03-26 01:00:00"
, to_seconddate('2017-03-26 02:00:00') as \"2017-03-26 02:00:00\"
, to_seconddate('2017-10-29 01:00:00') as \"2017-10-29 01:00:00\"
from dummy"

| 2017-03-26 01:00:0 | 2017-03-26 02:00:0 | 2017-10-29 01:00:0 |
| ------------------- | ------------------- | ------------------- |
| 2017-03-26 01:00:00 | 2017-03-26 02:00:00 | 2017-10-29 01:00:00 |

 

What is going on here?


The answer lies with JAVA, more precisely the way how JDBC handles timestamp data (see also here)
When fetching timestamp data from a database that does not store the timezone with the timestamp data, the JDBC method getTimestamp (int columnIndex, Calendar cal) gets used.

The method returns a java.sql.Timestamp object which in turn is based on java.util.Date. Java handles dates/timestamps as moments (instants) in time that can be expressed in calendar fields (year, month, day of month, etc.). This requires defining the calendar that should be used here.
In our example, without specifying any specific calendar, the JVM uses its default timezone.
If the JVM property user.timezone is not set, then the JVM will try to find the current operating system users timezone. If the JVM runs on Linux, that would be the value of TZ again, on MS Windows it would be the current timezone setting for the Windows user that runs the JVM.

What JDBC is doing basically means: if the JAVA program does not explicitly set the Calendar to be used, then the setting of the client OS is used. A prominent example for this is SAP HANA Studio.

To avoid this implicit, nearly hidden conversion, one can set the JVM parameter -Duser.timezone=UTC in the hdbstudio.ini file.

For the interested: JAVA comes with its own complete implementation of a timezone database and the conversion data can be - just as in SAP HANA - managed independently from the operating system. (see here, here, here, here, here, here)


Time to end


And here we are now. Two full stacks of timekeeping, time zone management and conversions with and without DST.

What do I take home from this?

To me, the JDBC behaviour was the most surprising effect, so I probably try to make sure that I set the timezone to UTC whenever I want to avoid automatic conversion (e.g. server-side data access like SDI adapters or development/DBA tools like SAP HANA Studio).
For the rest, I think a good rule of thumb is to keep all timezone and time settings the same, especially across SAP systems.

Finally for applications and analytic scenarios that rely heavily on timezone conversion, making sure to fully understand all involved conversion mechanisms is really important.

There you go, now you know!

Cheers,
Lars



This blog post was first published on LBREDDEMANN.ORG

3 Comments