Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member199540
Active Participant
.NET core is an open source framework that can run on both windows and linux. For those of you . In the previous set of blogs, you would have seen connecting MongoDB in Node.js application and BYOL concepts. In this tutorial, you would see the steps to connect to Postgres service from a .net core application. The buildpack is referenced from the set of community buildpacks available. Some of the important components for this solution are briefed here:

  • Manifest yml file: The manifest file is used to push an application to Cloud Foundry by indicating the application name, buildpack to use, etc. The manifest file for a .net core application to consume the Postgres service will look like the one shown below:


---

applications:

- name: netPostgres

random-route: true

memory: 512M

buildpack: https://github.com/cloudfoundry/dotnet-core-buildpack.git

env:

ASPNETCORE_ENVIRONMENT: development

services:

- devPostgres

The devPostgres service mentioned in the manifest file is the service instance of Postgres service that is create on SAP Cloud Platform.

  • Target framework: In this blog, the target framework that is being referred is .Net core 1.1. The .Net framework 4.5 could also be used with the mono community buildpack, however, for the buildpack mentioned above, the target framework that can be used is .Net core 1.1.

  • Visual Studio: The IDE that was used to develop the solution is Microsoft Visual Studio 2017. Ensure to download and install the relevant version/sku based on your requirement.

  • Dotnet CLI: The dotnet core Command Line Interface is used for building .NET Core apps and libraries. We will use the published set of files to push the application using CF PUSH command in CF CLI tool.

  • Entity Framework: Entity Framework is the ORM (Object/Relationship Mapping) framework that enables to map the relation data as domain-specific objects. The entity framework used in this solution is the Entity Framework core 1.1.2 which is the recent stable version at the time of writing this blog. The same can be downloaded to the visual studio solution using the NuGet package manager – Link.

  • Data Provider: The data provider allows the programs written in C# to access the Postgres service on SAP Cloud Platform. The data provider that is used in this solution is Npgsql which is an open source ADO.NET Data Provider for Postgres. This can be downloaded through the NuGet Package manager – Link. The version used was 1.1.2 which was the latest stable version at the time of writing this blog.

  • Connection String: The connection string is required by the application to connect to the Postgres service on SAP Cloud Platform. You can get the connection string using the VCAP Service of Cloud Foundry in SAP Cloud Platform. The connection string used by Npgsql will look like the one below:



var ConnectionString = "User ID=<User ID>;Password=<Password>;Host=<Host IP>; Port=<Post number>;Database=<Database ID>;Pooling=true;";


In the above connection string, if the pooling is set to true, for the transactions, the application shall use the DB connection that is already available. Without a connection pool, the application will reach out to the DB to establish a connection. Once you create the Postgres Service Instance on SAP Cloud Platform and bind to the application, you can view these details in the dashboard of the service instance.


Solution Overview


The solution involves developing a Database context and creating an entity that contains the fields equivalent to the columns of the Postgres relation.

Step 1: Let’s start by creating a user detail (Id, First name and Last name) in Postgres and retrieving the same using APIs. The Entity class would look like the one mentioned below:
public class Users
{
[Key]
public int Id { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
}

Step 2: The next step is to create an interface for the Data Access Provider. This interface will hold the list of methods that are to implemented in the Data Access Provider class. The same will be used for registering in the startup.cs class.
    public interface IDataAccessProvider
{
void AddUserRecord(Users user);
void UpdateUserRecord(int Id, Users user);
Users GetUser(int Id);
List<Users> GetUsers();
}

Step 3: A DbContext that contains the information about the entity to table mapping along with the properties of the columns in the table. The DB Context class can also be used to implement a Entity Framework shadow functionality which is beyond the scope of this blog.
public class PostgreSqlContext : DbContext
{
public PostgreSqlContext(DbContextOptions<PostgreSqlContext> options) : base(options)
{

}
public DbSet<Users> Users { get; set; }

protected override void OnModelCreating(ModelBuilder builder)
{
builder.Entity("NetCore1Postgre.Models.Users", b =>
{
b.Property<int>("Id");
b.Property<string>("FirstName");
b.Property<string>("LastName");
b.HasKey("Id");
b.ToTable("Users");
});

builder.Entity<Users>().ToTable("Users");
base.OnModelCreating(builder);
}
public override int SaveChanges()
{
ChangeTracker.DetectChanges();
return base.SaveChanges();
}
}

Step 4: Now, we are ready to implement the Data Access Provider interface. As an example, I have implemented just a couple of methods.
public class PostgreSqlProvider : IDataAccessProvider
{
private readonly PostgreSqlContext _context;

public PostgreSqlProvider(PostgreSqlContext context)
{
_context = context;
}

public void AddUserRecord(Users user)
{
if (user != null)
{
_context.Users.Add(user);
_context.SaveChanges();
}
}

public Users GetUser(int Id)
{
throw new NotImplementedException();
}

public List<Users> GetUsers()
{
return _context.Users.ToList();
}

public void UpdateUserRecord(int Id, Users user)
{
throw new NotImplementedException();
}
}

Note that I have implemented the methods to add the user and retrieve all the records.

Step 5: The penultimate step is to create a controller that offers the API service to retrieve and store the record.
    [Route("api/[controller]")]
public class UserAPIController : Controller
{
private readonly IDataAccessProvider _dataAccessProvider;

public UserAPIController(IDataAccessProvider dataAccessProvider)
{
_dataAccessProvider = dataAccessProvider;
}
[HttpGet]
public IEnumerable<Users> Get()
{
return _dataAccessProvider.GetUsers();
}

[HttpPost]
public int Post(Users user)
{
_dataAccessProvider.AddUserRecord(user);
return 0;
}

It is assumed that the relation ‘User’ already exists. Just in case you want to create the table instead of migrating from your older application/system, you can initialize the table using an API as shown below, by adding it in the same controller as above.
[HttpGet("create")]
public int Createtable()
{
var ConnectionString = "User ID=myuser;Password=myPassword;Host=PostgreHost; Port=Port;Database=PostgreDB;Pooling=true;";
var m_conn = new NpgsqlConnection(ConnectionString);
var Dropcommand = "DROP TABLE IF EXISTS \"Users\"";
var Createcommand = "CREATE TABLE IF NOT EXISTS \"Users\"(\"Id\" INT PRIMARY KEY, \"FirstName\" TEXT, \"LastName\" TEXT)";
var m_createtbl_cmd = new NpgsqlCommand(Createcommand, m_conn);
var m_droptbl_cmd = new NpgsqlCommand(Dropcommand, m_conn);
m_conn.Open();
m_droptbl_cmd.ExecuteNonQuery();
m_conn.Close();
m_conn.Open();
m_createtbl_cmd.ExecuteNonQuery();
m_conn.Close();
return 1;
}

Note to fill in the respective service credentials in the connection string.

Step 6: The last step is to register the Data Provider in the startup.cs class in the configure services method.
public void ConfigureServices(IServiceCollection services)
{
var ConnectionString = "User ID=myuser;Password=myPassword;Host=PostgreHost; Port=Port;Database=PostgreDB;Pooling=true;";
services.AddDbContext<PostgreSqlContext>(options => options.UseNpgsql(connectionString: ConnectionString));
services.AddScoped<IDataAccessProvider, Models.PostgreSqlProvider>();
services.AddMvc();
}

Since, we are deploying the application on to the Cloud Foundry, ensure that the respective stack is mentioned in the .csproj file.
<Project Sdk="Microsoft.NET.Sdk.Web">

<PropertyGroup>
<TargetFramework>netcoreapp1.1</TargetFramework>
<RuntimeIdentifier>ubuntu.14.04-x64</RuntimeIdentifier>
<PackageTargetFallback>$(PackageTargetFallback);portable-net45+win8+wp8+wpa81;</PackageTargetFallback>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="Microsoft.ApplicationInsights.AspNetCore" Version="2.0.0" />
<PackageReference Include="Microsoft.AspNetCore" Version="1.1.2" />
<PackageReference Include="Microsoft.AspNetCore.Mvc" Version="1.1.3" />
<PackageReference Include="Microsoft.AspNetCore.StaticFiles" Version="1.1.2" />
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="1.1.2" />
<PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="1.1.2" />
<PackageReference Include="Microsoft.VisualStudio.Web.BrowserLink" Version="1.1.2" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="1.1.0" />
</ItemGroup>

<ItemGroup>
<DotNetCliToolReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Tools" Version="1.0.1" />
</ItemGroup>

</Project>

In my example, I have mentioned the Runtime identifier as Ubuntu 14.04 since the Cloud Foundry environment in SAP Cloud Platform uses cflinuxfs2 stack which is derived from Ubuntu 14.04 (Trusty Tahr).

Solution Build and Deployment


Now, the solution is ready to be built using dotnet CLI and deployed to SAP Cloud Platform. Ensure that you have the latest dotnet CLI tool by providing the command “dotnet --version” in the command prompt or Git bash or any other terminal of your choice. This command would display the version of the dotnet CLI used.

Step 1: The first step is to ensure that all the project/solution dependencies are restorable and available through NuGet. To do so, navigate to the root folder of the solution and run the below command:
dotnet restore –no-cache

This will ensure that the packages are not cached. A successful restoration will look like the one in the snapshot below:



Step 2: The next step before we can push the application is to publish the application. This is done using the dotnet publish command.
dotnet publish --output <<Folder to output>> --configuration Release --framework netcoreapp1.1 --runtime ubuntu.14.04-x64

Specify the necessary runtime and framework based on the SAP Cloud Platform.



Note that this command will also prompt for any potential warning/error messages in the solution just as depicted in the above execution.

Step 3: Use the Cloud Foundry CF CLI tool to push the application to SAP Cloud Platform. The command for the same would be “cf push -f manifest.yml -p /c/publish”. Note to specify the manifest file’s location and the location where the solution is published. For details on how a cf push would be executed, refer step 3 in the blog.

Upon a successful deployment and application start, you will get the application URL in the CF CLI prompt or the SAP Cloud Platform cockpit in the application overview page.



Note the buildpack and stack mentioned in the overview page just to confirm that you have used the right buildpack.

Testing the Solution


I have used Postman tool as a Rest client to test my API services.

For those who wants to create the relation in Postgres, you will be executing the GET method API call to create the User relation.



The response is also displayed in the snapshot above.

Next, we will add records to the created relation as shown in the Postman execution below, for which you will get a response as ‘0’ as per the code.



Note that the Body of the POST request is being sent as form url encoded.

Now to see the record created, execute the GET request as shown below.



For the sake of simplicity, this blog showed how to create a .net core application to perform CRUD operation on Postgres service on SAP Cloud Platform. The same can be extended to a web form as well.

Hope you enjoyed the blog!
7 Comments