ASP.NET Error message Unable to connect to web server ‘IIS Express’

When you get the error “Unable to connect to web server ‘IIS Express’ you probably try to start the website with a port that is in an excluded port range

Unable to connect to web server ‘IIS Express’

List the excluded port ranges with the command below:

netsh interface ipv4 show excludedportrange protocol=tcp

Sample output may look something like this:

Now to change the port for your web site right select the dropdown right to the IIS Express label

Then select [YOUR PROJECT] debug properties and change the port on the properties page (highlighted below)

Change debug settings for project
Share

Azure and Let’s Encrypt certificate

In this post I will show you how to

  1. Create a sample .net core 3.0 website.
  2. Deploy a website to an Azure App Service.
  3. Assign a custom domain.
  4. Enable Let’s Encrypt certificate

Create a sample .net core 3.0 website

Startup Visual Studio 2019 and create a new project. Choose “ASP.NET Core Web Application” as your project template.

Create new ASP.NET Core web application

Press Create. Then choose “Web Application” .

Create an empty web application

Pres Create and Visual Studio opens with an overview screen of your new ASP.NET Core application. Press F5 to see it in action.

Now it is time to publish (or deploy) your (skeleton) application to Azure.

Deploy website to Azure

Deploying your website to Azure is simple. Right click on the solution and select Publish. Make sure App Service and Create New is selected and press Create Profile.

Publish a website to Azure

Now we have to fill in the details for the App Service. Please fill out the screen as you see fit.

Create a new App Service

Press Create and wait for the App Service to be created (this could take a while). When the window closes the App Service is created. Now it is time to publish the website to this app service. Press Publish

After a few moments the website is published and visual studio will start a new browser and open the website from the Azure environment.

As you can see the site has a domain name of “frubelen.azurewebsites.net”. Now it is time to assign our custom domain name to this site. After doing this we can send our browser to azure.frubelen.nl (for example) instead of the azure domain.

Assign a custom domain

Now we are going to assing our custom domain name. First of all you will have to decide if you want to assign the domain to the azure website or a subdomain.

If you want to assign the entire domain (not a subdomain) to this azure site you will have to create an A record at your dns provider. If you only want this azure site for a subdomain you should create a alias record, a CNAME record.

I have choosen to only send the subdomain azure.frubelen.nl to this site so I create a CNAME record at my DNS provider:

Create a CNAME record if you want to redirect a subdomain to azure

After changing your DNS registration it takes some time before all DNS servers have received this change. To check if DNS servers have received the change you could send your browser to https://digwebinterface.com/

Now go back to the azure portal and navigate to your App Service and select Custom Domains.

Press on “Add custom domain”. Fill in your subdomain to redirect to azure. In my example it is “azure.frubelen.nl”. Press Validate to let Azure retrieve the DNS records for the domain.

If the DNS servers are updated Azure will let you add the domain; press “Add custom domain”.

As you can see the domain is added. The next step is to secure our domain with a Let’s Encrypt certificate.

Enable Let’s Encrypt certificate

Now it is time to add the Let’s encrypt certificate to our domain. First of all you need a storage account. Let’s encrypt stores it certificate information on this storage account. Go to your resource group (mine is FRUBEL_RG) and press Add. Type “Storage account” and select the item from the dropdown. Press create. Type in a storage account name ( I will use letsencryptfrubelen). Press “Review and create” press “Create”.

Next navigate to the storage account just created and selected the keys “Access Keys” page.

Copy the connectionstring for Key you will need this later on. Now in the Azure portal go back to your App Service and select Configuration on the left navigation.

Now we are going to add 2 app settings to the app service. Both have as value the connection string to the storage account. Add the key AzureWebJobsStorage and AzureWebJobsDashboard.

First create a new App Registration. Select “Azure Active Directory”.

The select App Registration in the navigation on the left and then press “New Registration”.

Press Register. On the overview screen press “Add an application ID URI”. Then press “Add a scope”. Remove the default and fill in (in my case) “http://frubelen” and press “Save and Continue”.

Fill out the next screen as shown below and press Add Scope.

First, you need to install the Azure PowerShell module, which can be done though WebPI or the PowerShell Gallery.

Execute the Powershell commands below. It will create an App Registration in your Azure Active directory.

$uri = 'http://frubelen'

$password = ConvertTo-SecureString "yourpassword" -asplaintext -force

$app = New-AzureRmADApplication -DisplayName {some display name} -HomePage $uri -IdentifierUris $uri -Password $password

New-AzureRmADServicePrincipal -ApplicationId $app.ApplicationId

New-AzureRmRoleAssignment -RoleDefinitionName Contributor -ServicePrincipalName $app.ApplicationId

As you can see the site has a domain name of “frubelen.azurewebsites.net”. Now it is time to assign our custom domain name to this site. After doing this we can send our browser to azure.frubelen.nl (for example) instead of the azure domain.

Assign a custom domain

Now we are going to assing our custom domain name. First of all you will have to decide if you want to assign the domain to the azure website or a subdomain.

If you want to assign the entire domain (not a subdomain) to this azure site you will have to create an A record at your dns provider. If you only want this azure site for a subdomain you should create a alias record, a CNAME record.

I have choosen to only send the subdomain azure.frubelen.nl to this site so I create a CNAME record at my DNS provider:

Create a CNAME record if you want to redirect a subdomain to azure

After changing your DNS registration it takes some time before all DNS servers have received this change. To check if DNS servers have received the change you could send your browser to https://digwebinterface.com/

Now go back to the azure portal and navigate to your App Service and select Custom Domains.

Press on “Add custom domain”. Fill in your subdomain to redirect to azure. In my example it is “azure.frubelen.nl”. Press Validate to let Azure retrieve the DNS records for the domain.

If the DNS servers are updated Azure will let you add the domain; press “Add custom domain”.

As you can see the domain is added. The next step is to secure our domain with a Let’s Encrypt certificate.

Enable Let’s Encrypt certificate

Now it is time to add the Let’s encrypt certificate to our domain. First of all you need a storage account. Let’s encrypt stores it certificate information on this storage account. Go to your resource group (mine is FRUBEL_RG) and press Add. Type “Storage account” and select the item from the dropdown. Press create. Type in a storage account name ( I will use letsencryptfrubelen). Press “Review and create” press “Create”.

Next navigate to the storage account just created and selected the keys “Access Keys” page.

Copy the connectionstring for Key you will need this later on. Now in the Azure portal go back to your App Service and select Configuration on the left navigation.

Now we are going to add 2 app settings to the app service. Both have as value the connection string to the storage account. Add the key AzureWebJobsStorage and AzureWebJobsDashboard.

First, you need to install the Azure PowerShell module, which can be done though WebPI or the PowerShell Gallery.

Execute the Powershell commands below. It will create an App Registration in your Azure Active directory.

$uri = 'http://frubelen'

$password = ConvertTo-SecureString "yourpassword" -asplaintext -force

$app = New-AzADApplication -DisplayName {some display name} -HomePage $uri -IdentifierUris $uri -Password $password

New-AzADServicePrincipal -ApplicationId $app.ApplicationId

New-AzRoleAssignment -RoleDefinitionName Contributor -ServicePrincipalName $app.ApplicationId

The commands above create an App Registration in the Azure Active Directory.

Next go to the App service and select Extensions in the left navigation. Add the Let’s encrypt extension. Press Add and search for “”

Select the Legal Terms item, press Ok and next press Ok to add the extension. Then select the Extension and press Browse. A new browser window will open in which you van configure Let’s encrypt.

Go to your App Service and select Configuration on the left navigation.

Add the following App Settings

letsencrypt:Tenant The name of the directory you are working in
letsencrypt:SubscriptionId The id of your subscription
letsencrypt:ResourceGroupName The name of the resource group
letsencrypt:ClientId This is the ApplicationId of the app service (in the powershell $app.ApplicationId)
letsencrypt:ClientSecret The password (clear text)

Now go to the extensions page of the App Service and select Let’s Encrypt

In the screen that opens press Browse.

Fill in connection strings for the storage account and press Next.

Press Next.

Select the domain you want a certificate for, fill in your email and press “Request and install certificate”.

Your certificate is requested and installed. Now open a browser and navigate to https://azure.frubelen.nl. There you go your site is secured with a SSL certificate.

Share

NET CORE 3.1 Cookie Authentication

In this short post I will show you how to implement Cookie authentication with Visual Studio 2019 and ASP.NET CORE 3.1.

Create a new project

In Visual Studio create a new .NET CORE Web Application project. Press Next

Configure the project

Give it a name, “BasicCookies” for example. Press Create.

Select Empty web application

Create a empty web application, press Create. This will create the solution structure as shown below.

In the generated startup.cs replace the entire Configure function with the code below.

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }
            app.UseRouting();
            app.UseAuthentication();
            app.UseAuthorization();
            app.UseEndpoints(endpoints =>
            {
                endpoints.MapDefaultControllerRoute();
            });
        }

Also replace the entire ConfigureServices function with the code below.

public void ConfigureServices(IServiceCollection services)
        {
            services.AddAuthentication(CookieAuthenticationDefaults.AuthenticationScheme).AddCookie(config =>
            {
                config.Cookie.Name = "Cookie.Basics";
                config.LoginPath = "/Home/Authenticate";
            });
            services.AddControllersWithViews();
        }

Now create a new folder “Controllers” and add a new class “HomeController” to it. Replace the entire generated contents with the code shown below. The code below shows that a user ClaimsPrincipal can have multiple identities coming from different sources.

namespace Basics.Controllers
{
    using Microsoft.AspNetCore.Authentication;
    using Microsoft.AspNetCore.Authorization;
    using Microsoft.AspNetCore.Mvc;
    using System;
    using System.Collections.Generic;
    using System.Security.Claims;

    public class HomeController : Controller
    {
        public IActionResult Index()
        {
            return View();
        }

        [Authorize]
        public IActionResult ProtectedPage()
        {
            return View();
        }

        public IActionResult Authenticate()
        {
            
            var passPortClaims = new List<Claim>()
            {
                new Claim(ClaimTypes.Name, "Berend de Jong"),
                new Claim(ClaimTypes.Email, "email@passport.com"),
                new Claim(ClaimTypes.DateOfBirth, new DateTime(1971, 03, 09).ToString()),
            };

            var licenseClaim = new List<Claim>()
            {
                new Claim(ClaimTypes.Name, "Berend de Jong"),
                new Claim(ClaimTypes.Email, "enail@driverlicense.com"),
                new Claim("DrivingLicense", "Truck")
            };

            var passportIdentity = new ClaimsIdentity(passPortClaims, "MyPassport");
            var licensedentity = new ClaimsIdentity(licenseClaim, "MyLicense");

            var userPrincipal = new ClaimsPrincipal(new[] { passportIdentity, licensedentity });

            HttpContext.SignInAsync(userPrincipal);

            return RedirectToAction("Index");
        }
    }
}

As you can see we have three actions. Two action return the view for the given action and the third, Authenticate, returns to the Index view.

Create the Index view and the ProtectedPage view. As you can see the ProtectedPage view has an Authorize attribute. This means that we first have to authorize before we can access the view.

Create a new folder Views and within the Views folder create a folder Home. Right click on the new Home folder and select Add -> View.

Create a new razor view

Replace the entire file contents with the contents below. Do the same for the ProtectedPage view, replace the contents with something you will recognize as the protected page…

<h1>Home Page</h1>
<table>
    <tr><td>IsAuthenticated</td><td>@User.Identity.IsAuthenticated</td></tr>
    <tr><td>Username</td><td>@User.Identity.Name</td></tr>
    <tr><td>AuthenticationType</td><td>@User.Identity.AuthenticationType</td></tr>
    <tr><td colspan="2"><strong>Claims</strong></td></tr>
    @foreach (var c in User.Claims)
    {
        <tr>
            <td>@c.Subject.AuthenticationType</td><td>@c.Type</td><td>@c.Value</td>
        </tr>
    }
</table>
<hr/>
<a href="/home/protectedpage">Login</a>

Now you can start debugging the site. When you start the web application the home page will show up. Once you navigate to https://localhost:[port]/home/protectedpage you will be redirected to the home page. The Authenticate method has authenticated you and a cookie is stored in the browser.

Your Homepage after pressing the Login link
Share

Using Moq for testing

Install the nuget Moq package. Below is a simple example to create a mock for a database implementation (which is also a mock :-)).

namespace ConsoleApp1
{
    using Moq;
    using System;
    using System.Collections.Generic;

    public interface IDAL
    {
        List<string> RetrieveBooks();
    }

    public class DAL : IDAL
    {
        public List<string> RetrieveBooks()
        {
            return new List<string> { "a", "b", "c" };
        }
    }

    public class BookManager
    {
        private IDAL _dal;

        public BookManager(IDAL dal)
        {
            _dal = dal;
        }

        public List<string> GetBooks()
        {
            return this._dal.RetrieveBooks();
        }
    }

    class Program
    {
        static void Main(string[] args)
        {
            /*
             * Bookmanager with database implementation
             */
            Console.WriteLine("Database implementation");
            BookManager bmDB = new BookManager(new DAL());
            bmDB.GetBooks().ForEach(i => Console.WriteLine(i));

            /*
             * Bookmanager with a mock database
             */
            Console.WriteLine("Database mock implementation");
            var dal_mock = new Mock<IDAL>();
            dal_mock.Setup(i => i.RetrieveBooks()).Returns(new List<string> { "d" });
            BookManager bmMOQ = new BookManager(dal_mock.Object);
            bmMOQ.GetBooks().ForEach(i => Console.WriteLine(i));
        }
    }
}
Share

ASP.NET Core Tips and Quick Setup Identity System

In this blog post we are going to setup a basic invoice system. It uses the ASP.NET Core identity system. Every step for creating the app is described and at the end you should have a working Invoicing system (being it a bit simple one).

For convenience install the sqlitebrowser

sudo apt install sqlitebrowser

If you did not already installed the dotnet-aspnet-codegenerator already….

dotnet tool install --global dotnet-aspnet-codegenerator

And also install the libman Client Side library manage

dotnet tool install -g Microsoft.Web.LibraryManager.Cl

Create the intial WebApp project

dotnet new webapp -o WebApp

A folder, WebApp, with the new web application is created

cd WebApp

Add the required package for the aspnet-codegenerator tool

dotnet add package Microsoft.VisualStudio.Web.CodeGeneration.Design

Optional; add the package below manually so you use the newest version (not the one installed default by the aspnet-codegenerator)

dotnet add package Microsoft.EntityFrameworkCore.Sqlite

In Visual Studio you can install the package as a nuget package through the Package Manager console with

PM> Install-Package Microsoft.EntityFrameworkCore.Sqlite -Version 2.2.6

Scaffold the Identity pages you want to change later on, for now we are going to use a SqLite database and override the Register, Login and Logout pages.

dotnet aspnet-codegenerator identity --useSqLite -dc WebApp.Data.Identity.IdentityDbContext --files "Account.Register;Account.Login;Account.Logout"

Before we are going to add the migrations we change the name (and location) of the database to dbs/identity.db (we will have separate databases for users and data).

Start Visual Code in the root of the WebApp directory.

code .

Wait a few seconds for the window below to appear and answer Yes. If the window below does not appear press F1 and type “.NET”, then select “.NET: Generate assets for build and debug”.

Open the file appsettings.json in the root of the project and change WebApp.db to dbs/identity.db. Also create the folder dbsin the root of WebApp.

Now we are going to create the Migrations for the initial Identity database and update the database with this migration.

Create the initial migration for the identity system

dotnet ef migrations add InitialCreateIdentity 

Create the database

dotnet ef database update

Check the databastructure with SQLite browser

Because we did not use the --auth parameter on initial create of the project our Startup.cs is not prepared to use authentication. Add the line below right after app.UseCookiePolicy

app.UseAuthentication();

We also have to add the _LoginPartial to _Layout.cshtml because of this. Add the partial _LogingPartial to /Pages/Shared/_Layout.cshtml right before the ul which contains the Home link. Add the line below:

<partial name="_LoginPartial" />

To test authorization place the [Authorize] attribute on the PrivacyModel class and add the using Microsoft.AspNetCore.Authorisation

using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.RazorPages;
using Microsoft.AspnetCore.Authorisation;

namespace WebApp.Pages
{
    [Authorize]
    public class PrivacyModel : PageModel
    {
        public void OnGet()
        {
        }
    }
}

It is possible to configure password options in /Areas/Identity/IdentityHostingStartUp.cs. For example: do not require an uppercase character in the password:

services.Configure<IdentityOptions>(options =>
{
   options.Password.RequireUppercase = false;
});

Now we are going to add our first CRUD pages. We are going to store Invoices with our application. First create a directory Models and place a file Invoice.cs in it with the following code in it:  

namespace WebApp.Models
{
    using System;
    using System.ComponentModel.DataAnnotations;
    using System.ComponentModel.DataAnnotations.Schema;

    public class Invoice
    {
        public int ID { get; set; }
        public string Product { get; set; }

        [DataType(DataType.Date)]
        public DateTime InvoiceDate { get; set; }
        [Column(TypeName = "decimal(18,2)")]
        public decimal DiscountPercentage { get; set; }
        [Column(TypeName = "decimal(18,2)")]
        public decimal Amount { get; set; }
    }
}




Scaffold model CRUD pages   Execute the command below:

dotnet aspnet-codegenerator razorpage -m Invoice -dc InvoiceDbContext -udl -outDir Pages/Invoices --referenceScriptLibraries

The following files will be generated or adjusted:

In /Startup.cs the InvoiceDbContext is added to the services configuration
A directory /Pages/Invoice is created and all files in there are also generated
A directory /Data is created in which a file InvoiceDbContext.cs is stored with the data context for the invoices
The file appsettings.json is modified. A connection string is added for the InvoiceDbContext (we will change this below)

In case you are on Linux. Default the DbContext is using SqlServer, that is not supported on the Linux platform. Goto the file Startup.cs and replace SqlServer with SqLite for the InvoiceDbContext.

services.AddDbContext<InvoiceDbContext>(options =>
   options.UseSqlite(Configuration.GetConnectionString("InvoiceDbContext")));

Next edit the file appsettings.json and replace the connection string value with "DataSource=dbs/invoices.db"

Next create the initial migrations and update the invoice database for the InvoiceDbContext.

dotnet ef migrations add InitialCreateInvoices --context InvoiceDbContext
dotnet ef database update --context InvoiceDbContext

Start the sqlitebrowser to check the structure of your database (/WebApp/dbs/invoice.db)

Now to protect our Invoices folder for unauthorized access add the code below to your services configuration (ConfigureServices) in Startup.cs

services.AddMvc().AddRazorPagesOptions(options =>  
            {  
                options.Conventions.AuthorizePage("/Invoices");  
            });

References and handy URL’s

This url was very helpfull

Aantekeningen nav ContosoUniversity  tutorial

Some common errors

dotnet restore gives 401 error

Update your credentials for nuget, first download the nuget cli. Next execute the command:

nuget.exe sources update -name RDW -source [url] -username [user] -password [pwd]

Retrieve your name and url with the command:
nuget sources

When you get the error “Scheme already exists: Identity.Application” you probably generated the Identity pages with a different context then ApplicationDbContext.

Build the solution (Ctrl-Shift-B, Enter) en start debugging F5. Navigate to the Privacy page and verify that you have to login before you can continue to this page.

In case you get error “PlatformNotSupportedException: LocalDB is not supported on this platform.” you probably generated the Identity pages with a different context then ApplicationDbContext.

Share

Read data from smartmeter with nodejs

Install the necessary node modules and read data from your P1 cable connected to your smartmeter. Includes CRC checking of received packages.

const SerialPort = require('serialport')
const InterByteTimeout = require('@serialport/parser-inter-byte-timeout')
const port = new SerialPort('/dev/ttyUSB0')

const parser = port.pipe(new InterByteTimeout({interval: 100}))

table = [
0x0000, 0xc0c1, 0xc181, 0x0140, 0xc301, 0x03c0, 0x0280, 0xc241,
0xc601, 0x06c0, 0x0780, 0xc741, 0x0500, 0xc5c1, 0xc481, 0x0440,
0xcc01, 0x0cc0, 0x0d80, 0xcd41, 0x0f00, 0xcfc1, 0xce81, 0x0e40,
0x0a00, 0xcac1, 0xcb81, 0x0b40, 0xc901, 0x09c0, 0x0880, 0xc841,
0xd801, 0x18c0, 0x1980, 0xd941, 0x1b00, 0xdbc1, 0xda81, 0x1a40,
0x1e00, 0xdec1, 0xdf81, 0x1f40, 0xdd01, 0x1dc0, 0x1c80, 0xdc41,
0x1400, 0xd4c1, 0xd581, 0x1540, 0xd701, 0x17c0, 0x1680, 0xd641,
0xd201, 0x12c0, 0x1380, 0xd341, 0x1100, 0xd1c1, 0xd081, 0x1040,
0xf001, 0x30c0, 0x3180, 0xf141, 0x3300, 0xf3c1, 0xf281, 0x3240,
0x3600, 0xf6c1, 0xf781, 0x3740, 0xf501, 0x35c0, 0x3480, 0xf441,
0x3c00, 0xfcc1, 0xfd81, 0x3d40, 0xff01, 0x3fc0, 0x3e80, 0xfe41,
0xfa01, 0x3ac0, 0x3b80, 0xfb41, 0x3900, 0xf9c1, 0xf881, 0x3840,
0x2800, 0xe8c1, 0xe981, 0x2940, 0xeb01, 0x2bc0, 0x2a80, 0xea41,
0xee01, 0x2ec0, 0x2f80, 0xef41, 0x2d00, 0xedc1, 0xec81, 0x2c40,
0xe401, 0x24c0, 0x2580, 0xe541, 0x2700, 0xe7c1, 0xe681, 0x2640,
0x2200, 0xe2c1, 0xe381, 0x2340, 0xe101, 0x21c0, 0x2080, 0xe041,
0xa001, 0x60c0, 0x6180, 0xa141, 0x6300, 0xa3c1, 0xa281, 0x6240,
0x6600, 0xa6c1, 0xa781, 0x6740, 0xa501, 0x65c0, 0x6480, 0xa441,
0x6c00, 0xacc1, 0xad81, 0x6d40, 0xaf01, 0x6fc0, 0x6e80, 0xae41,
0xaa01, 0x6ac0, 0x6b80, 0xab41, 0x6900, 0xa9c1, 0xa881, 0x6840,
0x7800, 0xb8c1, 0xb981, 0x7940, 0xbb01, 0x7bc0, 0x7a80, 0xba41,
0xbe01, 0x7ec0, 0x7f80, 0xbf41, 0x7d00, 0xbdc1, 0xbc81, 0x7c40,
0xb401, 0x74c0, 0x7580, 0xb541, 0x7700, 0xb7c1, 0xb681, 0x7640,
0x7200, 0xb2c1, 0xb381, 0x7340, 0xb101, 0x71c0, 0x7080, 0xb041,
0x5000, 0x90c1, 0x9181, 0x5140, 0x9301, 0x53c0, 0x5280, 0x9241,
0x9601, 0x56c0, 0x5780, 0x9741, 0x5500, 0x95c1, 0x9481, 0x5440,
0x9c01, 0x5cc0, 0x5d80, 0x9d41, 0x5f00, 0x9fc1, 0x9e81, 0x5e40,
0x5a00, 0x9ac1, 0x9b81, 0x5b40, 0x9901, 0x59c0, 0x5880, 0x9841,
0x8801, 0x48c0, 0x4980, 0x8941, 0x4b00, 0x8bc1, 0x8a81, 0x4a40,
0x4e00, 0x8ec1, 0x8f81, 0x4f40, 0x8d01, 0x4dc0, 0x4c80, 0x8c41,
0x4400, 0x84c1, 0x8581, 0x4540, 0x8701, 0x47c0, 0x4680, 0x8641,
0x8201, 0x42c0, 0x4380, 0x8341, 0x4100, 0x81c1, 0x8081, 0x4040
]

parser.on('data', (data) => {

        let packet = data.toString('utf8');
        let lines = packet.split(/\r\n|\n|\r/);

        if (lines[0].indexOf('/Ene5\\T210-D ESMR5.0') === 0) {

                let crc = 0;

                for (var i = 0; i < data.length - 6; i++) {
                        crc = (crc >> 8) ^ table[(crc ^ data[i]) & 0xff]
                }

                let packetCrc = parseInt('0x' + lines[lines.length - 2].substring(1));
                if (crc == packetCrc) {
                        for (var i = 0; i < lines.length; i++) {
                                console.log(lines[i]);
                        }
                }
        }
});
Share

nodejs – sequelize ORM many to many setup

Image result for sequelize logo

With help of the Sequelize ORM you can manage your database models and queries. Below is an example of how to setup a many to many relationship with Nodejs and the Sequelize ORM. At the end of this article you can find a link to the source code.

First of all our scenario:

We have Invoices and Products in our database. As you can imagine you can receive a invoice for multiple products. Also a product can occur on multiple invoices. The relation between Invoices and Products is many to many.

1 Invoice can have 1 or more products and 1 product can have one or more invoices.

The way to model this in a database is with a so called join table. This table has a pointer to an Invoice and a pointer to a Product. Lets say we want to send two invoices.

Invoice number 2019001 with a SSD and a Laptop on it and Invoice number 2019002 with a harddisk and monitor on it. We have not yet sold any Desktop’s.

Our Invoice and Product tables look like this:

INVOICE                  PRODUCTS
ID   NR                  ID  DESC
1    2019001             1   Laptop
2    2019002             2   SSD
                         3   Harddisk
                         4   Monitor
                         5   Desktop

Then we have our join table, I have called it ProductInvoices after sending the invoices to our customer it will have the contents as shown below

INVOICEPRODUCT
INVOICE_ID PRODUCT_ID
1          1
1          2
1          3
2          3
2          4

Ok, now lets switch over to Nodejs and Sequelize. How are we going to model this in Sequelize? Below are all the steps to create a working program.

Create a new directory and setup your node environment, also initialize a new Sequelize setup with the Sequelize init command (you need to install the sequelize-cli package for this to work).

mkdir m_to_m
cd m_to_m
npm init -y
npm install sequelize-cli sequelize sqlite3
./node_modules/.bin/sequelize init

After this you will have the following directory structure (I use Visual Code for my development work as you can see).

Now we are going to create the Invoice model. Start up visual code in the m_to_m directory (execute code . in this directory). Right click on the models folder and choose new file. Name it invoice.js. Place the code shown below in it.

module.exports = (sequelize, DataTypes) => {

	var Invoice = sequelize.define('Invoices', {
		nr : {
			type : DataTypes.INTEGER,
			allowNull : false,
			unique : {msg : 'Invoice nr should be unique'}
		}
	});

	Invoice.associate = (models) => {
    		Invoice.belongsToMany(models.Products, {
			through: 'ProductInvoices',
			as: 'Products',
			foreignKey: 'invoiceId'
		});
	};

	return Invoice;
}

Next create a file product.js also in the models folder with the following content:

module.exports = (sequelize, DataTypes) => {
		
	var Product = sequelize.define('Products', {

		name : {
			type : DataTypes.STRING,
            allowNull : false,
			unique : {msg : 'Product name should be unique'}
		}
		
	});

	Product.associate = (models) => {
			Product.belongsToMany(models.Invoices, {
			through: 'ProductInvoices',
			as: 'Invoices',
			foreignKey: 'productId'
		});
	};

	return Product;
}

The next model is optional. When you do not specify it Sequelize will generate one for you. Specifying it yourself has the advantage that you can also query this model if needed and add some attributes to the relation. For this example I have added a ‘remark’ attribute to this join table. We can add a remark to the product for this invoice.

So for now lets code it, but remember, it is optional if there is no need to query the join table or you do not have any relation attributes.

module.exports = (sequelize, DataTypes) => {
		
	var ProductInvoices = sequelize.define('ProductInvoices', {

		id : {
			type : DataTypes.INTEGER,
			primaryKey : true,
			autoIncrement: true,
		},

		productId : {
			type : DataTypes.INTEGER,
			unique : false,
            allowNull : false,
		},
		
		invoiceId : {
			type : DataTypes.INTEGER,
			unique : false,
            allowNull : false,
		},

		remark : {
			type : DataTypes.STRING,
			unique : false,
            allowNull : true,
		},
	});

	ProductInvoices.associate = (models) => {
		
		ProductInvoices.belongsTo(models.Products, {
			as: 'Product',
			foreignKey: 'productId'
		});
		ProductInvoices.belongsTo(models.Invoices, {
			as: 'Invoice',
			foreignKey: 'invoiceId'
		});
	};

	return ProductInvoices;
}

Now our model is complete and we can start to program against it. I love the Sequelize ORM, don’t you, no?, you will in a minute…..

We have to do some configuration for the database connection before we are able to create the database. In this example I will be using a SQLite database. Remove the file generate by Sequelize at ~/m_to_m/config/config.json. Create a new file at this location but with the extension js. Place the contens shown below in this config.js file

module.exports = {
  "development": {
    "dialect": "sqlite",
    "storage" : "./database_dev.sqlite3",
    "operatorsAliases": false,
  },
  "test": {
    "dialect": "sqlite",
    "storage" : "./database_tst.sqlite3",
    "operatorsAliases": false
  },
  "production": {
    "dialect": "sqlite",
    "storage" : "./database_prd.sqlite3",
    "operatorsAliases": false
  }
}

There is one last thing to do before our database setup is complete. Edit the generated index.js file at ~/m_to_m/models and change config.json to config.js (remove the “on” at the end of the filename extension).

const config = require(__dirname + '/../config/config.js')[env];

Now the database setup is complete, lets start to write some code. Create a new file in the root of your project and name it app.js, place the code shown below in it.

var models = require('./models');
models.sequelize.sync().then(() => {
   console.log(`DB ${db.options.storage} created.`);
});

When you execute this node program the database database_dev.sqlite3 will be created (see the config.js file in the config directory). The SQL that is executed against the SQLite driver is also shown:

~/m_to_m$: node app.js
Executing (default): CREATE TABLE IF NOT EXISTS 
`Invoices` (`id` INTEGER PRIMARY KEY AUTOINCREMENT, `nr` INTEGER NOT NULL, `createdAt` DATETIME NOT NULL, `updatedAt` DATETIME NOT NULL);

Executing (default): PRAGMA INDEX_LIST(`Invoices`)

Executing (default): PRAGMA INDEX_INFO(`sqlite_autoindex_Invoices_1`)

Executing (default): CREATE TABLE IF NOT EXISTS 
`Products` (`id` INTEGER PRIMARY KEY AUTOINCREMENT, `name` VARCHAR(255) NOT NULL, `createdAt` DATETIME NOT NULL, `updatedAt` DATETIME NOT NULL);

Executing (default): PRAGMA INDEX_LIST(`Products`)

Executing (default): PRAGMA INDEX_INFO(`sqlite_autoindex_Products_1`)

Executing (default): CREATE TABLE IF NOT EXISTS `ProductInvoices` (`id` INTEGER PRIMARY KEY AUTOINCREMENT, `productId` INTEGER NOT NULL REFERENCES `Products` (`id`) ON DELETE CASCADE ON UPDATE CASCADE, `invoiceId` INTEGER NOT NULL REFERENCES `Invoices` (`id`) ON DELETE CASCADE ON UPDATE CASCADE, `remark` VARCHAR(255), `createdAt` DATETIME NOT NULL, `updatedAt` DATETIME NOT NULL, UNIQUE (`productId`, `invoiceId`));

Executing (default): PRAGMA INDEX_LIST(`ProductInvoices`)

Executing (default): PRAGMA INDEX_INFO(`sqlite_autoindex_ProductInvoices_1`)
~/m_to_m$:DB Created

To inspect the contents of this database you can for example use “DB Browser for SQLite” or DBeaver . Install “DB Browser for SQLite” with

sudo apt install sqlitebrowser
DB Browser for SQLite in action

Now we are going to add some records to our database and query the database. We will be adding five products and two invoices as described above. After that we will generate an overview of the data in the database.

Here is the code to store and query our Invoices and products.

var models = require('./models');

(async () => {

    let db = await models.sequelize.sync();

    console.log(`DB ${db.options.storage} created.`);

    await models.Invoices.destroy({ where : {}, truncate : true});
    await models.Products.destroy({ where : {}, truncate : true});

    let laptop = await models.Products.create( { name : 'Laptop' })
    let ssd = await models.Products.create( { name : 'SSD' });
    let harddisk = await models.Products.create( { name : 'Harddisk' });
    let monitor = await models.Products.create( { name : 'Monitor' });
    let desktop = await models.Products.create( { name : 'Desktop' });

    invoice = await models.Invoices.create({ nr : 2019001 });
    await invoice.addProducts([ssd],  {through: { remark: '10% discount on SSD' }});
    await invoice.addProducts([harddisk, laptop]);

    invoice = await models.Invoices.create({ nr : 2019002 });
    await invoice.addProducts([harddisk, monitor]);

    invoices = await models.Invoices.findAll(
        { 
            include : [ {model: models.Products, through: 'ProductInvoices', as: 'Products'}]
        }
    );

    invoices.forEach(invoice => {

        console.log(`INVOICE: ${invoice.id} ${invoice.nr}`);
        invoice.Products.forEach(product => {
            console.log(`\tPRODUCT: ${product.id} ${product.name}`);
        })
        console.log();

    });

    products = await models.Products.findAll(
        { 
            include : [ {model: models.Invoices, through: 'ProductInvoices', as: 'Invoices'}]
        }
    );

    products.forEach(product => {

        console.log(`PRODUCT ${product.id} ${product.name}`);
        product.Invoices.forEach(invoice => {
            console.log(`\tINVOICE: ${invoice.id} ${invoice.nr}`);
        })
        console.log();

    });

    /*
     * Weird but possible adding records via jointable.....
     */
    pi = await models.ProductInvoices.create(
    { 
        invoiceId : (await models.Invoices.create({ nr : 2019009})).id,
        productId : (await models.Products.findOne({ where : { name : 'Desktop' } })).id
    });

    productInvoices = await models.ProductInvoices.findAll(
        { 
            include : [ 
                {model: models.Products, as : 'Product'},
                {model: models.Invoices, as : 'Invoice'}
            ]
        }
    );

    productInvoices.forEach(pi => {
        console.log(`${pi.id} ${pi.Invoice.id} ${pi.Invoice.nr} ${pi.Product.id} ${pi.Product.name}`);
    });
})();

After running this code you will have two Invoices and five Products in your database. The image below shows the console output of running this Nodejs program. I have switched of Sequelize logging (add an attribute “logging” : false to the appropriate entry in ~/config/config.json) so the output is a bit more readable.

You can find the complete working example here and a typescript variant can be found here.

Got to elaborate on this, see https://www.npmjs.com/package/sequelize-typescript


Share

nodejs – using typescript with nodemon

Create a new directory and run the command below in it

npm init -y

Next install nodemon with

sudo npm install -D nodemon

Add a script tag to your package.json to execute the nodemon command. This works because npm looks under node_modules/.bin for an executable. The package.json looks then like this

{
  "name": "learnnode",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "nodemon": "nodemon"
  },
  "keywords": [],
  "author": "",
  "license": "ISC",
  "devDependencies": {
    "nodemon": "^1.18.9"
  }
}

Create a new file index.js, this is the default file nodemon looks for when started without any parameters. index.js contains only:

console.log('Hello');

Now start the nodemon executable

npm run nodemon

Now if you change the index.js file nodemon will restart and execute nodejs with the new index.js file.

Now we want to make use of TypeScript. Install the typescript compiler locally together with the ts-node binary with

npm install -D typescript
npm install -D ts-node

Also add a new script to your package.json called tsc. The complete contents of the package.json file is shown below:

{
  "name": "ts",
  "version": "1.0.0",
  "description": "Example live build",
  "main": "index.js",
  "scripts": {
    "start": "npm run build:live",
    "build": "tsc -p .",
    "build:live": "nodemon --watch '*.ts' --exec 'ts-node' app.ts"
  },
  "keywords": [],
  "author": "B J de Jong",
  "license": "ISC",
  "devDependencies": {
    "nodemon": "^1.18.9",
    "ts-node": "^7.0.1",
    "typescript": "^3.2.2"
  },
  "dependencies": {}
}

Now you will need a tsconfig.json for the typescript compiler. Create a tsconfig.json file with default settings with:

npm run tsc -- --init

Now to start monitoring your ts files start the ‘start’ node script with

npm run start
Share

nodejs – using express-session

In this article I will show you how to use express sessions. Default express-sessions are stored in memory. With help of the package ‘session-file-store’ you can persist sessions to your filesystem.

Use memory store for sessions (default)

First setup your nodejs app to use express and express-essions:

# Create new package.json file
npm init -y
# Add express and express-ession to node module
npm install express express-ession --save

Now add an app.js file to the current folder with the following contents:

const session = require('express-session');
// const FileStore = require('session-file-store')(session);
const express = require('express')
const app = express()
const port = 3000

app.use(session({
        secret: 'mysecret', 
        resave : false, 
        saveUninitialized : false,
        // store: new FileStore()
    })
)

app.get('/', (req, res) => {

    if (req.session.views) {
        req.session.views++
        res.send(`Returning client (${req.session.views} times})`)
    }
    else {
        req.session.views = 1
        res.send('New client')
    }
})

app.get('/destroy', (req, res) => {
    req.session.destroy()
    res.send('Session destroyed')
})

app.listen(port, () => console.log(`Listening on port ${port}`))

Start your nodejs application with

nodemon app.js

and navigate to ‘http://localhost:3000/’. A webpage shows up with the text ‘New client’. Now hit F5 and see the text ‘Returning client (2 times)’ appearing. The session is created on first request with a ‘views’ variable in it. Every next visit of the site this ‘views’ variable is incremented with 1.

Use a FileStore for session data

Now if you want to use persistent session you will have to install the session-file-store with:

npm install session-file-store --save

Uncomment the two lines of code in app.js and you are ready to go. Sessions are stored on the filesystem in a sub folder called ‘sessions’ below the location of your app.js.

If you are using nodemon to monitor changes in your nodejs code be sure to exclude monitoring of the ‘sessions’ folder as it will change on every request of the browser. Start nodemon with:

nodemon --ignore sessions/ app,js

Custom session id’s

In case you want to generate custom session id’s you will have to provide a genid callback to the session initialized. First add the uuid package with

npm install uuid

Add the require statement to the top of your app.js file:

const uuidv1 = require('uuid/v1')

And add the genid callback to the session initialization:

app.use(session({
        genid: (req) => {
            return 'app_' + uuidv1() // use UUIDs for session IDs
        },
        secret: 'keyboard cat', 
        resave : false, 
        saveUninitialized : false,
        store: new FileStore()
    })
)
Share

nodejs – use mongodb for CRUD application

In this article I’m going to create a minimalistic CRUD application with nodejs, express and mongodb. First I will show you the pug files and finally the nodejs code for creating our application.

To use mongodb you have to install it on your (ubuntu) box with:

sudo apt install mongodb

Then we have to add the node module to our project (and package.json) with:

npm install mongodb --save

Now on to the pug files. First of all the ‘index.pug’ file (remember pug files are stored in the views folder (default).

html
  head
  body
    h2 Add email address
    a(href='/all') All
    form(method='POST' action='/add')
      div
        p Your email address
        input#name.form-control(type='text', placeholder='Email address' required name='email')
      p 
          button.btn.btn-primary(type='submit') Sign up

Then we have the ‘all.pug’ file which gives us an overview of entries in the database together with a link to delete or edit the entry

html
  head
  body
    h2 List of email addresses
    a(href='/') Add
    table
        each email in email_addresses
            tr#email_list_item    
                td #{email.email}
                td #{email.i}
                td
                    a(href='delete?id=' + email._id) x
                td
                    a(href='edit?id=' + email._id) e

We have an ‘edit.pug’ file to edit our document entries

html
  head
  body
    h2 Update email address
    a(href='/all') All
    form(method='POST' action='/edit')
        input#email.form-control(type='hidden', name='id' value=id)
        div
            p Edit email address
            input#email.form-control(type='text', placeholder='Email address' required name='email' value=email_address)
        p 
            button.btn.btn-primary(type='submit') Update
        p

And finally we have our app.js nodejs application.

const express = require('express')
const app = express()
const port = 3000

app.use(express.json())
app.use(express.urlencoded({ extended: true }))
app.set('view engine', 'pug')

var ObjectId = require('mongodb').ObjectID;
var MongoClient = require('mongodb').MongoClient;

// Connect to the db
MongoClient.connect("mongodb://localhost:27017/", { useNewUrlParser: true }, function(err, client) {

    if (err) throw (err)
    mongoDB = client.db('emailaddresses')
    app.listen(port, () => console.log(`Listening on port ${port}`))

});

app.get('/', (req, res) =>  {

        res.render('index')

    }
)

app.post('/add', (req, res) =>  {
    
    mongoDB.collection('email_addresses').insertOne(req.body, (err, result) => {

        if (err) throw err
        res.redirect('all')

    })
})

app.get('/all', (req, res) =>  {

    mongoDB.collection('email_addresses').find().toArray((err, results) => {

        if (err) throw err
        res.render('all', { 'email_addresses': results })

    })

})

app.get('/delete', (req, res) =>  {

    mongoDB.collection('email_addresses').deleteOne({'_id': ObjectId(req.query.id)}, (err, results) => {

        if (err) throw err
        res.redirect('/all')
        
    });
})

app.get('/edit', (req, res) =>  {

    mongoDB.collection('email_addresses').findOne({'_id': ObjectId(req.query.id)}, (err, result) => {

        if (err) throw err
        res.render('edit', { 'email_address' : result.email, 'id' : result._id })

    });
})

app.post('/edit', (req, res) => {

    mongoDB.collection('email_addresses').updateOne({'_id': ObjectId(req.body.id)}, { $set:req.body }, (err, result) => {

        if(err) throw err
        res.redirect('/all')

    });
})
Share