Copying a nopCommerce Environment

Copying a nopCommerce environment to a destination such as your local machine or to another environment requires two major steps:

  • Copying the database
  • Cloning the source code and configuring to use the copied database

Copy The Database

  1. Using SSMS, log in to the source database server using administrative credentials (usually the remote environment).
  2. On source database server, right-click on database and click Tasks -> Export Data-tier Application
  3. Save the .bacpac file to the destination machine.
  4. Connect to the destination database server (usually local).
  5. Right-click ‘Databases’ on the destination server and click ‘Import Data-tier Application’.
  6. Use the .bacpac file created from the source database.

Copy and Configure the Source Code

  1. Clone the source code to the destination machine.
  2. Create the /Presentation/Nop.Web/App_Data/dataSettings.json file with the following content:
  "DataProvider": "sqlserver",
  "DataConnectionString": "Data Source=<YOUR_DB_SERVER_NAME>;Initial Catalog=<YOUR_DB_NAME>;Integrated Security=True;",
  "RawDataSettings": {}


  "DataProvider": "sqlserver",
  "DataConnectionString": "Data Source=localhost\\SQLEXPRESS;Initial Catalog=uesaka;Integrated Security=True;",
  "RawDataSettings": {}

Next, verify the site can be accessed.

Finally, make sure to change the store URL to the new URL being used.

Troubleshooting with Nop-templates

If you’re using nop-templates, you’ll run into some issues when trying to copy.

I had to delete all contents from the following SQL table:

delete from SS_C_Condition

Of course, this currently disables all HTML Widgets – so there must be a better way to do this. Take a look at


Converting nopCommerce to Store Images In Filesystem Instead of Database

By default, nopCommerce stores all of the images in the database, making for simple access. However, with a lot of images, this puts a lot of load on the database, alongside making for a larger database to maintain.

Some research shows that the performance doesnt differ greatly between the two options, but here’s a few reasons you might use either:

Storing in Database

  • Allows for easier backups & environment clones (only need to clone codebase and DB)
  • Single place for image storage (useful with multiple systems)

Storing on Fileserver

  • Separates large portion of data from DB (allows for smaller DB)
  • Lowers load on DB server.

First, back up your database – since this is a process that can cause major issues if something were to go wrong (or if you decide to revert back to storing images in the database).

To convert from storing images in the database to the file server, go to Admin -> Settings -> Media Settings. Click on ‘Change’:

After the conversion completes, nopCommerce will automatically convert all images into the /Content/Images folder. You’ll also notice the binaries no longer being stored in the Picture table.

Shrink Database

Once this is done, you have the potential to open a good amount of space on the database, since the binaries of the images are now stored on the fileserver themselves.

To reclaim this space, you’ll need to shrink the database. Run the following command in SQL:


Set up Pi-Hole to Block Ads at the Network Level

You can use a Respberry Pi to block all ads coming in at a DNS level


  • CanaKit (provides Raspberry Pi, Power Supply)
  • Short ethernet cable
  • Monitor and keyboard for initial setup

Raspberry Pi Initial Installation and Configuration

Assemble and then plug in the Raspberry Pi, which should take you to the NOOBS setup window.

Install Raspbian, and work through until you get to the desktop screen for the Raspberry Pi.

Router Configuration

With the Raspberry Pi configured, connect to your router admin page and find the IP address of the Raspberry Pi. Assign a static IP to the Raspberry Pi.

Pi-Hole Installation

On the Raspberry Pi, run the following commands to start installation:

wget -O
sudo bash

When asked for Upstream DNS Provider, select Cloudflare.

For everything else, just select the default options.

After installation finishes, you’ll be able to log into the web admin with the password provided at the end of installation.

You may want to change the password used to login, you can run the following command on the Raspberry Pi:

pihole -a -p

Pi-Hole Configuration

Access the web interface using http://<IP_ADDRESS>/admin, and log in using the password above.

Change the DNS server on your router to the IP address above.


An easy way to verify is to check a page that uses ads (here’s a good example). See if any ads appear, and then check the Pi-Hole admin:



Changing Default Token Expiration for Azure AD

To change the default token expiration timeframe when using Azure AD for authentication, you can do the following.

First, if you haven’t yet, install the AzureADPreview PowerShell Module:

Install-Module AzureADPreview

Now, connect to Azure AD using an account that has access to manage App Registrations:


After that, check and delete any policies that currently exist

After that, create a new policy (this one is set for 30 minutes as an example):

$policy = New-AzureADPolicy -Definition @('{"TokenLifetimePolicy":{"Version":1,"AccessTokenLifetime":"00:30:00","MaxAgeSessionSingleFactor":"00:30:00"}}') -DisplayName "CustomPolicy" -IsOrganizationDefault $false -Type "TokenLifetimePolicy"

And apply that policy to the service principal tied to the Azure AD integration:

$sp = Get-AzureADServicePrincipal -Filter "DisplayName eq '<service principal display name>'"

Add-AzureADServicePrincipalPolicy -Id $sp.ObjectId -RefObjectId $policy.Id

Now verify that the policy is in place:



Converting P7B Certificates into PFX Certificates

To convert a P7B certificate into a PFX certificate, you’ll need the following:

  • The .p7b certificate created after the CSR is generated.
  • The private key (likely .pem or .key) generated when generating the CSR.

First, doublr click the .p7b file and export out all of the certs that appear in Certificate Manager as Base64 encoded .CER files:

Once this is done, you’ll be able to create the .PFX file with the following openssl command:

openssl pkcs12 -export -in certificatename.cer -inkey privateKey.key -out certificatename.pfx -certfile cacert1.cer -certfile cacert2.cer 

Generating CSRs the Easy Way In Windows

First, download the DigiCert Certificate Utility.

Afterwards, click on “Create CSR”, and fill out the appropriate information to generate the CSR.

Accessing the Private Key for the CSR

When a CSR is generated, there is a private key that associates with the CSR (and eventual certificate). You can access this in certmgr.exe:

To get the private key, go to ‘All Tasks’ -> ‘Export’ and export the private key as needed. Windows will export it as a .pfx file, which you can convert using openssl:

openssl pkcs12 -in exported_cert.pfx  -nocerts -out key.pem

When requested for the password and PEM passphrase, use the password provided in the export step above.


Adding Icons to an Angular Web Site

Changing out the icons for an Angular website is just a few steps. This guide assumes you have an icon already in place, preferably in PNG format.

First, use a tool like Real Favicon Generator to create the source files, which will include a favicon.ico file alongside a series of apple-touch-icon* files. Add these files to the /src directory.

After that, make the following change to your index.html file:

    <link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png">
    <link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png">
    <link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png">
    <link rel="manifest" href="/site.webmanifest">
    <link rel="mask-icon" href="/safari-pinned-tab.svg" color="#5bbad5">
    <meta name="msapplication-TileColor" content="#da532c">
    <meta name="theme-color" content="#ffffff">

And finally make the change to angular.json to ensure the files are delivered correctly on build:

"assets": [

To test that everything worked successfully, make sure to run a build:

ng build

After this finishes, check the /dist folder to see the icon files added in the step above.


Keep A Consumption-Based Function App Warm With A Scheduled Task

With Azure Function Apps using a Consumption plan, they will need to be warmed up if not used for 20 minutes to prevent having cold starts for the users in place. If you’re serving an API using a Function app, you’ll want to put this in place to keep performance ideal.

Something to note with this solution – it works well for low-traffic APIs where the goal is to serve an API using the consumption app for low costs. Assuming larger traffic use, you may be better off switching to a dedicated App Service plan, to prevent the cold start issue at all, because cold starts will still come when scaling out.

To follow this guide, I’ll assume you already have a Function app in place. Create a new function with the following:

using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

  public static class KeepWarm
    public static void Run([TimerTrigger("0 */15 * * * *")]TimerInfo myTimer, ILogger log)
      log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");

That’s it! After this is deployed, the cold start issue of your API should be removed, as running the log function every 15 minutes will prevent the system from needing a cold start.

Further Reading


Adding a Scheduled Task to NOPCommerce through the database

When working in NOPCommerce, you may want to create a scheduled task without having to go through plugin install and uninstall.

First, you’ll have to have a task set up in ideally a plugin.

You can do so by adding the following to the ScheduleTask table in the NOPCommerce database:

INSERT INTO [dbo].[ScheduleTask]
           'NAMESPACE.ClassName, NAMESPACE'
           IS_ENABLED, -- 1-yes, 0-no
           SHOULD_STOP_ON_ERROR, -- 1-yes, 0-no

After that’s done, you should be able to immediately run the task.


Adding Swagger UI Documentation to Azure Function APIs

You can set up Swagger UI in your Azure Function API to allow for providing documentation for your serverless API pretty easily.

Initial Configuration

First, add the SwashBuckle library to your process via the <project>.csproj file:

    <PackageReference ... />
    <PackageReference Include="AzureFunctions.Extensions.Swashbuckle" Version="1.4.1" />

Next set up the SwashBuckle startup code in SwashBucketStartup.cs:

using System.Reflection;
using AzureFunctions.Extensions.Swashbuckle;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Hosting;

[assembly: WebJobsStartup(typeof(SwashBuckleStartup))]
  internal class SwashBuckleStartup : IWebJobsStartup
    public void Configure(IWebJobsBuilder builder)

Now create both HTTP Triggers for the Swagger document:

public static Task<HttpResponseMessage> Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "json")]
        HttpRequestMessage req,
    ILogger log,
    [SwashBuckleClient] ISwashBuckleClient swashBuckleClient)
  return Task.FromResult(swashBuckleClient.CreateSwaggerDocumentResponse(req));

And the Swagger UI document:

public static Task<HttpResponseMessage> Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "ui")]
        HttpRequestMessage req,
    ILogger log,
    [SwashBuckleClient] ISwashBuckleClient swashBuckleClient)
  return Task.FromResult(swashBuckleClient.CreateSwaggerUIResponse(req, "json"));

Running this locally will provide two endpoints:

The final step for initial configuration is changing the documentation for the API page. Add the following to host.json:

  "version": "2.0",
  "extensions": {

    "Swashbuckle": {
      "Documents": [
          "Title": "YOUR_TITLE",
          "Version": "v1",
          "Description": "YOUR_DESCRIPTION"

Which will give you:

Further Reading: