Setting up a Configuration Page for Your 3.90 or Below NopCommerce Plugin

After writing your NopCommerce plugin, a common next step is to create a ‘Configure’ page to allow for configuring the settings of the plugin.

First, create a model that will represent the values to configure in the plugin – Models/YourPluginModel.cs

Next, create a controller Controllers/YourPluginController.cs, which will look something like this:

...
[AdminAuthorize]
public class YourPluginController: BasePluginController
    {
        private readonly ILocalizationService _localizationService;

        public YourPluginController(ILocalizationService localizationService)
        {
            _localizationService = localizationService;
        }

        [ChildActionOnly]
        public ActionResult Configure()
        {
            return View("~/Plugins/Misc.YourPlugin/Views/YourPlugin/Configure.cshtml", model);
        }

        [HttpPost]
        [ChildActionOnly]
        public ActionResult Configure(YourPluginModel model)
        {
            if (!ModelState.IsValid)
            {
                return Configure();
            }

            SuccessNotification(_localizationService.GetResource("Admin.Plugins.Saved"));

            return Configure();
        }
    }
...

Finally, create a view Views/YourPlugin/Configure.cshtml:

@{
    Layout = "";
}
@using Nop.Web.Framework

@using (Html.BeginForm())
{
    @Html.AntiForgeryToken()
    <div class="panel-group">
        <div class="panel panel-default">
            <div class="panel-body">
                <div class="form-group">
                    <div class="col-md-3">
                         
                    </div>
                    <div class="col-md-9">
                        <input type="submit" name="save" class="btn bg-blue" value="@T("Admin.Common.Save")" />
                    </div>
                </div>
            </div>
        </div>
    </div>
}

Make sure the newly created View is set to ‘Copy if newer’.

Now when running your store, you should be able to enter the Configure page for your plugin, assuming the plugin is installed:

Setting up Jenkins to Run Angular Unit Tests

To be able to run unit tests in a Linux-based Jenkins instance, you just need to SSH into the Jenkins instance and run the following command:

wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
sudo dpkg -i google-chrome-stable_current_amd64.deb

When installing, you may run into a dependency issue, if so, run:

sudo apt-get install -f
sudo dpkg -i google-chrome-stable_current_amd64.deb

After Google Chrome is installed, you should be able to run npm test, meaning you can use Jenkins to run unit tests in the CI process.

Suggested Jenkins Plugins

Here’s a list of Jenkins plugins I tend to use frequently.

Global Slack Notification Plugin

Allows for sending messages via pipeline to Slack channels.

Azure AD Plugin

Allows for authentication to Jenkins using Azure Active Directory.

Checkmarx

If you are using Checkmarx to scan for vulnerabilities in your codebases, this plugin will allow for connecting to a Checkmarx server automatically to generate a report.

After installing, make sure to set up a server to allow for connection in Configure System -> Checkmarx:

If you need to add the Checkmarx step to the pipeline, this can be generated using the Pipeline Syntax feature.

NodeJS

This allows for using npm steps in your project, great for running processes for JavaScript based projects.

After installing, make sure to set up a NodeJS installation in Global Tool Configuration:

Build with Parameters Plugin

If you have a job that takes a series of parameters to run (such as a job that performs a deployment), this plugin will allow for programmatically filling in the parameters via a GET call.

Once installed, you can call any job with parameters like such:

https://server/job/$JOB/buildwithparam?PARAMS

Windows Azure Storage Plugin

This plugin allows for programmatic interaction with different Azure storage accounts, allowing for both uploading and downloading.

After installing, you’ll need to set up credentials for use:

To use in pipeline, the syntax will look like:

azureUpload blobProperties: [
                        cacheControl: '',
                        contentEncoding: '',
                        contentLanguage: '',
                        contentType: '',
                        detectContentType: true],
                        containerName: 'ui-dist',
                        fileShareName: '',
                        filesPath: 'dist/**',
                        storageCredentialId: 'YOUR-SP',
                        storageType: 'blobstorage'

Active Choices Plugin

This allows for more dynamic functionality with parameters (selecting one parameter allows a set of options for the other).

Setting this up in Pipeline takes some work, you’ll need to add using the following format:

properties([
    parameters([
        [$class: 'ChoiceParameter', 
            choiceType: 'PT_SINGLE_SELECT', 
            description: 'Select the context from the Dropdown List', 
            filterLength: 1, 
            filterable: true, 
            name: 'context', 
            randomName: 'choice-parameter-5631314439613978', 
            script: [
                $class: 'GroovyScript', 
                fallbackScript: [
                    classpath: [], 
                    sandbox: false, 
                    script: 
                        'return[\'Could not get Env\']'
                ], 
                script: [
                    classpath: [], 
                    sandbox: false, 
                    script: 
                        'return["uat","dev"]'
                ]
            ]
        ], 
        [$class: 'CascadeChoiceParameter', 
            choiceType: 'PT_SINGLE_SELECT', 
            description: 'Select the Server from the Dropdown List', 
            filterLength: 1, 
            filterable: true, 
            name: 'service', 
            randomName: 'choice-parameter-5631314456178619', 
            referencedParameters: 'Env', 
            script: [
                $class: 'GroovyScript', 
                fallbackScript: [
                    classpath: [], 
                    sandbox: false, 
                    script: 
                        'return[\'Could not get Environment from Env Param\']'
                ], 
                script: [
                    classpath: [], 
                    sandbox: false, 
                    script: 
                        ''' if (context.equals("uat")){
                                return["uat-1","uat-2"]
                            }
                            else if(context.equals("dev")){
                                return["dev-1","dev-2"]
                            }
                        '''
                ]
            ]
        ]
    ])
])

Copying Jenkins Jobs from Server to Server

In the case where you want to copy a collection of jobs from one Jenkins server to another, here’s a process you can use to make the migration. This guide assumes you have two different remote Jenkins instances to copy jobs from.

SSH into the source server to determine where the /jobs directory is for Jenkins. Example would be in /

Use scp (on your local machine) to copy the jobs from the remote Jenkins instance with the jobs:

scp -r USER@DNS:~JENKINS/jobs C:\Users\d\Downloads\jobs

After everything is downloaded, upload the files to the new server (you will need to upload them to the home directory of the user, and then move them via SSH):

scp -r C:\Users\d\Downloads\jobs DEST_USER@DEST_DNS:~

After that completes, SSH into the server again and move the files:

sudo mv jobs /var/lib/jenkins

Also, set the permissions of the jobs file over to the Jenkins service user:

sudo chown -R jenkins:jenkins /var/lib/jenkins/jobs

After this is complete, go into Settings and Reload Configuration from Disk. Ensure that all of your previous jobs are in place.

Setting up Jenkins with Azure AD Authentication

Configuring Azure AD

Run the following command in CLI to generate a service principal:

az ad sp create-for-rbac --name="{NAME}" --role="Contributor" --scope="/subscriptions/{SUBSCRIPTION_ID}" --years=100

Save the output generated, as you’ll use it for configuration in Jenkins.

Set the Reply URL to https://YOURHOST/securityRealm/finishLogin

Set Required Permissions in Azure Active Directory to:

  • Application Permissions (Read Directory Data)
  • Delegated Permissions (Read Directory Data)

Click on ‘Grant permissions’.

If planning to use an Azure AD group for authorization, create one now.

Configuring Jenkins

Download the ‘Azure AD’ plugin, and restart after installation.

Go to Manage Jenkins → Configure Global Security.

Select ‘Enable Security’ if it isn’t already selected.

Under ‘Security Realm’, select ‘Azure Active Directory’, and fill the information:

  • Client ID – ‘appId’
  • Client Secret – ‘password’
  • Tenant – ‘tenant’

Use the button to verify the application.

Set Authorization to ‘Azure Active Directory Matrix-based security’.

Set the Group to be the newly created, and assign the appropriate permissions.

Verify by logging out and logging back in as Azure AD user.

Troubleshooting

If you accidentally lock yourself out after enabling Azure AD, do the following:

SSH into the server.

modify the config.xml file

sudo nano /var/lib/jenkins/config.xml
  • For the useSecurity item, change to ‘false’
  • Remove authorizationStrategy and securityRealm sections.

Restart Jenkins:

sudo systemctl restart jenkins

Jenkins is now completely unprotected – so continue working on whatever security strategy you were working on.

Upgrading Ubuntu

To upgrade Ubuntu, use the following procedure:

Upgrade all of your current dependencies:

sudo apt update && sudo apt upgrade

Change over dependencies with new versions of packages:

sudo apt dist-upgrade && sudo apt-get autoremove

Perform the OS upgrade:

sudo apt install update-manager-core && sudo do-release-upgrade

Setting up WordPress to serve next-gen WebP images

Why do this?

If you are looking to improve performance on your website, this will automatically convert the images on your site to the more efficient WebP format. This will help with Google Lighthouse scores (especially in solving the “Serve images in next-gen format” issue).

Procedure

Install the WebP Express Plugin (and buy the developer a coffee!)

Go into the WebP Express settings, accept the defaults, and click ‘Save settings’.

Verify this is working by running Google’s Lighthouse analytics. You should not see the item ‘Serve images in next-gen formats” item in your listing.

Installing Dependencies

Depending on the server configuration you have, you may need to do more work after installing the WebP Express Plugin. If you see the following, you’ll need to configure your server to allow access:

SSH into the server and install the gd using the following:

sudo apt-get install php7.2-gd

In addition, you may need to install mod_headers:

sudo a2enmod headers
sudo systemctl restart apache2

Setting up Jenkins in Azure

Before getting started, you’ll need to have:

  • An Azure tenant and subscription.
  • OpenSSH (installation for Windows 10)

Installing Jenkins via Azure Marketplace

The easiest way to install Jenkins is to use the Azure Marketplace link. You’ll likely want to change the size of the VM to something smaller when testing out – you can always increase size later.

Next, SSH into the server and check to see if you can update the OS (as of this writing, the image ships with Ubuntu 16.04 LTS, and can be upgraded to 18.04 LTS).

Setting up SSL using Let’s Encrypt

The next step is setting up SSL using Let’s Encrypt to allow for an HTTPS connection. First, open the 443 port on the VM:

az network nsg rule update -g RG_NAME --nsg-name NSG_NAME -n http-rule --destination-port-ranges 80, 443

Now SSH into the server and modify SSL Offloading:

sudo nano /etc/nginx/sites-available/default

Use the following configuration:

server {
    listen 80 default_server;
    server_name _;
    return 301 https://$host$request_uri;
}

server {
    listen 443 ssl;
    server_name CUSTOMDOMAIN;
    ssl_certificate /etc/letsencrypt/live/CUSTOMDOMAIN/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/CUSTOMDOMAIN/privkey.pem;
    location / {
        proxy_set_header        Host $host:$server_port;
        proxy_set_header        X-Real-IP $remote_addr;
        proxy_set_header        X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header        X-Forwarded-Proto $scheme;


        # Fix the “It appears that your reverse proxy set up is broken" error.
        proxy_pass          http://localhost:8080;
        proxy_read_timeout  90;
    }
}

Then run the following commands:

sudo service nginx stop
git clone https://github.com/letsencrypt/letsencrypt
./letsencrypt/letsencrypt-auto certonly
sudo service nginx restart

Accessing and Logging Into Jenkins

After completed, access the Jenkins instance at http://<DNS-NAME>.<LOCATION>.cloudapp.azure.com. Verify that both the SSL connection is valid and that you are on the ‘Unlock Jenkins’ page:

Run the following command in the SSHed server to get a code for the screen:

sudo cat /var/lib/jenkins/secrets/initialAdminPassword

Next, you’ll get a request to either install suggested plugins or select plugins as desired. I recommend going through and selecting the plugins desired to keep the installation minimal. Remove anything from the list that you may not need (such as Subversion). You can always add plugins later if you find you need.

After that, create an admin user for yourself, and you’ll be ready to get started!

Next Steps

After you’ve finished setting up Jenkins, a few next steps would be:

Reference

Creating a Function App With a Full CI/CD Pipeline with VSCode and Jenkins

Before starting this, you’ll need to have a few things on your machine:

  • Azure Functions VSCode Extension
  • Azure Functions Core Tools (choco install azure-functions-core-tools)
  • .NET Core Build Tools (choco install visualstudio2017-workload-netcorebuildtools)
  • An Azure subscription
  • A Function app created inside of the Azure subscription
  • A Jenkins server with the following set up:
    • Azure Function plugin installed
    • A service principal configured (use az ad sp create-for-rbac -n "jenkins" --role contributor --scopes /subscriptions/{SubID} and then add to Jenkins Credentials)

Creating and Locally Running the Function App

Inside VSCode, create a function app project with the following:

  • Create a new folder for use
  • Language: C#
  • Template: HttpTrigger
  • Security: Anonymous

After creating, you’ll need to resolve a few dependencies, which VSCode should prompt for.

I ran into issues with getting the function app debugging locally, but I was able to run it without issue by running the following in PowerShell

dotnet clean
dotnet build
func host start

Once you get it running, you can use the URL provided to test:

Locally Running a TimerTrigger Function

To use a TimerTrigger functions, there are just a few changes:

  • You will need to have a Storage Account available to allow for running locally.
  • When running, you can invoke the function by making a POST to http://localhost:7071/admin/functions/{FUNCTION_NAME}

Adding to Git and Deploying via Jenkins

The next step is checking in the example code to Git, so you have a place to get the codebase from for deployment.

After checking in the codebase, create a Multibranch Pipeline project in Jenkins.

Use the following Jenkinsfile as a reference:

pipeline {
  agent any
  stages {
    stage('Build') {
      steps {
        sh 'dotnet clean'
        sh 'dotnet build'
      }
    }
    stage('Deploy to Function App') {
      when { branch 'master' }
      steps {
        azureFunctionAppPublish appName: "fa-poc-123",
          azureCredentialsId: 'jenkins-sp',
          resourceGroup: "fa-poc-ue-rg",
          sourceDirectory: '',
          targetDirectory: '',
          filePath: ''
      }
    }
  }
}

After that, check in your codebase and you should have a deployed Function App with the codebase provided. Verify it using the Function App URL provided.

Copying a Database in Azure with Always Encrypted Data

When trying to copy a database with Always Encrypted data (say, to a different environment), you’ll generally want to recycle the Column Master Key used to match the vault stored in the same Azure resource group. This takes a little bit of work to do.

Create a new key in the Azure Key Vault:

az keyvault key create --name Always-Encrypted-Auto1 --vault-name VAULT_NAME

Next, create a new Column Master Key, using the newly created key above.

Next, create a Column Encryption Key using the newly created Column Master Key above:

Script the CREATE script for the new Column Encryption Key and copy ENCRYPTED_VALUE:

Run the following query with the copied ENCRYPTED_VALUE to alter the current Column Encryption Key:

ALTER COLUMN ENCRYPTION KEY CEK_Auto1
ADD VALUE 
( 
    COLUMN_MASTER_KEY = [CMK_Auto2],
    ALGORITHM = 'RSA_OAEP',
    ENCRYPTED_VALUE = VALUE_FROM_ABOVE
); 
GO
 
ALTER COLUMN ENCRYPTION KEY CEK_Auto1
DROP VALUE 
( 
    COLUMN_MASTER_KEY = CMK_Auto1 
); 
GO

Clean Up

To clean up, delete the newly created, CEK and the old CMK.