Automatically Clean Artifactory Artifacts

If you’re running into issues with artifacts taking up too much space on your Artifactory server, this will set you up with a programmatic way to clean old artifacts as they are no longer required.

Install ArtifactCleanup Plugin

SSH into the Artifactory server, go into the $JFROG_HOME/etc/plugins directory and run the following to download the Artifact Cleanup plugin:


Make changes to the artifactCleanup.json file based on desired behavior.

To enable logging, add the following to $ARTIFACTORY_HOME/etc/logback.xml:

<logger name="artifactCleanup">
    <level value="info"/>

Restart the server to install the plugin and get it set up for processing.

Invoking Manually

Read more information on executing cleanup manually on the GitHub page. It’s worth running as a dry-run and making sure functionality works as intended before letting loose.

After running, you can check the logs $ARTIFACTORY_HOME/logs/artifactory.log to find the results of cleanup.

Remember that once deleted using the plugin, the artifacts will remain in the Trash Can if it is enabled on your instance.

Access Artifactory Securely with Kubernetes

To allow Kubernetes to download images from a secured Artifactory instance (for example, an instance that doesn’t allow anonymous access):

Artifactory User Setup

  1. Create a user in Artifactory to serve as the service principal.
  2. Log into this user and create an API token.

Docker Credentials

With the API token, log into the Docker instance (username is service account username, password is API key):

docker login REPOSITORY_PATH

Once logged in successfully, access the config.json file created to get the token to use.

Kubernetes Token Creation

Upload the following JSON, using the auth token generated above:

kubectl create secret generic artifactory-authtoken \
    --from-file=.dockerconfigjson=<path/to/.docker/config.json> \

Kubernetes Deployment Reference

Finally, reference the auth token in your Kubernetes deployments:

  - name: artifactory-authtoken


If you’re having issues, make sure the server name used in the deployment and the server name logged in (including port numbers) match.


Restricting Access to App Services and Function Apps

Restrict By Function

Restriction by function works well since it gives you granular control over functions. An appropriate application would be giving the key to other applications, and then having those applications use the key to access.

There are three levels to restrict a function:

  • Anonymous – no restrictions
  • Function – restricted by an individual function key
  • Admin – uses the admin key defined in the function app

When calling the function, there are two ways to do so, using either a query string:

Or using the x-functions-key header.

Restrict by Application

Restrict By IP

For Azure App Services and Function Apps, you can restrict access to them based on IP address – useful for building internal applications with limited access.

First, click Networking on the left sidebar, then open the Access Restrictions page.

From here, you can add and remove IP addresses for both the App service/Function App and the SCM page. If no IP addresses are listed, then access is open for all.

Any invalid IP addresses at this point will receive a 403:

If you need a list of the IPs trying to access the application, you can access:

  • App Service: Diagnose and solve problems -> Availabilty and Performance -> HTTP 4xx Errors
  • Function App: Diagnose and solve problems -> HTTP 4xx Errors -> HTTP 4xx Errors

Once here, scroll to the bottom of the page and expand Which client IPs got rejected due to IP restriction?:

App Service Authentication

App Service Authentication works well when working with an application that is accessed by a browser (so not an API). Note that without authentication, you’ll need to login before accessing the app at all.

First, turn on App Service Authentication:

For setting up the source, setting up Azure Active Directory allows for using users inside the tenant for access. Useful for setting up an internal application.

Connecting An Azure Function App to Loggly

Doing the following will get a Function App hooked up to Loggly.

  1. Create an Event Hub namespace, and an Event Hub.
  2. Create a Function App, and an Event Hub function.
  3. Create a Loggly HTTP/S Event Endpoint, and get the URL endpoint.
  4. Update the Event Hub trigger with this code (use test to verify)
  5. Create a HTTP Trigger function.
  6. In the Function App’s diagnostic settings, turn on FunctionAppLogs and stream to the Event Hub.
  7. Call the HTTP trigger endpoint and verify results in Loggly.


Creating an Angular and Azure Function API App with Azure Static Web App

Recently, Azure released Azure Static Web Apps, which looks like a way to host static web sites easily. Some of the perks I see immediately are:

  • Works well with SPA technologies (Angular, React, Vue)
  • Serve an API using Azure Functions
  • Automatic integration with GitHub and GitHub Actions to deploy immediately
  • Currently costs nothing (while this is in preview)


To get started, you’ll need:

  • Angular CLI
  • Azure Functions Core Tools
  • An Azure account

Create a GitHub Repo with Angular and Azure Function Apps

First, create a repo in your GitHub account, and clone that repo to your local PC.

Now create an Angular app with the CLI:

ng new NAME --directory app

Next, create an Azure Functions API (currently, there is a limitation that only allows for use of Javascript as the runtime):

func init NAME --javascript
mv NAME api
cd api
func new --name TestFunction --language dotnet --template HttpTrigger

Commit the changes made above, and then let’s move onto creating the Azure Static Web App.

Creating Azure Static Web App

Next, create an Azure Static Web App in your Azure account. When doing this, do the following:

  • Sign in to your GitHub account and select the correct repository and branch.
  • For build details, use the following information (replacing azure-static-web-app-poc with):
    • App location: app
    • Api location: api
    • App artifact location: dist/APPNAME

Automatic Deployment

After creating the Static Web App, a GitHub Workflows file will be created and committed to your repo. In turn, your skeleton application should be built and deployed automatically.

With the deployment completed, you can view the deployed UI and API by checking the URL of the Static Web App in the Azure portal:

  • UI – check the URL provided.
  • API – check the URL, plus /api/TestFunction

Setting a Custom Domain

To set up a custom domain, access the ‘Custom Domains’ option on the sidebar. You’ll create a CNAME record as requested for the domain being used.

In terms of SSL Certificates, you don’t have to worry. Azure Static Web Apps will handle this completely for you. However, I was only able to create a www domain (as opposed to a root domain).

Further Reading

Microsoft’s Guide on Static Web Apps:

Adding Settings to a Plugin in NopCommerce (pre-4.00)

Once you’re created a plugin for NopCommerce, you’ll likely want to add the ability to configure settings inside the plugin for reference later.

When adding this capability, we’re going to work on trying to make this as immutable as possible, to follow functional programming as best we can, just because it makes things a little cleaner and puts all of the conversion between the configuration model and settings object.

Locale Keys

First, we’ll set up a class called MyPluginLocaleKeys.cs that holds the different locale keys used to represent the settings descriptions in the Configure page:

public static class MyPluginLocaleKeys
    public const string MySetting1 = "Plugins.Misc.MyPlugin.Fields.Settings1";
    public const string MySetting2 = "Plugins.Misc.MyPlugin.Fields.Settings2";

Configuration Model

Next, we’ll create a configuration model named Models/MyPluginConfigModel.cs that represents the configure page for the plugin. This will get populated by the plugin’s settings, and then be responsible for receiving user data and saving the setting values provided.

In addition to usually having a 1:1 relationship to the plugin settings, you can also provide DataAnnotations that handle model validity and display names. When providing a NopResourceDisplayName, we’ll use the locale key provided above.

public class MyPluginConfigModel
    public string Setting1 { get; set; }

    public string Setting2 { get; set; }

Settings Object

Next, create an ISettings implementation in the root of the plugin named MyPluginSettings.cs.

This is going to work a little differently – we’re going to create this to be immutable, so we’ll both initialize it from a config model, create a “default” settings object, and allow for easily converting to a config model.

public class MyPluginSettings : ISettings
    // Settings properties
    public string Setting1 { get; private set; }
    public string Setting2 { get; private set; }

    // Initializes from a config model (used for saving values)
    public static MyPluginSettings FromModel(MyPluginConfigModel)
        return new MyPluginSettings()
            Setting1 = model.Setting1,
            Setting2 = model.Setting2

    // Create a "default" settings object that is used when installing plugin
    public static MyPluginSettings Default()
        return new MyPluginSettings()
            Setting1 = "Default setting",
            Setting2 = "Another default setting"

    // Creates a config model to use for the configuration page
    public MyPluginConfigModel ToModel()
        return new MyPluginConfigModel()
            Setting1 = Setting1,
            Setting2 = Setting2

We use private set here to make sure we initialize the settings object from a configuration model (when saving) and disallow the ability to change the settings object – making it immutable.

Base Plugin Class

Next, create the base plugin class (named MyPluginPlugin.cs, so for example, EnhancedLoggingPlugin.cs, used for defining the type of plugin and defining the install/uninstall options. We’ll do a few special things

public class MyPluginPlugin : BasePlugin, YOUR_PLUGIN_INTERFACE
    private readonly ISettingService _settingService;

    public MyPluginPlugin(
        ISettingService settingService)
        _settingService = settingService;

    public override void Install()
        // saves the desired default values for settings
        // Adds all locale resources
        foreach (var field in typeof(MyPluginLocaleKeys).GetFields())
	    string key = (string)field.GetValue(null);
	    // Converts the key above to a value (capital letters separated by space)
	    string value = Regex.Replace(field.Name, "([a-z])([A-Z])", "$1 $2");
	    value = Regex.Replace(value, "([A-Z])([A-Z][a-z])", "$1 $2");

	    this.AddOrUpdatePluginLocaleResource(key, value);


    public override void Uninstall()
        // Delete the plugin settings from DB

        // Delete all plugin locales using reflection
        foreach (var field in typeof(MyPluginLocaleKeys).GetFields())


Configure Controller and View

The final step is creating both the Configuration controller and view to finish the ability to view and save plugin settings data. Create the Controllers/MyPluginConfigController.cs file:

public class SliExportController : BasePluginController
    private readonly MyPluginSettings _settings;
    private readonly ISettingService _settingService;
    private readonly ILocalizationService _localizationService;

    public SliExportController(
        SliExportSettings sliExportSettings,
	ISettingService settingService,
	ILocalizationService localizationService)
	_sliExportSettings = sliExportSettings;
	_settingService = settingService;
	_localizationService = localizationService;

    public ActionResult Configure()
        return View("~/Plugins/PLUGIN_EXPORT/Views/Configure.cshtml", _settings.ToModel());

    public ActionResult Configure(MyPluginConfigModel model)
        if (!ModelState.IsValid)
	    return Configure();


        return Configure();

Then create the Views/Configure.cshtml file:

    Layout = "";
@using Nop.Web.Framework;
@using (Html.BeginForm())
    <div class="panel-group">
        <div class="panel panel-default">
            <div class="panel-body">
                <div class="form-group">
                    <div class="col-md-3">
                        @Html.NopLabelFor(model => model.Setting1)
                    <div class="col-md-9">
                        @Html.NopEditorFor(model => model.Setting1)
                        @Html.ValidationMessageFor(model => model.Setting1)
                <div class="form-group">
                    <div class="col-md-3">
                        @Html.NopLabelFor(model => model.Setting2)
                    <div class="col-md-9">
                        @Html.NopEditorFor(model => model.Setting2)
                        @Html.ValidationMessageFor(model => model.Setting2)
                <div class="form-group">
                    <div class="col-md-3">
                    <div class="col-md-9">
                        <input type="submit" name="save" class="btn bg-blue" value="@T("Admin.Common.Save")" />

Recording HTTP Request Body with Java, Spring Boot and Application Insights

Building off of my previous post about integrating App Insights into Spring Boot, I also wanted to record the request body in each trace sent to Azure. This is especially useful when looking up failures, since you’ll be able to see the request body used that caused the failure.

Important Note Regarding Privacy

Before getting started, something to consider is the issue of privacy – by activating this, you’ll be storing request body information into Azure, which can be an issue if you’re dealing with sensitive information.

If that’s the case, you should be sure to process the body extracted from this inplementation and remove the sensitive information in the payload before adding it to the request telemetry.

Bypassing the HttpServletRequest issue

Java servlets do not allow the ability to read a response multiple times – if you try to do so by reading getReader() multiple times, you’ll get an IllegalStateException. To fix this, we’ll create a custom implementation of HttpServletRequest that will cache the request provided, allowing us to read the request body, and then passing this down further the Spring Boot chain.

Create the CachedBodyHttpServletRequest class:

package com.example.demo;


import javax.servlet.ServletInputStream;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletRequestWrapper;

import org.springframework.util.StreamUtils;

public class CachedBodyHttpServletRequest extends HttpServletRequestWrapper {
    private byte[] cachedBody;

    public CachedBodyHttpServletRequest(HttpServletRequest request) throws IOException {
        InputStream requestInputStream = request.getInputStream();
        this.cachedBody = StreamUtils.copyToByteArray(requestInputStream);

    public ServletInputStream getInputStream() throws IOException {
        return new CachedBodyServletInputStream(this.cachedBody);

    public BufferedReader getReader() throws IOException {
        // Create a reader from cachedContent
        // and return it
        ByteArrayInputStream byteArrayInputStream = new ByteArrayInputStream(this.cachedBody);
        return new BufferedReader(new InputStreamReader(byteArrayInputStream));

    public String getBody() throws IOException {
        return getReader().lines().collect(Collectors.joining(System.lineSeparator()));

Next, create the CachedBodyServletInputStream class:

package com.example.demo;


import javax.servlet.ReadListener;
import javax.servlet.ServletInputStream;

public class CachedBodyServletInputStream extends ServletInputStream {
    private InputStream cachedBodyInputStream;

    public CachedBodyServletInputStream(byte[] cachedBody) {
        this.cachedBodyInputStream = new ByteArrayInputStream(cachedBody);

    public boolean isFinished() {
        try {
            return cachedBodyInputStream.available() == 0;
        } catch (IOException e) {
            // TODO Auto-generated catch block
        return false;

    public boolean isReady() {
        return true;

    public void setReadListener(ReadListener readListener) {
        throw new UnsupportedOperationException();

    public int read() throws IOException {

Adding CachedBodyHttpServletRequest to Spring Boot Filter

To use this, you’ll create a filter that activates before processing a request, which will add the request body to the request telemetry when defined as a POST or PUT method.

Create the CachedHttpServletRequestFilter class:

package com.example.demo;

import org.springframework.core.Ordered;
import org.springframework.core.annotation.Order;
import org.springframework.http.HttpMethod;
import org.springframework.stereotype.Component;
import org.springframework.web.filter.OncePerRequestFilter;

import javax.servlet.FilterChain;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebFilter;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

@Order(value = Ordered.HIGHEST_PRECEDENCE)
@WebFilter(filterName = "ContentCachingFilter", urlPatterns = "/*")
public class CachedHttpServletRequestFilter extends OncePerRequestFilter {
    protected void doFilterInternal(HttpServletRequest request, HttpServletResponse response, FilterChain filterChain) throws ServletException, IOException {
        // Skip all processing if not a POST or PUT request to improve efficiency
        if (!isPost(request) && !isPut(request)) {
            filterChain.doFilter(request, response);

        RequestTelemetryContext context = ThreadContext.getRequestTelemetryContext();
        RequestTelemetry requestTelemetry = context.getHttpRequestTelemetry();

        CachedBodyHttpServletRequest cachedRequest = new CachedBodyHttpServletRequest(request);
        String body = cachedRequest.getBody();
        requestTelemetry.getProperties().put("Request Body", body);

        filterChain.doFilter(cachedRequest, response);

    private boolean isPost(HttpServletRequest request) {
        return request.getMethod().equalsIgnoreCase(HttpMethod.POST.toString());

    private boolean isPut(HttpServletRequest request) {
        return request.getMethod().equalsIgnoreCase(HttpMethod.PUT.toString());


To verify, start running the application and make a few calls. When observing the requests in App Insights, you should be able to see the recorded Request Body on all POST and PUT calls:

Integrate Application Insights into a Spring Boot Application

To get started, we’ll set up a basic Spring Boot application, and then add Application Insights in the next step.

Creating a Skeleton Spring Boot App

To get started, go to Spring Boot Initializr and create an app with the following selected:

  • Create a gradle project
  • Add the Spring Web dependency

Once that’s done, extract the archive file given and open in your Java IDE of choice (IntelliJ, for example).

Add the following Controller ApiController.javato add an API endpoint:

package com.example.demo;

import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;

public class ApiController {

    public String doGet() {
        return "Hello World!";

Start the application using clean, build, and bootRun. You can verify the endpoint is working with a tool such as Insomnia and calling at http://localhost:8080 to have “Hello World!” displayed.

Once you have this working, you’re ready to start integrating Application Insights to provide analytics.

Integrating Application Insights

First, add the Application Insights dependencies to your build.gradle file:

dependencies {
    ...  // other dependencies
    compile group: '', name: 'applicationinsights-web-auto', version: '2.5.0'

Then add the Resources/ApplicationInsights.xml file:

<?xml version="1.0" encoding="utf-8"?>
<ApplicationInsights xmlns="" schemaVersion="2014-05-30">
   <!-- HTTP request component (not required for bare API) -->
      <Add type=""/>
      <Add type=""/>
      <Add type=""/>

   <!-- Events correlation (not required for bare API) -->
   <!-- These initializers add context data to each event -->
      <Add type=""/>
      <Add type=""/>
      <Add type=""/>
      <Add type=""/>
      <Add type=""/>


Finally, you’ll add the section in startup to add the App Insights Instrumentation Key to the codebase, to link the resource to send data to. A quick note on this – you can add the key to the .xml file, but I prefer to add it in as an environment variable, so this can be changed across different environments when deployed.

Add the following method to the file:

private static final Logger log = LoggerFactory.getLogger(DemoApplication.class);


	private void init() {
		String appInsightsKey = System.getenv("AppInsightsKey");
		if (appInsightsKey == null) {
			log.warn("App Insights Key not provided, no analytics will report.");


Verifying in App Insights

With the changes made in place, the last step is verifying everything is in place. To start the application with App Insights enabled:

  • Add the App Insights Instrumentation Key to an environment variable called AppInsightsKey.
  • Start the application.
  • Call the endpoint at http://localhost:8080
  • View the results at the App Insights screen.

Further Reading

Developing with nopCommerce using VSCode and Linux


To get started, you’ll need to set up:

  • VSCode with the C# and vscode-solution-explorer extensions installed
  • SQL Server

Download and Build nopCommerce Source Code

To get started, get a copy of the nopCommerce source code at their Github page.

After downloading the source code, open the /src folder using VSCode.

For cleaning and building the project, you have two choices:

  1. Use dotnet to run clean and build while in the src/ directory.
  2. Using the solution viewer plugin, you can clean and build the project:

After cleaning and building, you can run using either:

  1. dotnet run in the src/Presentation/Nop.Web directory.
  2. Using the VSCode solution explorer.

Once started running, you’ll be able to access nopCommerce below:

With this, you are able to run locally and perform all tasks related to administering nopCommerce, including running the installation and anything else.