Doing A Search & Replace Using SQL Server

Had a long weekend of work this week so this post is going to be short (since I didn’t have enough time to flesh something out. I’ll be better next week!).

I was working with a client and found I needed a way to do a search and replace on a SQL database. In particular I was changing a URL over from an old domain to a new domain.

Check the link below to see in easy way to apply it to your own database.

Setting up Continuous Integration and Deployment with WordPress using DeployHQ

To get this blog started, I wanted to set it up correctly and use a good Continuous Integration and Delivery workflow.  To do this, I’ve found that DeployHQ does a good job of simplifying the process.  Simply put, DeployHQ monitors a source control repository and deploys changes to servers that you signify.

By automating this process, you’ll no longer need to worry about issues that can come from deployment, and simplify the steps for getting your code into a production environment.  In addition, you can implement build steps, allowing you to do things such as compile code, run unit tests, add configuration files, and more.

This process is going to specifically focus on migrating a series of files, so this would work with WordPress.  In a later section of these blog post series, we will learn how to use build steps for any applications that require a build beforehand, such as a React or an Angular application.

If you’re developing using a local WordPress environment (and you should!), this will allow for the changes being made in your repository to automatically be transferred to the server.  No more needing to connect to the server and make sure your files are in sync!

Before you get started, you’ll need the following:

  • You’ll need to have your WordPress source code checked into a repository – the easiest to get started with is Github, but you can also use Bitbucket, Gitlab, or use a private Git/Subversion/Mercurial server.
  • You’ll need somewhere to host your website.  I personally use NearlyFreeSpeech.NET.  You can use any type of hosting that allows for an FTP connection,
  • You’ll need a DeployHQ account (you can get a 10 day free trial, and then you can use the free account indefinitely for a single project.  For about $8 a month, you get 10 projects you can deploy).

Creating a New Project

After creating a DeployHQ account, the first thing to do is log in to the DeployHQ website and get to the dashboard:

From here, you can add your first project, connect to your repository, and specify the servers to deploy the code to.  First, you’ll set up a name for the project and specify where the source code is located (I’m using Github here, but you can choose from a few others such as Gitlab, Bitbucket, or your own Git/Subversion/Mercurial server):

If you are using a service such as Github, you will need to log in and approve DeployHQ to access the available repositories.  Once that is done, you should see a list of repositories you can select from:

Step 2: Creating A Server

The next step is to create a server for the project to deploy your code base to.  You will need at least one server (for your Production environment), but may want to create multiples based on the different environments that your project uses.

You’ll also be able to select the default branch that the project will deploy from, alongside setting up notification settings.  Personally, I suggest changing to allow for receiving a notification after all deployments, whether they are successful or not.

If you are using Github to host your repository, you can add the webhook created to the repository afterwards to be able to kick off automatic deployments, fulfilling the continuous delivery capability.

Part 3:  Creating Configuration Files

Next, you’ll want to create multiple config files with the deployment to allow for setting the configuration for the application correctly:

  1. Set permalinks to work with an .htaccess file.
  2. Create a wp-config.php file for WordPress configuration (one for each environment).

If you are planning to use post names for permalinks, you can use the following for .htaccess:

RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]

and you can use the generate wp-config tool to create your wp-config.php file.

Part 4: Deployment

Once all of the steps above are complete, you are ready to deploy your web site!  Just click on the ‘New Deployment’ button on the top right, and you’ll be brought to this screen:

You’ll be able to change a few things on this screen if desired, such as selecting which server to deploy to or scheduling the deployment for a later time (useful if you don’t want these changes to take effect until off-peak traffic hours).

You can click on ‘Preview’ to check which files are going to be transferred over (good to make sure you’ve set which revisions to deploy correctly), and you can click ‘Deploy’ to actually start the deployment.  Once you start a deployment, you’ll be brought to this screen:

This shows the actual process of the deployment – once this screen shows that it’s finished, you’ve completed your first deployment!

Ripping a DVD To MP4 Using Handbrake

Disclaimer – my quick research on this tells me the legality of making copies of your digital media is a gray area.  The rule of thumb is that you are allowed to make a backup copy of physical media that you own, but you are assuming any risks that may come from copying your physical media to videos.

My wife watches a series of DVDs she uses for her fitness, and I wanted to get them off of the DVDs and onto her computer, so she doesn’t have to keep lugging around an external DVD player and the actual files.

There are a few options I’ve found in ripping DVDs to your computer for later use, such as:

  • Handbrake – an open-source video transcoder that can pull for all kinds of media.  A little complex to use and requires some setup to get in place.
  • WinX DVD Ripper Platinum is another application I’ve found that you can use for a quick download.  To rip a full DVD, you’ll have to purchase the full version (around $39.95).

I decided to go through and work on configuring Handbrake just to learn a bit more about the process and have something I can come back to anytime I need to rip a movie or other data from a DVD.

First, download and Install Handbrake.  After installing HandBrake, you’ll want to install libdvdcss (32bit, 64bit) in order to bypass any copy protection issues the DVD may have.  Without this file, you may end up with a video that comes back scrambled and unviewable.

Once you download the libdvdcss file, you’ll move it to the HandBrake installation location (usually C:/Program Files/HandBrake).  After this is done, start HandBrake and you’ll see the following:

Assuming you already have your DVD in your laptop (or connected via external drive), you can click on the DVD on the left side and allow it to load all source.  This will take maybe 15-20 minutes to load everything into the application.

Once everything is loaded, you’ll just need to do the following:

  • Select the title that matches the length of your video (usually one of the first few titles).
  • Choose a filename and a destination for the DVD rip output (the movie).

Once this is done, click on ‘Start Encode’, and allow the DVD to start ripping.  At the bottom left, you should see the encode process begin.  Once that’s all set, check the video and see if the encode worked!

Remember, if you end up with something that is scrambled, make sure you set up the libdvdcss file in your HandBrake directory correctly.



Fixing the scrambled video issue on YouTube

2018.03.31 Training

Power Snatch & OHS (1+2)

  • 50 x 3
  • 60 x 3
  • 65 x 3
  • 70 x 3
  • 75 x 3
  • 80 x 3 x 3

Power Clean & Push Press (1+3)

  • 70 x 4
  • 75 x 4
  • 80 x 4
  • 85 x 4
  • 90 x 4
  • 95 x 4


  • 90 x 5
  • 110 x 5
  • 135 x 5
  • 144 x 5 x 2

Glute Ham Raise

  • 3 x 5

Some assistance required on these.

2018.03.26 Training


  • 70 x 3
  • 90 x 3
  • 110 x 3
  • 130 x 3
  • 150 x 3
  • 155 x 3

Snatch RDL (4:3:0)

  • 70 x 5
  • 80 x 5
  • 90 x 5
  • 95 x 5
  • 100 x 5

Strict Press

  • 50 x 5
  • 55 x 5
  • 60 x 5
  • 65 x 5
  • 70 x 5

Bent-Over Row (3 sec. pause)

  • 40 x 8
  • 40 x 8
  • 40 x 8

Kneeling Ab Rollers

  • 3 sets of 5

Setting Up My New Windows 10 PC

Whenever setting up a new machine, I go off of this list to make sure I’ve got all the right software on my machine, along with getting everything set up.  That way, I have everything all taken care of, and nothing will surprise me when I get started and working.


The first thing to do is run Windows Update and make sure I have the latest version of the operating system.

Once that is done, I go in and set up Windows 10 Mail to use my email accounts and calendar.  Some of the things I do include:

  • Adding all email accounts.
  • Linking inboxes together.
  • Creating signatures for each account.
  • Setting up the calendar with the following:
    • Coloring calendars appropriately.
    • Setting the first day of the week to Monday.
    • Setting working hours.


The first thing I install is Microsoft Office 2016 Pro Plus.  I make sure to log into OneNote as well so I can start syncing my notes into the computer.

For everything else, I like to use Chocolatey to expediate the progress and get everything set up on your computer quickly.  Ninite is also a good choice to speed up the progress, but doesn’t have quite the same selection of applications.

Run the following command in an elevated CLI, adding or removing any applications as desired:

  • Remote Desktop Software (teamviewer)
    • Log in to account, set up machine for easy access.
  • Internet Browser (google-chrome-x64)
    • Log into account to sync bookmarks, extensions.
  • Cloud Storage (dropbox)
    • Log in and start data sync.
  • Anti-Theft (prey)
    • Log in to configure machine, clean up any unused machines.
  • Version Control (git)
  • Visual Studio Code (visualstudiocode)
    • Install the following extensions:
      • TSLint
      • Path Intellisense
      • vscode-icons
      • One Dark Pro
  • Visual Studio Community 2017 (visualstudio2017community)
    • Open the Visual studio Installer and install
  • Resharper (resharper)
    • Run Visual Studio and register product.
  • Code Comparison Tool (beyondcompare)
    • Run and register product.
  • Divvy (divvy)
    • Register product
    • Start Divvy at login, Set Ctrl-Shift-Z as shortcut
  • Torrent Client (qbittorrent)
  • WAMP Server (wamp-server)
  • Directory Admin Tool (windirstat)
  • Chat Client (franz –pre)
    • Log into Franz account and all active accounts.
  • Node.JS (long term support) (nodejs-lts)
  • File Archiver (7zip)
  • FTP Client (filezilla)
  • Cleanup Tool (ccleaner)

After the above list is installed, I just have to finish by globally installing any tools I use for JavaScript development using npm.  These tools are:

  • @angular/cli
  • npm-check-updates

Aggregating Data Into a Single SQL Server Column

2018.04.13 EDIT:  I made a slight change to this to fix the filtering for the Start Date and End Date for the Promotions in the query below.

This week, I was working with a client using nopCommerce to try and fix a performance issue relating to syncing data between their legacy product management system and their nopCommerce system used for their front-facing website.  A plugin was built in nopCommerce that would utilize a SQL Server staging database to stage data from the legacy system and import into nopCommerce on a daily basis.

Specifically, this issue relates to trying to add a promotion to a product, in which promotions and products have a many-to-many relationship with each other, and each product has a one-to-many relationship with a series of product tabs that are used to display information on the product.  The entity relationship looks like this:

When the client attempted to add a promotion to 3000 products, it slowed the sync process from about 30 minutes to 8 hours – causing those products to not reflect the new promotion until mid-day and making for an unhappy client.

Digging into the code, the major slowness issue was relating to creating the product tab page – the system was searching to see if a tab page existed for the product, and then either created a new one or adding to the existing with the appropriate data, forcing iteration through a large dataaset and giving a poorly scaling solution.

Creating a stored procedure for this make the most sense – since it’s just copying data from a few tables into another, but with database programming, I cannot truly iterate through each mapping to create the Promotion product tab correctly, so I needed to use aggregation to add the content to the product tab for each product.

I found out about the STUFF() operator and came up with this:

    'Promotions' as title,
        (SELECT '<br />' + p2.Name
         FROM Promotions p2
         JOIN ProductPromoMappings ppm2 on ppm2.Promo_Id = p2.Id
             and p2.StartDate <= GETDATE() AND GETDATE() < p2.EndDate
         WHERE ppm2.Product_Id = ppm.Product_Id
         FOR XML PATH(''), TYPE)
             .value('.','varchar(max)'),1,6, '') AS content,

 FROM ProductPromoMappings ppm
 JOIN Promos p ON p.Id = ppm.Promo_Id
     and p.StartDate <= GETDATE() AND GETDATE() < p.EndDate

Here’s what is happening – we are going through the ProductPromoMappings and returning the content for the product tab pages for use.  For the content in the product tabs, we are going through each promotion in an inner query and getting the content for each promotion.  After that, then concatenating it into a single value to be used by the outer query.

Here’s a little more details on some of the commands that may be new to you:

  • WITH TIES – Addition to the ‘TOP’ keyword that allows for returning ‘ties’, or multiple rows in last place with the limited data set.
  • STUFF() – Inserts a string into another string.  Specifically, this allows us to insert the ‘<br />’ element into each subsequent string without inserting the line break in the first element.
  • FOR XML PATH(”) – Outputs the results of a query in XML – for us, this means taking the contents and making them into a single string for use.
  • .value() – Converts the XML node passed from FOR XML PATH into a varchar,  allowing us to escape all characters that would cause issue with XML conversion (such as ‘<‘ and ‘>’).
  • ROW_NUMBER() – Numbers the result rows – this is uses with the OVER and PARTITION keywords to filter each product tab page by the Product ID.

Putting this command into a stored procedure reduced the runtime from 8 hours to about 15 seconds, as the data is now just copying directly into the database.




Adding Google Analytics To Your Angular CLI App

As you’re working through your Angular CLI application, you may want to set up Google Analytics for tracking the traffic coming in to your website.  This guide will help on setting up the basics for being able to view the active users on you web application, along with being able to see more analytical data on how your web application is being used.


Set up a Google Analytics account and property for your web site.  You should get a code that looks like the following for your web application: UA-XXXXXXXXX-XX.

Adding the Tracking Code to Your Application

Once you have the tracking code from your Google Analytics account, you’ll want to add it to the application somewhere that it can be referenced in the application.  The best place to do this is in the environment.ts file meant for environment-specific variables.  Add the following to your environment.ts file:

googleAnalyticsTrackingId: 'UA-XXXXXXXXX-XX' // add the tracking ID from your GA account

Once this is done, you’ll need to add the actual tracking code to your application.  One way to do this is to write it into the code that bootstraps your Angular App in the main.ts file.  Add the following to your main.ts file:

import { environment } from './environments/environment';

if (environment.googleAnalyticsTrackingId !== '') {
 document.write('<script>' +
 '(function(i,s,o,g,r,a,m){i[\'GoogleAnalyticsObject\']=r;i[r]=i[r]||' +
 'function(){(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();' +
 'a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;' +
 'm.parentNode.insertBefore(a,m)})(window,document,\'script\',' +
 '\'\',\'ga\');' +
 'ga(\'create\', \'' + environment.googleAnalyticsTrackingId + '\', \'auto\');</script>');

Modifying The App Component

Google Analytics works by receiving page events from the web application, such as page loads.  With a single page application, you’ll need to send an event for each time the user routes to a different part of the application.  The easiest way to do this is to edit the App Component, which will affect all routes.

Make the following additions to your app.component.ts file:


import { environment } from '../environments/environment';

declare let ga: Function;

constructor(private router: Router, private location: Location) {
    if (!this.hasGoogleAnalyticsTrackingId()) { return; }

    this.router.event.subscribe(event => {
        if (eventinstanceofNavigationEnd) {
            ga('set', 'page', event.urlAfterRedirects);
            ga('send', 'pageview');

private hasGoogleAnalyticsTrackingId() {
    return environment.googleAnalyticsTrackingId !== null &&
           environment.googleAnalyticsTrackingId !== '';

Verify Working Functionality

Once this is all set, you’ll want to make sure your changes are working – you can actually determine this using your local environment.  Open the Google Analytics dashboard and go to the view for the property you’ve created for your application, and access the ‘Real-time’ Reports.  You should see the following:

Start your Angular application locally and access the home page.  Once you do this, you should see the following change in the Google Analytics dashboard:

and just to see how the system acts on routing changes:

Congratulations!  You’ve set up Google Analytics for your Angular app.

Setting Up For Multiple Environments

Depending on how you’re deploying your Angular CLI app, you likely are using multiple environments (such as a Staging and a Production environment).  A common practice is to set up a separate property inside Google Analytics for each environment to allow for viewing statistics for each website.  Especially useful for testing to see if the changes are actually going through the environments.

The optimal way to set this up is using the environment specific files in Angular.  You’ll include the property-specific code for each environment, and build the application for each environment.

For this example, let’s imagine we have 3 different environments:

  • My local environment I use for development – we do not want the Google Analytics code running on this.
  • A staging environment used for testing changes before promoting changes to production.
  • A production environment that will be used by the end user.

Angular CLI comes with two files, so let’s create a third one called environment.staging.ts.  Once that’s done, add the following content to the new file and the file:

export const environment = {
    production:true, // allows for simulating production configuration
    googleAnalyticsTrackingId: 'UA-XXXXXXXXX-X'  // your tracking ID for the specific environment goes here
    // ..and any other environment variables you may have.

Notice that we’re leaving environment.ts as is – since we want the codebase to not use Google Analytics when working locally.

We will make one more change to .angular-cli.json to allow for Angular CLI to recognize the new environment and allow for the Staging environment build:

"environments": {
    // dev and prod statements are here, add this line below
    "staging": "environments/environment.staging.ts",

Once this is done, you’ll be able to use different builds that associate with different tracking IDs.



Tutorial: Angular 2 & Google Analytics with Event tracking

2018.03.08 Training

For my last day before my procedure (that will take me off for around a week), Zach granted me a day of heavy lifts.  I am pretty happy with how it all went down, I think these are both PRs without a belt.


  • 70 x 3
  • 90 x 3
  • 110 x 3
  • 130 x 3
  • 150 x 2
  • 170
  • 180

Deadlift (straps)

  • 70 x 3
  • 90 x 3
  • 110 x 3
  • 130
  • 150
  • 170
  • 190
  • 210