Deploying Drupal on an Azure App Service Linux Docker Container

From a performance perspective, PHP applications running on Azure App Services tend to perform better on Linux than Windows.  While Azure provides a Drupal template in their Marketplace, it deploys to a regular Windows based App Service and installs version 8.3.3 (where as of the time of writing this article; 9/10/2018, the latest Drupal version is 8.6.1)

In this case, Microsoft has published a set of templates that provide flexibility to choose the Drupal version, deploy nginx, install PHP, and allow flexibility in installing any modules.  The templates are currently deployed and maintenace on GitHub, which can be found here: https://github.com/Azure/app-service-quickstart-docker-images/tree/master/drupal-nginx-fpm

So let’s get started!

Download and install prerequisites:

  1. Download and install Git
  2. Download and install Docker
    1. Note: The above link is for Windows, here are the links to the other distributions for Docker: https://store.docker.com/search?offering=community&type=edition
    2. Note: On windows I recommend running the command git config core.autocrlf true to prevent issues with weird character returns.  Check out this awesome blog article by Tim Clem on why this is recommended: https://adaptivepatchwork.com/2012/03/01/mind-the-end-of-your-line/
  3. Download and install Visual Studio Code (free lightweight code editor for Windows, Linux, and Mac)

Note: I’m using a Windows 10 machine while writing this tutorial.  There will be some steps, like running Git Bash on Windows vs running Git natively on Linux.  Likely, you can just run Git from a regular terminal session and you’ll be fine on the Linux/Mac side.

Clone the GitHub template

  1. Open up Git Bash
  2. Create a new directory for our project
    1. mkdir Drupal-Azure
    2. cd Drupal-Azure
  3. Clone the GitHub templat
    1. git clone https://github.com/Azure/app-service-quickstart-docker-images.git –config core.autocrlf=input
      1. Note: Unfortunately, we cannot just clone a specific directory easily, we have to download all the files.  This particular GitHub project contains several projects, so it’ll be about a 50MB download as a heads up 
      2. Note: The –config core.autocrlf=input is used to prevent windows from using crlf vs lf’s for line returns.  If you don’t specify this, you might receive the following error if you tried running your docker container after being built:
        1. standard_init_linux.go:190: exec user process caused “no such file or directory”
  4. Navigate into the Drupal directory
    1. cd app-service-quickstart-docker-images/drupal-nginx-fpm/0.43

Modify the scripts to your desire

I personally prefer not to have PHPMyAdmin or MariaDB installed as I will leverage Azure MySQL PaaS services for the database.  In this case, I went ahead and modified the Dockerfile document accordingly.

Build the Docker container

Execute the following command to build your container:

docker build -t jackdrupalregistry.azurecr.io/azuredrupal:test .

Note: The . at the end is needed

Note: When building docker images, the repository name must be lowercase

Create Azure Container Registries

Select All Services -> Azure Container Registries.  Select Add and create a new container registry

Push the Docker container to your Azure Container Registry

  1. Navigate to  All Services -> Azure Container Registries -> Your Registry -> Access Keys
  2. Check Enable for Admin user
  3. Go back to Git Bash and execute the following commands
  4. Login to docker
    1. Execute the command:
      1. docker login jackdrupalregistry.azurecr.io -u yourusername -p yourpassword
  5. Push the image up to Azure Container Registry
    1. Execute the command:
      1. docker push jackdrupalregistry.azurecr.io/azuredrupal:test

Deploy the web app

Navigate to Create a resource -> Web App.  Select Docker as the OS type, select Configure container, and leverage the following settings:

  • Image Source: Azure Container Registry
  • Registry: jackdrupalregistry
  • Image: azuredrupal
  • Tag: 0.43

Navigate to All Services -> App Services -> Your App Service -> Application settings and set WEBSITES_ENABLE_APP_SERVICE_STORAGE to true, and click Save to help ensure data persists.  Essentially, anything you write to /home will persist.  Anything else will be reset when the container gets rebuilt.

Create a MySQL Database

Navigate to All Services -> App Services -> Your App Service -> Properties and write down the Outbound IP Addresses; we will use these later.

Select Create a Service -> Azure Database for MySQL -> Create -> create a blank database

Select Connection security and enter the Outbound IP Addresses from your App Service and click Save

Note: I haven’t found a way to get Drupal to allow SSL Connections, which would certainly be a best practice.  In this case, on the same Connection security blade, go ahead and set Enforce SSL Connection to Disabled.  If someone knows how to do this, please put a comment below, so I can update this guide.

Go back to the Overview section and write down the Server admin login name and Server name; we will use these during the Drupal setup

Configure Drupal

At this point, go ahead and browse out to your App Service.  You should have all the necessary details to complete the installation setup.  Once completed, you should see the Welcome to Drupal Side splash page.

Notes:

Email:

Upon installation of Drupal you’ll receive an error that Drupal cannot send email.  Azure Web Apps don’t allow open relay, so you will need to use a 3rd party mail service like SendGrid or Mailchimp to relay emails.

Helpful docker commands:

docker images

docker run -it azuredrupal:test

docker ps -a

docker rm containeridea

docker rmi image

Other deployment strategies:

In addition to deploying through the portal, you could easily deploy via PowerShell, Azure CLI, or ARM template.  Here’s an Azure CLI 2.0 example of how to deploy (note: the script below uses PowerShell variables for demonstration, please substitute those as needed):

$resourceGroupName = "Drupal-Test"
$planName = $resourceGroupName
$appName = $planName
$containerName = "appsvcorg/drupal-nginx-fpm:0.43"
$location = "West US"

az group create -l $location -n $resourceGroupName

az appservice plan create `
    -n $planName `
    -g $resourceGroupName `
    --sku S3 --is-linux 

az webapp create `
    --resource-group $resourceGroupName `
    --plan $planName `
    --name $appName `
    --deployment-container-image-name $containerName

az webapp config appsettings set `
    --resource-group $resourceGroupName `
    --name $appName `
    --settings WEBSITES_ENABLE_APP_SERVICE_STORAGE="true"

az webapp config appsettings set `
    --resource-group $resourceGroupName `
    --name $appName `
    --settings WEBSITES_CONTAINER_START_TIME_LIMIT="600"

# please modify DB settings according to current condition
az webapp config appsettings set `
        --resource-group $resourceGroupName `
        --name $appName `
        --settings DATABASE_HOST="drupaldb.mysql.database.azure.com" `
            DATABASE_NAME="drupaldb" `
            DATABASE_USERNAME="[email protected]" `
            DATABASE_PASSWORD="abcdefghijklmnopqrstuvwxyz"

How to hide users from the GAL in Office 365 synchronized from on-premises

Hiding users from the Global Address List (GAL) is a fairly straight forward when the user is a cloud account. Simply “Hide from address list” from the Exchange Online console or run some quick powershell:

$LiveCred = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell/ -Credential $LiveCred -Authentication Basic -AllowRedirection
Import-PSSession $Session 
Set-Mailbox -Identity [email protected] -HiddenFromAddressListsEnabled $true

Hiding users from the GAL is fairly straight forward when the user is synchronized from on-premises as well.  Simply edit the attribute of the user object, set msExchHideFromAddressLists  to True, and do a sync.  The problem though is what happens if you don’t have the msExchHideFromAddressLists attribute in Active Directory?

Well, you can either extend your Active Directory Schema for Exchange, which is not something that you can easily roll back if something goes wrong and arguably adds a ton of attributes that likely will be never used.  Or, you can simply create a custom sync rule within Azure AD Connect that flows the value from a different attribute.

This article will go over how to sync a custom attribute from on-premises to Azure AD to hide a user from the GAL, without the need of extending your Active Directory schema.  In this case, we are going to use an attribute called msDS-cloudExtensionAttributeX (where X is the number of the attribute that is free/not being used within your directory).  The msDS-cloudExtensionAttribute(s) were introduced in Windows Server 2012 and has 20 different numbers to allow flexibility for these types of scenarios.  Now some customer’s may gravitate towards using a different attribute like showInAddressBook.  The problem with the showInAddressBook is this attribute is referenced by very old versions of Exchange (which I’m sure people would never be running 😉 ) and is looking for the format of the common name of an object (not what we want).  In this case, easiest way to move forward is to simply use the msDS-cloudExtensionAttributes.

Step 1: Scope in the msDS-cloudExtensionAttribute for Azure AD Connect

Open the Azure AD Connect Synchronization Service

Navigate to the Connectors tab, select your Active Directory (not the domain.onmicrosoft.com entry), and select Properties

In the top right, click on Show All, scroll down and find msDS-CloudExtensionAttribute1 (you can use any of the numbers 1-20, just make sure to check the box you are using), and select OK

Step 2: Create a custom sync rule

Open up the Azure AD Connect Synchronization Rules Editor

Click on the Add new rule button (make sure direction in the top left shows Inbound)

Enter the following for the description:

Name: Hide user from GAL
Description: If msDS-CloudExtensionAttribute1 attribute is set to HideFromGAL, hide from Exchange Online GAL
Connected System: Your Active Directory Domain Name
Connected System Object Type: user
Metaverse Object Type: person
Link Type: Join
Precedence: 50 (this can be any number less than 100.  Just make sure you don’t duplicate numbers if you have other custom rules or you’ll receive a dead-lock error from SQL Server)

Click Next > on Scoping filter and Join rules, those can remain blank

Enter the following Transformation page, click the Add transformation button, fill out the form with the values below, and then click Add
FlowType: Expression
Target Attribute: msExchHideFromAddressLists
Source: IIF(IsPresent([msDS-cloudExtensionAttribute1]),IIF([msDS-cloudExtensionAttribute1]=”HideFromGAL”,True,False),NULL)

 

Step 3: Perform an initial sync

Open up Windows PowerShell on the Azure AD Connect Server

Execute the following command: Start-ADSyncSyncCycle -PolicyType Initial

Step 4: Hide a user from Active Directory

Open Active Directory Users and Computers, find the user you want to hide from the GAL, right click select Properties

Select the Attributes Editor tab, find msDS-cloudExtensionAttribute1, and enter the value HideFromGAL (note, this is case sensitive), click OK and OK to close out of the editor. 

Note: if you don’t see the Attribute Editor tab in the previous step, within Active Directory Users and Computers, click on View in the top menu and select Advanced Features

Step 5: Validation

Open the Azure AD Connect Synchronization Service

On the Operations tab, if you haven’t seen a Delta Synchronization, manually trigger the Delta sync to pick up the change you made in Active Directory

 

Select the Export for the domain.onmicrosoft.com connecter and you should see 1 Updates

Select the user account that is listed and click Properties.  On the Connector Space Object Properties, you should see Azure AD Connect triggered an add to Azure AD to set msExchHideFromAddressLists set to true

There ya have it!  An easy way to hide users from the GAL with minimal risk to ongoing operations.  Due to the way Azure AD Connect upgrades, our sync rule will persist fine during regular updates/patches released.

Installing Python Wheel files on an Azure App Service

Per Microsoft: Some packages may not install using pip when run on Azure. It may simply be that the package is not available on the Python Package Index. It could be that a compiler is required (a compiler is not available on the machine running the web app in Azure App Service).

Example, you may receive an error like this when trying to install a specific package (in this case, trying to install Pandas):

Command: "D:\home\site\deployments\tools\deploy.cmd"
Handling python deployment.
KuduSync.NET from: 'D:\home\site\repository' to: 'D:\home\site\wwwroot'
Copying file: 'requirements.txt'
Detected requirements.txt.  You can skip Python specific steps with a .skipPythonDeployment file.
Detecting Python runtime from runtime.txt
Detected python-2.7

Found compatible virtual environment.
Pip install requirements.
Downloading/unpacking Flask==0.12.1 (from -r requirements.txt (line 1))
Downloading/unpacking numpy==1.15.0rc2 (from -r requirements.txt (line 2))
Downloading/unpacking pandas==0.22.0 (from -r requirements.txt (line 3))
  Running setup.py (path:D:\home\site\wwwroot\env\build\pandas\setup.py) egg_info for package pandas

    Could not locate executable g77
    Could not locate executable f77
    Could not locate executable ifort
    Could not locate executable ifl
    Could not locate executable f90
    Could not locate executable efl
    Could not locate executable gfortran
    Could not locate executable f95
    Could not locate executable g95
    Could not locate executable effort
    Could not locate executable efc
    don't know how to compile Fortran code on platform 'nt'
    non-existing path in 'numpy\\distutils': 'site.cfg'
    Running from numpy source directory.

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\setup.py:385: UserWarning: Unrecognized setuptools command, proceeding with generating Cython sources and expanding templates
      run_build = parse_setuppy_commands()
    D:\python27\Lib\distutils\dist.py:267: UserWarning: Unknown distribution option: 'python_requires'
      warnings.warn(msg)
    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Atlas (http://math-atlas.sourceforge.net/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [atlas]) or by setting
        the ATLAS environment variable.
      self.calc_info()

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Blas (http://www.netlib.org/blas/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [blas]) or by setting
        the BLAS environment variable.
      self.calc_info()

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Blas (http://www.netlib.org/blas/) sources not found.
        Directories to search for the sources can be specified in the
        numpy/distutils/site.cfg file (section [blas_src]) or by setting
        the BLAS_SRC environment variable.
      self.calc_info()

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Lapack (http://www.netlib.org/lapack/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [lapack]) or by setting
        the LAPACK environment variable.
      self.calc_info()

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Lapack (http://www.netlib.org/lapack/) sources not found.
        Directories to search for the sources can be specified in the
        numpy/distutils/site.cfg file (section [lapack_src]) or by setting
        the LAPACK_SRC environment variable.
      self.calc_info()

    D:\python27\Lib\distutils\dist.py:267: UserWarning: Unknown distribution option: 'define_macros'
      warnings.warn(msg)
    Traceback (most recent call last):
      File "<string>", line 17, in <module>
      File "D:\home\site\wwwroot\env\build\pandas\setup.py", line 743, in <module>
        **setuptools_kwargs)
      File "D:\python27\Lib\distutils\core.py", line 111, in setup
        _setup_distribution = dist = klass(attrs)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 262, in __init__
        self.fetch_build_eggs(attrs['setup_requires'])
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 287, in fetch_build_eggs
        replace_conflicting=True,
      File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 614, in resolve
        dist = best[req.key] = env.best_match(req, ws, installer)
      File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 857, in best_match
        return self.obtain(req, installer)
      File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 869, in obtain
        return installer(requirement)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 338, in fetch_build_egg
        return cmd.easy_install(req)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 613, in easy_install
        return self.install_item(spec, dist.location, tmpdir, deps)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 643, in install_item
        dists = self.install_eggs(spec, download, tmpdir)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 833, in install_eggs
        return self.build_and_install(setup_script, setup_base)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 1055, in build_and_install
        self.run_setup(setup_script, setup_base, args)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 1043, in run_setup
        raise DistutilsError("Setup script exited with %s" % (v.args[0],))
    distutils.errors.DistutilsError: Setup script exited with error: Microsoft Visual C++ 9.0 is required (Unable to find vcvarsall.bat). Get it from http://aka.ms/vcpython27
    Complete output from command python setup.py egg_info:

Could not locate executable g77
Could not locate executable f77
Could not locate executable ifort
Could not locate executable ifl
Could not locate executable f90
Could not locate executable efl
Could not locate executable gfortran
Could not locate executable f95
Could not locate executable g95
Could not locate executable effort
Could not locate executable efc

don't know how to compile Fortran code on platform 'nt'
non-existing path in 'numpy\\distutils': 'site.cfg'
Running from numpy source directory.
d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\setup.py:385: UserWarning: Unrecognized setuptools command, proceeding with generating Cython sources and expanding templates
  run_build = parse_setuppy_commands()
D:\python27\Lib\distutils\dist.py:267: UserWarning: Unknown distribution option: 'python_requires'
  warnings.warn(msg)

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Atlas (http://math-atlas.sourceforge.net/) libraries not found.
    Directories to search for the libraries can be specified in the
    numpy/distutils/site.cfg file (section [atlas]) or by setting
    the ATLAS environment variable.
  self.calc_info()

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Blas (http://www.netlib.org/blas/) libraries not found.
    Directories to search for the libraries can be specified in the
    numpy/distutils/site.cfg file (section [blas]) or by setting
    the BLAS environment variable.
  self.calc_info()

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Blas (http://www.netlib.org/blas/) sources not found.
    Directories to search for the sources can be specified in the
    numpy/distutils/site.cfg file (section [blas_src]) or by setting
    the BLAS_SRC environment variable.
  self.calc_info()

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Lapack (http://www.netlib.org/lapack/) libraries not found.
    Directories to search for the libraries can be specified in the
    numpy/distutils/site.cfg file (section [lapack]) or by setting
    the LAPACK environment variable.
  self.calc_info()

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Lapack (http://www.netlib.org/lapack/) sources not found.
    Directories to search for the sources can be specified in the
    numpy/distutils/site.cfg file (section [lapack_src]) or by setting
    the LAPACK_SRC environment variable.
  self.calc_info()

D:\python27\Lib\distutils\dist.py:267: UserWarning: Unknown distribution option: 'define_macros'
  warnings.warn(msg)

Traceback (most recent call last):
  File "<string>", line 17, in <module>
  File "D:\home\site\wwwroot\env\build\pandas\setup.py", line 743, in <module>
    **setuptools_kwargs)
  File "D:\python27\Lib\distutils\core.py", line 111, in setup
    _setup_distribution = dist = klass(attrs)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 262, in __init__
    self.fetch_build_eggs(attrs['setup_requires'])
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 287, in fetch_build_eggs
    replace_conflicting=True,
  File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 614, in resolve
    dist = best[req.key] = env.best_match(req, ws, installer)
  File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 857, in best_match
    return self.obtain(req, installer)
  File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 869, in obtain
    return installer(requirement)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 338, in fetch_build_egg
    return cmd.easy_install(req)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 613, in easy_install
    return self.install_item(spec, dist.location, tmpdir, deps)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 643, in install_item
    dists = self.install_eggs(spec, download, tmpdir)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 833, in install_eggs
   return self.build_and_install(setup_script, setup_base)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 1055, in build_and_install
    self.run_setup(setup_script, setup_base, args)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 1043, in run_setup
    raise DistutilsError("Setup script exited with %s" % (v.args[0],))

distutils.errors.DistutilsError: Setup script exited with error: Microsoft Visual C++ 9.0 is required (Unable to find vcvarsall.bat). Get it from http://aka.ms/vcpython27

----------------------------------------

Cleaning up...

Command python setup.py egg_info failed with error code 1 in D:\home\site\wwwroot\env\build\pandas
Storing debug log for failure in D:\home\pip\pip.log

An error has occurred during web site deployment.
\r\nD:\Program Files (x86)\SiteExtensions\Kudu\75.10629.3460\bin\Scripts\starter.cmd "D:\home\site\deployments\tools\deploy.cmd"

This guide is a reflection on how to use Wheel files to install Modules that cannot natively be installed via pip due to a compiler missing in the Azure App Service:

Microsoft Official documentation can be found here: https://docs.microsoft.com/en-us/azure/app-service/web-sites-python-configure#troubleshooting—package-installation

Tutorial

  1. Modify requirements.txt file
    1. Add the following item as the first line to the document:
      1. –find-links wheelhouse
        1. Note: If you do not have a requirements.txt file, you can simply create a new text document and add this line to it.  The requirements.txt file is what allows the Azure App Service to automatically go out and try and download packages you may need for your application.  Official documentation on this file is found here: https://docs.microsoft.com/en-us/azure/app-service/web-sites-python-configure#package-management
    2. Navigate to the Kudu Debug Console by going to https://yourappservice.scm.azurewebsites.net/DebugConsole
    3. Within the debug console, navigate to your version of Python.
      1. Note: The default Python versions in an Azure App Service are 2.7 and 3.4; however since Wheel will need to install some files, you cannot leverage the default directories of D:\Python27 for v2.7 and D:\Python34 for v3.4
      2. In this case, I’d recommend leveraging Extensions to install whatever version of Python.  Documentation on this can be found here: https://blogs.msdn.microsoft.com/pythonengineering/2016/08/04/upgrading-python-on-azure-app-service/
    4. Install the Python Wheel module:
      1. python.exe -m pip install wheel
    5. Obtain Wheel files
      1. Option 1: Build your own wheel files
        1. Execute the following command:
          1. python.exe -m pip wheel -r D:\home\site\wwwroot\requirements.txt -w wheelhouse

      2. Option 2: Obtain Wheel files
        1. Create a wheelhouse folder within your python directory
          1. mkdir wheelhouse

        2. Copy whl files to this directory
          1. You can obtain wheel files from PyPi or from Laboratory for Fluorescence Dynamics, University of California, Irvine.
            1. PyPi: Search for the module and then clicking on the Download Files button
              1. https://pypi.org/
            2. Laboratory for Fluorescence Dynamics, University of California, Irvine: Simply download the appropriate whl file listed on the page below
              1. https://www.lfd.uci.edu/~gohlke/pythonlibs/
    6. Install Modules
      1. Manual Install
        1. Execute the following command:
          python.exe -m pip install –upgrade -r D:\home\site\wwwroot\requirements.txt

      2. Deployment Install (from CI/CD pipeline)
        1. Configure .deployment and deploy.cmd file
          1. Official documentation on this can be found here: https://github.com/projectkudu/kudu/wiki/Custom-Deployment-Script
          2. .deployment file
            1. [config]
              command = deploy.cmd
          3. deploy.cmd file (modify the python directory to reflect your version)
            1. :: 1. Install Wheel
              echo Configure Wheel
              D:\home\python364x64\python.exe -m pip install wheel:: 2. Install packages
              echo Pip install requirements.
              D:\home\python364x64\python.exe -m pip install –upgrade -r D:\home\site\wwwroot\requirements.txt

At this point, the modules in question should be installed and ready for use! 🙂

How to install NodeJS on a Raspberry Pi

Installing NodeJS on a Raspberry Pi can be a bit tricky.  Over the years, the ARM based processor has gone through several versions (ARMv6, ARMv7, and ARMv8), in which there are different flavors of NodeJS to each of these architectures.

Depending on the version you have, you will need to manually install NodeJS vs grabbing the packages via a traditional apt-get install nodejs.

Step 1: Validate what version of the ARM chipset you have

First let’s find out what ARM version you have for your Raspberry Pi.  To do that, execute the following command:

uname -m

You should receive something like: armv61

Step 2: Find the latest package to download from nodeJS’s website

Navigate to https://nodejs.org/en/download/ and scroll down to the latest Linux Binaries for ARM that match your instance.  Right click and copy the address to the instance that matches your processor’s architecture.  For example, if you saw armv61, you’d copy the download for ARMv6

Step 3: Download and install nodeJS

Within your SSH/console session on the Raspberry Pi, change to your local home directory and execute the following command (substituting in the URL you copied in the previous step in what’s outlined in red below).  For example:

cd ~
wget https://nodejs.org/dist/v8.11.3/node-v8.11.3-linux-armv6l.tar.xz

Next, extract the tarball (substituting in the name of the tarball you downloaded in the previous step) and change the directory to the extracted files

tar -xvf node-v8.11.3-linux-armv6l.tar.xz
cd node-v8.11.3-linux-armv6l

Next, remove a few files that aren’t used and copy the files to /usr/local

rm CHANGELOG.md LICENSE README.md
cp -R * /usr/local/

Step 4: Validate the installation

You can validate that you have successfully installed NodeJS by running the following commands to return the version numbers for NodeJS and npm

node -v
npm -v

That’s it!  Have fun!

 

How to build a LEMP stack

Growing up it was always common to spin up a “LAMP” box to host a website.  The typical setup was:
Linux
Apache
MySQL
PHP

Over the past few years, this model has slightly changed due to new open source technologies bringing new ideas to solve performance and licensing issues at massive scale.  In this tutorial, we are going to look at setting up a LEMP box on Debian Stretch (9.1).
Linux
nginx [engine x]
MariaDB
PHP

Please note, MariaDB could easily be swapped out with MySQL in this tutorial, however many have opted to jump over to MariaDB as an open source alternative (actually designed by the original developers of MySQL) over fear Oracle may close source MySQL.

Installing Linux

This tutorial assumes you already have either a copy of Ubuntu 14+ or Debian 7+.  This probably works on earlier versions as well, but I haven’t tested them.  On a side note, I typically don’t install Linux builds with an interactive desktop environment, so grab yourself a copy of Putty and ssh in or open up Terminal if you have interactive access to the Desktop Environment.  Before continuing, go ahead and update apt-get repos and upgrade any packages currently installed:

apt-get update && apt-get upgrade

Installing nginx

Grab a copy of nginx

apt-get install nginx

Installing MariaDB

Grab a copy of MariaDB

apt-get install mariadb-server

Installing PHP

In this case, I want to roll with PHP7.  You can specify php5 or php7 depending on your application, but PHP7 has some great performance enhancements, so for new apps, I’d leverage it.  The biggest thing here is to make sure you use the FastCGI Process Manager package.  If you specify just php or php7, package manager will pull down apache2 as a dependency.  That is not what we want in our LEMP stack.

apt-get install php7.0-fpm

Once installed, fire up your favorite text editor (it’s ok if it’s vi :)) and edit the default site for nginx

vi /etc/nginx/sites-enabled/default

Search for the comment # Add index.php to the list if you are using PHP and add index.php to the line below it.  For example:

index index.html index.htm index.php index.nginx-debian.html;

Next, find the comment # pass PHP scripts to FastCGI server and change the block of code to the following to tell nginx to process .PHP files with FastCGI-PHP:

# pass PHP scripts to FastCGI server
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php-fpm (or other unix sockets):
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
# # With php-cgi (or other tcp sockets):
# fastcgi_pass 127.0.0.1:9000;
}

Save the file.  If using vi, you can do that by executing :wq

Next, reload the nginx service to pickup the new changes to our configuration:

service nginx reload

Test

At this point, we can create a php file to validate things are working well. Go ahead and create a new file /var/www/html/info.php and add the following line:

<?php
phpinfo();

If you see a page listing the PHP version and the corresponding environment configuration, congratulations, you have finished setting up your new LEMP stack! 🙂

Setting up WeeWX with a Raspberry PI

This is a quick setup guide on how to configure the open source software WeeWX for a Personal Weather Station (PWS).  I highly recommend you check out the WeeWX User Guide as this information is very well documented.  Here is a reflection of how I was able to get WeeWX installed on a Raspberry PI with a brand new weather station.

  1. Setup your Raspberry PI
    1. How to setup your Raspberry PI: http://jackstromberg.com/2018/03/setting-up-a-new-raspberry-pi-via-ssh/
      1. Note: Raspbian is a distribution based upon Debian.  In this case, we will follow the Debian instructions for setting up WeeWX.
        1. http://weewx.com/docs/debian.htm
  2. (Optional) Configure the Raspberry PI to be localized to your environment
    1. sudo raspi-config
      1. Here you can arrow down to Localization Options and configure the timezone to match that of your console/weather sensor.  Keeping time is critical, so if possible, try to keep the date/time between your weather station and the Raspberry PI as close as possible.
  3. Configure Apt-Get to look for the WeeWX packages
    wget -qO - http://weewx.com/keys.html | sudo apt-key add -
    sudo wget -qO - http://weewx.com/apt/weewx.list | sudo tee /etc/apt/sources.list.d/weewx.list
  4. Update your Raspberry-PI to use the latest packages
    sudo apt-get update
    sudo apt-get upgrade
  5. Before installation, ensure you have your console or device setup and connected to your Raspberry PI for WeeWX to pull the data
  6. Determine the interface the console is connected to (if using a directly attached data loggerm skip if using an IP based source)
    1. Execute the command dmesg and look for what interface the data logger is connected to
      1. In my example, you can see the data logger is connected to ttyUSB0
  7. Launch the installation wizard for weewx
    1. sudo apt-get install weewx
      1. Note: You will likely be prompted to install a few dependencies, type Y for yes to install them
  8. Installation
    1. Enter the location of your weather station: Santa’s Workshop, North Pole
    2. Enter in the latitude, longitude of your weather station
      1. Note: If you don’t have GPS, you can easily find this by using Bing Maps or Google Maps, navigating to your location, and right clicking.
        1. For Bing, it will just show you the lat/long values when you right click
        2. For Google, click on “What’s Here” and it will list these values
      2. Note: You can be more specific than 3 digits behind the decimal, so if you want to use a more specific set of coordinates like 40.689167, -74.044444, that is acceptable.
    3. Enter in your Altitude of where the weather station is
      1. You can use Google Earth to find the altitude or this tool here: https://www.freemaptools.com/elevation-finder.htm
    4. Set your preferred unit of measurement
      1. US (Imperial) or Metric
    5. Select your weather station type
      1. I.e. AcuRite, Vantage (if using Davis), etc.
    6. Select the interface the device is listening on
    7. For those using serial port, select the interface that the data logger is connected to.  You should have found this in step 4 above; if using ethernet, go ahead and type in the IP, Port, etc. of the data logger.
  9. At this point WeeWX is technically installed, however many individuals will want to present the WeeWX reports via webpage.  In this case, we’ll install nginx, which is a lightweight webserver
    1. sudo apt-get install nginx
      1. More details on this can be found here: http://www.weewx.com/docs/usersguide.htm#integrating_with_webserver
  10. Configure WeeWX to minimize disk IO
    1. Why do we need to do this?  Since Raspberry PI’s leverage SD cards, there is typically a finite number of reads/writes to the SD Card.  In this case, it is recommended to either leverage an external database/fileserver for WeeWX to write its reports.  Alternatively, we can also configure WeeWX to leverage ram to host the reports, which will prevent IO to the SD card (in this case, theoretically increasing the life of the drive)
      1. Three approaches are outlined here–in this guide I’ll reflect the GitHub page in saving reports to a temporary file system using tmpfs
        1. Add an entry to fstab
          1. echo “weewx_reports /var/weewx/reports tmpfs size=20M,noexec,nosuid,nodev 0 0” | sudo tee -a /etc/fstab
        2. Mount the new file system
          1. sudo mkdir -p /var/weewx/reports
          2. sudo mount -a
        3. Update weewx.config file to point to new directory
          1. sudo sed -i -e ‘s%HTML_ROOT =.*%HTML_ROOT = /var/weewx/reports%’ /etc/weewx/weewx.conf
        4. Restart WeeWX service
          1. sudo service weewx restart
        5. Create symbolic link to point webserver to the reports
          1. sudo ln -s /var/weewx/reports /var/www/html/weewx
        6. Give the web server the ability to read from the directory
          1. sudo chmod -R 755 /var/www/html/weewx

At this point, go ahead and browse out to http://youripaddress/weewx/ to see your weather.

Notes:

WeeWX updates the webpage every 30 minutes (1800 seconds) out of the box.  You can force a report update by executing wee_reports weewx.conf or you can modify the /etc/weewx/weewx.conf file by changing the archive_interval variable (in seconds) under the [StdArchive] section.

You can modify the Weewx configuration by editing: /etc/weewx/weewx.conf

You can validate if WeeWX is running by executing: service weewx status

You can look at diagnostics logs by following the guide here: http://www.weewx.com/docs/usersguide.htm#monitoring

Best practices guide on using WeeMX + Raspberry PI: https://github.com/weewx/weewx/wiki/Raspberry%20Pi

How to upgrade your Windows Server Evaluation/Trial

Scenario: You downloaded the evaluation copy of Windows Server and you have 180 days to test out whatever you are working on.  Fast forward a few months and you only have a few days left and you are so happy with how it works, you go out and buy the whole license key.  When you go to apply the license key under System, you get a big ol’ error that says: “This edition cannot be upgraded.”

Solution:

You can use the DISM tool to figure out what versions of Windows Server you can upgrade to, and also use the tool to help change the product key of the version installed.

Easy enough, let’s go ahead and open up command prompt as an administrator (right click on windows flag/start icon, Command Prompt (Admin):

Execute the following command to find out what versions you can upgrade to:

Dism /Online /Get-TargetEditions

In this case, you can see I can upgrade to ServerStandard or ServerDatacenter

Next, let’s go ahead and actually upgrade the edition and inject my license key:

Dism /Online /Set-Edition:TheEditionListedYouWantToGoTo /AcceptEula /ProductKey:XXXXX-XXXXX-XXXXX-XXXXX-XXXXX

Viola!  At this point, I simply need to reboot and my instance will be upgraded accordingly.

[Tutorial] Integrate Visual Studio Code with Visual Studio Team Services

Here’s a quick way to integrate Visual Studio Code with Visual Studio Team Services.

  1. Create a new Team Project
    1. Instructions on how to create a new Team Project are outlined here: https://docs.microsoft.com/en-us/vsts/accounts/create-account-msa-or-work-student
  2. Create a Personal Access Token
    1. Instructions on how to generate a personal access token are outlined here: https://docs.microsoft.com/en-us/vsts/accounts/use-personal-access-tokens-to-authenticate
  3. Download and install Git: https://git-scm.com/download/
  4. Download Visual Studio Code: https://code.visualstudio.com/Download
  5. Inside of Visual Studio Code, click on the Extensions button
  6. Search the marketplace for Visual Studio Team Services and select Install button
  7. Once the extension has been Installed, click on the Reload button.
  8. Inside of Visual Studio Code, press F1 on your keyboard and type Git: Clone

  9. Once prompted, type in the URL to your Team Project and click on the Open Repository button
  10. Once in the repository, type: Team: Sign In  Select Provide an Access token manually, enter the Personal Access Token from Visual Studio Online and press Enter on your keyboard
  11. From there, go ahead and make a change to any of the files in your Repository
  12. Click on the Source Control icon in Visual Studio Code
  13. Select Commit All
    1. Note: You will be prompted to type in a commit message, go ahead and type in what you changed

  14. Either select Push from the button in the top right, or click the Push button in the bottom left corner
  15. Validate you see the committed changes in Visual Studio Team Services

Setting up a new Raspberry Pi via SSH

This is my super subpar tutorial on how to quickly setup a new Raspberry Pi via SSH (no mouse/keyboard/monitor directly attached to the device).

  1. Download the latest copy of the operating system (I personally prefer Raspbian Stretch Lite for the most minimal setup): https://www.raspberrypi.org/downloads/raspbian/
  2. Extract the download so you have a copy of the ****-**-**-raspbian-stretch-lite.img file
  3. Download Etcher to burn the image to an SD Card: https://etcher.io/
  4. Download a copy of Putty if you don’t have a way to ssh: https://www.chiark.greenend.org.uk/~sgtatham/putty/latest.html
  5. Open the SD card you just flashed and you should see the “boot” partition.  Create a file called ssh (no file extension or data needs to be written to the file)
    1. Note: ssh is disabled on all OS builds starting November 16 forward — see here: https://www.raspberrypi.org/documentation/remote-access/ssh/
  6. Default credentials:
    1. Username: pi
    2. Password: raspberry
  7. Quick commands
    1. Configure Raspberry PI specific settings: sudo raspi-config
    2. Proper Shutdown (-h) / Restart (-r): sudo shutdown -h now

Unlike most laptops/desktops, the Raspberry Pi doesn’t have a shutdown button, so always use the commands above to prevent SD Card corruption!

Windows 10 – Missing Windows Disc Image Burner for ISO files

In Windows, you typically are able to download a .ISO file, right click on it, and burn it via your CD/DVD drive using the Windows Disc Image Burner application. Unfortunately, for whatever reason my machine is missing this menu item.

A quick workaround that doesn’t involve any registry hacks is to simply right click on the file, select Open With, and select Choose another app.

Select More apps and scroll to the bottom and select Look for another app on this PC.

Navigate to C:\Windows\System32, select isoburn.exe, and click Open

At this point, you can go ahead and burn your iso 🙂