Category Archives: Uncategorized

Configuring DKIM for Postfix

Fighting spam can be tricky. In addition to SPF records, DKIM is nearly mandatory to help prevent sent emails from being classified as spam. Beginning February of 2024, both Google and Yahoo will require DMARC, which require either SPF or DKIM; and in some cases for a high volume of emails (5,000+), both.

In this tutorial, we will look at signed outbound messages with DKIM via use of the open source project OpenDKIM. If you followed my previous tutorial on Postfix + Dovecot + Mysql/MaraiDB, you may have multiple domain names, so this guide will assume you will want to configure separate DKIM keys for each domain name you are hosting.

Step 1: Install OpenDKIM

First, update packages for your distribution.

sudo apt-get update && sudo apt-get upgrade

Install OpenDKIM and OpenDKIM tools. OpenDKIM-tools has a utility to generate the keys we will use.

sudo apt-get install opendkim opendkim-tools

Step 2: Created trusted hosts configuration file for OpenDKIM

First, create a file that OpenDKIM will use that defines the trusted hosts that can send messages.

sudo mkdir /etc/opendkim
sudo vi /etc/opendkim/TrustedHosts

Add the IP addresses and fqdn of the server sending messages by typing i to change into insert mode in vi.

127.0.0.1
localhost
192.168.1.2
mail.mydomain.com

Type :wq to commit the changes in vi.

Step 3: Modify OpenDKIM configuration file

Modify the opendkim.conf configuration file

sudo vi /etc/opendkim.conf

Search for #Canonicalization simple and uncomment the line by removing the # symbol.

Search for #Mode and remove the # symbol to uncomment the line. Ensure the line is configured with s for signing outbound emails or sv for verifying dkim keys on sent and received emails.

If you have subdomains, search for #SubDomains and remove the # and change to yes. For example:

SubDomains              yes

Search for Socket local:/var/run/opendkim/opendkim.sock and comment the line by adding a # in front of the line.

Search for #Socket inet:8891t@localhost and uncomment the line. If the line does not exist in your document, then add the following at the end of your document.

Socket inet:8891@localhost

Next, add the following lines to reference our DKIM configurations for each domain:

KeyTable /etc/opendkim/KeyTable
SigningTable /etc/opendkim/SigningTable
ExternalIgnoreList /etc/opendkim/TrustedHosts
InternalHosts /etc/opendkim/TrustedHosts

Type :wq to save the change and close the file

Step 4: Configure Postfix

sudo vi /etc/postfix/main.cf

Add the following lines to the end of the file:

milter_default_action = accept
milter_protocol = 2
smtpd_milters = inet:localhost:8891
non_smtpd_milters = inet:localhost:8891

Type :wq to save the changes and close the file.

Step 5: Restart services to use the changes (optional)

Execute the following to apply the changes if you wish to add domain names at a later time. If not, you can skip this step.

sudo /etc/init.d/opendkim restart
sudo /etc/init.d/postfix reload
sudo /etc/init.d/postfix restart

Step 5/6: Create a DKIM key for a domain

Run the following command to create a new folder and change directory to it for where we will generate our key used to sign the outgoing emails.

sudo mkdir -p /etc/opendkim/keys/mydomain.com
cd /etc/opendkim/keys/mydomain.com

Execute the following command to generate the key:

sudo opendkim-genkey -r -d mydomain.com

Delegate access to the opendkim user and group to access the key (note, if you modified the user in your opendkim.conf file, you will want to use that instead)

sudo chown opendkim:opendkim default.private

Step 7: Reference the key via OpenDKIM KeyTable

Modify the Keytable with vi

sudo vi /etc/opendkim/KeyTable

Add the following line to the file to define your selector. In this example, we will call the selector default, but if your domain requires multiple DKIM keys, ensure you make this unique. You can modify the file by pressing i to enter insert mode in vi:

default._domainkey.mydomain.com mydomain.com:default:/etc/opendkim/keys/mydomain.com/default.private

Type :wq to write and quite vi

Step 8: Specify the domain in your OpenDKIM SigningTable

Open the SigningTable file via vi

sudo vi /etc/opendkim/SigningTable

Add the following line to the file by pressing i to enter insert mode (changing default if specified a different selector earlier on):

mydomain.com default._domainkey.mydomain.com

Type :wq to write and quite vi

Step 8: Update your services to apply the changes

Restart your services to begin signing your messages:

sudo /etc/init.d/opendkim restart
sudo /etc/init.d/postfix reload
sudo /etc/init.d/postfix restart

Step 9: Update DNS

Get the DNS record values we need to publish by executing the following command:

sudo cat /etc/opendkim/keys/mydomain.com/default.txt

Create a new TXT record within your nameservers and specify the value between the quotes (don't include the quotes). I.e.:

v=DKIM1; h=sha256; k=rsa; s=email; p=ABCDEFG.....

Note: I choose to update DNS last as once you update DNS, any servers that would receive mail before you apply the previous configuration may discard your emails. Then again, you didn't have DKIM before, so you were probably going to junk mail anyways ;^)

Credits

Shoutout to Diego on stackoverflow, edoceo, and suenotek for consolidating a lot of these steps:
postfix - Using DKIM in my server for multiple domains (websites) - Ask Ubuntu

How To: Installing and Configuring OpenDKIM for multiple domains with Postfix on Linux | Edoceo

Roundcube mail app and SPF, DKIM & DMARC on Ubuntu 20.04 (suenotek.com)

How to generate a root certificate and create a self-signed server certificate issued from the root

This is going to be a quick tutorial, but here's a quick way to generate a root certificate, server certificate, and bundle them together via pfx file. This can be useful to validate scenarios where a certificate chain is required. For this tutorial, we'll be using the openssl utility, which can be freely downloaded here: Win32/Win64 OpenSSL Installer for Windows - Shining Light Productions (slproweb.com)

Generate the Root Certificate

Execute the following command to generate a key for the root certificate:

openssl ecparam -out root.key -name prime256v1 -genkey

Execute the following command to generate a certificate signing request. Note: During this step, you will be prompted to specify several certificate attributes; for the common name, you can specify the name you'd like as the issuer (i.e. MyCorp)

openssl req -new -sha256 -key root.key -out root.csr

Execute the following to generate the public certificate. During this step, you'll specify the validity of the root certificate (you may want this to be longer than 365 days as the root).

openssl x509 -req -sha256 -days 3650 -in root.csr -signkey root.key -out root.crt

Generate the Server Certificate

Execute the following command to generate a private key for the server certificate:

openssl ecparam -out server-cert.key -name prime256v1 -genkey

Execute the following command to generate a certificate signing request. Note: During this step, you will be prompted to specify several certificate attributes; for the common name, specify the FQDN to your server. You do not need to start the value of the common name with CN=

openssl req -new -sha256 -key server-cert.key -out server-cert.csr

Execute the following command to generate the public certificate for the server certificate. During this step, you'll specify the validity of the server certificate. Generally speaking, the validity of this certificate would be much shorter than your root.

openssl x509 -req -in server-cert.csr -CA root.crt -CAkey root.key -CAcreateserial -out server-cert.crt -days 365 -sha256

Verify certificate chain

Optionally, you can verify the issuer or expiry dates of the server certificate is correct via the following command:

openssl x509 -in server-cert.crt -text -noout

Generate PFX from Root and Server certificate

Execute the following command to generate a PFX file containing the public and private keys of the server certificate as well as public key of the root certificate. Note, you will be prompted for a password for the PFX file, which can increase security when needing to move these sensitive files around.

openssl pkcs12 -export -out mycert.pfx -inkey server-cert.key -in server-cert.crt -certfile root.crt

Using SDRplay RSPduo with RTLSDR-Airplay and a RaspberryPi

One of the side projects I have is rebroadcasting local ATC (Air Traffic Control) audio from my local airport to LiveATC.net. I previously had an RTL-SDR dongle connected to a RaspberryPi 1 Model B, which then rebroadcasted to LiveATC via IceCast. While I've had success the past few years broadcasting, overhead plans were really the only thing that was clear; being distant from the airport, receiving broadcasts from the tower were a slim to none at best.

In doing a bit of research I settled on purchasing a SDRplay RSPduo and Raspberry Pi 4, which seems to help with noise. Pairing the SDRplay with the newest version of RTLSDR-Airplay, I was able to achieve much clearer audio/hear things I couldn't before. While I'm using the SDRplay RSPduo, this guide can be used for their other devices such as the Rsp1a and RSPdx as well (likely others as this guide ages). Here's a reflection on how I got things setup.

Update Raspbian packages

First, update your Linux packages to latest version. I'm running the latest version of Raspbian / Debian.

sudo apt-get update && sudo apt-get upgrade

Disable WiFi/Bluetooth

This is optional, but I figured I'd disable the radios on the RaspberryPi to further mitigate as much possible noise as possible. First, you can disable both radios by editing /boot/config.txt via the vi text editor (this can actually be configured by placing this file on your SD-Card you attach to your Raspberry Pi during first-time boot). Official details on the boot overlays can be found here.

sudo vi /boot/config.txt

Once in vi, press i to insert the following lines:

dtoverlay=disable-bt
dtoverlay=disable-wifi

Press the escape key and then type :wq to write the changes to the file and exit vi.

Lastly, execute the following command to disable the UART bluetooth service.

sudo systemctl disable hciuart

Download & Install RSP Control Library + Driver

First, you will want to grab the latest SDRplay Drivers and Libraries. You can do this by navigating to SDRplay's website and selecting RSPduo and ARM Raspberry Pi OS for the download. Then click the API button. Now this is kinda difficult if you are SSHed into the Pi, so I'd find the latest version from their website and then use the following commands below to remotely download the software (substituting in the version number to grab the latest download) and install it and reboot after install (rebooting after installation is strongly recommended).

Execute the following commands:

# Navigate to home directory
cd ~
# Download latest API Library + Driver
wget https://www.sdrplay.com/software/SDRplay_RSP_API-ARM32-3.07.2.run
# Provide execution rights to install the software
chmod 755 ./SDRplay_RSP_API-ARM32-3.07.2.run
# Run the installer
./SDRplay_RSP_API-ARM32-3.07.2.run
# Reboot the machine
sudo reboot now

Build and install SoapySDR from source

In this section, we need to install SoapySDR which is a vendor and platform neutral SDR support library. Essentially this means that instead of needing a bunch of developers to write integrations into all the different SDRs, other software can leverage these interfaces to skip worrying about device compatibility and focus on what the application needs to do. As we'll see later, RTLSDR-Airband does exactly this to provide support for tons of different SDRs. Kudos to the PothosWare team for enabling developers all over the world to build all sorts of SDR projects!

So, to get this installed, we need to clone the source code from their GitHub repo and compile the project. Official documentation on this process can be found on their wiki, but I'm going to try and simplify everything here.

Since Raspberry Pi doesn't come with Git, I am going to use wget and unzip to do this, but if you don't mind installing Git, that'd be the easier way to "clone" down the latest source code from GitHub (make sure you replace versions where appropriate, at time of writing this, 0.8.0 is the latest version).

# Install dependencies needed to build this project
sudo apt-get install cmake g++ libpython-dev python-numpy swig
# Make sure we are back in our home directory
cd ~
# Grab latest tarball from GitHub
wget -O soapy-sdr-0.8.0.tar.gz https://github.com/pothosware/SoapySDR/archive/soapy-sdr-0.8.0.tar.gz
# Extract the tarball (this is like unzipping a .zip on Windows)
tar xvfz soapy-sdr-0.8.0.tar.gz

# Change directories into the new SoapSDR folder
cd SoapySDR
# Make a new folder called build
mkdir build

# Change directories into the build folder
cd build

# Execute cmake build automation
cmake ..

# Make installer (-j4 parameter increases build threads to make compilation quicker)
make -j4
# Make the installer copy files to right locations
sudo make install
sudo ldconfig #needed on debian systems
# Navigate back to home directory
cd ~
# Delete the SoapySDR folder since we are done with it
rm -R SoapySDR

At this point, you should be able to execute the SoapySDRUtil command and see the version you installed.

SoapySDRUtil --info

You should get something like this:

Build and install SoapySDR Play module from source

Now that we have SoapySDR installed, we need to install the module to allow it to control the SDRplay device. Similiar to SoapySDR install, we'll pull down the latest files from the SoapySDR Play Module GitHub repo, build the installer, execute it, and verify that all went well. Official instructions can be found on their wiki as well.

# Make sure we are back in our home directory
cd ~
# Grab latest tarball from GitHub
wget -O SoapySDRPlay.zip https://github.com/pothosware/SoapySDRPlay3/archive/refs/heads/master.zip
# Unzip the archive
unzip SoapySDRPlay.zip
# Change directories into the new SoapSDR folder
cd SoapySDRPlay3-master
# Make a new folder called build
mkdir build
# Change directories into the build folder
cd build
# Execute cmake build automation
cmake ..
# Make installer
make
# Make the installer copy files to right locations
sudo make install
sudo ldconfig #needed on debian systems
# Navigate back to home directory
cd ~
# Delete the SoapySDR folder since we are done with it
rm -R SoapySDRPlay3-master

Plug in SDRplay RSPduo device and verify we see it

If you haven't already, go ahead and plug in your SDRplay RSPduo. Next, let's verify we see it using the SopaySDRUtil command.

SoapySDRUtil --probe="driver=sdrplay"

You should see something like this and you should see your device and hardware version (note the hardware hardware= value as you may need that later). In addition, one thing that is neat about the RSPduo is there are multiple tuners/antennas. You will be able to see these values in the probe output. Once you enable RTLSDR-Airplay, you'll notice active antennas are removed from the list of available devices.


pi@raspberrypi:~/RTLSDR-Airband-3.2.1 $ SoapySDRUtil --probe="driver=sdrplay"
######################################################
##     Soapy SDR -- the SDR abstraction library     ##
######################################################

Probe device driver=sdrplay
[INFO] devIdx: 0
[INFO] hwVer: 3
[INFO] rspDuoMode: 1
[INFO] tuner: 1
[INFO] rspDuoSampleFreq: 0.000000

----------------------------------------------------
-- Device identification
----------------------------------------------------
  driver=SDRplay
  hardware=RSPduo
  sdrplay_api_api_version=3.070000
  sdrplay_api_hw_version=3

----------------------------------------------------
-- Peripheral summary
----------------------------------------------------
  Channels: 1 Rx, 0 Tx
  Timestamps: NO
  Other Settings:
     * RF Gain Select - RF Gain Select
       [key=rfgain_sel, default=4, type=string, options=(0, 1, 2, 3, 4, 5, 6, 7, 8, 9)]
     * IQ Correction - IQ Correction Control
       [key=iqcorr_ctrl, default=true, type=bool]
     * AGC Setpoint - AGC Setpoint (dBfs)
       [key=agc_setpoint, default=-30, type=int, range=[-60, 0]]
     * ExtRef Enable - External Reference Control
       [key=extref_ctrl, default=true, type=bool]
     * BiasT Enable - BiasT Control
       [key=biasT_ctrl, default=true, type=bool]
     * RfNotch Enable - RF Notch Filter Control
       [key=rfnotch_ctrl, default=true, type=bool]
     * DabNotch Enable - DAB Notch Filter Control
       [key=dabnotch_ctrl, default=true, type=bool]

----------------------------------------------------
-- RX Channel 0
----------------------------------------------------
  Full-duplex: NO
  Supports AGC: YES
  Stream formats: CS16, CF32
  Native format: CS16 [full-scale=32767]
  Antennas: Tuner 1 50 ohm, Tuner 1 Hi-Z, Tuner 2 50 ohm
  Corrections: DC removal
  Full gain range: [0, 48] dB
    IFGR gain range: [20, 59] dB
    RFGR gain range: [0, 9] dB
  Full freq range: [0.001, 2000] MHz
    RF freq range: [0.001, 2000] MHz
    CORR freq range:  MHz
  Sample rates: 0.0625, 0.096, 0.125, 0.192, 0.25, ..., 6, 7, 8, 9, 10 MSps
  Filter bandwidths: 0.2, 0.3, 0.6, 1.536, 5, 6, 7, 8 MHz

Build and install RTLSDR-Airband from source

RTLSDR-Airband is an open source project that allows you to receive analog radio voice channels and produce audio streams which can be routed to various outputs, such as online streaming via Icecast server, PulseAudio server, Audio file, or Raw I/Q file. In our case, we are going to stream to an Icecast server in this example.

Similar to our previous section in SoapySDR, we need to download the latest source code, build and install RTLSDR, and then modify the configuration file. Official documentation can be found on the RTLSDR-Airplay GitHub Wiki.

# Install RTLSDR-Airplay dependencies
sudo apt-get install build-essential libmp3lame-dev libshout3-dev libconfig++-dev libfftw3-dev
# Navigate back to our home directory
cd ~
# Download the latest source from GitHub
wget -O RTLSDR-Airband-3.2.1.tar.gz https://github.com/szpajder/RTLSDR-Airband/archive/v3.2.1.tar.gz
# Extract the tarball
tar xvfz RTLSDR-Airband-3.2.1.tar.gz
# Change directory into the RTLSDR-Airband folder
cd RTLSDR-Airband-3.2.1
# Make the installer; this is specify to armv7 (32-bit Raspberry PI) with SoapySDR support.
# This removes RTLSDR support to avoid another dependency install (WITH_RTLSDR=0)
make PLATFORM=armv7-generic WITH_RTLSDR=0 WITH_SOAPYSDR=1
# Install the program
sudo make install

Configure RTLSDR-Airband

For my particular setup, I want to stream to an external icecast server. To do that, I recommend creating a backup of the default configuration file (as a backup).

# Rename the original config file as a backup
sudo mv /usr/local/etc/rtl_airband.conf /usr/local/etc/rtl_airband.conf.bak

Next, we can create a new configuration file with the proper configuration. Execute the following command to open vi.

sudo vi /usr/local/etc/rtl_airband.conf

Press i to go into insert mode and paste the following (replacing the values applicable to your environment; you may want to change the name of the stream, authentication parameters, and gain"). Also, note that we are using the first Antenna and specifying the hardware version of RSPduo from the previous step where we probed the SDRplay device (if you have a different SDRplay device, substitute that value accordingly).

# Configure IceCast Stream
devices:
({
  type = "soapysdr";
  index = 0;
  device_string = "driver=sdrplay,hardware=RSPduo";
  channel = 0;
  gain = 35;
  correction = 1;
  antenna = "Tuner 1 50 ohm";
  mode = "scan";
  channels: ( 
    {
      freqs = ( 133.8 );
      outputs: ( 
          {
             type = "icecast";
             server = "audio-in.myicecastserver.net";
             port = 8010;
             mountpoint = "station"
             username = "username"
             password = "mypassword";
             name = "Tower";
             description = "Tower - 133.8Mhz";
             genre = "ATC";
          }
       );
    }
 );
});

Type :wq to write and save the changes to the file.

Validate RTLSDR-Airband Configuration

Once you have your configuration, you can validate everything is ready to go by running RTLSDR-Airplay in foreground mode.

From their wiki: you will see simple text waterfalls, one per each configured channel. This is an example for three devices running in multichannel mode. The meaning of the fields is as follows:

User interface screenshot
  • The number at the top of each waterfall is the channel frequency. When running in scan mode, this will be the first one from the list of frequencies to scan.
  • The number before the forward slash is the current signal level
  • The number after the forward slash is the current noise level estimate.
  • If there is an asterisk * after the second number, it means the squelch is currently open.
  • If there is a > or < character after the second number, it means AFC has been configured and is currently correcting the frequency in the respective direction.

Execute the following command to start running in foreground mode:

# Test in foreground mode
/usr/local/bin/rtl_airband -f

Press Cntrl+C to break out of the stream once you are satisfied with your testing.

Enable RTLSDR-Airband to autostart

To enable RTLSDR-Airband to automatically start up each time your Raspberry Pi is rebooted, you can execute the following commands from within the RTLSDR-Airband directory.

sudo cp init.d/rtl_airband.service /etc/systemd/system
sudo chown root.root /etc/systemd/system/rtl_airband.service
sudo systemctl daemon-reload
sudo systemctl enable rtl_airband

Hurray! We are done!

If you made it this far you have completed all the steps! Enjoy your new streaming SDR solution!

Controlling a Haiku fan with a wall switch

TLDR: I wanted to control the light on Big Ass Fans' Haiku fan via physical wall switch, so this tutorial is going to go over how to pair a smart switch with Home Assistant software to provide a traditional light switch experience. Skip down to Setting up the wall switch to start if you want to skip my ramblings.

Here's a YouTube video if you don't like to read:

Longer story

In the background of many commercial buildings, silently lurking and judging us from above, lies what looks like possibly a recycled helicopter blade. Don't be fooled, these blades are no helicopter blade, they are years of engineered excellence in the makings. The company prides themselves on solid engineering and building a solid product for their customers. They are called Big Ass Fans.

For quite some time I've been eyeing their Haiku fan, which is their residential ceiling fan. Their fans look incredibly modern, operate almost completely silent, have a "SenseMe" feature that figures out when people are in the room to automatically do stuff, and they have an API that you can integrate into locally on the fan via WiFi (lose internet, no problem, you can still control your fan!). One of my biggest "beefs" with today's companies is they try to make things really proprietary and crappy, so seeing the company that takes pride in their product and allowing others to integrate into it remotely without internet is super "cool" 😉

You spin me right round

The fan itself is smart... too smart

One thing that's really interesting, is when you hook up the fan, to me, it's designed more like their commercial units where it needs to be constantly powered on; from there you remotely control the fan either by remote or their smart phone application. Both the remote and even the mobile app, work incredibly well and are extremely responsive, but the only tricky thing about the fan is in a residential setting, many folks have light/fan combos in their bedrooms, offices, and living rooms and if a guest walks into the room and flips the light switch, they are flipping power to the whole fan/light.

So....?

In many commercial settings, fans you typically set once and let em' rip, but with the residential play, you have grandparents, guests, friends, etc. that may come over. Since the remote is there, they go, how do I turn the lights on to this room? Unfortunately, there isn't a good answer here other than to put a plate on the wall and force your guests to check out the remote.

I personally find the remote a hassle since I have a small little corridor into one room, so when it's darker in the evenings, you grab the remote on the wall, walk through this dark area, and then aim the remote somewhere at the ceiling to turn it on (this is if you don't forget the remote in the room from before).

So...?

I am a "big fan" of having a smart home, but I want it to be super intuitive to the end user. I design everything to be used as if my grandparent is over and they have no idea what the heck is going on. In this case, I leveraged an open source project called Home Assistant and a Leviton Z-Wave switch to do the magic of controlling the fan like any other fan you'd buy at a big box store. More specifically, I really just needed to control the light on the fan, so this tutorial is going to go over how to control the light from the fan via the switch.

Setting up the wall switch

The first thing you'll need is a smart switch. It can be WiFi, Z-Wave, ZigBee, etc; it doesn't matter specifically what brand (odds are, it'll be compatible (here's the official list)), but you'll need a switch that allows you to control it via the computer or your phone. I used a dimmer switch specifically as the light on the Haiku allows several different levels of brightness.

Once you have the switch, what you'll want to do is wire up the fan so it constantly has power and also give power the switch. This does two things: 1) it allows the fan to be powered on regardless if your guest turns the light switch on/off 2) it allows the light switch to stay powered on so you use that to talk to your fan. Here's an example of how I wired my Leviton Z-Wave switch.

Here you can see I don't have anything connected to the red pin, or load. Typically, you'd have this connect back to the fans lights to turn them on/off, but the Haiku fan isn't wired like that.

On this side, you can see we only have the negative wire connected. It's hard to see, but in the box, I have all my neutral and negative wires capped together, which offers power to the fan 100% of the time, regardless of what this switch is doing.

Once you have the switch wired up and ready to go, it should literally do nothing when you turn it on/off, but your fan should stay on all the time.

Setting up Home Assistant Automation

This guide won't go into installing / setting up Home Assistant, rather more so around the automation scripts needed to get this all working. If you are interested in learning more about Home Assistant, you can check out their website here and I have a blog post on how to deploy Home Assistant on a Raspberry Pi here.

To get this working, you will need a couple of things:

  • Add your smart switch to Home Assistant
  • Install HACS
  • Install Haiku SenseMe Integration
  • Add two automation scripts
    • One to control light on/off events
    • One to control light brightness events
Add your smart switch

I won't go into details here too much since every switch will have a separate way to install (Z-Wave vs WiFi vs Zigbee for example), but here is a nice YouTube video on how to get things going (https://youtu.be/FtWFSuMdiSQ?t=353).

HACS

If you have used Home Assistant, it comes with many different native integrations out of the box. Unfortunately, many integrations are developed so quickly the HA (Home Assistant) team doesn't have time to vet them all, so they end up being maintained by the community. HACS helps install these integrations, so I'd recommend installing this.

Step-by-Step documentation on installation can be found here: Prerequisites | HACS

Haiku Integration

A few much smarter folks wrote up an integration for the Haiku fan called SenseME, which we need to install. Once HACS is installed, you can search for the integration via HACS and install the integration. Copied from their integration, here is how to install the integration:

  1. Go to Configuration -> Integrations.
  2. Click on the + ADD INTEGRATION button in the bottom right corner.
  3. Search for and select the SenseME integration.
  4. If any devices are discovered you will see the dialog below. Select a discovered device and click Submit and you are done. If you would prefer to add a device by IP address select that option, click Submit, and you will be presented with the dialog in step 5.
  5. If no devices were discovered or you selected the IP Address option the dialog below is presented. Here you can type in an IP address of undiscoverable devices.
  6. Repeat these steps for each device you wish to add.

Information on the SenseME integration can be found on their GitHub site here: mikelawrence/senseme-hacs: Haiku with SenseME fan integration for Home Assistant (github.com)

Once configuration is completed, you should see an entity for your fan listed that looks something like this.

On/Off Automation

This automation will first control On/Off behavior from your light switch.

  1. Go to Configuration -> Automations.
  2. Click on the + ADD Automation button in the bottom right corner.
  3. Click the START WITH AN EMPTY AUTOMATION button
  4. Click on the three dots in the top right corner and click Edit in YAML

5. Paste the following code; make sure you edit the names of each of your light switch entities (one for your fan light and one for the light switch on the wall):
light.your_light (the light for your wall) and light.fan_light (the light on the Haiku fan).

alias: Turn On/Off Haiku Fan/Wall Switch
description: ''
trigger:
  - platform: state
    entity_id: light.your_light, light.fan_light
    from: 'off'
    to: 'on'
  - platform: state
    from: 'on'
    to: 'off'
    entity_id: light.your_light, light.fan_light
condition: []
action:
  - service: light.turn_{{ trigger.to_state.state }}
    data:
      entity_id: |-
        {% if trigger.entity_id == 'light.your_light' %}
          light.fan_light
        {% elif trigger.entity_id == 'light.fan_light' %}
          light.your_light
        {% endif %}
mode: single

6. Click the SAVE button

Brightness Automation

This automation will first control On/Off behavior from your light switch.

  1. Go to Configuration -> Automations.
  2. Click on the + ADD Automation button in the bottom right corner.
  3. Click the START WITH AN EMPTY AUTOMATION button
  4. Click on the three dots in the top right corner and click Edit in YAML
  5. Paste the following code; make sure you edit the names of each of your light switch entities (one for your fan light and one for the light switch on the wall):
    light.your_light (the light for your wall) and light.fan_light (the light on the Haiku fan).
alias: Sync Haiku Fan/Wall Switch Brightness
description: ''
trigger:
  - platform: state
    entity_id: light.your_light, light.fan_light
    attribute: brightness
    for: '00:00:02'
condition:
  - condition: template
    value_template: '{{ trigger.to_state.attributes.brightness > 0}}'
action:
  - service: light.turn_on
    data:
      brightness: '{{ trigger.to_state.attributes.brightness }}'
      entity_id: |-
        {% if trigger.entity_id == 'light.your_light' %}
          light.fan_light
        {% elif trigger.entity_id == 'light.fan_light' %}
          light.your_light
        {% endif %}
mode: restart

6. Click the SAVE button

Testing!

At this point, whether you use your remote or the light switch, your lights should be in sync! Use the remote or the wall switch to the turn on/off the lights. Try using the switch to dim and it should adjust the brightness of the light (note: there may be a tiny delay after you make changes to the dimmer value as there's a 2second delay in the automation, which prevents the lights from going wonky).

Conclusion

Through the use of Home Assistant + any smart switch, we can easily control the Haiku fan with physical nobs and dials. While this tutorial only covers controlling the fan's light via a switch, the same principals can be used to add a second switch to control the fan speed.

For those that like physical knobs and dials to control your devices, hope this was helpful!

If you are thinking of buying a Big Ass Fan, please consider using my referral code so I can use them towards future reviews! https://bigassfans.referralrock.com/l/1JACKSTROMB57/

[Tutorial] How to create a bootable USB Drive to flash a Lenovo device's BIOS

This tutorial will review how to create a bootable USB drive to flash the fimrware/bios on your Lenovo device.

Before we begin, Lenovo offers three different downloads for Firmware today:

  1. Windows installer/flash utility (.exe)
  2. CD ISO version (.iso) to burn to a disk
  3. USB Flash Package (.zip)

While the USB Flash Package (.zip) is exactly what we are looking for, by default if you just drag the files onto your USB drive, it won't boot to the flash utility. In this case, the instructions below will show you have to make the drive bootable and then launch the USB Flash Package.

Make a bootable drive

First, you will want to download a copy of the Rufus utility. This utility is an open source utility for Windows only, but will allow you to make a bootable USB drive. You can obtain a copy of the utility here. Rufus' website can officially be found here: https://rufus.ie/

Once installed, open the application. Select your USB device you wish to flash (note this will erase all data on your device) and set the Boot selection to FreeDOS. Once your Device and Boot selection has been set, go ahead and click Start to flash the device.

You will be prompted to confirm you are OK with erasing the device. Go ahead and click OK if you are sure you have selected the correct device in the prior step.

Once completed, you should see a green bar that says READY. This is kinda misleading, wish it would say completed, but your device should be flashed at this point.

Download the right firmware from Lenovo

As mentioned earlier, Lenovo offers 3 different types of downloads on their website. You will want a copy of the zipped installer as shown in the screenshot below.

Once downloaded, navigate to where you downloaded the zipped file, right click, and select Extract All... If you don't see Extract All... then try downloading a copy of 7-Zip, which is a fantastic free archiver solution that can open all types of compressed files (zip, 7zip, tar.gz, etc)

In this picture, we show right clicking the zipped folder and clicking Extract All... on the file.
In this picture, we are selecting the folder to where the extracted files should go.

Copy the extracted files to your bootable USB drive

Once you have extracted the files from the zipped folder from Lenovo, you will want to copy and paste the files from the extracted directory to the bootable USB drive. To show visually, I opened two file explorer windows, one in the directory of the extracted firmware and the other on the bootable USB drive. I simply dragged and dropped the files from the firmware directory to the bootable USB drive.

When you try to copy the files from the firmware directory to the bootable USB drive, you will be prompted to replace AUTOEXEC.BAT. Make sure to Replace the file in the destination as this will execute the command to launch the flash2 utility, which actually writes the firmware to the device.

Plug in the drive and set the device to boot to it

At this point, you should have a bootable USB device that you can now plugin to your Lenovo device. You can unplug it from your client machine and plug it in to your Lenovo device. Make sure you set your Lenovo device to boot from the USB drive (this can usually be set by pressing the F1 or F2 keys during the post screen).

What to expect

Upon boot, you should be greeted by the Lenovo flash utility, which will ask if you want to update your device. Please note, that in my experience, once I select yes the device needed to reboot several times and may boot into the BIOS. The utility will tell you when everything is completed, so make sure you don't power down your device or unplug your USB drive after the first or second reboot, make sure you wait things out. As with updating any firmware, make sure you don't do this in a storm or on a device with low battery as you ensuring little chance of disruption as possible is absolutely critical.

Summary

At this point, you should have a bootable USB drive created by Rufus and FreeDOS that can be paired with Lenovo's firmware to go around and flash your devices. Hope this helps!

DPM 2016 - Reporting Services Server cannot connect to DPM Database

After installing System Center Data Protection Manager from scratch or after performing an upgrade from DPM 2012 R2, when you attempt to schedule a report to be mailed, you receive the following popup error or notice that when you generate a report you just receive a white page in your browser that continues to load indefinitely:


Reporting Services Server cannot connect to the DPM database.
To repair the configuration, follow steps for repairing DPM from DPM Setup Help.
ID: 3001
More Information
Just an endless loading page presented in your browser.

Unfortunately, following the repair steps suggested in the More Information link does not resolve the problem.

Resolution for SQL Server 2016 and later

  1. On the DPM server, open Computer Management, expand Local Users and Groups, click Groups, and create a new local group with the following information (replace items in red with the actual server name/hostname of your machine):
    1. Group name: DPMDBReaders$DPMSERVERNAME
    2. Description: This group is internally used by Microsoft System Center 2016 Data Protection Manager.
  2. On the DPM server, open Computer Management, expand Local Users and Groups, click Users, and create a new local user with the following information (replace items in red with the actual server name/hostname of your machine):
    1. User name: DPMR$DPMSERVERNAME
    2. Full name: DPMR$DPMSERVERNAME
    3. Description: This account is used for SQL reporting to generate reports for DPM 2016.
    4. Enter a strong password
    5. Check Password never expires
  3. Select your recently created DPMR$DPMSERVERNAME account and click on the Membership of tab
    1. Add the DPMDBReaders$DPMSERVERNAME group we created in step 1
    2. Click OK
  4. Start Microsoft SQL Server Management Studio and connect to the SQL instance used by DPM.
    1. Expand Security, right-click on the Logins, select New login
      1. Click on the Search... button and add the local group DPMDBReaders$DPMSERVERNAME
      2. Set the Default database to YourDPMDatabase

      3. Click on the User Mapping section, check the checkbox for YourDPMDatabase, and check the checkbox for the db_datareader role.
      4. Click OK
  5. In Microsoft SQL Server Management Studio, expand Databases, expand YourDPMDatabase, expand Security, expand Users, and right click Properties on the DPMDBReaders$DPMSERVERNAME group you granted access to in step 4.
    1. Click on Securables
    2. Click the Search... button
    3. Select Specific objects... and click OK
    4. Click the Object Types... button
    5. Check Stored precedures and click OK
    6. Click the Browse... button
    7. Check [dbo].[prc_MOM_Heartbeat_Get] and [dbo].[prc_MOM_ProductionServerGet] and click OK
    8. Click OK on the Select objects dialog box
    9. Place a checkbox in the Grant column for the Execute row.
      1. Make sure you do this step for both [dbo].[prc_MOM_Heartbeat_Get] and [dbo].[prc_MOM_ProductionServerGet], checking the box once will only update one of the storage procedures.
    10. Click OK
  6. Exit Microsoft SQL Server Management Studio.
  7. Open Reporting Services Configuration Manager and connect to the SqlServer and Instance hosting the DPM reports (as you go through this, replace the items in Red with your applicable values).
    1. Click on the Web Portal URL menu item and click on the listed URL for DPM.
    2. Click on the DPMReports_GUID folder to open the DPM reports page.
    3. Click on the DPMReporterDataSource Data Source to open its properties.
    4. Under Credentials, use the following configuration:
      1. Select the Using the following credentials radio button
      2. Type of credentials: Windows user name and password
      3. User name: DPMR$DPMSERVERNAME
      4. Password: EnterYourPasswordToTheAccount
      5. Ensure Log in using these credentials, but then try to impersonate the user viewing the report is unchecked
      6. Click the Test connection button
        1. Ensure it says Connected successfully

      7. Click Apply
    5. Close out of your web browser
  8. Back in the Reporting Services Configuration Manager window, complete the following steps
    1. Select the Service Account menu item
    2. Ensure Use built-in account is set to Network Service
      1. Check the Use built-in account radio button
      2. Set the account to Network Service
      3. Click Apply
      4. You will be prompted to save an encryption key. Save the key to location of your choosing, and type a password to be used to encrypt the file. Click OK
      5. You will then be prompted to specify an account with administrator privileges. Click OK to use your current account.
  9. Reboot the DPM Server

At this point, you should now be able to schedule e-mail reports without experiencing the original error and your reports should load properly!

DPM 2016 - Anonymous / Open Relay for SMTP Notifications

DPM 2016 is primarily geared towards using mail servers that require authentication (rightfully so, that's a best security practice). However, many IT organizations have local mail relay servers with anonymous authentication that are used for several IT services in the organization. Unfortunately, DPM 2016 gets a bit wonky using unauthenticated mail servers and will likely give you a generic error that says:

Error ID: 2013
Details: The user name or password is incorrect

And if you ignore the error and head over to the notifications tab to configure a notification, you will be presented with another generic error:

An authentication error occured when trying to connect to the SMTP serve. (ID: 518)

You typed an incorrect user name, password, or SMTP server name. Type the correct user name or password to enable e-mail delivery of reports and alert notifications.

And if you are trying to configure scheduled emails you may receive an error about reporting services:

DPM Setup is unable to update the report server configuration to configure e-mail settings. (ID: 3040).

One thing I may do before getting too far ahead though is validate you can send an email from the DPM server. This can easily be done via PowerShell by executing the following command:

Send-MailMessage -SMTPServer localhost -To [email protected] -From [email protected] -Subject "Test Email from DPM Server" -Body "Howdy!  This is a test from the DPM Sever.  If you see this, mail relay is working!"

When executing the PowerShell command, it won't return anything, but you should hopefully see a message in your mailbox. If you do, you've at least ruled out network/mail issues.

Once you've ruled out connectivity/the mail server, we will complete the following steps below to configure DPM.

  1. Configure E-mail for SQL Server Reporting Services
  2. Create a Local User Account
  3. Remove any artifacts left in the registry
  4. Update the SMTP settings in DPM.

Configuration

  1. Configure SQL Server Reporting Services
    1. Open Reporting Services Configuration Manager
    2. Sign into your DPM instance
    3. Select E-mail Settings and leverage the following configuration
      1. Sender Address: [email protected]
      2. SMTP Server: emailserver.yourdomain.com
      3. Authentication: No authentication

    4. Click Apply
  2. Create a local user account
    1. Open Computer Management, expand Local Users and Groups, select Users, and Create a new local user on the machine
      1. Create the user (I used anonemail as the account name, but anything can be specified)
      2. Remove all group membership
        1. This account doesn't need to be a part of any group, including the Users group
        2. This account should not be a part of administrators (I've seen other blog posts mention you must use administrator, that is 100% not necessary and can be considered a security risk)
      3. Ensure the account is enabled
        1. A disabled account will not work
  3. Cleanup the registry
    1. Open registry editor (regedit.msc)
    2. Navigate to HKEY_LOCAL_MACHINE\Software\Microsoft\Microsoft Data Protection Manager\Notification
    3. Delete the following keys (if they exist):
      1. SmtpUserName
      2. SmtpPassword
  4. Reboot the DPM Server
    1. Technically, you could restart two services:
      SQL Server Reporting Services instance for DPM and the DPM service, but a reboot never hurts 😉
  5. Configure DPM to use SMTP relay
    1. Close out of the DPM and reopen
    2. Select Reporting, waiting for the screen to finish loading, and then select Action, Options
    3. Select the SMTP Server tab and enter
      1. SMTP sever name: relayserver.mydomain.com
      2. SMTP server port: 25
      3. "From" Address: [email protected]
      4. Username: .\localuserwecreatedearlier
        1. Ensure you have .\ to designate the user is local
      5. Password: LocalUserAccountPassword

    4. Click the Send Test E-Mail button and specify an email address to send a test email to validate all is well
    5. Success!
    6. Click OK on the Options window to save your settings

At this point, you should be able to relay emails through your open relay as well as schedule emails for reports without error.

DPM 2016 Installation - Error ID: 4387

When installing DPM 2016, you may get a really generic error during the "Prerequisites check" during installation. Looking online, there's a ton of individuals that have this issue, but no one correlates the log files to specifically what is needed to solve each problem (yep, "each" problem, Error 4387 is a generic catch all for several issues during the prerequisits check).

Before I get into the article, the too long didn't read (TLDR) version is make sure you are using both SQL Server 2016 (no service pack) and SSMS 16.5 or earlier to successfully install DPM 2016.

To get a bit more technical and find out what's going on, open up the DPM Installation logs after you receive the error. The installation log files can be found by browsing to %ProgramFiles%\Microsoft System Center 2016\DPM\DPMLogs.  Documentation on where log files are stored by DPM can be found here: https://docs.microsoft.com/en-us/system-center/dpm/set-up-dpm-logging?view=sc-dpm-2016

Here's a copy of my DpmSetup.log file, in which when looking through it, there isn't a clear cut answer, just this generic line at the bottom ([3/1/2019 6:22:54 AM] *** Error : CurrentDomain_UnhandledException).

[3/1/2019 6:21:16 AM] Information : Microsoft System Center 2016 Data Protection Manager setup started.
[3/1/2019 6:21:16 AM] Data : Mode of setup = User interface
[3/1/2019 6:21:16 AM] Data : OSVersion = Microsoft Windows NT 10.0.14393.0
[3/1/2019 6:21:16 AM] Information : Check if the media is removable
[3/1/2019 6:21:16 AM] Data : Folder Path = C:\Program Files\Microsoft System Center 2016\DPM
[3/1/2019 6:21:16 AM] Data : Drive Name = C:\
[3/1/2019 6:21:16 AM] Data : Drive Type = 3
[3/1/2019 6:21:16 AM] Information : Check attributes of the directory
[3/1/2019 6:21:16 AM] Data : Folder Path = C:\Program Files\Microsoft System Center 2016\DPM
[3/1/2019 6:21:16 AM] Data : File Attributes = Directory
[3/1/2019 6:21:16 AM] Information : Check if the media is removable
[3/1/2019 6:21:16 AM] Data : Folder Path = C:\Program Files\Microsoft Data Protection Manager
[3/1/2019 6:21:16 AM] Data : Drive Name = C:\
[3/1/2019 6:21:16 AM] Data : Drive Type = 3
[3/1/2019 6:21:16 AM] Information : Check attributes of the directory
[3/1/2019 6:21:16 AM] Data : Folder Path = C:\Program Files\Microsoft Data Protection Manager
[3/1/2019 6:21:16 AM] * Exception : Ignoring the following exception intentionally => System.IO.FileNotFoundException: Could not find file 'C:\Program Files\Microsoft Data Protection Manager'.
File name: 'C:\Program Files\Microsoft Data Protection Manager'
   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.File.GetAttributes(String path)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Wizard.InstallLocationValidation.CheckForDirectoryAttributes(String path)
[3/1/2019 6:21:16 AM] Information : Check if the media is removable
[3/1/2019 6:21:16 AM] Data : Folder Path = C:\Program Files\Microsoft System Center 2016\DPM\DPM\DPMDB
[3/1/2019 6:21:16 AM] Data : Drive Name = C:\
[3/1/2019 6:21:16 AM] Data : Drive Type = 3
[3/1/2019 6:21:16 AM] Information : Check attributes of the directory
[3/1/2019 6:21:16 AM] Data : Folder Path = C:\Program Files\Microsoft System Center 2016\DPM\DPM\DPMDB
[3/1/2019 6:21:16 AM] * Exception : Ignoring the following exception intentionally => System.IO.DirectoryNotFoundException: Could not find a part of the path 'C:\Program Files\Microsoft System Center 2016\DPM\DPM\DPMDB'.
   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.File.GetAttributes(String path)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Wizard.InstallLocationValidation.CheckForDirectoryAttributes(String path)
[3/1/2019 6:21:17 AM] Information : The setup wizard is initialized.
[3/1/2019 6:21:17 AM] Information : Starting the setup wizard.
[3/1/2019 6:21:17 AM] Information : <<< Dialog >>> Welcome Page : Entering
[3/1/2019 6:22:33 AM] Information : <<< Dialog >>> Welcome Page : Leaving
[3/1/2019 6:22:33 AM] Information : <<< Dialog >>> Inspect Page : Entering
[3/1/2019 6:22:41 AM] Information : Query WMI provider for path of configuration file for SQL Server 2008 Reporting Services.
[3/1/2019 6:22:41 AM] Information : Querying WMI Namespace: \\DPM-SERVER\root\Microsoft\SqlServer\ReportServer\RS_DPM\V13\admin for query: SELECT * FROM MSReportServer_ConfigurationSetting WHERE InstanceName='DPM'
[3/1/2019 6:22:42 AM] Data : Path of configuration file for SQL Server 2008 Reporting Services = C:\Program Files\Microsoft SQL Server\MSRS13.DPM\Reporting Services\ReportServer\RSReportServer.config
[3/1/2019 6:22:42 AM] * Exception :  => System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.SqlServer.Smo, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified.
File name: 'Microsoft.SqlServer.Smo, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Helpers.MiscHelper.IsSqlClustered(String sqlMachineName, String sqlInstanceName)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Helpers.MiscHelper.IsMachineClustered(String sqlMachineName, String sqlInstanceName)

WRN: Assembly binding logging is turned OFF.
To enable assembly bind failure logging, set the registry value [HKLM\Software\Microsoft\Fusion!EnableLog] (DWORD) to 1.
Note: There is some performance penalty associated with assembly bind failure logging.
To turn this feature off, remove the registry value [HKLM\Software\Microsoft\Fusion!EnableLog].

[3/1/2019 6:22:42 AM] * Exception :  => System.Management.ManagementException: Invalid namespace 
   at System.Management.ManagementException.ThrowWithExtendedInfo(ManagementStatus errorCode)
   at System.Management.ManagementScope.InitializeGuts(Object o)
   at System.Management.ManagementScope.Initialize()
   at System.Management.ManagementObjectSearcher.Initialize()
   at System.Management.ManagementObjectSearcher.Get()
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Helpers.WmiHelper.IsMachineClustered(String machineName, String instanceName)
[3/1/2019 6:22:42 AM] Information : OS >= win 8 , enable Dedupe role
[3/1/2019 6:22:53 AM] Information : output : True
.. 
 error : 
[3/1/2019 6:22:53 AM] Data : Path of inspection output xml = C:\Program Files\Microsoft System Center 2016\DPM\DPMLogs\InspectReport.xml
[3/1/2019 6:22:53 AM] Information : Instantiating inspect component.
[3/1/2019 6:22:53 AM] Data : Path of output xml = C:\Program Files\Microsoft System Center 2016\DPM\DPMLogs\InspectReport.xml
[3/1/2019 6:22:53 AM] Information : Deserializing the check XML from path : C:\Users\labuser.CONTOSO\AppData\Local\Temp\DPM8AFA.tmp\DPM2012\Setup\checks.xml
[3/1/2019 6:22:53 AM] Information : Loading the check XML from path : C:\Users\labuser.CONTOSO\AppData\Local\Temp\DPM8AFA.tmp\DPM2012\Setup\checks.xml
[3/1/2019 6:22:54 AM] Information : Deserialising the scenario XML from path : C:\Users\labuser.CONTOSO\AppData\Local\Temp\DPM8AFA.tmp\DPM2012\Setup\scenarios.xml
[3/1/2019 6:22:54 AM] Information : Loading the check XML from path : C:\Users\labuser.CONTOSO\AppData\Local\Temp\DPM8AFA.tmp\DPM2012\Setup\scenarios.xml
[3/1/2019 6:22:54 AM] Information : Getting scenarios for the product: DPM
[3/1/2019 6:22:54 AM] Information : Getting scenarios for DPM
[3/1/2019 6:22:54 AM] Information : Getting scenario for Mode:Install, DbLocation:Remote, SKU:Retail and CCMode:NotApplicable
[3/1/2019 6:22:54 AM] *** Error : Initialize the SQLSetUpHelper Object
[3/1/2019 6:22:54 AM] Information : [SQLSetupHelper.GetWMIReportingNamespace]. Reporting Namespace found. Reporting Namespace : V13
[3/1/2019 6:22:54 AM] Information : [SQLSetupHelper.GetWMISqlServerNamespace]. SQL Namespace found. SQL Namespace : \\DPM-SERVER\root\Microsoft\SqlServer\ComputerManagement13
[3/1/2019 6:22:54 AM] Information : Query WMI provider for SQL Server 2008.
[3/1/2019 6:22:54 AM] Information : Querying WMI Namespace: \\DPM-SERVER\root\Microsoft\SqlServer\ComputerManagement13 for query: Select * from SqlServiceAdvancedProperty where ServiceName='MSSQL$DPM' and PropertyName='Version'
[3/1/2019 6:22:54 AM] Information : SQL Server 2008 R2 SP2 instance DPM is present on this system.
[3/1/2019 6:22:54 AM] Information : Query WMI provider for SQL Server 2008.
[3/1/2019 6:22:54 AM] Information : Querying WMI Namespace: \\DPM-SERVER\root\Microsoft\SqlServer\ComputerManagement13 for query: Select * from SqlServiceAdvancedProperty where ServiceName='MSSQL$DPM' and PropertyName='Version'
[3/1/2019 6:22:54 AM] Information : [SQLSetupHelper.GetSQLDepedency]. Reporting Namespace and SQL namespace for installed SQL server which will be used as DPM DB. Reporting Namespace : \\DPM-SERVER\root\Microsoft\SqlServer\ReportServer\RS_DPM\V13\admin SQL Namespace : \\DPM-SERVER\root\Microsoft\SqlServer\ComputerManagement13
[3/1/2019 6:22:54 AM] Information : Check if SQL Server 2012 Service Pack 1 Tools is installed.
[3/1/2019 6:22:54 AM] Information : [SQLSetupHelper.GetSqlSetupRegKeyPath]. Registry Key path that contains SQL tools location: Software\Microsoft\Microsoft SQL Server\140\Tools\Setup\
[3/1/2019 6:22:54 AM] Information : Inspect.CheckSqlServerTools : MsiQueryProductState returned : INSTALLSTATE_DEFAULT
[3/1/2019 6:22:54 AM] *** Error : CurrentDomain_UnhandledException

Digging some more, I found that DPM seems to also place logs within the %temp% folder. Within this folder, I found that a tmpXXX.xml file was being created each time I ran through the installer and triggered an error. Upon opening the file, I see the following:

<?xml version="1.0" encoding="utf-16"?>
<WatsonInfo xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <ExceptionList>
    <ExceptionEntry>
      <Exception_x0020_Type_x000D__x000A_>System.ArgumentNullException</Exception_x0020_Type_x000D__x000A_>
      <StackTrace_x000D__x000A_>   at System.Version.Parse(String input)
   at System.Version..ctor(String version)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Inspect.InspectPrerequisites.CheckSqlServerTools(InspectContext context)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Inspect.Inspect.InitializeContext(String sqlMachineName, String sqlInstanceName, String reportingMachineName, String reportingInstanceName, ConnectionOptions wmiSqlConnectionOptions, ConnectionOptions wmiReportingConnectionOptions, Boolean isRemoteDb, Boolean isSqlClustered, List`1 sqlClusterNodes, Boolean isRemoteReporting, String oldSqlMachineName, String oldSqlInstanceName, ProductNameEnum productName, InspectModeEnum inspectMode, Boolean remoteTriggerJob)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Inspect.Inspect..ctor(String reportFilePath, String sqlMachineName, String sqlInstanceName, String reportingMachineName, String reportingInstanceName, ConnectionOptions wmiSqlConnectionOptions, ConnectionOptions wmiReportingConnectionOptions, Boolean isRemoteDb, Boolean isSqlClustered, List`1 sqlClusterNodes, Boolean isRemoteReporting, String oldSqlMachineName, String oldSqlInstanceName, InspectModeEnum inspectMode, InspectSkuEnum inspectSku, ProductNameEnum productName, InspectCCModeEnum ccMode, Boolean remoteTriggerJob)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Wizard.BackEnd.InstantiateInspect(String inspectFile, String sqlMachineName, String sqlInstanceName, String reportingMachineName, String reportingInstanceName, ConnectionOptions wmiSqlConnectionOptions, ConnectionOptions wmiReportingConnectionOptions)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Wizard.InspectPage.RunInspect()
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Wizard.InspectPage.InspectThreadEntry()
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
   at System.Threading.ThreadHelper.ThreadStart()</StackTrace_x000D__x000A_>
      <message>Value cannot be null.
Parameter name: input</message>
      <targetSite>System.Version Parse(System.String)</targetSite>
    </ExceptionEntry>
  </ExceptionList>
  <UICulture>en-US</UICulture>
  <Culture>en-US</Culture>
  <CLRVersion>4.0.30319.42000</CLRVersion>
  <OSVersion>Microsoft Windows NT 10.0.14393.0</OSVersion>
  <BucketingParametersValue>
    <Other>BC81BBF7</Other>
    <ExceptionPoint>System.Version.Parse</ExceptionPoint>
    <ExceptionName>System.ArgumentNullException</ExceptionName>
    <ModuleVersion>5.0.158.0</ModuleVersion>
    <ModuleName>SetupDpm.exe</ModuleName>
    <ApplicationVersion>5.0.158.0</ApplicationVersion>
    <ApplicationName>SetupDpm</ApplicationName>
  </BucketingParametersValue>
  <Info>Microsoft Data Protection Manager Exception Record</Info>
</WatsonInfo>

Looking through the above stack trace, I see hints that this is to SQL Server and in this case I'm receiving a null value for what looks like a version. So after reading other posts online, everyone said to downgrade to SQL Server 2016 RTM.

SQL Server 2016 version numbers: https://support.microsoft.com/en-us/help/3177312/sql-server-2016-build-versions

After downgrading to SQL Server 2016 RTM, I noticed I still received Error ID: 4387. This time I don't see any files within the %temp% directory, but I did find in the DPMSetup.log file (within the DPMLogs directory) the following log:

[3/8/2019 5:13:09 AM] Information : Microsoft System Center 2016 Data Protection Manager setup started.
[3/8/2019 5:13:09 AM] Data : Mode of setup = User interface
[3/8/2019 5:13:09 AM] Data : OSVersion = Microsoft Windows NT 10.0.14393.0
[3/8/2019 5:13:09 AM] Information : Check if the media is removable
[3/8/2019 5:13:09 AM] Data : Folder Path = C:\Program Files\Microsoft System Center 2016\DPM
[3/8/2019 5:13:09 AM] Data : Drive Name = C:\
[3/8/2019 5:13:09 AM] Data : Drive Type = 3
[3/8/2019 5:13:09 AM] Information : Check attributes of the directory
[3/8/2019 5:13:09 AM] Data : Folder Path = C:\Program Files\Microsoft System Center 2016\DPM
[3/8/2019 5:13:09 AM] Data : File Attributes = Directory
[3/8/2019 5:13:09 AM] Information : Check if the media is removable
[3/8/2019 5:13:09 AM] Data : Folder Path = C:\Program Files\Microsoft Data Protection Manager
[3/8/2019 5:13:09 AM] Data : Drive Name = C:\
[3/8/2019 5:13:09 AM] Data : Drive Type = 3
[3/8/2019 5:13:09 AM] Information : Check attributes of the directory
[3/8/2019 5:13:09 AM] Data : Folder Path = C:\Program Files\Microsoft Data Protection Manager
[3/8/2019 5:13:09 AM] * Exception : Ignoring the following exception intentionally => System.IO.FileNotFoundException: Could not find file 'C:\Program Files\Microsoft Data Protection Manager'.
File name: 'C:\Program Files\Microsoft Data Protection Manager'
   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.File.GetAttributes(String path)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Wizard.InstallLocationValidation.CheckForDirectoryAttributes(String path)
[3/8/2019 5:13:09 AM] Information : Check if the media is removable
[3/8/2019 5:13:09 AM] Data : Folder Path = C:\Program Files\Microsoft System Center 2016\DPM\DPM\DPMDB
[3/8/2019 5:13:09 AM] Data : Drive Name = C:\
[3/8/2019 5:13:09 AM] Data : Drive Type = 3
[3/8/2019 5:13:09 AM] Information : Check attributes of the directory
[3/8/2019 5:13:09 AM] Data : Folder Path = C:\Program Files\Microsoft System Center 2016\DPM\DPM\DPMDB
[3/8/2019 5:13:09 AM] * Exception : Ignoring the following exception intentionally => System.IO.DirectoryNotFoundException: Could not find a part of the path 'C:\Program Files\Microsoft System Center 2016\DPM\DPM\DPMDB'.
   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.File.GetAttributes(String path)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Wizard.InstallLocationValidation.CheckForDirectoryAttributes(String path)
[3/8/2019 5:13:10 AM] Information : The setup wizard is initialized.
[3/8/2019 5:13:10 AM] Information : Starting the setup wizard.
[3/8/2019 5:13:10 AM] Information : <<< Dialog >>> Welcome Page : Entering
[3/8/2019 5:13:55 AM] Information : <<< Dialog >>> Welcome Page : Leaving
[3/8/2019 5:13:55 AM] Information : <<< Dialog >>> Inspect Page : Entering
[3/8/2019 5:14:06 AM] Information : Query WMI provider for path of configuration file for SQL Server 2008 Reporting Services.
[3/8/2019 5:14:06 AM] Information : Querying WMI Namespace: \\DPM-SERVER\root\Microsoft\SqlServer\ReportServer\RS_DPM\V13\admin for query: SELECT * FROM MSReportServer_ConfigurationSetting WHERE InstanceName='DPM'
[3/8/2019 5:14:06 AM] Data : Path of configuration file for SQL Server 2008 Reporting Services = C:\Program Files\Microsoft SQL Server\MSRS13.DPM\Reporting Services\ReportServer\RSReportServer.config
[3/8/2019 5:14:06 AM] * Exception :  => System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.SqlServer.Smo, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified.
File name: 'Microsoft.SqlServer.Smo, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Helpers.MiscHelper.IsSqlClustered(String sqlMachineName, String sqlInstanceName)
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Helpers.MiscHelper.IsMachineClustered(String sqlMachineName, String sqlInstanceName)

WRN: Assembly binding logging is turned OFF.
To enable assembly bind failure logging, set the registry value [HKLM\Software\Microsoft\Fusion!EnableLog] (DWORD) to 1.
Note: There is some performance penalty associated with assembly bind failure logging.
To turn this feature off, remove the registry value [HKLM\Software\Microsoft\Fusion!EnableLog].

[3/8/2019 5:14:06 AM] * Exception :  => System.Management.ManagementException: Invalid namespace 
   at System.Management.ManagementException.ThrowWithExtendedInfo(ManagementStatus errorCode)
   at System.Management.ManagementScope.InitializeGuts(Object o)
   at System.Management.ManagementScope.Initialize()
   at System.Management.ManagementObjectSearcher.Initialize()
   at System.Management.ManagementObjectSearcher.Get()
   at Microsoft.Internal.EnterpriseStorage.Dls.Setup.Helpers.WmiHelper.IsMachineClustered(String machineName, String instanceName)
[3/8/2019 5:14:06 AM] Information : OS >= win 8 , enable Dedupe role
[3/8/2019 5:14:07 AM] Information : output : True
.. 
 error : 
[3/8/2019 5:14:08 AM] Data : Path of inspection output xml = C:\Program Files\Microsoft System Center 2016\DPM\DPMLogs\InspectReport.xml
[3/8/2019 5:14:08 AM] Information : Instantiating inspect component.
[3/8/2019 5:14:08 AM] Data : Path of output xml = C:\Program Files\Microsoft System Center 2016\DPM\DPMLogs\InspectReport.xml
[3/8/2019 5:14:08 AM] Information : Deserializing the check XML from path : C:\Users\labuser.CONTOSO\AppData\Local\Temp\DPM6EC.tmp\DPM2012\Setup\checks.xml
[3/8/2019 5:14:08 AM] Information : Loading the check XML from path : C:\Users\labuser.CONTOSO\AppData\Local\Temp\DPM6EC.tmp\DPM2012\Setup\checks.xml
[3/8/2019 5:14:08 AM] Information : Deserialising the scenario XML from path : C:\Users\labuser.CONTOSO\AppData\Local\Temp\DPM6EC.tmp\DPM2012\Setup\scenarios.xml
[3/8/2019 5:14:08 AM] Information : Loading the check XML from path : C:\Users\labuser.CONTOSO\AppData\Local\Temp\DPM6EC.tmp\DPM2012\Setup\scenarios.xml
[3/8/2019 5:14:08 AM] Information : Getting scenarios for the product: DPM
[3/8/2019 5:14:08 AM] Information : Getting scenarios for DPM
[3/8/2019 5:14:08 AM] Information : Getting scenario for Mode:Install, DbLocation:Remote, SKU:Retail and CCMode:NotApplicable
[3/8/2019 5:14:08 AM] *** Error : Initialize the SQLSetUpHelper Object
[3/8/2019 5:14:08 AM] Information : [SQLSetupHelper.GetWMIReportingNamespace]. Reporting Namespace found. Reporting Namespace : V13
[3/8/2019 5:14:08 AM] Information : [SQLSetupHelper.GetWMISqlServerNamespace]. SQL Namespace found. SQL Namespace : \\DPM-SERVER\root\Microsoft\SqlServer\ComputerManagement13
[3/8/2019 5:14:08 AM] Information : Query WMI provider for SQL Server 2008.
[3/8/2019 5:14:08 AM] Information : Querying WMI Namespace: \\DPM-SERVER\root\Microsoft\SqlServer\ComputerManagement13 for query: Select * from SqlServiceAdvancedProperty where ServiceName='MSSQL$DPM' and PropertyName='Version'
[3/8/2019 5:14:08 AM] Information : SQL Server 2008 R2 SP2 instance DPM is present on this system.
[3/8/2019 5:14:08 AM] Information : Query WMI provider for SQL Server 2008.
[3/8/2019 5:14:08 AM] Information : Querying WMI Namespace: \\DPM-SERVER\root\Microsoft\SqlServer\ComputerManagement13 for query: Select * from SqlServiceAdvancedProperty where ServiceName='MSSQL$DPM' and PropertyName='Version'
[3/8/2019 5:14:08 AM] Information : [SQLSetupHelper.GetSQLDepedency]. Reporting Namespace and SQL namespace for installed SQL server which will be used as DPM DB. Reporting Namespace : \\DPM-SERVER\root\Microsoft\SqlServer\ReportServer\RS_DPM\V13\admin SQL Namespace : \\DPM-SERVER\root\Microsoft\SqlServer\ComputerManagement13
[3/8/2019 5:14:08 AM] Information : Check if SQL Server 2012 Service Pack 1 Tools is installed.
[3/8/2019 5:14:08 AM] Information : [SQLSetupHelper.GetSqlSetupRegKeyPath]. Registry Key path that contains SQL tools location: Software\Microsoft\Microsoft SQL Server\140\Tools\Setup\
[3/8/2019 5:14:08 AM] Information : Inspect.CheckSqlServerTools : MsiQueryProductState returned : INSTALLSTATE_DEFAULT
[3/8/2019 5:14:08 AM] *** Error : CurrentDomain_UnhandledException

Looking at the above log, the last line hints we are looking for SQL Server Tools (in this case, what looks like some crazy old hints to depencies on SQL Server 2012). Unfortunately, installation of SQL Server 2016 will provide you the recommendation to grab SQL Server Management Studio 17.X, however DPM 2016 will only install with SQL Server Management Studio 16.5.X. You will need to uninstall the 17.X version of SSMS and install the 16.5.X build from the link below:
https://docs.microsoft.com/en-us/sql/ssms/sql-server-management-studio-changelog-ssms?view=sql-server-2017#download-ssms-1653

Alas! Upon installation and run through the DPM installation, no more Error 4387! Once DPM is installed, you can safely upgrade your SQL Server instance to 2017 if needed.

Hope this helps someone else! DPM can be picky and unforgiving in nature, but if you abide by exactly what their documentation calls out to a T and not venture anything outside of those parameters, you should be golden 🙂

https://docs.microsoft.com/en-us/system-center/dpm/install-dpm?view=sc-dpm-2016#setup-prerequisites

Closing notes, if the above items didn't solve your problem. Please post your logs and let's troubleshoot to document all solutions needed for all error logs. Thank you!

Installing Python Wheel files on an Azure App Service

Per Microsoft: Some packages may not install using pip when run on Azure. It may simply be that the package is not available on the Python Package Index. It could be that a compiler is required (a compiler is not available on the machine running the web app in Azure App Service).

Example, you may receive an error like this when trying to install a specific package (in this case, trying to install Pandas):

Command: "D:\home\site\deployments\tools\deploy.cmd"
Handling python deployment.
KuduSync.NET from: 'D:\home\site\repository' to: 'D:\home\site\wwwroot'
Copying file: 'requirements.txt'
Detected requirements.txt.  You can skip Python specific steps with a .skipPythonDeployment file.
Detecting Python runtime from runtime.txt
Detected python-2.7

Found compatible virtual environment.
Pip install requirements.
Downloading/unpacking Flask==0.12.1 (from -r requirements.txt (line 1))
Downloading/unpacking numpy==1.15.0rc2 (from -r requirements.txt (line 2))
Downloading/unpacking pandas==0.22.0 (from -r requirements.txt (line 3))
  Running setup.py (path:D:\home\site\wwwroot\env\build\pandas\setup.py) egg_info for package pandas

    Could not locate executable g77
    Could not locate executable f77
    Could not locate executable ifort
    Could not locate executable ifl
    Could not locate executable f90
    Could not locate executable efl
    Could not locate executable gfortran
    Could not locate executable f95
    Could not locate executable g95
    Could not locate executable effort
    Could not locate executable efc
    don't know how to compile Fortran code on platform 'nt'
    non-existing path in 'numpy\\distutils': 'site.cfg'
    Running from numpy source directory.

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\setup.py:385: UserWarning: Unrecognized setuptools command, proceeding with generating Cython sources and expanding templates
      run_build = parse_setuppy_commands()
    D:\python27\Lib\distutils\dist.py:267: UserWarning: Unknown distribution option: 'python_requires'
      warnings.warn(msg)
    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Atlas (http://math-atlas.sourceforge.net/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [atlas]) or by setting
        the ATLAS environment variable.
      self.calc_info()

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Blas (http://www.netlib.org/blas/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [blas]) or by setting
        the BLAS environment variable.
      self.calc_info()

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Blas (http://www.netlib.org/blas/) sources not found.
        Directories to search for the sources can be specified in the
        numpy/distutils/site.cfg file (section [blas_src]) or by setting
        the BLAS_SRC environment variable.
      self.calc_info()

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Lapack (http://www.netlib.org/lapack/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [lapack]) or by setting
        the LAPACK environment variable.
      self.calc_info()

    d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
        Lapack (http://www.netlib.org/lapack/) sources not found.
        Directories to search for the sources can be specified in the
        numpy/distutils/site.cfg file (section [lapack_src]) or by setting
        the LAPACK_SRC environment variable.
      self.calc_info()

    D:\python27\Lib\distutils\dist.py:267: UserWarning: Unknown distribution option: 'define_macros'
      warnings.warn(msg)
    Traceback (most recent call last):
      File "<string>", line 17, in <module>
      File "D:\home\site\wwwroot\env\build\pandas\setup.py", line 743, in <module>
        **setuptools_kwargs)
      File "D:\python27\Lib\distutils\core.py", line 111, in setup
        _setup_distribution = dist = klass(attrs)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 262, in __init__
        self.fetch_build_eggs(attrs['setup_requires'])
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 287, in fetch_build_eggs
        replace_conflicting=True,
      File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 614, in resolve
        dist = best[req.key] = env.best_match(req, ws, installer)
      File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 857, in best_match
        return self.obtain(req, installer)
      File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 869, in obtain
        return installer(requirement)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 338, in fetch_build_egg
        return cmd.easy_install(req)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 613, in easy_install
        return self.install_item(spec, dist.location, tmpdir, deps)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 643, in install_item
        dists = self.install_eggs(spec, download, tmpdir)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 833, in install_eggs
        return self.build_and_install(setup_script, setup_base)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 1055, in build_and_install
        self.run_setup(setup_script, setup_base, args)
      File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 1043, in run_setup
        raise DistutilsError("Setup script exited with %s" % (v.args[0],))
    distutils.errors.DistutilsError: Setup script exited with error: Microsoft Visual C++ 9.0 is required (Unable to find vcvarsall.bat). Get it from http://aka.ms/vcpython27
    Complete output from command python setup.py egg_info:

Could not locate executable g77
Could not locate executable f77
Could not locate executable ifort
Could not locate executable ifl
Could not locate executable f90
Could not locate executable efl
Could not locate executable gfortran
Could not locate executable f95
Could not locate executable g95
Could not locate executable effort
Could not locate executable efc

don't know how to compile Fortran code on platform 'nt'
non-existing path in 'numpy\\distutils': 'site.cfg'
Running from numpy source directory.
d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\setup.py:385: UserWarning: Unrecognized setuptools command, proceeding with generating Cython sources and expanding templates
  run_build = parse_setuppy_commands()
D:\python27\Lib\distutils\dist.py:267: UserWarning: Unknown distribution option: 'python_requires'
  warnings.warn(msg)

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Atlas (http://math-atlas.sourceforge.net/) libraries not found.
    Directories to search for the libraries can be specified in the
    numpy/distutils/site.cfg file (section [atlas]) or by setting
    the ATLAS environment variable.
  self.calc_info()

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Blas (http://www.netlib.org/blas/) libraries not found.
    Directories to search for the libraries can be specified in the
    numpy/distutils/site.cfg file (section [blas]) or by setting
    the BLAS environment variable.
  self.calc_info()

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Blas (http://www.netlib.org/blas/) sources not found.
    Directories to search for the sources can be specified in the
    numpy/distutils/site.cfg file (section [blas_src]) or by setting
    the BLAS_SRC environment variable.
  self.calc_info()

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Lapack (http://www.netlib.org/lapack/) libraries not found.
    Directories to search for the libraries can be specified in the
    numpy/distutils/site.cfg file (section [lapack]) or by setting
    the LAPACK environment variable.
  self.calc_info()

d:\local\temp\easy_install-dsrz9g\numpy-1.15.0rc2\numpy\distutils\system_info.py:625: UserWarning:
    Lapack (http://www.netlib.org/lapack/) sources not found.
    Directories to search for the sources can be specified in the
    numpy/distutils/site.cfg file (section [lapack_src]) or by setting
    the LAPACK_SRC environment variable.
  self.calc_info()

D:\python27\Lib\distutils\dist.py:267: UserWarning: Unknown distribution option: 'define_macros'
  warnings.warn(msg)

Traceback (most recent call last):
  File "<string>", line 17, in <module>
  File "D:\home\site\wwwroot\env\build\pandas\setup.py", line 743, in <module>
    **setuptools_kwargs)
  File "D:\python27\Lib\distutils\core.py", line 111, in setup
    _setup_distribution = dist = klass(attrs)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 262, in __init__
    self.fetch_build_eggs(attrs['setup_requires'])
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 287, in fetch_build_eggs
    replace_conflicting=True,
  File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 614, in resolve
    dist = best[req.key] = env.best_match(req, ws, installer)
  File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 857, in best_match
    return self.obtain(req, installer)
  File "D:\home\site\wwwroot\env\lib\site-packages\pkg_resources.py", line 869, in obtain
    return installer(requirement)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\dist.py", line 338, in fetch_build_egg
    return cmd.easy_install(req)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 613, in easy_install
    return self.install_item(spec, dist.location, tmpdir, deps)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 643, in install_item
    dists = self.install_eggs(spec, download, tmpdir)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 833, in install_eggs
   return self.build_and_install(setup_script, setup_base)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 1055, in build_and_install
    self.run_setup(setup_script, setup_base, args)
  File "D:\home\site\wwwroot\env\lib\site-packages\setuptools\command\easy_install.py", line 1043, in run_setup
    raise DistutilsError("Setup script exited with %s" % (v.args[0],))

distutils.errors.DistutilsError: Setup script exited with error: Microsoft Visual C++ 9.0 is required (Unable to find vcvarsall.bat). Get it from http://aka.ms/vcpython27

----------------------------------------

Cleaning up...

Command python setup.py egg_info failed with error code 1 in D:\home\site\wwwroot\env\build\pandas
Storing debug log for failure in D:\home\pip\pip.log

An error has occurred during web site deployment.
\r\nD:\Program Files (x86)\SiteExtensions\Kudu\75.10629.3460\bin\Scripts\starter.cmd "D:\home\site\deployments\tools\deploy.cmd"

This guide is a reflection on how to use Wheel files to install Modules that cannot natively be installed via pip due to a compiler missing in the Azure App Service:

Microsoft Official documentation can be found here: https://docs.microsoft.com/en-us/azure/app-service/web-sites-python-configure#troubleshooting---package-installation

Tutorial

  1. Modify requirements.txt file
    1. Add the following item as the first line to the document:
      1. --find-links wheelhouse
        1. Note: If you do not have a requirements.txt file, you can simply create a new text document and add this line to it.  The requirements.txt file is what allows the Azure App Service to automatically go out and try and download packages you may need for your application.  Official documentation on this file is found here: https://docs.microsoft.com/en-us/azure/app-service/web-sites-python-configure#package-management
    2. Navigate to the Kudu Debug Console by going to https://yourappservice.scm.azurewebsites.net/DebugConsole
    3. Within the debug console, navigate to your version of Python.
      1. Note: The default Python versions in an Azure App Service are 2.7 and 3.4; however since Wheel will need to install some files, you cannot leverage the default directories of D:\Python27 for v2.7 and D:\Python34 for v3.4
      2. In this case, I'd recommend leveraging Extensions to install whatever version of Python.  Documentation on this can be found here: https://blogs.msdn.microsoft.com/pythonengineering/2016/08/04/upgrading-python-on-azure-app-service/
    4. Install the Python Wheel module:
      1. python.exe -m pip install wheel
    5. Obtain Wheel files
      1. Option 1: Build your own wheel files
        1. Execute the following command:
          1. python.exe -m pip wheel -r D:\home\site\wwwroot\requirements.txt -w wheelhouse

      2. Option 2: Obtain Wheel files
        1. Create a wheelhouse folder within your python directory
          1. mkdir wheelhouse

        2. Copy whl files to this directory
          1. You can obtain wheel files from PyPi or from Laboratory for Fluorescence Dynamics, University of California, Irvine.
            1. PyPi: Search for the module and then clicking on the Download Files button
              1. https://pypi.org/
            2. Laboratory for Fluorescence Dynamics, University of California, Irvine: Simply download the appropriate whl file listed on the page below
              1. https://www.lfd.uci.edu/~gohlke/pythonlibs/
    6. Install Modules
      1. Manual Install
        1. Execute the following command:
          python.exe -m pip install --upgrade -r D:\home\site\wwwroot\requirements.txt

      2. Deployment Install (from CI/CD pipeline)
        1. Configure .deployment and deploy.cmd file
          1. Official documentation on this can be found here: https://github.com/projectkudu/kudu/wiki/Custom-Deployment-Script
          2. .deployment file
            1. [config]
              command = deploy.cmd
          3. deploy.cmd file (modify the python directory to reflect your version)
            1. :: 1. Install Wheel
              echo Configure Wheel
              D:\home\python364x64\python.exe -m pip install wheel:: 2. Install packages
              echo Pip install requirements.
              D:\home\python364x64\python.exe -m pip install --upgrade -r D:\home\site\wwwroot\requirements.txt

At this point, the modules in question should be installed and ready for use! 🙂

Setting up WeeWX with a Raspberry PI

This is a quick setup guide on how to configure the open source software WeeWX for a Personal Weather Station (PWS).  I highly recommend you check out the WeeWX User Guide as this information is very well documented.  Here is a reflection of how I was able to get WeeWX installed on a Raspberry PI with a brand new weather station.

  1. Setup your Raspberry PI
    1. How to setup your Raspberry PI: http://jackstromberg.com/2018/03/setting-up-a-new-raspberry-pi-via-ssh/
      1. Note: Raspbian is a distribution based upon Debian.  In this case, we will follow the Debian instructions for setting up WeeWX.
        1. http://weewx.com/docs/debian.htm
  2. (Optional) Configure the Raspberry PI to be localized to your environment
    1. sudo raspi-config
      1. Here you can arrow down to Localization Options and configure the timezone to match that of your console/weather sensor.  Keeping time is critical, so if possible, try to keep the date/time between your weather station and the Raspberry PI as close as possible.
  3. Configure Apt-Get to look for the WeeWX packages
    wget -qO - http://weewx.com/keys.html | sudo apt-key add -
    sudo wget -qO - https://weewx.com/apt/weewx-python3.list | sudo tee /etc/apt/sources.list.d/weewx.list

    Note: Use https://weewx.com/apt/weewx-python3.list for Debian 10.X (latest version of raspbian as of 2021-07-23 will use this); otherwise use https://weewx.com/apt/weewx-python2.list for Debian 9.X.

  4. Update your Raspberry-PI to use the latest packages
    sudo apt-get update
    sudo apt-get upgrade
  5. Before installation, ensure you have your console or device setup and connected to your Raspberry PI for WeeWX to pull the data
  6. Determine the interface the console is connected to (if using a directly attached data loggerm skip if using an IP based source)
    1. Execute the command dmesg and look for what interface the data logger is connected to
      1. In my example, you can see the data logger is connected to ttyUSB0
  7. Launch the installation wizard for weewx
    1. sudo apt-get install weewx
      1. Note: You will likely be prompted to install a few dependencies, type Y for yes to install them
  8. Installation
    1. Enter the location of your weather station: Santa's Workshop, North Pole
    2. Enter in the latitude, longitude of your weather station
      1. Note: If you don't have GPS, you can easily find this by using Bing Maps or Google Maps, navigating to your location, and right clicking.
        1. For Bing, it will just show you the lat/long values when you right click
        2. For Google, click on "What's Here" and it will list these values
      2. Note: You can be more specific than 3 digits behind the decimal, so if you want to use a more specific set of coordinates like 40.689167, -74.044444, that is acceptable.
    3. Enter in your Altitude of where the weather station is
      1. You can use Google Earth to find the altitude or this tool here: https://www.freemaptools.com/elevation-finder.htm
    4. Set your preferred unit of measurement
      1. US (Imperial) or Metric
    5. Select your weather station type
      1. I.e. AcuRite, Vantage (if using Davis), etc.
    6. Select the interface the device is listening on
    7. For those using serial port, select the interface that the data logger is connected to.  You should have found this in step 4 above; if using ethernet, go ahead and type in the IP, Port, etc. of the data logger.
  9. At this point WeeWX is technically installed, however many individuals will want to present the WeeWX reports via webpage.  In this case, we'll install nginx, which is a lightweight webserver
    1. sudo apt-get install nginx
      1. More details on this can be found here: http://www.weewx.com/docs/usersguide.htm#integrating_with_webserver
  10. Configure WeeWX to minimize disk IO
    1. Why do we need to do this?  Since Raspberry PI's leverage SD cards, there is typically a finite number of reads/writes to the SD Card.  In this case, it is recommended to either leverage an external database/fileserver for WeeWX to write its reports.  Alternatively, we can also configure WeeWX to leverage ram to host the reports, which will prevent IO to the SD card (in this case, theoretically increasing the life of the drive)
      1. Three approaches are outlined here--in this guide I'll reflect the GitHub page in saving reports to a temporary file system using tmpfs
        1. Add an entry to fstab
          1. echo "weewx_reports /var/weewx/reports tmpfs size=20M,noexec,nosuid,nodev 0 0" | sudo tee -a /etc/fstab
        2. Mount the new file system
          1. sudo mkdir -p /var/weewx/reports
          2. sudo mount -a
        3. Update weewx.config file to point to new directory
          1. sudo sed -i -e 's%HTML_ROOT =.*%HTML_ROOT = /var/weewx/reports%' /etc/weewx/weewx.conf
        4. Restart WeeWX service
          1. sudo service weewx restart
        5. Create symbolic link to point webserver to the reports
          1. sudo ln -s /var/weewx/reports /var/www/html/weewx
        6. Give the web server the ability to read from the directory
          1. sudo chmod -R 755 /var/www/html/weewx

At this point, go ahead and browse out to http://youripaddress/weewx/ to see your weather.

Notes:

WeeWX updates the webpage every 30 minutes (1800 seconds) out of the box.  You can force a report update by executing wee_reports weewx.conf or you can modify the /etc/weewx/weewx.conf file by changing the archive_interval variable (in seconds) under the [StdArchive] section.

You can modify the Weewx configuration by editing: /etc/weewx/weewx.conf

You can validate if WeeWX is running by executing: service weewx status

You can look at diagnostics logs by following the guide here: http://www.weewx.com/docs/usersguide.htm#monitoring

Best practices guide on using WeeWX + Raspberry PI: https://github.com/weewx/weewx/wiki/Raspberry%20Pi