This site was powered by Drupal for a full decade. It went through Drupal 5, 6, 7 and 8, but it’s outlived its usefulness in that form. I think it might have even begun as a 4.7 site.
My reasons for switching to a static site generator are quite plain but highlight my shifting needs:
Over 3/4 of posts here contain a code snippet, this is vastly easier to handle with Markdown or similar tools, than CKEditor.
The content is by all accounts static. The end-user is not rewarded through interactive features but rather burneded by the wait time of the CMS overhead. Even with the dramatically improved page cache in D8, it doesn’t come close to plain HTML.
I don’t have nontechnical users to support here.
I have enough other Drupal sites in my life to use for trying out new modules and experimenting.
I can stop maintaing a server to have things as flexible and speedy as I like, for a minor site.
And finally: All posts are revisioned in Git, without any intermediate systems.
My choice fell on Spress even though it’s not that prevalent (yet), since it uses the stack I’ve come to appreciate everywhere else: Symfony, Twig, Composer.
Converting data is straightfoward and easy. Just write a quick script to fetch all nodes and output their fields as desired.
League’s HTML-to-Markdown library easily converted CKEditor HTML with only minor corrections necessary afterwards.
Then just write it to a file with fopen/fwrite/fclose. Yes, of course that could be done more elegantly, but for a single migration it works fine.
Along the way I pruned the content back to what could still be considered tangentially relevant. I look forward to seeing if this will make posts here more frequent, as I expect, the opposite, or neither.
Today I want to highlight a solution that has become one of my favourite tools in web development over the summer: Gitlab with Gitlab CI.
I used to be a bit skeptical if a continuous integration solution would not be overkill for small projects but with configuration management in Drupal 8 and composer & composer-patches I’m a dedicated convert and would never want to go back to doing it by hand. You might think it’s just a git pull here or a cache-clear there but I'm certain that once you've tried it you don't want to go back.
Fire up a VPN and install the omnibus edition, then follow the instructions. Just make sure that you have at least 1GB of RAM and several GB of swap. I tried 512MB, you will run into issues that are not worth your time and effort.
Next you’ll need to actually define a runner, again, see the documentation for that. For my case it works perfectly fine to have the runner be a second user on that server. Since I’m using SSH later I’ll need to provide that user with a key-pair, since we need to add that to the authorized_keys of the production server.
First, you’ll need a .gitlab-ci.yml file in the repository root and that can be quite minimalistic. You can execute commands directly there but since most of what I’m doing for automated deployments is happening in an ssh session, I just wrap that in a deploy script:
set -e # We want to fail at each command, to stop execution
drush sql-dump --gzip > ../../somewhere-safe/db/`date +%Y-%m-%d-%H%M`.sql.gz
~/bin/drush updb -y
~/bin/drush cim -y
When you commit that CI file you will likely get a notification that the build is stuck. Just follow the prompts until you get to the list of runners so that you can enable it for the specific project.
Of course that could be done more elegantly in a variety of ways (maintenance mode, aliases instead of directories, etc.) but it actually works extremely well. I rarely have to touch the server anymore and adding additional steps such as unit testing and code standard validation is extremely easy with such a lightweight CI solution.
The tutorial below is based primarily on work done by Aditya Tannu (see here and here) and tries to make things easier for beginners by reducing the setup to the bare essentials and providing clean and simple examples.
So, you might have gotten the iOS 10 beta and seen the Home app but don't wan't to spend an inordinate amount on commercial solutions and nowadays ESP8266 solutions are ubiquitous and cheap and those can be made to work with HomeKit.
You'll need three things running to actually make that work: a bridge for your sensors, an MQTT broker and the sensor itself. While you can certainly use a Raspberry Pi for the first two I chose a Mac mini since I have that already running anyway. Sadly, that solution is Node-based but I haven't seen any other solution that works out there.
Install with npm install (just Google if you don't have node yet)
Set a custom pincode in BridgedCore.js
Run with DEBUG=\* node BridgedCore.js
Install with sudo npm install -g mosca bunyan (bunyan is only needed for output)
Run with mosca -v | bunyan
You should now be able to add the Node bridge in your HomeKit management application and see the fake sensors which are present in accessories. Once everything is running smoothly, you can run these as system services with an unprivileged user. See the project root for plist examples, just add your user name where necessary. I had some issues with calling node directly for HAP so I just made a one-liner bash script which then runs 'node BridgedCore.js' in the relevant directory.
The pairing fails after a longish timeout. Your firewall settings on the server are the likely cause turn if off, if that helps, adjust as necessary.
We can program the sensor with the NodeMCU stack but in my experience the Arduino one is far less volatile (for now).
You can now flash something onto the Wemos which will periodically send your data to the server. You can use my ino templates as a starting point. You still need set your Wifi SSID, password and the IP of the server you are sending data to.
Now that you are sending your results to the broker you still need the bridge to tell your HomeKit application about the accessory you wish to use and update its data when an update comes along. You can start with my accessory templates for humidity and temperature. You will only need to update those with the server IP, if you did not change anything else in the Arduino templates.
Uploading fails. Check the baud rate, the definition might not have set it to the recommended 115200.
I see the data in my serial monitor, the broker gets a connection but the values are still zero. Make sure that that the channel you are pushing to is the same as in the accessory file.
My accessory doesn't show when I start the bridge. Make sure all files end in _accessory.js.
Hardcoding the channels for each accessory in the js template and the Arduino sketch is inefficient at best. If someone has a good tutorial for HAP-Nodejs that does that right, let me know.
I've noticed that the default of 10 minutes until the next update is a bit much and the accessory can be shown as timed out or zero out. Also let me know if you have a good default here or recommendations on power optimisation.
There are some cases where you want to interact with Facets on a Search API page, apart from just rendering them.
The API for facets in D8 provides you with services to make that easily possible, however, there aren't many examples out there yet for cases such as these where you work outside the primary rendering pipeline. The trick is to process the facet not build it, to get at the actual data: