Merge remote-tracking branch 'sculpin-new/main'

This commit is contained in:
Oliver Davies 2024-08-01 22:29:19 +01:00
commit 21bca0b74e
1095 changed files with 50632 additions and 0 deletions

1
.envrc Normal file
View file

@ -0,0 +1 @@
use flake

27
.githooks/prepare-commit-msg Executable file
View file

@ -0,0 +1,27 @@
#!/usr/bin/env bash
# Do not edit this file. It is automatically generated by https://www.oliverdavies.uk/build-configs.
# Load the issue ID from an `.issue-id` file within the project and replace the
# `ISSUE_ID` placeholder within a Git commit message.
#
# For example, running `echo "OD-123" > .issue-id` will add `Refs: OD-123` to
# the commit message.
#
# This also works with multiple issue IDs in the same string, e.g.
# "OD-123 OD-456", or IDs on multiple lines.
set -o errexit
set -o nounset
set -o pipefail
PROJECT_DIR=$(git rev-parse --show-toplevel)
ISSUE_FILE="$PROJECT_DIR/.issue-id"
if [ -f "${ISSUE_FILE}" ]; then
ISSUE_IDS=$(cat "${ISSUE_FILE}" | tr '\n' ',' | tr ' ' ',' | sed 's/,$//' | sed 's/,/, /g')
if [ -n "${ISSUE_IDS}" ]; then
sed -i.bak "s/# Refs:/Refs: $ISSUE_IDS/" "$1"
fi
fi

22
.github/workflows/test.yaml vendored Normal file
View file

@ -0,0 +1,22 @@
name: Run tests
on:
pull_request:
push:
branches:
- main
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: cachix/install-nix-action@v26
with:
nix_path: nixpkgs=channel:nixos-unstable
- uses: actions/checkout@v4
- run: |
nix develop -c composer install
nix develop -c ./run test --testdox --colors=always

13
.gitignore vendored Normal file
View file

@ -0,0 +1,13 @@
# Do not edit this file. It is automatically generated by https://www.oliverdavies.uk/build-configs.
/.phpunit.cache
/.phpunit.result.cache
/output_*/
/vendor/
# Front-end assets.
node_modules
source/build
/.direnv/

1
.markdownlint.yaml Normal file
View file

@ -0,0 +1 @@
MD013: false

19
.tmux Executable file
View file

@ -0,0 +1,19 @@
#!/usr/bin/env bash
set -o errexit
# 1. Vim.
tmux send-keys -t "$1:1" "nvim" Enter
# 2. Server.
tmux new-window -t "$1" -c "$PWD"
tmux send-keys -t "$1:2" "./run start" Enter
tmux split-window -t "$1" -v
tmux send-keys -t "$1:2.bottom" "(cd assets && ../run npm:build:css)" Enter
# 3. General shell use.
tmux new-window -t "$1" -c "$PWD"
tmux send-keys -t "$1:3" "git status" Enter
tmux select-window -t "$1:1"

5
.yamlfmt.yaml Normal file
View file

@ -0,0 +1,5 @@
---
formatter:
include_document_start: true
retain_line_breaks_single: true
type: basic

14
app/SculpinKernel.php Normal file
View file

@ -0,0 +1,14 @@
<?php
use Opdavies\Sculpin\Bundle\TwigMarkdownBundle\SculpinTwigMarkdownBundle;
use Sculpin\Bundle\SculpinBundle\HttpKernel\AbstractKernel;
class SculpinKernel extends AbstractKernel
{
protected function getAdditionalSculpinBundles(): array
{
return [
SculpinTwigMarkdownBundle::class,
];
}
}

View file

@ -0,0 +1,17 @@
---
sculpin_content_types:
daily_emails:
permalink: /daily/:year/:month/:day/:basename/
pages:
permalink: /:basename/
podcast_episodes:
permalink: /podcast/:basename/
posts:
permalink: /blog/:basename/
talks:
permalink: /talks/:basename/
services:
App\Opdavies\TwigExtension\OpdaviesTwigExtension:
tags:
- {name: twig.extension}

316
app/config/sculpin_site.yml Normal file
View file

@ -0,0 +1,316 @@
---
name: Oliver Davies
slogan: Certified Drupal expert, Developer and Consultant
email: oliver+website@oliverdavies.uk
url: http://localhost:8000
assets:
url: '%url%'
version: 4
banner_text: ~
ctas:
call: |
Are you still using Drupal 7 and dont know whats involved to upgrade to Drupal 10? <a href="%site.url%/call">Book a Drupal 7 upgrade consultation call</a> or <a href="%site.url%/drupal-upgrade">an upgrade roadmap</a>.
d7eol: |
There's less than a year until Drupal 7's end-of-life date. <a href="%site.url%/drupal-upgrade">Plan your upgrade to Drupal 10 now!</a>
module: |
If you're creating a new Drupal module, try my <a href="https://github.com/opdavies/drupal-module-template">free Drupal module template</a>.
subscription: |
Do you need immediate access to an expert Drupal Developer? With my <a href="%site.url%/subscription">Drupal development subscription</a>, make unlimited requests for a fixed monthly price in less time than posting to a job board!
testing_course: |
Do you want to learn about automated testing in Drupal? Take my <a href="%site.url%/atdc">free 10-day email course</a> and get daily lessons straight to your inbox.
pair: |
Need help or want another pair of eyes on your code? Book a <a href="%site.url%/call">1-on-1 consulting call</a> or an <a href="%site.url%/pair">online pair programming session</a> with a 100% money-back guarantee.,
menu_links:
- title: About
url: /
- title: Press Info
url: /press
- title: Services
url: /pricing
- title: Talks and Workshops
url: /talks
- title: Podcast
url: /podcast
- title: Daily list
url: /daily
meta:
description: |-
Oliver is an Acquia-certified Triple Drupal expert, core contributor, Developer, Consultant and multiple-time DrupalCon speaker.
testimonials:
- text: |
The course was very informative. One of the biggest pain points with Drupal testing was that there was no clear and definitive guide on setting up the php unit XML file to get functional and kernel tests working right away. Your guide was fantastic and I will definitely be using it going forward in my module development for work.
name: Frank Landry
title: ~
image:
url: '%site.assets.url%/assets/images/recommendations/frank-landry.jpg'
tags: [testing, atdc]
- text: |
Well done. You've created a really excellent resource here that has the potential to bring Drupal development forward a huge leap. Youve managed to simplify and share some often complex seeming issues.
name: Adam Nuttall
title: Drupal Engineer
image:
url: '%site.assets.url%/assets/images/recommendations/adam-nuttall.jpg'
tags: [testing, atdc]
- text: |
I'm liking your short emails. They're just the right length that isn't too distracting but I'm able to consume it in a single glance.
name: Kevin Coyle
title: Design System Engineering Consultant
url: https://www.coyledesign.co.uk
image:
url: '%site.assets.url%/assets/images/recommendations/kevin-coyle.jpg'
tags: [daily]
- text: |
I really love your daily posts. They are opinionated, and this gives room for thoughts, I appreciate this.
name: Boris Böhne
title: Drupal Developer
url: https://www.drupal.org/u/drubb
image:
url: '%site.assets.url%/assets/images/recommendations/boris-bohne.jpg'
tags: [daily]
- text: |
Following your "Automated tests" emails and they are great! Such a pleasant reading. I love how you start from the very beginning and keeping things simple, step by step.
Looking forward to more content!
name: Matthieu Scarset
title: Drupal Expert
url: https://matthieuscarset.com
image:
url: '%site.assets.url%/assets/images/recommendations/matthieu-scarset.jpg'
tags: [testing, atdc]
- text: |
Hi Oliver, we met briefly at the Tech Connect event in London last month. Been reading through a few of your latest posts and have found the messages valuable, especially as we spent the week learning about unit, integration and e2e testing. I have signed up to your mailing list to keep the good advice flowing!
name: Alexander Carr
title: Full Stack Software Engineer at School of Code
image:
url: '%site.assets.url%/assets/images/recommendations/alexander-carr.jpg'
tags: [daily]
- text: |
These emails are superb and make for very interesting reading. Thank you!
name: Adam Nuttall
title: Drupal Engineer
image:
url: '%site.assets.url%/assets/images/recommendations/adam-nuttall.jpg'
tags: [daily]
- text: Oliver's approach to testing is a continual reminder of his commitment to delivering high-quality, bug-free, software.
name: Mike Karthauser
title: Senior Software Engineer
image:
url: '%site.assets.url%/assets/images/recommendations/mike-karthauser.jpg'
tags: [daily, testing, coaching, atdc]
- text: |
I had the opportunity and good fortune to work with Oliver solving two problems that I was having on a Drupal Commerce site. I have done several Drupal sites using UberCart, but since it is deprecated, I chose to use Commerce. I had searched, posted to forums, and other normal means to find answers to my problems, to no response and to no avail.
I got a referral to Oliver and scheduled an appointment to discuss the problems on a Zoom call. After showing him via screen share where I was stumped, he offered different approaches to what I was doing, which I was fine with as long as it worked.
Once we solved the first problem, I was really elated and then focused on the second one, which was an easier fix. So in a short period of time, both problems were fixed and tested.
I found Oliver was affable and easy to work with. He has a strong work ethic and a desire to solve problems for his customers and can recommend working with him. I think one of his strengths is to find alternative solutions to problems.
name: Tom Evans
title: ~
image: ~
tags: [call]
- text: |
I am a big fan of your git approaches. I especially remember pairing with you and watching how many commands you run to solve many problems and how fast you were. It's a skill I believe not many have, particularly those who are used to working with a GUI like me, and personally I think it is quite valuable.
name: Marcos Duran
title: Senior Software Engineer
image:
url: '%site.assets.url%/assets/images/recommendations/marcos-duran.jpg'
tags: [git, daily, coaching]
- text: |
I like the "$ git log -S" and "$ git log --grep" commands, will definitely be using these, thanks!
name: Stephen Mulvihill
title: Solutions Architect
image:
url: '%site.assets.url%/assets/images/recommendations/stephen-mulvihill.jpg'
tags: [git, daily, coaching]
- text: |
Just wanted to say that your blog is amazing <3 I absolutely love it and usually share it with colleagues and some of the kids at my Code Club.
Thanks for contributing to the community with your amazing content!
name: Patty O'Callaghan
title: Tech Lead
url: https://pattyocallaghan.com
image:
url: '%site.assets.url%/assets/images/recommendations/patty-ocallaghan.jpg'
tags: [daily]
- text: |
I've wanted to explore testing for a while, but as a PHP developer with 10 years of Drupal experience who'd written next to no tests, I really needed guidance. Oliver's expertise in testing and TDD motivated me to seek his help.
Before our call, I'd started writing tests for my modules but needed direction, understanding the code to a degree but needing help with approach. Oliver clarified both unit and integration testing, providing solutions for my challenges, and shared his code for inspiration and help. He also gave me ideas on how to utilise contrib code to help me further.
Consulting with an expert, I gained the clarity and confidence I needed in tackling testing with structured, maintainable practices. Oliver's adaptability and tailored services make him highly recommendable.
Thanks, Oliver - I feel empowered and know exactly what approach to take now!
name: Tawny Bartlett
title: Senior Drupal Developer
url: https://www.playingwithpixels.co.uk
image:
url: '%site.assets.url%/assets/images/recommendations/tawny.jpg'
tags: [testing, coaching, call, atdc]
- text: |
I've worked with Oliver for a number of years on B2C and B2B web projects and he has always demonstrated himself to be an expert in his field.
As an insurance provider, some of our products and services don't naturally fit within a traditional ecommerce journey - but Oliver has always been able to come up with innovative ways to leverage core Drupal functionality and develop custom modules to meet our needs.
Friendly, flexible and diligent - I wouldn't hesitate to recommend Oliver to anyone looking for a Drupal developer to progress their next project.
name: Joe Howell
title: Director, Bastion Insurance
url: https://www.bastioninsurance.co.uk
image:
url: '%site.assets.url%/assets/images/recommendations/joe-howell.jpg'
tags: [front, subscription, coaching]
- text: |
We use Oliver for maintaining a couple of Drupal sites for which we no longer have the skills ourselves. We became aware of Oliver through his work in the Drupal community, and about a year ago we approached him to help us with the deep dive aspects of maintaining and developing Drupal sites. He's been really helpful and very responsive. Much appreciated!
name: Jon Hallett
title: Senior Systems Administrator at the University of Bristol
url: https://bristol.ac.uk
image:
url: '%site.assets.url%/assets/images/recommendations/jon-hallett.jpeg'
tags: [front, subscription]
- text: |
For over a decade we have worked with Oliver on a number of different projects. Initially our collaboration consisted of web maintenance and troubleshooting but we soon tapped Oliver to design, build and maintain a custom awards site which includes both submission and judging functionality. Oliver has deep and wide-ranging skills and I would certainly recommend his services!
name: Michael Itkoff
title: Cofounder
url: https://www.daylightbooks.org
image:
url: '%site.assets.url%/assets/images/recommendations/michael-itkoff.jpg'
tags: [front, subscription]
- text: |
Working with Oliver on the Seren website has been easy and beneficial.
As well as providing general maintenance support, he built a new Drupal module which integrated with our partners Glassboxx so that we could sell ebooks directly from our website. Oliver worked closely with the team at Glassboxx to create the integration which needed to communicate with the Glassboxx app so that users could download their purchases. He was able to resolve issues which came up along the way in order to create a functioning module which we now use on our site.
Oliver has extensive knowledge of Drupal and his familiarity with the Seren site meant he was able to fix problems quickly and efficiently as they arose.
He is reliable and has always been willing to discuss new ideas for how the site could function.
We would recommend working with him for his invaluable knowledge and ability to find solutions to problems at short notice. It has been a pleasure to work with him over the years.
name: Mick Felton
title: Publisher at Poetry Wales Press Ltd (Seren Books)
url: https://www.serenbooks.com
image:
url: '%site.assets.url%/assets/images/recommendations/mick-felton.jpg'
tags: [front, subscription, coaching]
- text: |
Oliver is a pleasure to work with, and I would engage him again without hesitation. He communicates regularly, ensures that he meets requirements, and suggests improvements to the potential solutions to the brief.
name: Duncan Davidson
title: Director at Rohallion
url: https://rohallion.agency
image:
url: '%site.assets.url%/assets/images/recommendations/duncan.jpeg'
tags: [front, subscription, coaching]
- text: |
We have only worked together for a short while but I can see Oliver is a Drupal expert.
His technical knowledge means we have been able to make improvements to the sites we manage quickly and efficiently.
If we have complex issues to contend with in the future I feel confident he will be able to deal with them.
name: Anonymous
title: Marketing Strategist
tags: [front, subscription]
- text: |
A fantastic and highly knowledgeable Drupal Developer. Oliver saved a struggling Drupal project with his wealth of Drupal experience.
name: Adam Cuddihy
title: Web Development Manager
url: ~
image:
url: '%site.assets.url%/assets/images/recommendations/adam.jpeg'
tags: [front, subscription]
- text: |
I had the pleasure of working with Oliver whilst building the first version of our drupal based intranet. His knowledge of Drupal and the wider infrastructure required to run a site was really invaluable.
At the time, we were very new to Drupal, so it gave us a great platform to learn from and expand our own knowledge.
He's the only external contractor that we've kept in touch with over the years, which goes to show how much we valued his input.
name: Huw Davies
title: Web Dev Manager / DevOps / Team Manager at Admiral Group Plc
url: https://admiral.com
image:
url: '%site.assets.url%/assets/images/recommendations/huw.jpeg'
tags: [front, subscription]
- text: |
Oliver really knows his stuff. Whether you are just starting out or looking to take your knowledge to the next level, his patient and clear way of explaining will help get you there.
name: Scott Euser
title: Head of Web Development
url: ~
image:
url: '%site.assets.url%/assets/images/recommendations/scott-euser.jpg'
tags: [testing, coaching]
- text: |
I have had the pleasure of working with Oliver on several projects at Microserve. He is a natural innovator and a great mentor who inspires others to explore new technologies and approaches. He is a highly knowledgeable professional with a passion for all things Drupal and the tenacity required to get the job done well.
name: Alan Hatch
title: Senior Drupal Developer at Microserve
url: ~
image:
url: '%site.assets.url%/assets/images/recommendations/alan.jpeg'
tags: [coaching]
- text: |
Oliver has been an outstanding contributor to the Drupal Association team. He is a talented developer who writes great code and applies his curiosity and love of learning to every project. He is also a fantastic team member, who gives to the team as much as he gets.
Oliver is the embodiment of everything good about the Drupal community.
name: Holly Ross
title: Executive Director at the Drupal Association
url: https://www.drupal.org/association
image:
url: '%site.assets.url%/assets/images/recommendations/holly-ross.png'
tags: [front, subscription]
- text: |
Oliver is a skilled Drupal developer with a passion for the Drupal community. As his direct supervisor, I was able to watch Oliver grow with the Drupal Association and contribute an amazing amount of effort and integrity to all of his work.
Everything we have thrown at Oliver, he has approached with an open and flexible mind that has allowed him to work on a wide range of projects and features for Drupal products.
name: Josh Mitchell
title: CTO at Drupal Association
url: https://joshuami.com
image:
url: '%site.assets.url%/assets/images/recommendations/josh-mitchell.png'
tags: [front, subscription]
- text: |
Oliver was great to work with. He has expert knowledge with Drupal and delivered exactly what we were looking for on time. He's understanding, friendly and easy to get along with. I would enjoy working with him again in the future.
name: Brian Hartwell
title: Interactive Creative Director
url: ~
image: ~
tags: [front, subscription]
- text: |
Oliver was fantastic to work with - pro-active and highly responsive, he worked well remotely and as part of a project team. His understanding of the project requirement(s) and ability to translate it into working code was essential and he delivered.
name: Brian Healy
title: Director of Business Development at Tincan
url: ~
image:
url: '%site.assets.url%/assets/images/recommendations/brian-healy.png'
tags: [front, subscription]
- text: |
Oliver is an amazing colleague, he's professional, full of knowledge and I could not recommend him more.
name: Chris Jarvis
title: Developer at Microserve
url: ~
image:
url: '%site.assets.url%/assets/images/recommendations/chris-jarvis.jpg'
- text: |
Oliver is seasoned Drupal and all round highly skilled and experienced web developer. I have worked with Oliver on an important project where he was reliable, prompt and ensured strict client deadline delivery and confidentiality at all times.
name: Daniel Easterbrook
title: Digital Strategy Consultant
tags: [front, subscription]
plausible:
domain: ~
prose_classes: |
prose prose-p:text-black prose-a:font-light prose-a:text-blue-primary prose-p:text-lg prose-blockquote:border-blue-primary dark:marker:text-white prose-li:my-1 prose-li:text-lg prose-figcaption:text-white prose-li:text-black marker:text-black dark:prose-p:text-white dark:prose-invert dark:prose-a:text-blue-400 dark:prose-blockquote:border-blue-400 dark:prose-li:text-white hover:prose-a:no-underline prose-h2:text-xl prose-code:font-normal prose-h2:mb-4 prose-ul:my-3 dark:prose-hr:border-grey-400 prose-code:before:content-[''] prose-code:after:content-['']
transistor:
feed:
url: https://feeds.transistor.fm/beyond-blocks
share:
url: https://share.transistor.fm/e
youtube:
channel:
slug: opdavies
url: https://www.youtube.com/@%youtube.channel.slug%

View file

@ -0,0 +1,8 @@
---
imports:
- sculpin_site.yml
plausible:
domain: oliverdavies.uk
url: https://www.oliverdavies.uk

7
assets/package.json Normal file
View file

@ -0,0 +1,7 @@
{
"dependencies": {
"@tailwindcss/forms": "^0.5.7",
"@tailwindcss/typography": "^0.5.10",
"tailwindcss": "^3.4.0"
}
}

779
assets/pnpm-lock.yaml Normal file
View file

@ -0,0 +1,779 @@
lockfileVersion: '6.0'
settings:
autoInstallPeers: true
excludeLinksFromLockfile: false
dependencies:
'@tailwindcss/forms':
specifier: ^0.5.7
version: 0.5.7(tailwindcss@3.4.0)
'@tailwindcss/typography':
specifier: ^0.5.10
version: 0.5.10(tailwindcss@3.4.0)
tailwindcss:
specifier: ^3.4.0
version: 3.4.0
packages:
/@alloc/quick-lru@5.2.0:
resolution: {integrity: sha512-UrcABB+4bUrFABwbluTIBErXwvbsU/V7TZWfmbgJfbkwiBuziS9gxdODUyuiecfdGQ85jglMW6juS3+z5TsKLw==}
engines: {node: '>=10'}
dev: false
/@isaacs/cliui@8.0.2:
resolution: {integrity: sha512-O8jcjabXaleOG9DQ0+ARXWZBTfnP4WNAqzuiJK7ll44AmxGKv/J2M4TPjxjY3znBCfvBXFzucm1twdyFybFqEA==}
engines: {node: '>=12'}
dependencies:
string-width: 5.1.2
string-width-cjs: /string-width@4.2.3
strip-ansi: 7.1.0
strip-ansi-cjs: /strip-ansi@6.0.1
wrap-ansi: 8.1.0
wrap-ansi-cjs: /wrap-ansi@7.0.0
dev: false
/@jridgewell/gen-mapping@0.3.3:
resolution: {integrity: sha512-HLhSWOLRi875zjjMG/r+Nv0oCW8umGb0BgEhyX3dDX3egwZtB8PqLnjz3yedt8R5StBrzcg4aBpnh8UA9D1BoQ==}
engines: {node: '>=6.0.0'}
dependencies:
'@jridgewell/set-array': 1.1.2
'@jridgewell/sourcemap-codec': 1.4.15
'@jridgewell/trace-mapping': 0.3.20
dev: false
/@jridgewell/resolve-uri@3.1.1:
resolution: {integrity: sha512-dSYZh7HhCDtCKm4QakX0xFpsRDqjjtZf/kjI/v3T3Nwt5r8/qz/M19F9ySyOqU94SXBmeG9ttTul+YnR4LOxFA==}
engines: {node: '>=6.0.0'}
dev: false
/@jridgewell/set-array@1.1.2:
resolution: {integrity: sha512-xnkseuNADM0gt2bs+BvhO0p78Mk762YnZdsuzFV018NoG1Sj1SCQvpSqa7XUaTam5vAGasABV9qXASMKnFMwMw==}
engines: {node: '>=6.0.0'}
dev: false
/@jridgewell/sourcemap-codec@1.4.15:
resolution: {integrity: sha512-eF2rxCRulEKXHTRiDrDy6erMYWqNw4LPdQ8UQA4huuxaQsVeRPFl2oM8oDGxMFhJUWZf9McpLtJasDDZb/Bpeg==}
dev: false
/@jridgewell/trace-mapping@0.3.20:
resolution: {integrity: sha512-R8LcPeWZol2zR8mmH3JeKQ6QRCFb7XgUhV9ZlGhHLGyg4wpPiPZNQOOWhFZhxKw8u//yTbNGI42Bx/3paXEQ+Q==}
dependencies:
'@jridgewell/resolve-uri': 3.1.1
'@jridgewell/sourcemap-codec': 1.4.15
dev: false
/@nodelib/fs.scandir@2.1.5:
resolution: {integrity: sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==}
engines: {node: '>= 8'}
dependencies:
'@nodelib/fs.stat': 2.0.5
run-parallel: 1.2.0
dev: false
/@nodelib/fs.stat@2.0.5:
resolution: {integrity: sha512-RkhPPp2zrqDAQA/2jNhnztcPAlv64XdhIp7a7454A5ovI7Bukxgt7MX7udwAu3zg1DcpPU0rz3VV1SeaqvY4+A==}
engines: {node: '>= 8'}
dev: false
/@nodelib/fs.walk@1.2.8:
resolution: {integrity: sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==}
engines: {node: '>= 8'}
dependencies:
'@nodelib/fs.scandir': 2.1.5
fastq: 1.16.0
dev: false
/@pkgjs/parseargs@0.11.0:
resolution: {integrity: sha512-+1VkjdD0QBLPodGrJUeqarH8VAIvQODIbwh9XpP5Syisf7YoQgsJKPNFoqqLQlu+VQ/tVSshMR6loPMn8U+dPg==}
engines: {node: '>=14'}
requiresBuild: true
dev: false
optional: true
/@tailwindcss/forms@0.5.7(tailwindcss@3.4.0):
resolution: {integrity: sha512-QE7X69iQI+ZXwldE+rzasvbJiyV/ju1FGHH0Qn2W3FKbuYtqp8LKcy6iSw79fVUT5/Vvf+0XgLCeYVG+UV6hOw==}
peerDependencies:
tailwindcss: '>=3.0.0 || >= 3.0.0-alpha.1'
dependencies:
mini-svg-data-uri: 1.4.4
tailwindcss: 3.4.0
dev: false
/@tailwindcss/typography@0.5.10(tailwindcss@3.4.0):
resolution: {integrity: sha512-Pe8BuPJQJd3FfRnm6H0ulKIGoMEQS+Vq01R6M5aCrFB/ccR/shT+0kXLjouGC1gFLm9hopTFN+DMP0pfwRWzPw==}
peerDependencies:
tailwindcss: '>=3.0.0 || insiders'
dependencies:
lodash.castarray: 4.4.0
lodash.isplainobject: 4.0.6
lodash.merge: 4.6.2
postcss-selector-parser: 6.0.10
tailwindcss: 3.4.0
dev: false
/ansi-regex@5.0.1:
resolution: {integrity: sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==}
engines: {node: '>=8'}
dev: false
/ansi-regex@6.0.1:
resolution: {integrity: sha512-n5M855fKb2SsfMIiFFoVrABHJC8QtHwVx+mHWP3QcEqBHYienj5dHSgjbxtC0WEZXYt4wcD6zrQElDPhFuZgfA==}
engines: {node: '>=12'}
dev: false
/ansi-styles@4.3.0:
resolution: {integrity: sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==}
engines: {node: '>=8'}
dependencies:
color-convert: 2.0.1
dev: false
/ansi-styles@6.2.1:
resolution: {integrity: sha512-bN798gFfQX+viw3R7yrGWRqnrN2oRkEkUjjl4JNn4E8GxxbjtG3FbrEIIY3l8/hrwUwIeCZvi4QuOTP4MErVug==}
engines: {node: '>=12'}
dev: false
/any-promise@1.3.0:
resolution: {integrity: sha512-7UvmKalWRt1wgjL1RrGxoSJW/0QZFIegpeGvZG9kjp8vrRu55XTHbwnqq2GpXm9uLbcuhxm3IqX9OB4MZR1b2A==}
dev: false
/anymatch@3.1.3:
resolution: {integrity: sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==}
engines: {node: '>= 8'}
dependencies:
normalize-path: 3.0.0
picomatch: 2.3.1
dev: false
/arg@5.0.2:
resolution: {integrity: sha512-PYjyFOLKQ9y57JvQ6QLo8dAgNqswh8M1RMJYdQduT6xbWSgK36P/Z/v+p888pM69jMMfS8Xd8F6I1kQ/I9HUGg==}
dev: false
/balanced-match@1.0.2:
resolution: {integrity: sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==}
dev: false
/binary-extensions@2.2.0:
resolution: {integrity: sha512-jDctJ/IVQbZoJykoeHbhXpOlNBqGNcwXJKJog42E5HDPUwQTSdjCHdihjj0DlnheQ7blbT6dHOafNAiS8ooQKA==}
engines: {node: '>=8'}
dev: false
/brace-expansion@2.0.1:
resolution: {integrity: sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA==}
dependencies:
balanced-match: 1.0.2
dev: false
/braces@3.0.2:
resolution: {integrity: sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==}
engines: {node: '>=8'}
dependencies:
fill-range: 7.0.1
dev: false
/camelcase-css@2.0.1:
resolution: {integrity: sha512-QOSvevhslijgYwRx6Rv7zKdMF8lbRmx+uQGx2+vDc+KI/eBnsy9kit5aj23AgGu3pa4t9AgwbnXWqS+iOY+2aA==}
engines: {node: '>= 6'}
dev: false
/chokidar@3.5.3:
resolution: {integrity: sha512-Dr3sfKRP6oTcjf2JmUmFJfeVMvXBdegxB0iVQ5eb2V10uFJUCAS8OByZdVAyVb8xXNz3GjjTgj9kLWsZTqE6kw==}
engines: {node: '>= 8.10.0'}
dependencies:
anymatch: 3.1.3
braces: 3.0.2
glob-parent: 5.1.2
is-binary-path: 2.1.0
is-glob: 4.0.3
normalize-path: 3.0.0
readdirp: 3.6.0
optionalDependencies:
fsevents: 2.3.3
dev: false
/color-convert@2.0.1:
resolution: {integrity: sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==}
engines: {node: '>=7.0.0'}
dependencies:
color-name: 1.1.4
dev: false
/color-name@1.1.4:
resolution: {integrity: sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==}
dev: false
/commander@4.1.1:
resolution: {integrity: sha512-NOKm8xhkzAjzFx8B2v5OAHT+u5pRQc2UCa2Vq9jYL/31o2wi9mxBA7LIFs3sV5VSC49z6pEhfbMULvShKj26WA==}
engines: {node: '>= 6'}
dev: false
/cross-spawn@7.0.3:
resolution: {integrity: sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==}
engines: {node: '>= 8'}
dependencies:
path-key: 3.1.1
shebang-command: 2.0.0
which: 2.0.2
dev: false
/cssesc@3.0.0:
resolution: {integrity: sha512-/Tb/JcjK111nNScGob5MNtsntNM1aCNUDipB/TkwZFhyDrrE47SOx/18wF2bbjgc3ZzCSKW1T5nt5EbFoAz/Vg==}
engines: {node: '>=4'}
hasBin: true
dev: false
/didyoumean@1.2.2:
resolution: {integrity: sha512-gxtyfqMg7GKyhQmb056K7M3xszy/myH8w+B4RT+QXBQsvAOdc3XymqDDPHx1BgPgsdAA5SIifona89YtRATDzw==}
dev: false
/dlv@1.1.3:
resolution: {integrity: sha512-+HlytyjlPKnIG8XuRG8WvmBP8xs8P71y+SKKS6ZXWoEgLuePxtDoUEiH7WkdePWrQ5JBpE6aoVqfZfJUQkjXwA==}
dev: false
/eastasianwidth@0.2.0:
resolution: {integrity: sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==}
dev: false
/emoji-regex@8.0.0:
resolution: {integrity: sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==}
dev: false
/emoji-regex@9.2.2:
resolution: {integrity: sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==}
dev: false
/fast-glob@3.3.2:
resolution: {integrity: sha512-oX2ruAFQwf/Orj8m737Y5adxDQO0LAB7/S5MnxCdTNDd4p6BsyIVsv9JQsATbTSq8KHRpLwIHbVlUNatxd+1Ow==}
engines: {node: '>=8.6.0'}
dependencies:
'@nodelib/fs.stat': 2.0.5
'@nodelib/fs.walk': 1.2.8
glob-parent: 5.1.2
merge2: 1.4.1
micromatch: 4.0.5
dev: false
/fastq@1.16.0:
resolution: {integrity: sha512-ifCoaXsDrsdkWTtiNJX5uzHDsrck5TzfKKDcuFFTIrrc/BS076qgEIfoIy1VeZqViznfKiysPYTh/QeHtnIsYA==}
dependencies:
reusify: 1.0.4
dev: false
/fill-range@7.0.1:
resolution: {integrity: sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==}
engines: {node: '>=8'}
dependencies:
to-regex-range: 5.0.1
dev: false
/foreground-child@3.1.1:
resolution: {integrity: sha512-TMKDUnIte6bfb5nWv7V/caI169OHgvwjb7V4WkeUvbQQdjr5rWKqHFiKWb/fcOwB+CzBT+qbWjvj+DVwRskpIg==}
engines: {node: '>=14'}
dependencies:
cross-spawn: 7.0.3
signal-exit: 4.1.0
dev: false
/fsevents@2.3.3:
resolution: {integrity: sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==}
engines: {node: ^8.16.0 || ^10.6.0 || >=11.0.0}
os: [darwin]
requiresBuild: true
dev: false
optional: true
/function-bind@1.1.2:
resolution: {integrity: sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==}
dev: false
/glob-parent@5.1.2:
resolution: {integrity: sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==}
engines: {node: '>= 6'}
dependencies:
is-glob: 4.0.3
dev: false
/glob-parent@6.0.2:
resolution: {integrity: sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==}
engines: {node: '>=10.13.0'}
dependencies:
is-glob: 4.0.3
dev: false
/glob@10.3.10:
resolution: {integrity: sha512-fa46+tv1Ak0UPK1TOy/pZrIybNNt4HCv7SDzwyfiOZkvZLEbjsZkJBPtDHVshZjbecAoAGSC20MjLDG/qr679g==}
engines: {node: '>=16 || 14 >=14.17'}
hasBin: true
dependencies:
foreground-child: 3.1.1
jackspeak: 2.3.6
minimatch: 9.0.3
minipass: 7.0.4
path-scurry: 1.10.1
dev: false
/hasown@2.0.0:
resolution: {integrity: sha512-vUptKVTpIJhcczKBbgnS+RtcuYMB8+oNzPK2/Hp3hanz8JmpATdmmgLgSaadVREkDm+e2giHwY3ZRkyjSIDDFA==}
engines: {node: '>= 0.4'}
dependencies:
function-bind: 1.1.2
dev: false
/is-binary-path@2.1.0:
resolution: {integrity: sha512-ZMERYes6pDydyuGidse7OsHxtbI7WVeUEozgR/g7rd0xUimYNlvZRE/K2MgZTjWy725IfelLeVcEM97mmtRGXw==}
engines: {node: '>=8'}
dependencies:
binary-extensions: 2.2.0
dev: false
/is-core-module@2.13.1:
resolution: {integrity: sha512-hHrIjvZsftOsvKSn2TRYl63zvxsgE0K+0mYMoH6gD4omR5IWB2KynivBQczo3+wF1cCkjzvptnI9Q0sPU66ilw==}
dependencies:
hasown: 2.0.0
dev: false
/is-extglob@2.1.1:
resolution: {integrity: sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==}
engines: {node: '>=0.10.0'}
dev: false
/is-fullwidth-code-point@3.0.0:
resolution: {integrity: sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==}
engines: {node: '>=8'}
dev: false
/is-glob@4.0.3:
resolution: {integrity: sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==}
engines: {node: '>=0.10.0'}
dependencies:
is-extglob: 2.1.1
dev: false
/is-number@7.0.0:
resolution: {integrity: sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==}
engines: {node: '>=0.12.0'}
dev: false
/isexe@2.0.0:
resolution: {integrity: sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==}
dev: false
/jackspeak@2.3.6:
resolution: {integrity: sha512-N3yCS/NegsOBokc8GAdM8UcmfsKiSS8cipheD/nivzr700H+nsMOxJjQnvwOcRYVuFkdH0wGUvW2WbXGmrZGbQ==}
engines: {node: '>=14'}
dependencies:
'@isaacs/cliui': 8.0.2
optionalDependencies:
'@pkgjs/parseargs': 0.11.0
dev: false
/jiti@1.21.0:
resolution: {integrity: sha512-gFqAIbuKyyso/3G2qhiO2OM6shY6EPP/R0+mkDbyspxKazh8BXDC5FiFsUjlczgdNz/vfra0da2y+aHrusLG/Q==}
hasBin: true
dev: false
/lilconfig@2.1.0:
resolution: {integrity: sha512-utWOt/GHzuUxnLKxB6dk81RoOeoNeHgbrXiuGk4yyF5qlRz+iIVWu56E2fqGHFrXz0QNUhLB/8nKqvRH66JKGQ==}
engines: {node: '>=10'}
dev: false
/lilconfig@3.0.0:
resolution: {integrity: sha512-K2U4W2Ff5ibV7j7ydLr+zLAkIg5JJ4lPn1Ltsdt+Tz/IjQ8buJ55pZAxoP34lqIiwtF9iAvtLv3JGv7CAyAg+g==}
engines: {node: '>=14'}
dev: false
/lines-and-columns@1.2.4:
resolution: {integrity: sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==}
dev: false
/lodash.castarray@4.4.0:
resolution: {integrity: sha512-aVx8ztPv7/2ULbArGJ2Y42bG1mEQ5mGjpdvrbJcJFU3TbYybe+QlLS4pst9zV52ymy2in1KpFPiZnAOATxD4+Q==}
dev: false
/lodash.isplainobject@4.0.6:
resolution: {integrity: sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA==}
dev: false
/lodash.merge@4.6.2:
resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==}
dev: false
/lru-cache@10.1.0:
resolution: {integrity: sha512-/1clY/ui8CzjKFyjdvwPWJUYKiFVXG2I2cY0ssG7h4+hwk+XOIX7ZSG9Q7TW8TW3Kp3BUSqgFWBLgL4PJ+Blag==}
engines: {node: 14 || >=16.14}
dev: false
/merge2@1.4.1:
resolution: {integrity: sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==}
engines: {node: '>= 8'}
dev: false
/micromatch@4.0.5:
resolution: {integrity: sha512-DMy+ERcEW2q8Z2Po+WNXuw3c5YaUSFjAO5GsJqfEl7UjvtIuFKO6ZrKvcItdy98dwFI2N1tg3zNIdKaQT+aNdA==}
engines: {node: '>=8.6'}
dependencies:
braces: 3.0.2
picomatch: 2.3.1
dev: false
/mini-svg-data-uri@1.4.4:
resolution: {integrity: sha512-r9deDe9p5FJUPZAk3A59wGH7Ii9YrjjWw0jmw/liSbHl2CHiyXj6FcDXDu2K3TjVAXqiJdaw3xxwlZZr9E6nHg==}
hasBin: true
dev: false
/minimatch@9.0.3:
resolution: {integrity: sha512-RHiac9mvaRw0x3AYRgDC1CxAP7HTcNrrECeA8YYJeWnpo+2Q5CegtZjaotWTWxDG3UeGA1coE05iH1mPjT/2mg==}
engines: {node: '>=16 || 14 >=14.17'}
dependencies:
brace-expansion: 2.0.1
dev: false
/minipass@7.0.4:
resolution: {integrity: sha512-jYofLM5Dam9279rdkWzqHozUo4ybjdZmCsDHePy5V/PbBcVMiSZR97gmAy45aqi8CK1lG2ECd356FU86avfwUQ==}
engines: {node: '>=16 || 14 >=14.17'}
dev: false
/mz@2.7.0:
resolution: {integrity: sha512-z81GNO7nnYMEhrGh9LeymoE4+Yr0Wn5McHIZMK5cfQCl+NDX08sCZgUc9/6MHni9IWuFLm1Z3HTCXu2z9fN62Q==}
dependencies:
any-promise: 1.3.0
object-assign: 4.1.1
thenify-all: 1.6.0
dev: false
/nanoid@3.3.7:
resolution: {integrity: sha512-eSRppjcPIatRIMC1U6UngP8XFcz8MQWGQdt1MTBQ7NaAmvXDfvNxbvWV3x2y6CdEUciCSsDHDQZbhYaB8QEo2g==}
engines: {node: ^10 || ^12 || ^13.7 || ^14 || >=15.0.1}
hasBin: true
dev: false
/normalize-path@3.0.0:
resolution: {integrity: sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==}
engines: {node: '>=0.10.0'}
dev: false
/object-assign@4.1.1:
resolution: {integrity: sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg==}
engines: {node: '>=0.10.0'}
dev: false
/object-hash@3.0.0:
resolution: {integrity: sha512-RSn9F68PjH9HqtltsSnqYC1XXoWe9Bju5+213R98cNGttag9q9yAOTzdbsqvIa7aNm5WffBZFpWYr2aWrklWAw==}
engines: {node: '>= 6'}
dev: false
/path-key@3.1.1:
resolution: {integrity: sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==}
engines: {node: '>=8'}
dev: false
/path-parse@1.0.7:
resolution: {integrity: sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==}
dev: false
/path-scurry@1.10.1:
resolution: {integrity: sha512-MkhCqzzBEpPvxxQ71Md0b1Kk51W01lrYvlMzSUaIzNsODdd7mqhiimSZlr+VegAz5Z6Vzt9Xg2ttE//XBhH3EQ==}
engines: {node: '>=16 || 14 >=14.17'}
dependencies:
lru-cache: 10.1.0
minipass: 7.0.4
dev: false
/picocolors@1.0.0:
resolution: {integrity: sha512-1fygroTLlHu66zi26VoTDv8yRgm0Fccecssto+MhsZ0D/DGW2sm8E8AjW7NU5VVTRt5GxbeZ5qBuJr+HyLYkjQ==}
dev: false
/picomatch@2.3.1:
resolution: {integrity: sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==}
engines: {node: '>=8.6'}
dev: false
/pify@2.3.0:
resolution: {integrity: sha512-udgsAY+fTnvv7kI7aaxbqwWNb0AHiB0qBO89PZKPkoTmGOgdbrHDKD+0B2X4uTfJ/FT1R09r9gTsjUjNJotuog==}
engines: {node: '>=0.10.0'}
dev: false
/pirates@4.0.6:
resolution: {integrity: sha512-saLsH7WeYYPiD25LDuLRRY/i+6HaPYr6G1OUlN39otzkSTxKnubR9RTxS3/Kk50s1g2JTgFwWQDQyplC5/SHZg==}
engines: {node: '>= 6'}
dev: false
/postcss-import@15.1.0(postcss@8.4.32):
resolution: {integrity: sha512-hpr+J05B2FVYUAXHeK1YyI267J/dDDhMU6B6civm8hSY1jYJnBXxzKDKDswzJmtLHryrjhnDjqqp/49t8FALew==}
engines: {node: '>=14.0.0'}
peerDependencies:
postcss: ^8.0.0
dependencies:
postcss: 8.4.32
postcss-value-parser: 4.2.0
read-cache: 1.0.0
resolve: 1.22.8
dev: false
/postcss-js@4.0.1(postcss@8.4.32):
resolution: {integrity: sha512-dDLF8pEO191hJMtlHFPRa8xsizHaM82MLfNkUHdUtVEV3tgTp5oj+8qbEqYM57SLfc74KSbw//4SeJma2LRVIw==}
engines: {node: ^12 || ^14 || >= 16}
peerDependencies:
postcss: ^8.4.21
dependencies:
camelcase-css: 2.0.1
postcss: 8.4.32
dev: false
/postcss-load-config@4.0.2(postcss@8.4.32):
resolution: {integrity: sha512-bSVhyJGL00wMVoPUzAVAnbEoWyqRxkjv64tUl427SKnPrENtq6hJwUojroMz2VB+Q1edmi4IfrAPpami5VVgMQ==}
engines: {node: '>= 14'}
peerDependencies:
postcss: '>=8.0.9'
ts-node: '>=9.0.0'
peerDependenciesMeta:
postcss:
optional: true
ts-node:
optional: true
dependencies:
lilconfig: 3.0.0
postcss: 8.4.32
yaml: 2.3.4
dev: false
/postcss-nested@6.0.1(postcss@8.4.32):
resolution: {integrity: sha512-mEp4xPMi5bSWiMbsgoPfcP74lsWLHkQbZc3sY+jWYd65CUwXrUaTp0fmNpa01ZcETKlIgUdFN/MpS2xZtqL9dQ==}
engines: {node: '>=12.0'}
peerDependencies:
postcss: ^8.2.14
dependencies:
postcss: 8.4.32
postcss-selector-parser: 6.0.14
dev: false
/postcss-selector-parser@6.0.10:
resolution: {integrity: sha512-IQ7TZdoaqbT+LCpShg46jnZVlhWD2w6iQYAcYXfHARZ7X1t/UGhhceQDs5X0cGqKvYlHNOuv7Oa1xmb0oQuA3w==}
engines: {node: '>=4'}
dependencies:
cssesc: 3.0.0
util-deprecate: 1.0.2
dev: false
/postcss-selector-parser@6.0.14:
resolution: {integrity: sha512-65xXYsT40i9GyWzlHQ5ShZoK7JZdySeOozi/tz2EezDo6c04q6+ckYMeoY7idaie1qp2dT5KoYQ2yky6JuoHnA==}
engines: {node: '>=4'}
dependencies:
cssesc: 3.0.0
util-deprecate: 1.0.2
dev: false
/postcss-value-parser@4.2.0:
resolution: {integrity: sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ==}
dev: false
/postcss@8.4.32:
resolution: {integrity: sha512-D/kj5JNu6oo2EIy+XL/26JEDTlIbB8hw85G8StOE6L74RQAVVP5rej6wxCNqyMbR4RkPfqvezVbPw81Ngd6Kcw==}
engines: {node: ^10 || ^12 || >=14}
dependencies:
nanoid: 3.3.7
picocolors: 1.0.0
source-map-js: 1.0.2
dev: false
/queue-microtask@1.2.3:
resolution: {integrity: sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A==}
dev: false
/read-cache@1.0.0:
resolution: {integrity: sha512-Owdv/Ft7IjOgm/i0xvNDZ1LrRANRfew4b2prF3OWMQLxLfu3bS8FVhCsrSCMK4lR56Y9ya+AThoTpDCTxCmpRA==}
dependencies:
pify: 2.3.0
dev: false
/readdirp@3.6.0:
resolution: {integrity: sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==}
engines: {node: '>=8.10.0'}
dependencies:
picomatch: 2.3.1
dev: false
/resolve@1.22.8:
resolution: {integrity: sha512-oKWePCxqpd6FlLvGV1VU0x7bkPmmCNolxzjMf4NczoDnQcIWrAF+cPtZn5i6n+RfD2d9i0tzpKnG6Yk168yIyw==}
hasBin: true
dependencies:
is-core-module: 2.13.1
path-parse: 1.0.7
supports-preserve-symlinks-flag: 1.0.0
dev: false
/reusify@1.0.4:
resolution: {integrity: sha512-U9nH88a3fc/ekCF1l0/UP1IosiuIjyTh7hBvXVMHYgVcfGvt897Xguj2UOLDeI5BG2m7/uwyaLVT6fbtCwTyzw==}
engines: {iojs: '>=1.0.0', node: '>=0.10.0'}
dev: false
/run-parallel@1.2.0:
resolution: {integrity: sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA==}
dependencies:
queue-microtask: 1.2.3
dev: false
/shebang-command@2.0.0:
resolution: {integrity: sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==}
engines: {node: '>=8'}
dependencies:
shebang-regex: 3.0.0
dev: false
/shebang-regex@3.0.0:
resolution: {integrity: sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==}
engines: {node: '>=8'}
dev: false
/signal-exit@4.1.0:
resolution: {integrity: sha512-bzyZ1e88w9O1iNJbKnOlvYTrWPDl46O1bG0D3XInv+9tkPrxrN8jUUTiFlDkkmKWgn1M6CfIA13SuGqOa9Korw==}
engines: {node: '>=14'}
dev: false
/source-map-js@1.0.2:
resolution: {integrity: sha512-R0XvVJ9WusLiqTCEiGCmICCMplcCkIwwR11mOSD9CR5u+IXYdiseeEuXCVAjS54zqwkLcPNnmU4OeJ6tUrWhDw==}
engines: {node: '>=0.10.0'}
dev: false
/string-width@4.2.3:
resolution: {integrity: sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==}
engines: {node: '>=8'}
dependencies:
emoji-regex: 8.0.0
is-fullwidth-code-point: 3.0.0
strip-ansi: 6.0.1
dev: false
/string-width@5.1.2:
resolution: {integrity: sha512-HnLOCR3vjcY8beoNLtcjZ5/nxn2afmME6lhrDrebokqMap+XbeW8n9TXpPDOqdGK5qcI3oT0GKTW6wC7EMiVqA==}
engines: {node: '>=12'}
dependencies:
eastasianwidth: 0.2.0
emoji-regex: 9.2.2
strip-ansi: 7.1.0
dev: false
/strip-ansi@6.0.1:
resolution: {integrity: sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==}
engines: {node: '>=8'}
dependencies:
ansi-regex: 5.0.1
dev: false
/strip-ansi@7.1.0:
resolution: {integrity: sha512-iq6eVVI64nQQTRYq2KtEg2d2uU7LElhTJwsH4YzIHZshxlgZms/wIc4VoDQTlG/IvVIrBKG06CrZnp0qv7hkcQ==}
engines: {node: '>=12'}
dependencies:
ansi-regex: 6.0.1
dev: false
/sucrase@3.35.0:
resolution: {integrity: sha512-8EbVDiu9iN/nESwxeSxDKe0dunta1GOlHufmSSXxMD2z2/tMZpDMpvXQGsc+ajGo8y2uYUmixaSRUc/QPoQ0GA==}
engines: {node: '>=16 || 14 >=14.17'}
hasBin: true
dependencies:
'@jridgewell/gen-mapping': 0.3.3
commander: 4.1.1
glob: 10.3.10
lines-and-columns: 1.2.4
mz: 2.7.0
pirates: 4.0.6
ts-interface-checker: 0.1.13
dev: false
/supports-preserve-symlinks-flag@1.0.0:
resolution: {integrity: sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==}
engines: {node: '>= 0.4'}
dev: false
/tailwindcss@3.4.0:
resolution: {integrity: sha512-VigzymniH77knD1dryXbyxR+ePHihHociZbXnLZHUyzf2MMs2ZVqlUrZ3FvpXP8pno9JzmILt1sZPD19M3IxtA==}
engines: {node: '>=14.0.0'}
hasBin: true
dependencies:
'@alloc/quick-lru': 5.2.0
arg: 5.0.2
chokidar: 3.5.3
didyoumean: 1.2.2
dlv: 1.1.3
fast-glob: 3.3.2
glob-parent: 6.0.2
is-glob: 4.0.3
jiti: 1.21.0
lilconfig: 2.1.0
micromatch: 4.0.5
normalize-path: 3.0.0
object-hash: 3.0.0
picocolors: 1.0.0
postcss: 8.4.32
postcss-import: 15.1.0(postcss@8.4.32)
postcss-js: 4.0.1(postcss@8.4.32)
postcss-load-config: 4.0.2(postcss@8.4.32)
postcss-nested: 6.0.1(postcss@8.4.32)
postcss-selector-parser: 6.0.14
resolve: 1.22.8
sucrase: 3.35.0
transitivePeerDependencies:
- ts-node
dev: false
/thenify-all@1.6.0:
resolution: {integrity: sha512-RNxQH/qI8/t3thXJDwcstUO4zeqo64+Uy/+sNVRBx4Xn2OX+OZ9oP+iJnNFqplFra2ZUVeKCSa2oVWi3T4uVmA==}
engines: {node: '>=0.8'}
dependencies:
thenify: 3.3.1
dev: false
/thenify@3.3.1:
resolution: {integrity: sha512-RVZSIV5IG10Hk3enotrhvz0T9em6cyHBLkH/YAZuKqd8hRkKhSfCGIcP2KUY0EPxndzANBmNllzWPwak+bheSw==}
dependencies:
any-promise: 1.3.0
dev: false
/to-regex-range@5.0.1:
resolution: {integrity: sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==}
engines: {node: '>=8.0'}
dependencies:
is-number: 7.0.0
dev: false
/ts-interface-checker@0.1.13:
resolution: {integrity: sha512-Y/arvbn+rrz3JCKl9C4kVNfTfSm2/mEp5FSz5EsZSANGPSlQrpRI5M4PKF+mJnE52jOO90PnPSc3Ur3bTQw0gA==}
dev: false
/util-deprecate@1.0.2:
resolution: {integrity: sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw==}
dev: false
/which@2.0.2:
resolution: {integrity: sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==}
engines: {node: '>= 8'}
hasBin: true
dependencies:
isexe: 2.0.0
dev: false
/wrap-ansi@7.0.0:
resolution: {integrity: sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==}
engines: {node: '>=10'}
dependencies:
ansi-styles: 4.3.0
string-width: 4.2.3
strip-ansi: 6.0.1
dev: false
/wrap-ansi@8.1.0:
resolution: {integrity: sha512-si7QWI6zUMq56bESFvagtmzMdGOtoxfR+Sez11Mobfc7tm+VkUckk9bW2UeffTGVUbOksxmSw0AA2gs8g71NCQ==}
engines: {node: '>=12'}
dependencies:
ansi-styles: 6.2.1
string-width: 5.1.2
strip-ansi: 7.1.0
dev: false
/yaml@2.3.4:
resolution: {integrity: sha512-8aAvwVUSHpfEqTQ4w/KMlf3HcRdt50E5ODIQJBw1fQ5RL34xabzxtUlzTXVqc4rkZsPbvrXKWnABCD7kWSmocA==}
engines: {node: '>= 14'}
dev: false

38
assets/tailwind.config.ts Normal file
View file

@ -0,0 +1,38 @@
import colours from "tailwindcss/colors";
import type { Config } from "tailwindcss";
import formsPlugin from '@tailwindcss/forms';
import typographyPlugin from "@tailwindcss/typography";
export default {
content: [
"../app/config/**/*.yml",
"../source/**/*.{md,twig}",
],
theme: {
colors: {
black: "#000",
blue: {
primary: "#24608A",
400: "#60a5fa",
},
current: "currentColor",
grey: colours.stone,
inherit: "inherit",
transparent: "transparent",
white: "#fff",
},
extend: {
fontFamily: {
sans: [
"Roboto Condensed",
"Arial",
"Helvetica Neue",
"Helvetica",
"sans-serif",
],
},
},
},
plugins: [formsPlugin, typographyPlugin],
} satisfies Config;

13
build.yaml Normal file
View file

@ -0,0 +1,13 @@
---
name: oliverdavies-uk
type: sculpin
language: php
flake:
devshell:
packages:
- bashInteractive
- nodePackages.pnpm
- nodejs
- php82
- php82Packages.composer

23
composer.json Normal file
View file

@ -0,0 +1,23 @@
{
"require": {
"illuminate/collections": "^11.6",
"opdavies/sculpin-twig-markdown-bundle": "^0.2.0",
"sculpin/sculpin": "^3.2"
},
"config": {
"allow-plugins": {
"sculpin/sculpin-theme-composer-plugin": true
}
},
"autoload": {
"psr-4": {
"App\\": "src" }
},
"autoload-dev": {
"psr-4": {
"App\\Tests\\": "tests" }
},
"require-dev": {
"phpunit/phpunit": "^11.1"
}
}

5453
composer.lock generated Normal file

File diff suppressed because it is too large Load diff

27
flake.lock Normal file
View file

@ -0,0 +1,27 @@
{
"nodes": {
"nixpkgs": {
"locked": {
"lastModified": 1711703276,
"narHash": "sha256-iMUFArF0WCatKK6RzfUJknjem0H9m4KgorO/p3Dopkk=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "d8fe5e6c92d0d190646fb9f1056741a229980089",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"nixpkgs": "nixpkgs"
}
}
},
"root": "root",
"version": 7
}

19
flake.nix Normal file
View file

@ -0,0 +1,19 @@
# Do not edit this file. It is automatically generated by https://www.oliverdavies.uk/build-configs.
{
description = "A Nix Flake for oliverdavies-uk";
inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
outputs = { nixpkgs, ... }:
let
system = "x86_64-linux";
pkgs = nixpkgs.legacyPackages.${system};
inherit (pkgs) mkShell;
in {
devShells.${system}.default =
mkShell { buildInputs = with pkgs; [ bashInteractive nodePackages.pnpm nodejs php82 php82Packages.composer ]; };
formatter.${system} = pkgs.nixfmt;
};
}

8
phpunit.xml.dist Normal file
View file

@ -0,0 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?>
<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" bootstrap="./vendor/autoload.php" colors="true" stopOnFailure="true" xsi:noNamespaceSchemaLocation="https://schema.phpunit.de/11.1/phpunit.xsd" cacheDirectory=".phpunit.cache">
<testsuites>
<testsuite name="unit tests">
<directory suffix="Test.php">tests</directory>
</testsuite>
</testsuites>
</phpunit>

View file

@ -0,0 +1,13 @@
---
title: {{ title }}
date: {{ date }}
permalink: {{ permalink }}
tags:
- software-development
# - drupal
# - php
# - podcast
cta: ~
snippet: |
TODO
---

44
run Executable file
View file

@ -0,0 +1,44 @@
#!/usr/bin/env bash
# Do not edit this file. It is automatically generated by https://www.oliverdavies.uk/build-configs.
set -o errexit
set -o nounset
set -o pipefail
PATH="${PATH}:./vendor/bin"
# Generate the site.
function generate {
local args=()
if [[ "${APP_ENV:-}" == "production" ]]; then
args=(--env="prod")
else
args=(--server --watch)
fi
sculpin generate "${args[@]}" "${@}"
}
function help {
printf "%s <task> [args]\n\nTasks:\n" "${0}"
compgen -A function | grep -v "^_" | cat -n
printf "\nExtended help:\n Each task has comments for general usage\n"
}
# Start the project.
function start {
sculpin generate --server --watch "${@}"
}
# Include any local tasks.
# https://stackoverflow.com/a/6659698
[[ -e "${BASH_SOURCE%/*}/run.local" ]] && source "${BASH_SOURCE%/*}/run.local"
TIMEFORMAT="Task completed in %3lR"
time "${@:-help}"
# vim: ft=bash

90
run.local Executable file
View file

@ -0,0 +1,90 @@
#!/usr/bin/env bash
function clean {
rm -fr output_*/ source/build/
}
# Create a new daily email.
function create-daily {
local date="${1}"
local title="${2}"
if [ "${date}" == "next" ]; then
next_date=$(ls -1 source/_daily_emails | tail -n 1 | tr -d '.md' | xargs -I {} date +%Y-%m-%d -d '{} +1 day')
else
next_date="${date}"
fi
filepath="source/_daily_emails/${next_date}.md"
shift 1
# Generate the title and slug.
title="${*}"
slug=$(echo "${title}" | \
tr '[:upper:]' '[:lower:]' | \
sed 's/[^a-z0-9]/-/g' | \
sed 's/\-\-+/-/g' | \
sed 's/^\-//;s/\-$//')
# Create the file.
cp -f --no-clobber resources/daily-email-stub.md "${filepath}"
date=$(date -d "${next_date}" +%Y-%m-%d)
day=$(date -d "${next_date}" +%d)
month=$(date -d "${next_date}" +%m)
year=$(date -d "${next_date}" +%Y)
# Replace the placeholders.
sed -i "s/{{ date }}/${date}/" "${filepath}"
sed -i "s/{{ title }}/${title}/" "${filepath}"
sed -i "s#{{ permalink }}#daily/${year}/${month}/${day}/${slug}#" "${filepath}"
# Create a commit with the appropriate date in the message
git add "${filepath}"
git commit --quiet -m "Add daily email for ${date}
${title}"
echo "${filepath}"
}
# Build CSS assets, this is meant to be run within the `assets` directory.
function npm:build:css {
local args=()
if [[ "${NODE_ENV:-}" == "production" ]]; then
args=(--minify)
else
args=(--watch)
fi
tailwindcss \
--config tailwind.config.ts \
--output ../source/build/tailwind.css "${args[@]}"
}
function publish {
export NODE_ENV=production
export APP_ENV=production
tag-release
git push
git stash
clean
(cd assets && npm:build:css)
generate
rsync --archive --verbose --compress --update --delete \
output_prod/ ssh.oliverdavies.uk:/var/www/vhosts/www.oliverdavies.uk
git stash pop
}
function test {
phpunit "${@}"
}
# vim: ft=bash

436
source/.htaccess Normal file
View file

@ -0,0 +1,436 @@
Options +FollowSymLinks -MultiViews
RewriteEngine on
# Redirect all users to access the site WITH the 'www.' prefix.
RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteRule ^ http%{ENV:protossl}://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
# Remove trailing slashes from directories.
DirectorySlash Off
RewriteCond %{REQUEST_FILENAME} -d
RewriteCond %{REQUEST_URI} !/$
RewriteCond %{REQUEST_FILENAME}/index.html -f
RewriteRule (.*) $1/index.html [L]
RewriteRule ^(.*)/$ /$1 [L,R=301]
# Remove index.html from URLs.
RewriteCond %{THE_REQUEST} \s/+(.*/)?index\.html[\s?] [NC]
RewriteRule ^(.*)index\.html$ /$1 [L,R=301]
ErrorDocument 404 /404/index.html
RewriteCond %{REQUEST_URI} !^/archive/?$
RewriteRule ^archive/(.*)$ /daily/$1 [L,R=301]
RewriteRule ^articles/(.*) /blog/$1 [L,R=301]
Redirect 301 /10-useful-drupal-6-modules-i-use-every-project /blog/10-useful-drupal-6-modules
Redirect 301 /2010/04/05/styling-drupal-6s-taxonomy-lists-with-php-css-and-jquery /blog/style-drupal-6s-taxonomy-lists-php-css-jquery
Redirect 301 /2010/04/28/using-imagecache-and-imagecrop-for-my-portfolio /blog/using-imagecache-imagecrop-my-portfolio
Redirect 301 /2010/05/29/importing-images-using-the-imagefieldimport-module /blog/quickly-import-multiples-images-using-imagefieldimport-module
Redirect 301 /2010/06/23/creating-a-block-of-social-media-icons-using-cck-views-and-nodequeue /blog/create-block-social-media-icons-using-cck-views-nodequeue
Redirect 301 /2010/07/05/thanks /blog/thanks
Redirect 301 /2010/08/17/create-a-better-photo-gallery-in-drupal-part-2 /blog/create-better-photo-gallery-drupal-part-2
Redirect 301 /2014/05/21/git-format-patch /blog/git-format-patch-your-friend
Redirect 301 /2PxmyqP /articles/examples-of-laravel-collections-in-drupal
Redirect 301 /39CoG /articles/drupalcamp-london-testing-workshop
Redirect 301 /3eGQr https://github.com/howToCodeWell/howToCodeWellFM/blob/c927e0b3589f1d7375002f7fd70f0bfc9fc90449/composer.json#L17
Redirect 301 /6UhLN https://github.com/opdavies/sculpin-twig-markdown-bundle/pull/1
Redirect 301 /6i3YZ https://www.youtube.com/watch?v=vUK5sEbd-dk
Redirect 301 /9rv0Z https://www.drupal.org/project/override_node_options/issues/3109852
Redirect 301 /BhMZi https://git.drupalcode.org/search?utf8=%E2%9C%93&snippets=&scope=&repository_ref=8.x-1.x&search=baz&project_id=23203
Redirect 301 /NBi5h https://git.drupalcode.org/search?utf8=%E2%9C%93&search=bar&group_id=&project_id=23203&search_code=true&repository_ref=8.x-1.x&nav_source=navbar
Redirect 301 /P5KQ5 https://www.npmjs.com/package/tailwindcss-skip-link
Redirect 301 /S8ZDA /articles/rebuilding-bartik-with-vuejs-tailwind-css-part-2
Redirect 301 /Wh48P https://github.com/opdavies/oliverdavies.uk/blob/master/source/_partials/talk/video.html.twig
Redirect 301 /XbzS2 https://github.com/opdavies/gmail-filter-builder
Redirect 301 /YK1VH /articles/psr4-autoloading-test-cases-drupal-7
Redirect 301 /Yil https://drupalcamp.london/tickets/training
Redirect 301 /about /
Redirect 301 /about/cv /cv
Redirect 301 /about/speaker /speaker
Redirect 301 /about/speaker-information /speaker-information
Redirect 301 /acquia-certifications https://certification.acquia.com/registry?fname=Oliver&lname=Davies&city=&state=&country=United+Kingdom&org=&exam=All
Redirect 301 /acquia-certified https://certification.acquia.com/?fname=Oliver&lname=Davies
Redirect 301 /ansible https://galaxy.ansible.com/opdavies
Redirect 301 /ansible-molecule /articles/test-driven-ansible-role-development-molecule
Redirect 301 /ansistrano-code https://github.com/opdavies/dransible
Redirect 301 /ansistrano-demo https://www.youtube.com/watch?v=PLS4ET7FAcU
Redirect 301 /ansistrano-slides /talks/deploying-php-ansible-ansistrano
Redirect 301 /archive/2022/10/20/run-vs-task-runner /archive/2022/10/19/run-vs-task-runners
Redirect 301 /atNOQ https://youtu.be/r41dkD2EOo8
Redirect 301 /automatically-updating-talk-created-date https://gist.github.com/opdavies/4e75e1753d8603113f07f8264bb783d6
Redirect 301 /blog.xml /rss.xml
Redirect 301 /blog/10-useful-drupal-6-modules /blog/useful-drupal-6-modules
Redirect 301 /blog/10-years-working-full-time-drupal /blog/10-years-working-full-time-drupal-php
Redirect 301 /blog/2010/04/05/style-drupal-6s-taxonomy-lists-php-css-and-jquery /blog/style-drupal-6s-taxonomy-lists-php-css-and-jquery
Redirect 301 /blog/2010/04/05/styling-drupal-6s-taxonomy-lists-with-php-css-and-jquery /blog/style-drupal-6s-taxonomy-lists-php-css-jquery
Redirect 301 /blog/2010/04/28/using-imagecache-and-imagecrop-my-portfolio /blog/using-imagecache-and-imagecrop-my-portfolio
Redirect 301 /blog/2010/05/06/conditional-email-addresses-webform /blog/conditional-email-addresses-webform
Redirect 301 /blog/2010/05/10/quickly-create-zen-subthemes-using-zenophile /blog/quickly-create-zen-subthemes-using-zenophile
Redirect 301 /blog/2010/05/25/create-slideshow-multiple-images-using-fancy-slide /blog/create-slideshow-multiple-images-using-fancy-slide
Redirect 301 /blog/2010/05/29/quickly-import-multiples-images-using-imagefieldimport-module /blog/quickly-import-multiples-images-using-imagefieldimport-module
Redirect 301 /blog/2010/06/02/improve-jpg-quality-imagecache-and-imageapi /blog/improve-jpg-quality-imagecache-and-imageapi
Redirect 301 /blog/2010/06/23/create-block-social-media-icons-using-cck-views-and-nodequeue /blog/create-block-social-media-icons-using-cck-views-and-nodequeue
Redirect 301 /blog/2010/06/25/10-useful-drupal-6-modules /blog/10-useful-drupal-6-modules
Redirect 301 /blog/2010/06/28/create-flickr-photo-gallery-using-feeds-cck-and-views /blog/create-flickr-photo-gallery-using-feeds-cck-and-views
Redirect 301 /blog/2010/07/01/change-content-type-multiple-nodes-using-sql /blog/change-content-type-multiple-nodes-using-sql
Redirect 301 /blog/2010/07/02/create-virtual-hosts-mac-os-x-using-virtualhostx /blog/create-virtual-hosts-mac-os-x-using-virtualhostx
Redirect 301 /blog/2010/07/07/add-taxonomy-term-multiple-nodes-using-sql /blog/add-taxonomy-term-multiple-nodes-using-sql
Redirect 301 /blog/2010/07/07/quickly-adding-taxonomy-term-multiple-nodes-using-sql /blog/add-taxonomy-term-multiple-nodes-using-sql
Redirect 301 /blog/2010/07/12/overview-teleport-module /blog/review-teleport-module
Redirect 301 /blog/2010/07/12/review-teleport-module /blog/review-teleport-module
Redirect 301 /blog/2010/08/10/review-adminhover-module /blog/review-adminhover-module
Redirect 301 /blog/2010/08/11/create-better-photo-gallery-drupal-part-1 /blog/create-better-photo-gallery-drupal-part-1
Redirect 301 /blog/2010/08/11/how-create-better-photo-gallery-drupal-part-1 /blog/create-better-photo-gallery-drupal-part-1
Redirect 301 /blog/2010/08/17/create-better-photo-gallery-drupal-part-2 /blog/create-better-photo-gallery-drupal-part-2
Redirect 301 /blog/2010/08/20/review-image-caption-module /blog/review-image-caption-module
Redirect 301 /blog/2010/09/26/south-wales-drupal-user-group /blog/south-wales-drupal-user-group
Redirect 301 /blog/2010/10/10/create-and-apply-patches /blog/create-and-apply-patches
Redirect 301 /blog/2010/10/13/create-better-photo-gallery-drupal-part-3 /blog/create-better-photo-gallery-drupal-part-3
Redirect 301 /blog/2010/10/22/create-better-photo-gallery-drupal-part-21 /blog/create-better-photo-gallery-drupal-part-21
Redirect 301 /blog/2010/11/04/use-regular-expressions-search-and-replace-coda-or-textmate /blog/use-regular-expressions-search-and-replace-coda-or-textmate
Redirect 301 /blog/2011/02/14/easily-embed-typekit-fonts-your-drupal-website /blog/easily-embed-typekit-fonts-your-drupal-website
Redirect 301 /blog/2011/03/15/display-number-facebook-fans-php /blog/display-number-facebook-fans-php
Redirect 301 /blog/2011/03/31/proctor-stevenson /blog/proctor-stevenson
Redirect 301 /blog/2011/05/20/proctors-hosting-next-drupal-meetup /blog/proctors-hosting-next-drupal-meetup
Redirect 301 /blog/2011/05/23/imagefield-import-archive /blog/imagefield-import-archive
Redirect 301 /blog/2011/08/28/create-multigroups-drupal-7-using-field-collections /blog/create-multigroups-drupal-7-using-field-collections
Redirect 301 /blog/2011/10/19/install-and-configure-subversion-svn-server-ubuntu /blog/install-and-configure-subversion-svn-server-ubuntu
Redirect 301 /blog/2011/10/install-and-configure-subversion-svn-server-ubuntu /blog/how-install-configure-subversion-svn-server-ubuntu
Redirect 301 /blog/2012/01/04/site-upgraded-drupal-7 /blog/site-upgraded-drupal-7
Redirect 301 /blog/2012/02/01/use-authorize-keys-create-passwordless-ssh-connection /blog/use-authorized-keys-create-passwordless-ssh-connection
Redirect 301 /blog/2012/04/16/create-omega-subtheme-less-css-preprocessor-using-omega-tools-and-drush /blog/create-omega-subtheme-less-css-preprocessor-using-omega-tools-and-drush
Redirect 301 /blog/2012/04/17/installing-nagios-centos /blog/installing-nagios-centos
Redirect 301 /blog/2012/04/19/adding-custom-theme-templates-drupal-7 /blog/adding-custom-theme-templates-drupal-7
Redirect 301 /blog/2012/04/adding-custom-theme-templates-drupal-7 /blog/adding-custom-theme-templates-drupal-7
Redirect 301 /blog/2012/05/23/add-date-popup-calendar-custom-form /blog/add-date-popup-calendar-custom-form
Redirect 301 /blog/2012/05/23/checkout-specific-revision-svn-command-line /blog/checkout-specific-revision-svn-command-line
Redirect 301 /blog/2012/05/23/forward-one-domain-another-using-mod-rewrite-and-htaccess /blog/forward-one-domain-another-using-mod-rewrite-and-htaccess
Redirect 301 /blog/2012/05/23/forward-one-domain-another-using-modrewrite-and-htaccess /blog/forward-one-domain-another-using-modrewrite-htaccess
Redirect 301 /blog/2012/05/23/prevent-apache-displaying-text-files-within-web-browser /blog/prevent-apache-displaying-text-files-within-web-browser
Redirect 301 /blog/2012/05/23/writing-info-file-drupal-7-theme /blog/writing-info-file-drupal-7-theme
Redirect 301 /blog/2012/05/24/dividing-drupals-process-and-preprocess-functions-separate-files /blog/dividing-drupals-process-and-preprocess-functions-separate-files
Redirect 301 /blog/2012/05/forward-one-domain-another-using-modrewrite-and-htaccess /blog/forward-one-domain-another-using-modrewrite-htaccess
Redirect 301 /blog/2012/07/12/my-new-drupal-modules /blog/my-new-drupal-modules
Redirect 301 /blog/2012/07/14/install-nomensa-media-player-drupal /blog/install-nomensa-media-player-drupal
Redirect 301 /blog/2012/07/27/writing-article-linux-journal /blog/writing-article-linux-journal
Redirect 301 /blog/2012/07/install-and-configure-nomensa-accessible-media-player-drupal /blog/install-configure-nomensa-accessible-media-player-drupal
Redirect 301 /blog/2012/07/nomensa-accessible-media-player-drupal /blog/install-configure-nomensa-accessible-media-player-drupal
Redirect 301 /blog/2012/08/18/display-custom-menu-drupal-7-theme-template-file /blog/display-custom-menu-drupal-7-theme-template-file
Redirect 301 /blog/2012/09/06/reflections-speaking-unifieddiff /blog/reflections-speaking-unifieddiff
Redirect 301 /blog/2012/10/25/my-sublime-text-2-settings /blog/my-sublime-text-2-settings
Redirect 301 /blog/2012/11/15/accessible-bristol-site-launched /blog/accessible-bristol-site-launched
Redirect 301 /blog/2012/11/17/open-sublime-text-2-mac-os-x-command-line /blog/open-sublime-text-2-mac-os-x-command-line
Redirect 301 /blog/2012/12/06/use-sass-and-compass-drupal-7-using-sassy /blog/use-sass-and-compass-drupal-7-using-sassy
Redirect 301 /blog/2012/12/use-sass-and-compass-drupal-7-using-sassy /blog/use-sass-and-compass-drupal-7-using-sassy
Redirect 301 /blog/2013/01/09/checking-if-user-logged-drupal-right-way /blog/checking-if-user-logged-drupal-right-way
Redirect 301 /blog/2013/02/16/creating-and-using-custom-tokens-drupal-7 /blog/creating-and-using-custom-tokens-drupal-7
Redirect 301 /blog/2013/02/creating-and-using-custom-tokens-drupal-7 /blog/creating-using-custom-tokens-drupal-7
Redirect 301 /blog/2013/03/02/quickest-way-install-sublime-text-2-ubuntu /blog/quickest-way-install-sublime-text-2-ubuntu
Redirect 301 /blog/2013/04/20/leaving-nomensa-joining-precedent /blog/leaving-nomensa-joining-precedent
Redirect 301 /blog/2013/04/27/display-git-branch-or-tag-names-your-bash-prompt /blog/display-git-branch-or-tag-names-your-bash-prompt
Redirect 301 /blog/2013/04/display-git-branch-or-tag-names-your-bash-prompt /blog/display-git-branch-or-tag-names-your-bash-prompt
Redirect 301 /blog/2013/06/13/some-useful-links-using-simpletest-drupal /blog/some-useful-links-using-simpletest-drupal
Redirect 301 /blog/2013/07/17/creating-local-and-staging-sites-drupals-domain-module-enabled /blog/creating-local-and-staging-sites-drupals-domain-module-enabled
Redirect 301 /blog/2013/07/26/going-drupalcon /blog/going-drupalcon
Redirect 301 /blog/2013/09/06/create-a-zen-sub-theme-using-drush /blog/create-a-zen-sub-theme-using-drush
Redirect 301 /blog/2013/09/create-zen-sub-theme-using-drush /blog/create-zen-sub-theme-using-drush
Redirect 301 /blog/2013/11/19/dont-bootstrap-drupal-use-drush /blog/dont-bootstrap-drupal-use-drush
Redirect 301 /blog/2013/11/27/useful-vagrant-commands /blog/useful-vagrant-commands
Redirect 301 /blog/2013/11/dont-bootstrap-drupal-use-drush /blog/dont-bootstrap-drupal-use-drush
Redirect 301 /blog/2013/12/24/quickly-apply-patches-using-git-and-curl-or-wget /blog/quickly-apply-patches-using-git-and-curl-or-wget
Redirect 301 /blog/2013/12/31/download-different-versions-drupal-drush /blog/download-different-versions-drupal-drush
Redirect 301 /blog/2013/12/quickly-apply-patches-using-git-and-curl-or-w /blog/quickly-apply-patches-using-git-curl-or-wget
Redirect 301 /blog/2014/01/15/some-useful-git-aliases /blog/some-useful-git-aliases
Redirect 301 /blog/2014/02/09/drupalcamp-london-2014 /blog/drupalcamp-london-2014
Redirect 301 /blog/2014/03/03/what-git-flow /blog/what-git-flow
Redirect 301 /blog/2014/05/03/drupal-association /blog/drupal-association
Redirect 301 /blog/2014/05/06/thanks /blog/thanks
Redirect 301 /blog/2014/05/21/git-format-patch /blog/git-format-patch
Redirect 301 /blog/2014/07/02/drush-make-drupalbristol /blog/drush-make-drupalbristol
Redirect 301 /blog/2014/10/06/fix-vagrant-loading-wrong-virtual-machine /blog/fix-vagrant-loading-wrong-virtual-machine
Redirect 301 /blog/2014/10/21/updating-features-and-adding-components-using-drush /blog/updating-features-and-adding-components-using-drush
Redirect 301 /blog/2014/11/18/include-css-fonts-using-sass-each-loop /blog/include-css-fonts-using-sass-each-loop
Redirect 301 /blog/2014/11/20/using-remote-files-when-developing-locally-with-stage-file-proxy-module /blog/using-remote-files-when-developing-locally-with-stage-file-proxy-module
Redirect 301 /blog/2014/11/27/pantheon-settings-files /blog/pantheon-settings-files
Redirect 301 /blog/2014/12/20/include-local-drupal-settings-file-environment-configuration-and-overrides /blog/include-local-drupal-settings-file-environment-configuration-and-overrides
Redirect 301 /blog/2015/04/03/how-to-define-a-minimum-drupal-core-version /blog/how-to-define-a-minimum-drupal-core-version
Redirect 301 /blog/2015/06/18/updating-forked-repositories-on-github /blog/updating-forked-repositories-on-github
Redirect 301 /blog/2015/07/19/sculpin-twig-resources /blog/sculpin-twig-resources
Redirect 301 /blog/2015/07/21/automating-sculpin-jenkins /blog/automating-sculpin-jenkins
Redirect 301 /blog/2015/12/22/programmatically-load-an-entityform-in-drupal-7 /blog/programmatically-load-an-entityform-in-drupal-7
Redirect 301 /blog/2016/02/15/announcing-the-drupal-vm-generator /blog/announcing-the-drupal-vm-generator
Redirect 301 /blog/2016/05/03/simplifying-drupal-migrations-with-xautoload /blog/simplifying-drupal-migrations-with-xautoload
Redirect 301 /blog/2016/07/15/building-gmail-filters-with-php /blog/building-gmail-filters-with-php
Redirect 301 /blog/2016/12/30/drupal-vm-generator-291-released /blog/drupal-vm-generator-291-released
Redirect 301 /blog/2017/01/07/easier-sculpin-commands-with-composer-and-npm-scripts /blog/easier-sculpin-commands-with-composer-and-npm-scripts
Redirect 301 /blog/2017/01/31/nginx-redirects-with-query-string-arguments /blog/nginx-redirects-with-query-string-arguments
Redirect 301 /blog/2017/05/05/fixing-drupal-simpletest-docker /blog/2017/05/05/fixing-drupal-simpletest-issues-inside-docker-containers
Redirect 301 /blog/2017/05/05/fixing-drupal-simpletest-issues-inside-docker-containers /blog/fixing-drupal-simpletest-issues-inside-docker-containers
Redirect 301 /blog/2017/05/20/turning-drupal-module-into-feature /blog/turning-your-custom-drupal-module-feature
Redirect 301 /blog/2017/06/09/introducing-the-drupal-meetups-twitterbot /blog/introducing-the-drupal-meetups-twitterbot
Redirect 301 /blog/2017/07/13/publishing-sculpin-sites-with-github-pages /blog/publishing-sculpin-sites-github-pages
Redirect 301 /blog/2017/11/07/tdd-test-driven-drupal /blog/tdd-test-driven-drupal
Redirect 301 /blog/2017/11/07/writing-drupal-module-test-driven-development-tdd /blog/2017/11/07/tdd-test-driven-drupal
Redirect 301 /blog/2018/01/30/drupalcamp-bristol-2018 /blog/drupalcamp-bristol-2018
Redirect 301 /blog/2018/02/05/using-tailwind-css-in-your-drupal-theme /blog/using-tailwind-css-in-your-drupal-theme
Redirect 301 /blog/2018/02/27/looking-forward-to-drupalcamp-london /blog/looking-forward-to-drupalcamp-london
Redirect 301 /blog/2018/02/27/queuing-private-messages-in-drupal-8 /blog/queuing-private-messages-in-drupal-8
Redirect 301 /blog/2018/02/28/building-the-new-phpsw-website /blog/building-the-new-phpsw-website
Redirect 301 /blog/2018/03/02/yay-the-mediacurrent-contrib-half-hour-is-back /blog/yay-the-mediacurrent-contrib-half-hour-is-back
Redirect 301 /blog/2018/03/04/tweets-from-drupalcamp-london /blog/tweets-from-drupalcamp-london
Redirect 301 /blog/2018/04/23/back-to-the-future-git-diff-apply /blog/back-future-gits-diff-apply-commands
Redirect 301 /blog/2018/05/06/creating-a-custom-phpunit-command-for-docksal /blog/creating-a-custom-phpunit-command-for-docksal
Redirect 301 /blog/add-date-popup-calendar-custom-form /blog/how-add-date-popup-calendar-custom-form
Redirect 301 /blog/adding-methods-decorating-entity-metadata-wrapper /blog/decorating-entity-metadata-wrapper-add-refactor-methods
Redirect 301 /blog/announcing-drupal-vm-generator /blog/announcing-the-drupal-vm-generator
Redirect 301 /blog/announcing-the-drupal-vm-config-generator /blog/announcing-the-drupal-vm-generator
Redirect 301 /blog/back-to-the-future-git-diff-apply /blog/back-future-gits-diff-apply-commands
Redirect 301 /blog/building-gmail-filters-in-php /blog/building-gmail-filters-php
Redirect 301 /blog/building-new-phpsw-website /blog/building-the-new-phpsw-website
Redirect 301 /blog/building-presentation-slides-reveal-js-tailwind-css /blog/building-presentation-slides-rst2pdf
Redirect 301 /blog/building-speaker-leaderboard-php-south-wales-using-drupal-symfony /blog/building-speaker-leaderboard-php-south-wales-drupal-symfony
Redirect 301 /blog/create-and-apply-patches /blog/how-create-apply-patches
Redirect 301 /blog/create-flickr-photo-gallery-using-feeds-cck-and-views /blog/create-flickr-photo-gallery-using-feeds-cck-views
Redirect 301 /blog/creating-and-using-custom-tokens-drupal-7 /blog/creating-using-custom-tokens-drupal-7
Redirect 301 /blog/creating-custom-docksal-commands /blog/creating-custom-phpunit-command-docksal
Redirect 301 /blog/debugging-drupal-commerce-promotions-illiminate-collections /blog/debugging-drupal-commerce-illuminate-collections
Redirect 301 /blog/decorating-entity-metadata-wrapper-add-add-refactor-methods /blog/decorating-entity-metadata-wrapper-add-refactor-methods
Redirect 301 /blog/dev-book-club-refactoring-chapter-1 /blog/dev-book-club-notes-refactoring-chapter-1
Redirect 301 /blog/dividing-drupals-process-preprocess-functions-separate-files /blog/dividing-drupals-process-and-preprocess-functions-separate-files
Redirect 301 /blog/drupal-8-commerce-fixing-no-such-customer-error-checkout /blog/drupal-8-commerce-fixing-no-such-customer-error-checkou
Redirect 301 /blog/drupal-8-commerce-fixing-no-such-customer-error-on-checkout /blog/drupal-8-commerce-fixing-no-such-customer-error-checkou
Redirect 301 /blog/drupal-vm-generator-291-released /blog/drupal-vm-generator-updates
Redirect 301 /blog/drupalcamp-london-2019-tickets /blog/drupalcamp-london-2019-tickets-available-call-sessions
Redirect 301 /blog/drush-make-drupalbristol /talks/drush-make-drupalbristol
Redirect 301 /blog/easier-git-repository-cloning-with-insteadof /blog/easier-git-repository-cloning-insteadof
Redirect 301 /blog/easier-sculpin-commands-with-composer-and-npm-scripts /blog/easier-sculpin-commands-composer-npm-scripts
Redirect 301 /blog/editing-meetup-videos-kdenlive /blog/editing-meetup-videos-linux-kdenlive
Redirect 301 /blog/examples-of-laravel-collections-in-drupal /blog/using-laravel-collections-drupal
Redirect 301 /blog/experimenting-with-events-in-drupal-8 /blog/experimenting-events-drupal-8
Redirect 301 /blog/fix-vagrant-loading-wrong-virtual-machine /blog/how-fix-vagrant-loading-wrong-virtual-machine
Redirect 301 /blog/fixing-drupal-simpletest-docker /blog/fixing-drupal-simpletest-issues-inside-docker-containers
Redirect 301 /blog/forward-one-domain-another-using-modrewrite-and-htaccess /blog/forward-one-domain-another-using-modrewrite-htaccess
Redirect 301 /blog/forward-one-domain-another-using-modrewrite-and-htaccess /blog/forward-one-domain-another-using-modrewrite-htaccess
Redirect 301 /blog/git-format-patch /blog/git-format-patch-your-friend
Redirect 301 /blog/how-easily-embed-typekit-fonts-your-drupal-website /blog/easily-embed-typekit-fonts-your-drupal-website
Redirect 301 /blog/how-split-new-drupal-contrib-project-within-another-repository /blog/splitting-new-drupal-project-from-repo
Redirect 301 /blog/how-style-drupal-6s-taxonomy-lists-php-css-and-jquery /blog/style-drupal-6s-taxonomy-lists-php-css-jquery
Redirect 301 /blog/include-local-drupal-settings-file-environment-configuration-overrides /blog/include-local-drupal-settings-file-environment-configuration-and-overrides
Redirect 301 /blog/install-and-configure-subversion-svn-server-ubuntu /blog/how-install-configure-subversion-svn-server-ubuntu
Redirect 301 /blog/live-blogging-symfonylive-london /blog/live-blogging-symfonylive-london-2019
Redirect 301 /blog/minimum-core-version /blog/how-define-minimum-drupal-core-version
Redirect 301 /blog/nginx-redirects-with-query-string-arguments /blog/nginx-redirects-query-string-arguments
Redirect 301 /blog/null-users-and-system-users-in-drupal /blog/null-users-system-users-drupal
Redirect 301 /blog/overridding-phpcs-configuration-drupal-ci /blog/overriding-phpcs-configuration-drupal-ci
Redirect 301 /blog/pantheon-settings-files /blog/include-environment-specific-settings-files-pantheon
Redirect 301 /blog/pdfpc-pdf-presenter-console-notes /blog/presenting-pdf-slides-using-pdfpc-pdf-presenter-console
Redirect 301 /blog/php-apps-subdirectory-nginx /blog/how-put-your-php-application-subdirectory-another-site-nginx
Redirect 301 /blog/presenting-tailwind-css-ansible-cms-philly /blog/presenting-on-tailwind-css-and-ansible-at-cms-philly
Redirect 301 /blog/programmatically-load-(an-)?entityform-(in-)?drupal-7 /blog/entityform
Redirect 301 /blog/published-my-first-docker-images-docker-hub /blog/published-my-first-docker-images-docker-hub-adr-tools-sculpin-rst2pdf
Redirect 301 /blog/publishing-sculpin-sites-(with-)?github-pages /blog/publishing-sculpin-sites-github-pages
Redirect 301 /blog/queuing-private-messages-in-drupal-8 /blog/queuing-private-messages-drupal-8
Redirect 301 /blog/quick-project-switching-in-phpstorm /blog/quick-project-switching-phpstorm
Redirect 301 /blog/quickly-apply-patches-using-git-and-curl-or-wget /blog/quickly-apply-patches-using-git-curl-or-wget
Redirect 301 /blog/rebuilding-bartik-with-vuejs-tailwind-css /blog/rebuilding-bartik-drupals-default-theme-vuejs-tailwind-css
Redirect 301 /blog/rebuilding-bartik-with-vuejs-tailwind-css-part-2 /blog/rebuilding-bartik-drupals-default-theme-vuejs-tailwind-css-part-2
Redirect 301 /blog/rebuilding-uis-tailwind-css /blog/uis-ive-rebuilt-tailwind-css
Redirect 301 /blog/restructuring-my-tailwindjs-config-files /blog/restructuring-my-tailwindjs-configuration-files
Redirect 301 /blog/retrieving-profile-data-user-using-entity-metadata-wrapper /blog/cleanly-retrieving-user-profile-data-using-entity-metadata-wrapper
Redirect 301 /blog/running-drupal-with-symfony-local-server /blog/running-drupal-88-symfony-local-server
Redirect 301 /blog/running-phpunit-tests-docksal-phpstorm /blog/how-run-drupal-8-phpunit-tests-within-docksal-phpstorm
Redirect 301 /blog/simplifying-drupal-migrations-with-xautoload /blog/simplifying-drupal-migrations-xautoload
Redirect 301 /blog/speaking-drupalcon-barcelona-2020 /blog/speaking-drupalcon-europe-2020
Redirect 301 /blog/speaking-during-lockdown /blog/speaking-remotely-during-covid-19
Redirect 301 /blog/speaking-remotely-during-lockdown /blog/speaking-remotely-during-covid-19
Redirect 301 /blog/style-drupal-6s-taxonomy-lists-php-css-jquery /blog/style-drupal-6s-taxonomy-lists-php-css-and-jquery
Redirect 301 /blog/survey-results-my-drupalcon-europe-session /blog/survey-results-my-drupalcon-europe-session-test-driven-drupal
Redirect 301 /blog/system-users-null-users /blog/null-users-and-system-users-in-drupal
Redirect 301 /blog/test-driven-drupal-gitstore-leanpub /blog/test-driven-drupal-on-gitstore-leanpub
Redirect 301 /blog/test-driven-drupal-presentation-drupalcon-europe-0 /blog/test-driven-drupal-presentation-drupalcon-europe
Redirect 301 /blog/test-driven-drupal-session-drupalcon-europe /blog/test-driven-drupal-presentation-drupalcon-europe
Redirect 301 /blog/test-driven-drupal-session-video-drupalcon-europe /blog/test-driven-drupal-presentation-drupalcon-europe
Redirect 301 /blog/testing-tailwind-css-plugins-with-jest /blog/testing-tailwind-css-plugins-jest
Redirect 301 /blog/testing-tailwindcss-plugins-with-jest /blog/testing-tailwind-css-plugins-jest
Redirect 301 /blog/tweets-from-drupalcamp-london /blog/tweets-drupalcamp-london
Redirect 301 /blog/updating-features-and-adding-components-using-drush /blog/updating-features-adding-components-using-drush
Redirect 301 /blog/updating-forked-repositories-github /blog/updating-forked-github-repos
Redirect 301 /blog/use-regular-expressions-search-and-replace-coda-or-textmate /blog/use-regular-expressions-search-replace-coda-or-textmate
Redirect 301 /blog/using-environment-variables-settings-docksal /blog/how-use-environment-variables-your-drupal-settings-docksal
Redirect 301 /blog/using-psr-4-autoloading-your-drupal-7-test-cases /blog/psr4-autoloading-test-cases-drupal-7
Redirect 301 /blog/using-tailwind-css-in-your-drupal-theme /blog/using-tailwind-css-your-drupal-theme
Redirect 301 /blog/using-the-pcss-extension-postcss-webpack-encore /blog/using-pcss-extension-postcss-webpack-encore
Redirect 301 /blog/weeknotes-june-5th /blog/weeknotes-2021-06-05
Redirect 301 /blog/writing-drupal-module-test-driven-development-tdd /blog/writing-new-drupal-8-module-using-test-driven-development-tdd
Redirect 301 /book /test-driven-drupal
Redirect 301 /calendars? https://savvycal.com/opdavies
Redirect 301 /cms-philly /articles/presenting-on-tailwind-css-and-ansible-at-cms-philly
Redirect 301 /code-enigma-interview https://blog.codeenigma.com/interview-with-a-drupal-expert-9fcd8e0fad28
Redirect 301 /consulting /
Redirect 301 /contrib-half-hour https://www.youtube.com/playlist?list=PLu-MxhbnjI9rHroPvZO5LEUhr58Yl0j_F
Redirect 301 /cv /cv.txt
Redirect 301 /d0P5z /talks/drupal-8-php-libraries-drupalorg-api
Redirect 301 /d7 /drupal7
Redirect 301 /dcbristol-cfp https://www.papercall.io/drupalcamp-bristol-2019
Redirect 301 /dcbristol17-videos https://www.youtube.com/playlist?list=PLOwPvExSyLLngtd6R4PUD9MCXa6QL_obA
Redirect 301 /dcbristol19-announced /articles/drupalcamp-bristol-2019-speakers-sessions-announced
Redirect 301 /dclondon-sat https://drupalcamp.london/schedule/saturday
Redirect 301 /dclondon-sun https://drupalcamp.london/schedule/sunday
Redirect 301 /dclondon20 /articles/drupalcamp-london-testing-workshop
Redirect 301 /ddev-phpunit-command /blog/creating-custom-phpunit-command-ddev
Redirect 301 /deploying-php-ansible /talks/deploying-php-ansible-ansistrano
Redirect 301 /dks7E https://www.youtube.com/watch?v=PLS4ET7FAcU
Redirect 301 /do-library https://github.com/opdavies/drupalorg-api-php
Redirect 301 /do-projects https://github.com/opdavies/drupal-module-drupalorg-projects
Redirect 301 /docksal-phpunit-phpstorm /articles/running-phpunit-tests-docksal-phpstorm
Redirect 301 /docksal-posts /articles/tags/docksal
Redirect 301 /dransible https://github.com/opdavies/dransible
Redirect 301 /dransible-drupal-9 /blog/upgrading-dransible-project-drupal-9
Redirect 301 /drupal-bristol-march-19 https://docs.google.com/presentation/d/1pk9LIN-hHX73kvDdo-lzgmKlAeH33_K_uvI0t7A-rvY/edit?usp=sharing
Redirect 301 /drupal-consultant /drupal-consulting
Redirect 301 /drupal-consulting /
Redirect 301 /drupal-core-live-stream https://www.youtube.com/watch?v=OK4FWwh1gQU
Redirect 301 /drupal-core-testing-gate https://www.drupal.org/core/gates#testing
Redirect 301 /drupal-first-time-issues https://www.drupal.org/project/issues/search?text=&projects=&assigned=&submitted=&project_issue_followers=&status%5B%5D=Open&issue_tags_op=%3D&issue_tags=Novice
Redirect 301 /drupal-forum-post http://www.webmaster-forums.net/webmasters-corner/developing-my-website-using-php-and-mysql#comment-1231537
Redirect 301 /drupal-marketplace-uk https://www.drupal.org/drupal-services?offices%5B%5D=24460
Redirect 301 /drupal-meetups-twitterbot /articles/introducing-the-drupal-meetups-twitterbot
Redirect 301 /drupal-novice-issues https://www.drupal.org/project/issues/search?text=&projects=&assigned=&submitted=&project_issue_followers=&status%5B%5D=Open&issue_tags_op=%3D&issue_tags=Novice
Redirect 301 /drupal-php-developer /drupal-consultant
Redirect 301 /drupal-php-developer-consultant-uk /drupal-php-developer
Redirect 301 /drupal-tailwind-demo https://www.youtube.com/watch?v=1eM-Gw6GI4g
Redirect 301 /drupal-tailwindcss https://www.drupal.org/project/tailwindcss
Redirect 301 /drupal-vuejs /talks/decoupling-drupal-vuejs/
Redirect 301 /drupal7 /drupal-upgrade
Redirect 301 /drupalcamp-london-2019-tickets /articles/drupalcamp-london-2019-tickets
Redirect 301 /drupalcamp-nyc-training https://www.youtube.com/watch?v=3M9c4UUzKm0
Redirect 301 /drupalorg https://www.drupal.org/u/opdavies
Redirect 301 /drupalorg-project-issues https://www.drupal.org/project/issues/search?projects=Override+Node+Options%2C+Tailwind+CSS+Starter+Kit%2C+Block+ARIA+Landmark+Roles%2C+Copyright+Block+module%2C+System+User%2C+Null+User%2C+Collection+class%2C+Pathauto+Menu+Link%2C+Webform+ARIA&project_issue_followers=&status%5B%5D=1&status%5B%5D=13&status%5B%5D=8&status%5B%5D=14&status%5B%5D=15&issue_tags_op=%3D
Redirect 301 /drupalversary https://github.com/opdavies/drupal-module-drupalversary
Redirect 301 /elewant https://elewant.com/shepherd/admire/opdavies
Redirect 301 /feed /rss.xml
Redirect 301 /first-drupal-core-issue https://www.drupal.org/project/drupal/issues/753898
Redirect 301 /first-npm-package https://www.npmjs.com/package/tailwindcss-vuejs
Redirect 301 /freeagent https://opdavies.freeagent.com
Redirect 301 /git-flow /talks/git-flow
Redirect 301 /gitlab https://gitlab.com/opdavies
Redirect 301 /gitstore https://enjoy.gitstore.app/maintainers/opdavies
Redirect 301 /gmail-filters https://gitlab.com/opdavies/gmail-filters
Redirect 301 /images/me-precedent.jpg /sites/default/files/images/social-avatar.jpg
Redirect 301 /inviqa-tailwind-demo https://play.tailwindcss.com/Yfmw8O5UNN
Redirect 301 /inviqa-tailwind-notes https://gist.github.com/opdavies/e6f0f4938506a6859acf1aca8b4e1a74
Redirect 301 /join-php-south-wales-slack https://join.slack.com/t/phpsouthwales/shared_invite/zt-4vuetc43-AvtEK1WqNzp5k1w4yWKOJA
Redirect 301 /jy6rW https://www.meetup.com/PHP-South-Wales/events/264731393
Redirect 301 /kB6Jd /articles/running-drupal-with-symfony-local-server/
Redirect 301 /kmDRA https://www.bbc.co.uk/news/uk-46561779
Redirect 301 /leeds-php-drupal-9 https://www.meetup.com/leedsphp/events/272504993
Redirect 301 /live https://www.youtube.com/channel/UCkeK0qF9HHUPQH_fvn4ghqQ
Redirect 301 /npm https://www.npmjs.com/~opdavies
Redirect 301 /oFlkS /articles/test-driven-drupal-on-gitstore-leanpub
Redirect 301 /oliver-davies-uk-based-drupal-symfony-developer /oliver-davies-uk-based-drupal-php-developer
Redirect 301 /pair-programming /pair
Redirect 301 /pair-with-me /pair
Redirect 301 /pairing /pair
Redirect 301 /php-ansible /talks/deploying-php-ansible-ansistrano
Redirect 301 /qSHAl /articles/published-my-first-npm-package/
Redirect 301 /qT1Rb https://github.com/opdavies/drupal-meetups-twitterbot
Redirect 301 /rebuilding-acquia https://rebuilding-acquia.oliverdavies.uk
Redirect 301 /rebuilding-bartik /articles/rebuilding-bartik-with-vuejs-tailwind-css
Redirect 301 /rebuilding-bristol-js https://github.com/opdavies/rebuilding-bristol-js
Redirect 301 /rebuilding-pantheon https://play.tailwindcss.com/LND98XihGI?layout=horizontal
Redirect 301 /rebuilding-platformsh https://rebuilding-platformsh.oliverdavies.uk
Redirect 301 /rebuilding-symfony https://github.com/opdavies/rebuilding-symfony
Redirect 301 /rk29B https://www.meetup.com/PHP-South-Wales/events/268422525
Redirect 301 /roadmap /drupal-upgrade
Redirect 301 /rss /rss.xml
Redirect 301 /rst2pdf /talks/building-presenting-slide-decks-rst2pdf
Redirect 301 /s9MjJ https://symfonycasts.com/screencast/symfony
Redirect 301 /sculpin /talks/building-static-websites-sculpin
Redirect 301 /sculpin-encore-versioning https://github.com/opdavies/oliverdavies.uk/commit/d192b04aefa6e7a21bfc1f2e0fe0a16111e0e8a2
Redirect 301 /sites/default/files/images/social-avatar. /images/social-avatar.jpg
Redirect 301 /skills https://opdavies-skills-tailwindcss.netlify.com/
Redirect 301 /slides-drupal-9 https://slides-upgrading-to-drupal-9.oliverdavies.uk
Redirect 301 /slides-upgrading-to-drupal-9 https://slides-upgrading-to-drupal-9.oliverdavies.uk
Redirect 301 /slides-upgrading-to-drupal-9/index.html https://slides-upgrading-to-drupal-9.oliverdavies.uk
Redirect 301 /slides-working-with-workspace https://slides-working-with-workspace.oliverdavies.uk
Redirect 301 /speaker /press
Redirect 301 /speaker-info /speaker
Redirect 301 /speaker-information /speaker
Redirect 301 /speaking-videos https://www.youtube.com/playlist?list=PLHn41Ay7w7kfAzczswrANch5oHAPZBlvu
Redirect 301 /stream https://www.youtube.com/channel/UCkeK0qF9HHUPQH_fvn4ghqQ/live
Redirect 301 /subscription /
Redirect 301 /swap-markdown-parser https://github.com/opdavies/sculpin-twig-markdown-bundle-example/tree/swap-markdown-parser
Redirect 301 /symfony https://connect.symfony.com/profile/opdavies
Redirect 301 /symfony-server /articles/running-drupal-with-symfony-local-server
Redirect 301 /symfonylive /articles/live-blogging-symfonylive-london
Redirect 301 /symposium https://symposiumapp.com/u/opdavies
Redirect 301 /tailwind-css-talk /talks/taking-flight-tailwind-css
Redirect 301 /tailwind-repos https://github.com/opdavies?utf8=%E2%9C%93&tab=repositories&q=tailwindcss
Redirect 301 /tailwind-talk /talks/taking-flight-with-tailwind-css
Redirect 301 /tailwindcss-demo http://tailwindcss-demo.oliverdavies.uk/
Redirect 301 /talks-offer-tweet https://twitter.com/opdavies/status/1250870367712935938
Redirect 301 /talks/2012/09/05/what-is-this-drupal-thing-unified-diff /talks/what-is-this-drupal-thing
Redirect 301 /talks/2013/07/10/drupal-ldap-swdug /talks/drupal-ldap
Redirect 301 /talks/2014/03/01/git-flow-drupalcamp-london-2014 /talks/git-flow
Redirect 301 /talks/2014/07/02/drush-make-drupalbristol-drupal-bristol /talks/drush-make-drupalbristol
Redirect 301 /talks/2014/08/19/drupal-association-swdug /talks/drupal-association
Redirect 301 /talks/2015/01/18/drupalorg-2015-drupalcamp-brighton-2015 /talks/drupalorg-in-2015-whats-coming-next
Redirect 301 /talks/2015/02/28/drupalorg-2015-drupalcamp-london-2015 /talks/drupalorg-in-2015-whats-coming-next
Redirect 301 /talks/2015/04/08/drupal-8-phpsw /talks/drupal-8
Redirect 301 /talks/2015/07/25/test-drive-twig-with-sculpin-drupalcamp-north-2015 /talks/test-drive-twig-with-sculpin
Redirect 301 /talks/2015/08/25/dancing-for-drupal-umbristol /talks/dancing-for-drupal
Redirect 301 /talks/2015/10/14/sculpin-phpsw /talks/sculpin
Redirect 301 /talks/2016/03/05/drupal-8-module-development-drupalcamp-london-2016 /talks/getting-started-with-drupal-8-module-development
Redirect 301 /talks/2016/03/09/drupal-vm-generator-nwdug /talks/drupal-vm-generator
Redirect 301 /talks/2016/04/02/drupal-vm-generator-drupal-bristol /talks/drupal-vm-generator
Redirect 301 /talks/2016/06/11/drupal-8-rejoining-the-herd-php-south-coast-2016 /talks/drupal-8-rejoining-the-herd
Redirect 301 /talks/2016/07/23/drupal-vm-meet-symfony-console-drupalcamp-bristol-2016 /talks/drupal-vm-meet-symfony-console
Redirect 301 /talks/2016/11/09/drupal-development-with-composer-phpsw /talks/drupal-development-with-composer
Redirect 301 /talks/2016/11/17/goodbye-drush-make-hello-composer-drupal-bristol /talks/goodbye-drush-make-hello-composer
Redirect 301 /talks/2017/01/18/getting-your-data-into-drupal-8-drupal-bristol /talks/getting-your-data-into-drupal-8
Redirect 301 /talks/2017/03/04/getting-your-data-into-drupal-8-drupalcamp-london-2017 /talks/getting-your-data-into-drupal-8
Redirect 301 /talks/ansible-ansistrano https://www.oliverdavies.uk/talks/deploying-php-ansible-ansistrano
Redirect 301 /talks/archive /talks
Redirect 301 /talks/deploying-php-applications-fabric /talks/deploying-php-fabric
Redirect 301 /talks/deploying-php-applications-with-fabric /talks/deploying-php-fabric
Redirect 301 /talks/drupal-vm-generator-2 /talks/drupal-vm-generator
Redirect 301 /talks/drupalorg-2015-2 /talks/drupalorg-2015
Redirect 301 /talks/drupalorg-in-2015-whats-coming-next /talks/drupalorg-2015
Redirect 301 /talks/getting-started-with-drupal-8-module-development /drupal-8-module-development
Redirect 301 /talks/having-fun-drupal-8-php-libraries-drupalorg-api /talks/drupal-8-php-libraries-drupalorg-api
Redirect 301 /talks/never-commit-master-introduction-git-flow /talks/git-flow
Redirect 301 /talks/sculpin /talks/building-static-websites-sculpin
Redirect 301 /talks/tailwind /talks/taking-flight-with-tailwind-css/
Redirect 301 /talks/taking-flight-tailwind-css /talks/taking-flight-with-tailwind-css
Redirect 301 /talks/using-laravel-collections-outside-laravel /talks/using-illuminate-collections-outside-laravel
Redirect 301 /talks/working-workspace /talks/working-with-workspace
Redirect 301 /tdd-blog https://github.com/opdavies/drupal-module-tdd-blog
Redirect 301 /tdd-test-driven-drupal /talks/tdd-test-driven-drupal/
Redirect 301 /team-coaching /
Redirect 301 /test-driven-drupal-book /test-driven-drupal
Redirect 301 /testing-drupal https://www.oliverdavies.uk/talks/tdd-test-driven-drupal
Redirect 301 /testing-drupal-intro https://inviqa.com/blog/drupal-automated-testing-introduction
Redirect 301 /testing-tailwind-plugins /articles/testing-tailwindcss-plugins-with-jest
Redirect 301 /testing-workshop https://github.com/opdavies/workshop-drupal-automated-testing
Redirect 301 /testing-workshop-code https://github.com/opdavies/workshop-drupal-automated-testing-code
Redirect 301 /todoist-filters https://gist.github.com/opdavies/6709fbdac5c3babbd94137bcc8b8e3c2
Redirect 301 /twitter-tweaks https://github.com/opdavies/chrome-extension-twitter-tweaks
Redirect 301 /upgrading-to-drupal-9 /talks/upgrading-your-site-drupal-9
Redirect 301 /uxbjV https://www.drupal.org/project/copyright_block
Redirect 301 /vyTEF https://www.npmjs.com/package/tailwindcss-vuejs
Redirect 301 /webpack-encore-pcss-regex https://regexr.com/51iaf
Redirect 301 /wordcamp-bristol-tailwindcss https://2019.bristol.wordcamp.org/session/taking-flight-with-tailwind-css
Redirect 301 /wordpress-tailwind https://github.com/opdavies/wordcamp-bristol-2019
Redirect 301 /work /drupal-php-developer
Redirect 301 /working-with-workspace /talks/working-with-workspace
Redirect 301 /workshop-drupal-testing https://github.com/opdavies/workshop-drupal-automated-testing
Redirect 301 /workspace-demo https://github.com/opdavies/working-with-workspace-demo
Redirect 301 /wp-tailwind https://wp-tailwind.oliverdavies.uk
Redirect 301 /wp-tailwind-repo https://github.com/opdavies/wordcamp-bristol-2019
Redirect 301 /wp-tailwind-starter https://github.com/opdavies/wordpress-tailwindcss-startker-kit
Redirect 301 /wp-tailwind-static https://wp-tailwind.oliverdavies.uk
Redirect 301 /yXhoS /talks/things-you-should-know-about-php

View file

@ -0,0 +1,75 @@
---
permalink: daily/2022/08/12/git-worktrees-docker-compose
title: Git Worktrees and Docker Compose
pubDate: 2022-08-12
---
I've recently started trialing Git worktrees again as part of my development workflow.
If you are unfamiliar with Git worktrees, they allow you to have muliple branches of a repository checked out at the same time in different directories.
For example, this is what I see within my local checkout of my website repository:
```
.
├── config
├── HEAD
├── main
│   ├── ansible
│   ├── nginx
│   ├── README.md
│   └── website
├── new-post
│   ├── ansible
│   ├── nginx
│   ├── README.md
│   └── website
├── objects
│   ├── info
│   └── pack
├── packed-refs
├── refs
│   ├── heads
│   └── tags
└── worktrees
├── main
└── new-post
```
The first thing that you'll notice is, because it's a bare clone, it looks a little different to a what you usually see in a Git repository.
Each worktree has it's own directory, so my "main" branch inside the `main` directory.
If I need to work on a different branch, such as `new-post`, then I can create a new worktree, move into that directory and start working. I don't need to commit or stash any in-progress work and switch branches.
## Complications with Docker Compose
I use Docker and Docker Compose for my projects, and this caused some issues for me the last time that I tried using worktrees.
By default, Docker Compose will use the name of the directory that the Compose file is in to name its containers. If the directory name is "oliverdavies-uk", then the containers will be `oliverdavies-uk-web_1`, `oliverdavies-uk-db_1` etc.
This doesn't work so well if the directory is a worktree called "main" or "master" as you'll have containers called `main_web_1` or `master_db_1`.
The way to solve this is to use the `COMPOSE_PROJECT_NAME` environment variable.
If you prefix Docker Compose commands with `COMPOSE_PROJECT_NAME=your-project`, or add it to an `.env` file (Docker Compose will load this automatically), then this will override the prefix in the container names to be `your-project-{service}`.
## Container names per worktree
Whilst you could use the same Compose project name within all of your worktrees, I prefer to include the worktree name as a suffix - something like `my-project-main` or `my-project-staging` - and keep these stored in an `.env` file in each worktree's directory.
As each worktree now has unique container names, I can have multiple instances of a project running at the same time, and each worktree will have it's own separate data - meaning that I can make changes and test something in one worktree without affecting any others.
You can also use the `COMPOSE_PROJECT_NAME` variable inside Docker Compose files.
For example, if you use Traefik and needed to override the host URL for a service, the string will be interpolated and the project name would be injected as you'd expect.
```language-yaml
labels:
- "traefik.http.routers.${COMPOSE_PROJECT_NAME}.rule=Host(
`${COMPOSE_PROJECT_NAME}.docker.localhost`,
`admin.${COMPOSE_PROJECT_NAME}.docker.localhost`
)"
```
This means that Traefik would continue to use a different URL for each worktree without you needing to make any changes to your Docker Compose file.

View file

@ -0,0 +1,47 @@
---
permalink: daily/2022/08/13/i-wrote-a-neovim-plugin
pubDate: 2022-08-13
title: I wrote a Neovim plugin
tags:
- neovim
- open-source
---
I enjoy writing and working with open-source software, starting back to when I started working with PHP and Drupal in 2007.
Since then, I've written and maintained a number of Drupal modules and themes, PHP libraries, npm packages, Ansible roles and Docker images - all of which are available on my GitHub and Drupal.org pages.
Just over a year ago, [I switched to using Neovim full-time](/blog/going-full-vim) for my development and DevOps work, and last week, I wrote my first Neovim plugin, written in Lua.
I've used Lua to configure Neovim but this is the first time that I've written and open-sourced a standalone Neovim plugin.
It's called [toggle-checkbox.nvim](https://github.com/opdavies/toggle-checkbox.nvim) and is used toggle checkboxes in Markdown files - something that I use frequently for to-do lists.
For example, this a simple list containing both checked and unchecked checkboxes:
```markdown
- [x] A completed task
- [ ] An incomplete task
```
To toggle a checkbox, the `x` character needs to be either added or removed, depending on whether we're checking or unchecking it.
This is done by calling the `toggle()` function within the plugin.
In my Neovim configuration, I've added a keymap to do this:
```lua
vim.keymap.set(
"n",
"<leader>tt",
"require('toggle-checkbox').toggle()"
)
```
This means that I can use the same keymap by running `<leader>tt` to check or uncheck a checkbox. I could use Vim's replace mode to do this, but I really wanted to have one keymap that I could use for both.
As it's my first Neovim plugin, I decided to keep it simple.
The main `toggle-checkbox.lua` file is currently only 41 lines of code, and whilst there is an existing Vim plugin that I could have used, I was excited to write my own plugin for Neovim, to start contributing to the Neovim ecosystem, and add a Neovim plugin to my portfolio of open-source projects.
You can view the plugin at <https://github.com/opdavies/toggle-checkbox.nvim>, as well as my Neovim configuration (which is also written in Lua) as part of [my Dotfiles repository](https://github.com/opdavies/dotfiles/tree/main/roles/neovim/files).

View file

@ -0,0 +1,36 @@
---
permalink: daily/2022/08/14/why-i-write-tests
pubDate: 2022-08-14
title: "Why I write automated tests"
tags: [testing]
---
In February 2012, I saw a tweet from Tim Millwood asking if anyone wanted to maintain or co-maintain a Drupal module called [Override Node Options](https://www.drupal.org/project/override_node_options).
It had more than 9,200 active installations at that time, with versions for Drupal 5, 6 and 7.
I said yes and became the modules maintainer.
The module now has versions for Drupal 7, 8 and 9, with (at the latest count, according to Drupal.org) 32,292 active installations - which makes it currently the 197th most installed module.
There have been two main things that come to mind with this module, related to automated testing.
Before I become the maintainer, a feature request had been created, along with a large patch file, to add some new permissions to the module. There were some large merge conflicts that stopped me from just committing the changes but I was able to fix them manually and, because the tests still passed, ensure that the original functionality still worked. There werent tests for the new permissions but I committed the patch and added the tests later.
Without the tests to ensure that the original functionality still worked, I probably wouldnt have committed the patch and would have just closed the issue.
More recently, a friend and ex-colleague and I decided to refactor some of the module's code.
We wanted to split the `override_node_options.module` file so that each override was in its own file and its own class. This would make them easier to edit and maintain, and if anyone wanted to add a new one, theyd just need to create a new file for it and add it to the list of overrides.
Without the tests ensuring that the module still worked after the refactor, we probably wouldnt have done it as it was used on over 30,000 sites that I didn't want to break.
When I was learning about testing, I was working on projects where I was writing the code during the day and the tests in the evening on my own time.
I remember once when my manual testing had been fine, but when writing the test, I found that Id used an incorrect permission name in the code that was causing the test to fail. This was a bug that, rather than waiting for a QA Engineer or the client to discover and report, I was able to fix it locally before I'd even committed the code.
I also worked on an event booking and management website, where we had code responsible for calculating the number of available spaces for an event based on orders, determining the correct price based on the customer's status and the time until the event, creating voucher codes for new members and event leaders, and bulk messaging event attendees. All of the custom functionality was covered by automated tests.
The great thing about testing is that it gives you confidence that everything still works how you expect - not only when you wrote the code, but also in the future.
I've talked about this, and how to get started with automated testing in Drupal, in a presentation called [TDD - Test-Driven Drupal]({{site.url}}/talks/tdd-test-driven-drupal). If you want to find out more, the slides and a video recording are embedded there.

View file

@ -0,0 +1,84 @@
---
permalink: daily/2022/08/15/using-run-file-simplify-project-tasks
pubDate: 2022-08-15
title: Using a "run" file to simplify project tasks
tags: ["php"]
---
Every project has its own set of commands that need to be run regularly.
From starting a local server or the project's containers with Docker or Docker Compose, running tests or clearing a cache, or generating the CSS and JavaScript assets, these commands can get quite complicated and time-consuming and error-prone to type over and over again.
One common way to simplify these commands is using a `Makefile`.
A Makefile contains a number of named targets that you can reference, and each has one or more commands that it executes.
For example:
```language-language-yaml
# Start the project.
start:
docker-compose up -d
# Stop the project.
stop:
docker-compose down
# Run a Drush command.
drush:
docker-compose exec php-fpm drush $(ARGS)
```
With this Makefile, I can run `make start` to start the project, and `make stop` to stop it.
Makefiles work well, but I don't use the full functionality that they offer, such as dependencies for targets, and passing arguments to a command - like arguments for a Drush, Symfony Console, or Artisan command, doesn't work as I originally expected.
In the example, to pass arguments to the `drush` command, I'd have to type `ARGS="cache:rebuild" make drush` for them to get added and the command to work as expected.
An agency that I worked for created and open-sourced their own Makefile-like tool, written in PHP and built on Symfony Console. I gave a talk on it called [Working with Workspace]({{site.url}}/talks/working-with-workspace) and used it on some of my own personal and client projects.
## What I'm using now
The solution that I'm using now is a `run` file, which is something that I learned from Nick Janetakis' blog and YouTube channel.
It's a simple Bash file where you define your commands (or tasks) as functions, and then execute them by typing `./run test` or `./run composer require something`.
Here's the Makefile example, but as a `run` script:
```bash
#!/usr/bin/env bash
function help() {
# Display some default help text.
# See examples on GitHub of how to list the available tasks.
}
function start {
# Start the project.
docker-compose up -d
}
function stop {
# Stop the project.
docker-compose down
}
function drush {
# Run a Drush command with any additional arguments.
# e.g. "./run drush cache:rebuild"
docker-compose exec php-fpm drush "${@}"
}
# Execute the command, or run "help".
eval "${@:-help}"
```
As it's Bash, I can just use `$1`, `$2` etc to get specific arguments, or `$@` to get them all, so `./run drush cache:rebuild` works as expected and any additional arguments are included.
You can group tasks by having functions like `test:unit` and `test:commit`, and tasks can run other tasks. I use this for running groups of commands within a CI pipeline, and to extract helper functions for tasks like running `docker-compose exec` within the PHP container that other commands like `drush`, `console` or `composer` could re-use.
As well as running ad-hoc commands during development, I also use the run file to create functions that run Git pre-commit or pre-push hooks, deploy code with Ansible, or build, push or pull the project's latest Docker images.
I also use one within my Talks repository to generate PDF files using rst2pdf, present them using phdpc, and generate thumbnail images.
For examples of `run` files that I use in my open-source code, [you can look in my public GitHub repositories](https://github.com/search?l=Shell&q=user%3Aopdavies+filename%3Arun&type=Code), and for more information, here is [Nick's blog post where I first found the idea](https://nickjanetakis.com/blog/replacing-make-with-a-shell-script-for-running-your-projects-tasks).

View file

@ -0,0 +1,42 @@
---
permalink: daily/2022/08/16/what-are-git-hooks-why-are-they-useful
pubDate: 2022-08-16
title: "What are Git hooks and why are they useful?"
tags: ["git"]
---
In yesterday's email, I mentioned Git hooks but didn't go into any detail. So, what are they?
Git hooks are Bash scripts that you add to your repository that are executed when certain events happen, such as before a commit is made or before a push to a remote.
By default, the script files need to be within the `.git/hooks` directory, have executable permissions, and be named to exactly match the name of the hook - e.g. `pre-push` - with no file extension.
If it returns an error exit code then the process is stopped and the action doesn't complete.
This is useful if, for example, you or your team use a specified format for commit messages and you want to prevent the commit if the message doesn't match the requirements.
But, the main benefit that I get from Git hooks if from the `pre-push` hook.
I use it to run a subset of the checks that are run within project's CI pipeline to limit failures in the CI tool and fix simple errors before I push the code.
Typically, these are the quicker tasks such as ensuring the Docker image builds, running linting and static analysis, validating lock files, and some of the automated tests if they don't take too long to run.
If a build is going to fail because of something simply like a linting error, then I'd rather find that out and fix it locally rather than waiting for a CI tool to fail.
Also, if you're utilising trunk-based development and continuous integration where team members are pushing changes regularly, then you want to keep the pipeline in a passing, deployable state as much as possible and prevent disruption.
But what have Git hooks got to do with the "run" file?
Firstly, I like to keep the scripts as minimal as possible and move the majority of the code into functions within the `run` file. This means that the scripts are only responsible for running functions like `./run test:commit` and returning the appropriate exit code, but also means that it's easy to iterate and test them locally without making fake commits or trying to push them to your actual remote repository (and hoping that they don't get pushed).
Secondly, I like to simplify the setup of Git hooks with their own functions.
For security reasons, the `.git/hooks` directory cannot be committed and pushed to your remote so they need to be enabled per user within their own clone of the repository.
A common workaround is to put the scripts in a directory like `.githooks` and either symlink them to where Git expects them to be, or to use the `core.hooksPath` configuration option and change where Git is going to look.
I like to lower the barrier for any team members by creating `git-hooks:on` and `git-hooks:off` functions which either set or unset the `core.hooksPath`. If someone wants to enable the Git hooks then they only need to run one of those commands rather than having to remember the name of the configuration option or manually creating or removing symlinks.
There are other Git hooks that can be used but just using `pre-commit` and `pre-push` has saved me and teams that I've worked on both Developer time and build minutes, provides quicker feedback and fewer disruptions in our build pipelines, and I like how simple it can be by creating custom functions in a `run` file.
Lastly, I've created <https://github.com/opdavies/git-hooks-scratch> as an example with a minimal `run` file and some example hooks.

View file

@ -0,0 +1,39 @@
---
permalink: daily/2022/08/17/one-more-run-command-git-worktrees
pubDate: 2022-08-17
title: One more "run" command, for Git worktrees
tags: ["drupal", "git"]
---
Here's another `run` file example, this time relating to Git worktrees...
One project that I work on is a multilingual Drupal application that needs to work in both English and Welsh. As I'm cloning a fresh version today, I'm doing it as a bare repository so I can use worktrees.
To work on it locally, just like in production, I need to use a different URL for each language so that Drupal can identify it and load the correct content and configuration.
For fixed environments like production or staging, the URLs are set in configuration files, but for ad-hoc environments such as local worktrees, I thought that the best approach was to override them as needed per worktree using Drush (a Drupal CLI tool).
I could do this manually each time or I could automate it in a `run` command. :)
Here's the function that I came up with:
```bash
function drupal:set-urls-for-worktree {
# Set the site URLs based on the current Git worktree name.
local worktree_name="$(basename $PWD)"
local cy_url="cy-projectname-${worktree_name}.docker.localhost"
local en_url="projectname-${worktree_name}.docker.localhost"
# Update the URLs.
drush config:set language.negotiation url.domains.cy -y $cy_url
drush config:set language.negotiation url.domains.en -y $en_url
# Display the domains configuration to ensure that they were set correctly.
drush config:get language.negotiation url.domains
}
```
It builds the worktree URL for each language based on the directory name, executes the configuration change, and finally displays the updated configuration so I can confirm that it's been set correctly.
This is a good example of why I like using `run` files and how I use them to automate and simplify parts of my workflow.

View file

@ -0,0 +1,27 @@
---
permalink: daily/2022/08/18/talking-drupal-tailwind-css
pubDate: 2022-08-18
title: "'Talking Drupal' and Tailwind CSS"
tags:
- css
- tailwind-css
- twig
---
In March, I was a guest again on the Talking Drupal podcast. This time I was talking about utility CSS and, in particular, the Tailwind CSS framework.
I've become a big fan of this approach to styling websites and was an early adopter of Tailwind, and have released [a starter-kit theme](https://www.drupal.org/project/tailwindcss) for building custom Drupal themes with Tailwind CSS based on what I was using for my own client projects.
## Rebuilding Talking Drupal with Tailwind
Usually when I give a Tailwind CSS talk at a conference or user group, I rebuild something familiar - maybe a page of their website - as an example and to explain some of the concepts and anything that was particularly interesting during the build. (I have [a blog post]({{site.url}}/blog/uis-ive-rebuilt-tailwind-css) that lists the ones that I've done before).
After this podcast episode, I built a [Tailwind version of the Talking Drupal homepage](https://talking-drupal-tailwindcss.oliverdavies.uk).
But, given that Drupal uses Twig and that we'd talked about best practices around using a templating engine to use loops and extract components to organise code and reduce duplication, I definitely wanted to build this example using Twig templates.
Drupal seemed like too much for a single page example, and Symfony or Sculpin could distract from the main focus of the demo, so I decided to start from scratch with an empty PHP file and add Twig and any other dependencies myself.
[The code repository](https://github.com/opdavies/talking-drupal-tailwindcss) is publicly viewable on my GitHub profile so people can look at the code and see some of the things that I talked about during the episode in practice and not just the resulting HTML a browser.
You can [listen to the episode](https://talkingdrupal.com/338), and if you want any more information, the slides and video from my [Taking Flight with Tailwind CSS talk]({{site.url}}/talks/taking-flight-with-tailwind-css) are on my website.

View file

@ -0,0 +1,25 @@
---
permalink: daily/2022/08/19/pair-programming-or-code-reviews
pubDate: 2022-08-19
title: Pair programming or code reviews?
---
It's been almost a year and a half since I last pushed a feature branch, created a pull request, and waited for it to be reviewed and (hopefully) merged and deployed.
On the majority of teams and projects that I've worked on, this was how things were done.
Tasks would be worked on in separate branches which would need to be reviewed by one or more other Developers before being merged.
I'm an advocate for continuous integration and trunk-based development (both I plan on writing about in more depth) in which there is no formal code review step, but instead, I encourage people to pair program as much as possible.
Pair or mob (group) programming, for me, is like a real-time code review where you can discuss and make changes instantly, rather than waiting until the work is complete and someone reviewing it after the fact. If a bug is spotted as you're typing it or something could be named better, you can update it there and then.
But there are other benefits too.
Instead of one person writing some code, and others reviewing it after the fact, multiple people have written it together and the knowledge is shared amongst those people.
As you've worked together, you don't need to ask or wait for someone to set time aside to review your changes, so it's quicker for them to be merged and deployed. It's already been reviewed, so as long as any automated checks pass, the code can be merged.
I've worked in pairs where I've taught someone how to write automated tests and do test-driven development, which I suspect wouldn't have been quite the same if they'd just read the finished code afterwards.
Of course, some Developers and teams will prefer the typical code review process - it's worked well for me and projects that I've worked on in the past - but personally, I like the speed, agility, mentoring and learning, and social benefits that I can get more easily from pair programming.

View file

@ -0,0 +1,22 @@
---
pubDate: 2022-08-20
title: "A return to offline meetups and conferences"
permalink: "archive/2022/08/20/return-to-offline-meetups-conferences"
tags: ["community"]
---
Yesterday, I dusted off our Meetup page and posted our next [PHP South Wales meetup](https://www.meetup.com/php-south-wales) event.
We've had online meetups and code practice sessions throughout the pandemic and during lockdowns, but this will be our first offline/in person/IRL meetup since February 2020.
As well as organising our online meetups during COVID, I attended a lot of other online events, [usually giving various talks or workshops]({{site.url}}/blog/speaking-remotely-during-covid-19), and whilst they were good for a while, I eventually started to get burned out by them.
I've been an organiser of various meetups and conferences for a long time, and attending events has been a very large part of my career so far - providing opportunities to learn, to network and socialise with other attendees, and pass knowledge on through talks, workshops and mentoring.
It's been great to see some offline events returning, from local user groups to conferences such as DevOpsDays, DrupalCon and SymfonyLive.
I've given one talk this year - a lot less than this time last year - but it was in front of an audience instead of a screen, and whilst it seemed strange, I'm sure that it's something that will feel normal again in time.
I'm thinking of attending a conference next month, I've submitted some talk suggestions to some other conferences which I'm waiting to hear from, and am considering travelling to some of the other UK user groups as they restart - some of which I joined or spoke at online but it would be great to meet them in person.
For next week, I'll be glad to have PHP South Wales events running again and to see our community back together in person, and then do it again and start getting ready for next month's event.

View file

@ -0,0 +1,29 @@
---
permalink: daily/2022/08/21/2022-08-21
pubDate: 2022-08-21
title: "Why I use Docker and Docker Compose for my projects"
tags:
- docker
---
For the last few years, I've used Docker and Docker Compose exclusively on all of my projects. When I start a new project or onboard a new client, usually one of the first things that I need to do is get an application running in Docker so that I can work on it.
<!-- Since I started programming, I've used a number of different local environments. Starting with WAMP and XAMPP on Windows, MAMP on macOS, Laravel Valet, the Symfony local server, and various open-source Docker-based solutions. -->
I've inherited projects with no environment configuration or documentation at all and I need to start from scratch to get it running. Ideally, each project would have it's local environment configuration in the same Git repository as the application code.
For my own projects, these days I prefer to use Docker and Docker Compose - creating my own Dockerfiles for each project so that the correct dependencies are present and the required build steps are executed, as well as acting as documentation.
It's lean as the environment is built specifically for each project, and easy to configure using Docker and Docker Compose directly using native patterns such as override files, environment variables and interpolation, and multi-stage builds.
The configuration can be as simple or complicated as it needs to be for each project rather than using "a one size fits all" approach. If I'm working with Drupal, Fractal, Vue.js, a PHP library, a Go command line tool, or something else entirely, I can use the most appropriate starting point.
As well as local developments, it's easy to use Docker and Docker Compose in CI environments with tools like GitHub Actions and Bitbucket Pipelines. They will either be present by default or will be easy to install, and it's simple to run a `docker-compose build` or `docker-compose run` command within a pipeline to check that the project builds correctly and to execute tasks such as automated tests or static analysis.
As well as using it for projects, Docker has been useful for me in other situations where I need to run small tools such as rst2pdf for generating presentation slides, and ADR Tools for working with architectural decision records.
For some situations like an open-source contribution day, using an off-the-shelf solution would probably be a better option, and some teams will have their own preferences, but I prefer to use Docker and Docker Compose when I can.
Personally, I like to invest time into learning tools that provide reusable knowledge, such as Docker and Docker Compose. I'd prefer to spend time learning something, even if it may take longer compared to other tools, if it's going to give me a return on that investment in the medium- to long-term.
For some examples of how I work with Docker and Docker Compose, you can [see my public GitHub repositories](https://github.com/opdavies?tab=repositories&q=docker) and how things are put together there.

View file

@ -0,0 +1,25 @@
---
permalink: daily/2022/08/22/2022-08-22
pubDate: 2022-08-22
title: "Being a T-shaped Developer"
---
A blog post appeared on my feed this morning, titled [How to be T-Shaped](https://www.nomensa.com/blog/how-to-be-t-shaped).
"T-shaped Developers" is a term that I've also used before. Being T-shaped means that you have a deep knowledge in one particular area and a breadth of knowledge in other areas.
I would say that I'm T-shaped.
My main area of knowledge is PHP and Drupal software development - they're the programming language and content management system that I've used throughout most of my career so far, since I started in 2007.
As I worked on my own personal and client projects, I needed to learn more complementary skills.
I needed to learn how to style websites and build themes so I started to learn front-end development with CSS and frameworks like Bootstrap, Bulma and Tailwind CSS, and JavaScript frameworks like Angular, Vue.js and Alpine, as well as TypeScript.
I also needed to host these projects somewhere, which introduced me to Linux servers, virtual hosts, (S)FTP and SSL, web servers like Apache, Nginx and Caddy, MySQL and MariaDB databases, and as projects got more complicated, I started using tools like Vagrant and Puppet, Ansible, and Docker for configuring environments to work in.
I don't use Drupal for every project. I've used static site generators and frameworks like Symfony based on the project's requirements, and have projects that use several different technologies at the same time.
The main benefits are that I can either deliver entire projects or projects with more complicated architectures, or work across different teams - mentoring a team of Front-End Developers in Drupal theming, or working with System Administrators to start hosting PHP applications. Having these additional skills is definitely valuable to employers and clients.
I've said that one of the best and worst things about software development is that there's always something new to learn!

View file

@ -0,0 +1,31 @@
---
pubDate: 2022-08-23
title: "Git: GUI or command-line?"
permalink: "archive/2022/08/23/git-gui-command-line"
tags:
- "git"
---
Ive been using Git for a long time. My first full-time Developer role in 2010 was working on an in-house team and that project used Git as its version control system.
I remember typing commands into an Ubuntu terminal and trying to wrap my head around the process of adding and staging files, (sometimes) pulling, and then pushing to a remote. I think the remote was a simple bare repository on a server, so there was no UI like there is in GitHub and similar tools today.
In fact, GitHub only started two years earlier in 2008, and GitLab wasnt around until 2014.
Looking back, my introduction to Git as a Junior Developer wasn't easy and I remember starting to get frustrated until it eventually "clicked" and made sense.
I don't remember if there were GUIs at that time (I remember using gitk but I can't think when), but having a tool like GitHub where I could see the code, branches and commits, would probably have been helpful with my initial learning.
Whilst working locally, I've tried some of the desktop GUI tools like Sourcetree, gitkraken and Tower, but I always come back to using Git on the command line.
While a Git GUI tool may make it easier to learn Git initially as a Junior Developer, I'd recommend trying to learn the command line too.
In my opinion, understanding whats happening "under the hood" when is important working with a GUI - just in case you find yourself unexpectedly having to use the command line. Ive seen an error in a Git GUI that suggests running commands in the terminal to debug or fix the issue. If you aren't familiar with the terminal commands or what they do, then I'd expect this to be intimidating and confusing.
If you're working as part of a team or contributing to an open-source project then the consistency that the command line provides will make it easier when working with colleagues or getting help from project maintainers. You're also learning Git itself rather than a tool that may add it's own terminology or change how Git itself works, also causing confusion.
There's a lot of Git functionality and concepts that I wouldn't have explored if I wasn't using the command line and relying on a GUI, such as adding and removing code in chunks using patch mode, using bisect to find when a bug was introduced, worktrees for local code organisation, and understanding merging vs rebasing, interactive and non-interactive rebases, and merge commits and fast-forward merges.
Of course, if you prefer to use a GUI and it works for you, then that's fine. Personally, I like to dig deep when learning tools, to know them inside-out and understand how to use them well, and I think that the time that I've spent learning Git and optimising my workflow paid for itself a long time ago.
How do you like to use Git? Do you prefer to use the command line or a GUI tool? Reply to this email and let me know.

View file

@ -0,0 +1,51 @@
---
permalink: daily/2022/08/24/2022-08-24
pubDate: 2022-08-24
title: "How I've configured Git"
tags:
- "git"
---
After yesterday's post on why I prefer using Git on the command line rather than using a GUI tool, today I thought that I'd post about how I've configured Git.
First, I rarely ever run the `git` command - I usually run a `g` function that I've created within my zsh configuration.
Rather than being an simple alias, it's a shell function that will run `git status -sb` to show the current status of the repository if there are no additional arguments. If there are, such as when running `g add`, then this is executed as a normal Git command. (This is something that I first saw from Thoughtbot, if I remember correctly).
## Using .gitconfig
The main part of my configuration is within Git's `~/.gitconfig` file, where I can configure Git to work how I want.
For example, I like to avoid merge conflicts, so I always want to use fast-forward merges whilst pulling and also to rebase by default. I can do this by adding `ff = only` and `rebase = true` to the `[pull]` section of my `~/.gitconfig` file.
I can do this manually, or running `git config --global pull.rebase true` will set the option but also update the file automatically.
Some of the tweaks that I've made are to only allow fast-forward merges by adding `merge.ff = only`, automatically squash commits when rebasing by setting `rebase.autosquash = true`, and automatically pruning branches by adding `fetch.prune = true`.
### Simple aliases
Another way that I configure Git is using aliases, which are also within the `~/.gitconfig` file.
For example, if I ran `git config --global alias.b "branch"`, then running `git b` would just run `git branch` which shortens the command and saves some time and keystrokes.
I have similar one- or two letter "short" aliases for pushing and pulling code, and some that also set some additional arguments such as `aa` for `add --all` and `worktrees` for `worktree list`.
### More complicated aliases
Aliases can be more complex if needed by prefixing it with a `!`, meaning that it executes it as a shell command.
This means that I can have `repush = !git pull --rebase && git push` to chain two separate Git commands and combine them into one, and `ureset = !git reset --hard $(git upstream)` which executes the full command, including another alias as part of it.
I also have `issues = !gh issue list --web` and `pulls = !gh pr list --web` to open the current repository's GitHub issues or pull requests respectively, which can be done as it's not limited to just running `git` commands.
### Custom functions
Finally, if an alias is getting too long or complex, then it can extracted to it's own file.
Any executable file within your `$PATH` that starts with `git-` will automatically become a Git command.
One example that I have is [git-cm](https://github.com/opdavies/dotfiles/blob/2b20cd1e59ae3b1fa81074077e855cbdfa02f146/bin/bin/git-cm) which, similar to the `g` function`, is a bash script that checks for any arguments passed to it and runs a slightly different command. It achieves the same thing as if it were an alias, but it does make it easier to write and maintain as it's in a separate file.
These are just some examples. If you want to see my entire configuration, then check out [my dotfiles repository on GitHub](https://github.com/opdavies/dotfiles/tree/2b20cd1e59ae3b1fa81074077e855cbdfa02f146/roles/git/files).
How have you configured Git for your workflow? Reply to this email and let me know.

View file

@ -0,0 +1,24 @@
---
pubDate: 2022-08-25
title: "Why I work in Neovim"
tags: ["vim", "neovim"]
permalink: "archive/2022/08/25/why-i-work-in-neovim"
---
Over a year ago, I posted that I was [switching to using Neovim full-time]({{site.url}}/blog/going-full-vim) for my development work.
I'd used Vim one file at a time on remote servers, and added Vim plugins in other IDEs and editors, so I was already familiar with a lot of the key bindings and motions before I decided to use it full-time.
Still, it was tough to begin with, but once I'd learned how to configure Neovim, I also learned that being able to customise and extend it as much as you need to is one of its main advantages compared to other IDEs and code editors.
TJ DeVries - a Neovim core team member - has recently coined the term "PDE" (a personalised development environment) which, for me, describes Neovim perfectly.
Currently, I have a fuzzy-finder to quickly open files (as well as many other things), an LSP client to add code intelesense, auto-completion, refactoring tools, custom snippets, and very recently, a database client and a HTTP client.
Just as important to me, I've found a growing community of other Neovim users who stream on Twitch, post YouTube videos, write blog posts, or publish their dotfiles for others to see and reference.
I've learned Lua. Not just for my own Neovim configuration, but I recently wrote and open-sourced my own simple plugin.
Like Git, I enjoy and prefer using tools that I can configure and adapt to my workflow.
Given Neovim's flexibility and configurability, its expanding feature set both in core and community plugins, and the growing community, I think that Neovim is going to be something that I continue to use and adapt for a long time.

View file

@ -0,0 +1,19 @@
---
pubDate: 2022-08-26
title: "Always be learning"
permalink: "archive/2022/08/26/always-be-learning"
---
I've been a Developer for 15 years and one thing that I've always focussed on is to always keep learning.
From starting as a self-taught Developer, initially learning HTML and CSS, to later learning PHP and Drupal as well as other languages, frameworks and tools.
For the last couple of days, I've been experimenting with Next.js - a React-based web framework. I hadn't used React before and have typically reached for Vue.js or sometimes Alpine.js based on what I needed to do. However, I'm always looking for opportunities to learn and implement new things, and see how I can use them in any of my projects.
This afternoon, I started a new Next.js and TypeScript project, and refactored a small codebase that used a static site generator to create a small number of landing pages from Markdown files.
It took me a short time to set up a Docker environment for it based on some of my Vue.js projects, ported across the application to recreate the pages, and finally, updated the CI pipeline that generated the static pages and uploaded them to an S3 bucket.
The end result is the same - the same HTML pages are generated and uploaded - but, for me, trying and experimenting with new things keeps my work interesting and my knowledge fresh, which benefits me as well as my colleagues and clients.
As I said in a previous email, one of the great things about software development is that there's always something new to learn.

View file

@ -0,0 +1,15 @@
---
pubDate: 2022-08-27
title: "Giving back"
permalink: "archive/2022/08/27/giving-back"
---
Today, I've been at an event run by a local animal rescue charity. It's one that we attend often as my children like to enter the dog show, but this year, I've also sponsored one of the categories.
As well as organising the PHP South Wales user group, I'm also now a sponsor - donating books and elePHPant plushies for raffle prizes and paying the group's Meetup.com subscription costs.
Giving back and supporting open-source maintainers and content creators is a big priority of mine. If I use some open-source software or find that someone's Twitch or YouTube channel is useful, if that person or organisation is on GitHub or Patron, then I'll sponsor them, or I'll subscribe to their channel.
If I find a useful blog post or video, I'll add a comment or link to it on Twitter, thanking them and letting them know that it helped me.
Especially if it's something that I've used within my projects, it makes sense to support it and it's maintainers, so that they keep working on and improving the software, continue streaming, and keep writing blog posts and recording videos for me to learn from.

View file

@ -0,0 +1,27 @@
---
pubDate: 2022-08-28
title: "How I started programming"
permalink: "archive/2022-08-28/how-started-programming"
---
In 2007, I was working in the IT sector in a Desktop Support role but hadn't done any coding professionally.
In my spare time, I was a black belt in Tae Kwon-Do and enjoyed training at a few different schools. Because of my IT experience, I was asked if I could create a website for one of the schools - somewhere that we could post information and class times for new starters, as well as news articles and competition results.
This would be my introduction to programming.
I started learning what I needed to know, starting with HTML and CSS - experimenting with a template that I found online and was able to tweak to match the school's colours.
I was able to complete the first version of the website with static HTML pages and CSS but had to manually create a new HTML page for every new news article and edit existing pages manually.
I wanted to make it more dynamic, and started to learn about PHP and MySQL from video courses and online forums.
After posting a question about some PHP code that I'd written, someone suggested that I look at content management systems - namely Drupal, which was used for that forum (I have [a screenshot of the reply](https://twitter.com/opdavies/status/1185456825103241216)). This was a new concept to me as until that point, I'd written everything so far myself whilst learning it.
I remember evaluating Drupal alongside some others - rebuilding the same website a few different times, but stuck with Drupal and relaunched it on Drupal 6 and a custom theme that I'd created from the original templates.
I signed up for a Drupal.org account, started to do some freelance work for a local web design agency, and built a new website for a local cattery.
I started blogging, attending meetups, and when an opportunity to switch careers to software development came along, I applied for and got the job.
That job was also using Drupal and, in another email, I'll write more about why I still like and use Drupal years later.

View file

@ -0,0 +1,22 @@
---
pubDate: 2022-08-29
title: "Why I like Drupal"
permalink: "archive/2022/08/29/why-like-drupal"
tags: ["drupal"]
---
As I said in yesterday's email, I developed my first website project on Drupal. It allowed me to take a static HTML and CSS website and convert it into something that was much easier and quicker for me to update, and allowed me to create more users with permissions to do those tasks too.
I worked on various Drupal projects, and my first full-time job was on an in-house team where we maintained and enhanced a Drupal 6 website.
I've since used Drupal for projects of all shapes and sizes with different levels of complexity. Everything from a simple brochure website to large and complex, multilingual, API-driven projects.
I've been able to build eCommerce websites with Drupal using Ubercart and Drupal Commerce. I've built traditional stores where customers purchase physical products, a photography competition website with custom judging functionality, a site for purchasing commercial and residential property and land searches, and a fully-fledged events booking and management platform.
Whatever the size and complexity of the project, Drupal is flexible enough to fit it.
I've loved some of the ecosystem improvements within the last few years. Moving to object-orientated code by default, integrating code from other projects like Symfony, shipping new features every six months as part of the new release cycle, and embracing tools like Composer, PHPStan and Rector.
I also love being part of the Drupal community. Collaborating on tasks, speaking on Slack, and attending events like DrupalCon where I've been lucky enough to attend, speak and mentor.
Although Drupal is my specialty and the tool that I've used the most, I don't use it exclusively. I'll talk more about this in tomorrow's email.

View file

@ -0,0 +1,24 @@
---
pubDate: 2022-08-30
title: "Why I don't only use Drupal"
permalink: "archive/2022/08/30/why-dont-only-use-drupal"
tags: ["drupal"]
---
Yesterday, [I shared some of the reasons]({{site.url}}/archive/2022/08/29/why-like-drupal) why I like Drupal and why I use it for the majority of my projects. But, as I said, I don't use it exclusively and for some projects I used various different tools.
Essentially, I always try to recommend and use the best tool for the job.
I previously interviewed for a job and was asked to complete a coding test. The role was mostly Drupal-focussed, but as the test asked for a command-line application, I completed it using Symfony and Symfony Console, and was able to discuss why I'd made that decision. In my opinion, it was the best choice based on the requirements.
This is the same approach that I use when making recommendations for a new project.
I've delivered projects using other tools like the Symfony framework or a static site generator, as long as it fitted the requirements.
If there's a SaaS solution that can be used instead, or an off-the-shelf tool that can be integrated instead of writing a custom solution, then that should be evaluated.
There may be other constraints like budgets or deadlines to consider - maybe something can be delivered faster or cheaper using a particular technology, even if it's not the final solution.
There are situations though where a tool may be the best choice even though it's not the ideal fit based purely on the technical requirements. Maybe the client is already familiar with publishing content in Drupal, or an in-house development team is used to working with a certain tool or language. In that case, those things should be considered too.
Also, for me, having a chance to evaluate other technologies and explore what's happening outside of the Drupal ecosystem is a good opportunity. A lot of what I've learned about automated testing, for example, is from the wider PHP and JavaScript communities, as well as tools like [Tailwind CSS]({{site.url}}/talks/taking-flight-with-tailwind-css) and [Illuminate Collections]({{site.url}}//talks/using-illuminate-collections-outside-laravel) that I've been able to bring back into my other Drupal projects.

View file

@ -0,0 +1,24 @@
---
title: "To monorepo, or not to monorepo?"
permalink: "archive/2022/08/31/monorepo-or-not"
pubDate: 2022-08-31
tags: ["git"]
---
I listened to a podcast episode recently which talked about monorepos - i.e. code repositories that contain multiple project codebases rather than a single repository for each codebase - and this got me thinking about whether I should be using these more.
It's something that I've been trialling recently in my [Docker examples](https://github.com/opdavies/docker-examples) and [Docker images](https://github.com/OliverDaviesLtd/docker-images) repositories, where one repository contains and builds multiple Docker images.
I'm not suggesting that I put all of my client projects into one repository, but at least combining the different parts of the same project into the same repository.
For example, I'm working for one client on their current Drupal 7 websites whilst developing the new Drupal 9 versions, which are currently in two separate repositories. I'm also developing an embeddable Vue.js application as part of the Drupal 9 website, and using Fractal as a component library. These are also in their own repositories.
Using a monorepo approach, all of these projects would be in the same repository.
I can see advantages to being able to see cross-project changes in the same place - such as an API change in Drupal that needs a update to be made in Vue.js, or vice-versa - rather than needing to look at separate repositories. This could also make versioning easier as everything will be stored and tagged inside the same repository.
Each project has it's own CI pipeline, so it would require some changes where I set a specific pipeline to run only when a directory is changed.
I see how deployments may be tricker if I need to push an update within a directory to another Git repository, which makes me wonder if I'll need to look into using subtree splits to create separate deployment repositories - similar to how the Symfony project has one main repository and then each component split into its own repository.
I'll keep trialling it in my open-source projects and maybe test it with some client projects, but if you have experience with monorepos that you'd like to share, then please reply to this email - I'd love to hear about it.

View file

@ -0,0 +1,40 @@
---
pubDate: 2022-09-01
title: "Conventional commits and CHANGELOGs"
tags: []
permalink: "archive/2022/09/01/conventional-commits-changelogs"
---
One of the things that I've done since joining my current team is to implement a standard approach for our commit messages.
We're using the [Conventional Commits specification](https://www.conventionalcommits.org), which gives some additional rules to follow when writing commit messages.
For example:
```
build(deps): update Drupal to 9.4.5
Updated Drupal's `drupal/core-*` packages to 9.4.5.
See https://www.drupal.org/project/drupal/releases/9.4.5.
Refs: #123
```
We can see that this is a `build` task that relates to our project dependencies, in this example, we're updating Drupal core. We can also see this in the subject line.
In the commit body, I add as much information as possible to do with the change and include any relevant links, just in case I need to refer to them again, and the list the names of anyone else who worked with me. I also typically include any ticket numbers or links in the commit footer.
So far, I've mostly used the `build`, `chore`, `ci`, `docs` and `refactor` commit types, which are types that are recommended and used by [the Angular convention](https://github.com/angular/angular/blob/22b96b9/CONTRIBUTING.md#-commit-message-guidelines).
Following this standard means that it's very easy to look at the Git log and see what type of changes are going to be included within a release and, if you're using scopes, which part of the application are affected.
Conventional commits also works nicely with something else that we've introduced, which is a CHANGELOG file.
There are tools that can generate and update CHANGELOGs automatically from conventional commits, but so far, we've been following the [Keep a Changelog](https://keepachangelog.com) format.
It's easy to match the commits to the `Added`, `Changed` or `Fixed` types, and although it needs to be updated manually, it's easy to add to the `Unreleased` section of the file and re-organise everything within the appropriate headings as needed as part of a release.
What I like about this format is that it's more human-friendly and gives a higher level overview of the changes rather than a reformatted Git log.
As we do trunk-based development and continuous integration on our projects, there can be numerous commits related to the same change, so I'd rather only see a single line in the CHANGELOG for each change. This also makes it easier to share the CHANGELOG file with others, and we can still view and grep the Git log to see the individual commits if we need to.

View file

@ -0,0 +1,22 @@
---
title: "Automating all the things with Ansible"
pubDate: 2022-09-02
permalink: "archive/2022/09/02/automating-all-the-things-with-ansible"
tags: ["ansible"]
---
Ansible is a tool for automating IT tasks. It's one of my preferred tools to use, and one that I've written about and [presented talks on]({{site.url}}/talks/deploying-php-ansible-ansistrano) previously.
It's typically thought of as a tool for managing configuration on servers. For example. you have a new VPS that you want to use as a web server, so it needs Nginx, MySQL, PHP, etc to be installed - or whatever your application uses. You define the desired state and run Ansible, which will perform whatever tasks are needed to get to that state.
Ansible though does include modules for interacting with services like Amazon AWS and DigitalOcean to create the servers and resources, and not just configure them.
It also doesn't just work on servers. I use Ansible to configure my local development environment, to ensure that dependencies and tools are installed, and requirements like my SSH keys and configuration are present and correct.
Lastly, I use Ansible to deploy application code onto servers and automatically run any required steps, ensuring that deployments are simple, robust and repeatable.
In the next few emails, I'll explain how I've been able to utilise Ansible for each of these situations.
---
Want to learn more about how I use Ansible? [Register for my upcoming free email course]({{site.url}}/ansible-course).

View file

@ -0,0 +1,57 @@
---
pubDate: 2022-09-03
title: Creating infrastructure with Ansible
permalink: dailys/2022/09/03/creating-infrastructure-with-ansible
tags: ["ansible"]
---
Let's start at the beginning.
If we want to automate our infrastructure then we first need to create it. This could be done manually or we can automate it.
Popular tools for this include Terraform and Pulumi, but Ansible also includes modules to interface with hosting providers such as Amazon Web Services, Microsoft Azure, DigitalOcean, and Linode.
By using one of these tools, you can programatically provision a new, blank server that is ready for you to be configered.
For example, to [create a DigitalOcean droplet](https://docs.ansible.com/ansible/latest/collections/community/digitalocean/digital_ocean_module.htm):
```language-yaml
---
- community.digitalocean.digital_ocean_droplet:
image: ubuntu-20-04-x64
name: mydroplet
oauth_token: "..."
region: sfo3
size: s-1vcpu-1gb
ssh_keys: [ .... ]
state: present
wait_timeout: 500
register: my_droplet
```
Running this playbook will create a new Droplet with the specified name, size, and operating system, and within the specified region.
If you needed to create a separate database server or another server for a new environment, then the file can be updated and re-run.
[Creating an Amazon EC2 instance](https://docs.ansible.com/ansible/latest/collections/amazon/aws/ec2_instance_module.html#ansible-collections-amazon-aws-ec2-instance-module) looks very similar:
```language-yaml
---
- amazon.aws.ec2_instance:
image_id: ami-123456
instance_type: c5.large
key_name: "prod-ssh-key"
name: "public-compute-instance"
network:
assign_public_ip: true
security_group: default
vpc_subnet_id: subnet-5ca1ab1e
```
This doesn't apply just to servers - you can also use Ansible to create security groups and S3 buckets, manage SSH keys, firewalls, and load balancers.
Once we have our infrastructure in place, we can start using Ansible to set and manage its configuration, which we'll do in tomorrow's email.
---
Want to learn more about how I use Ansible? [Register for my upcoming free email course]({{site.url}}/ansible-course).

View file

@ -0,0 +1,23 @@
---
title: "Using Ansible for server configuration"
pubDate: 2022-09-04
permalink: "archive/2022/09/04/using-ansible-for-server-configuration"
---
[In yesterday's email]({{site.url}}/archives/2022/09/03/creating-infrastructure-with-ansible), I described how to set up a blank server with Ansible.
Now that we've done that, it needs to be configured.
Once the servers IP address or hostname has been added to a `hosts.ini` file, you can run ad-hoc commands against it - such as `ansible all -i hosts.ini -m ping` to run Ansible's `ping` module on all of the hosts in your inventory and check that you can connect to them.
Another useful one that you can use is the `shell` module, that runs ad-hoc run commands on each host. If you need to check the uptime of each of your servers, run `ansible all -i hosts.ini -m shell -a uptime`. You can replace the last argument with any other shell command that you need to run, like `df` or `free`.
Running commands in this way is great for getting started, for routine maintenance, or an emergency free disk space check, but for more complex tasks like configuration management, using playbooks is the better option. They are YAML files that contain lists of tasks that Ansible will run through and execute in order.
If you have a group of related tasks, such as for installing a piece of software, then you can combine them into roles. In fact, Ansible Galaxy has thousands of pre-built collections and roles that you can download, include in your playbooks, configure, and run.
Very quickly, you can get a full stack installed and configured - ready to serve your application.
---
Want to learn more about how I use Ansible? [Register for my upcoming free email course]({{site.url}}/ansible-course).

View file

@ -0,0 +1,25 @@
---
title: "Using Ansible for local environment configuration"
pubDate: 2022-09-05
permalink: "archive/2022/09/05/using-ansible-for-local-configuration"
---
As well as [configuring servers]({{site.url}}/archive/2022/09/04/using-ansible-for-server-configuration), you can use Ansible to configure your own local machine and development environment.
The change that you need to make is within the `hosts.ini` file:
```
127.0.0.1 ansible_connection=local
```
Instead of the server's IP address or hostname, use the localhost IP address and set `ansible_connection` to `local` to tell Ansible to run locally instead of using an SSH connection.
Another way to do this is to set `hosts: 127.0.0.1` and `connection: true` in your playbook.
Once this is done, you can run tasks, roles, and collections to automate tasks such as installing software, adding your SSH keys, configuring your project directories, and anything else that you need to do.
For an example of this, you can see [my dotfiles repository on GitHub](https://github.com/opdavies/dotfiles).
---
Want to learn more about how I use Ansible? [Register for my upcoming free email course]({{site.url}}/ansible-course).

View file

@ -0,0 +1,26 @@
---
title: "Deploying applications with Ansible"
pubDate: 2022-09-06
permalink: "archive/2022/09/06/deploying-applications-with-ansible"
---
The last few days' emails have been about using Ansible to create and configure infrastructure, but it can also be used to deploy application code.
The simplest way being that an artifact is built locally - e.g. a directory of static HTML pages from a static site generator - and uploaded onto the server, and for this you could use Ansible's `synchronize` module.
It's a wrapper around the `rsync` command and makes it as simple as specifying `src` and `dest` values for the local and remote paths.
For more complicated deployments, I like to use a tool called Ansistrano - an Ansible port of a deployment tool called Capistrano.
It creates a new directory for each release and updates a `current` symlink to identify and serve the current release, and can share files and directories between releases.
As well as being able to configure settings such as the deployment strategy, how many old releases to keep, and even the directory and symlink names, there are a number of hooks that you can listen for an add your own steps as playbooks so you can install dependencies, generate assets, run migrations, or rebuild a cache as part of each deployment.
If you're running your applications in Docker, you could use Ansible to pull the latest images and restart your applications.
For more information and examples, I've given a talk on Ansible at various PHP events, which covers some Ansible basics before moving on to [deploying applications with Ansistrano]({{site.url}}/talks/deploying-php-ansible-ansistrano).
---
Want to learn more about how I use Ansible? [Register for my upcoming free email course]({{site.url}}/ansible-course).

View file

@ -0,0 +1,30 @@
---
title: "My Tailwind CSS origin story"
pubDate: 2022-09-07
permalink: "archive/2022/09/07/my-tailwind-css-origin-story"
tags: ["tailwind-css"]
---
Tomorrow night, I'm attending one of Simon Vrachliotis (simonswiss)'s Pro Tailwind workshops, so I thought that it would be a good time, as Simon has done himself recently on the Navbar podcast, to describe how I started using Tailwind CSS.
I remember watching a lot of Adam Wathan's live streams on YouTube before Tailwind CSS, and I remember when he started a new project - a SaaS product called KiteTail.
It was a Laravel and Vue.js project, and although I'm not a Laravel Developer primarily, I got a lot of other information from Adam's streams about automated testing, test-driven development, and Vue.js as I was learning Vue at the time.
One of the episodes was about styling a card component using some styles that Adam was copying between projects - which would eventually be the starting point for Tailwind CSS.
In fact, I think I watched some of the episode and stopped as I was happy with the Sass and BEM or SMACSS approach that I was using at the time, and didn't initially see the value of the utility CSS approach that I was seeing for the first time (everyone has a similar reaction initially).
After a while, I did re-visit it but because Tailwind CSS wasn't released as it's own project yet, I (like Simon) started to experiment with Tachyons - another utility CSS library.
I rebuilt a particularly tricky component that I'd just finished working on and had caused me some issues, and managed to re-do it in only a few minutes.
I started to use Tachyons on some personal and client projects as a layer on other frameworks like Bootstrap and Bulma, and later moved on to Tailwind CSS once it has been released.
I was working in this way on a project when I released that I could use Tailwind for all of the styling instead of just adding small sprinklings of utilities here and there. I refactored everything and removed the other framework that I'd been using - leaving just Tailwind CSS.
With the exception of some legacy projects, now I use Tailwind CSS exclusively and have used it for a number of projects. I've given lunch and learn sessions to teams that I've worked on, [presented a Tailwind CSS talk]({{site.url}}/talks/taking-flight-tailwind-css) at a number of PHP, Drupal, WordPress, and JavaScript events, and maintain [a starter-kit theme](https://www.drupal.org/project/tailwindcss) for using Tailwind in custom Drupal themes.
I've also rebuilt a [number of existing sites]({{site.url}}/blog/uis-ive-rebuilt-tailwind-css) as examples and written some [Tailwind CSS related blog posts]({{site.url}}/blog/tags/tailwind-css).
I'm looking forward to attending Simon's workshop tomorrow and quickly putting that knowledge to use in the next phase of a project that I'm currently working on.

View file

@ -0,0 +1,34 @@
---
title: "Keeping secrets with Ansible Vault"
pubDate: 2022-09-08
permalink: "archive/2022/09/08/keeping-secrets-with-ansible-vault"
tags: ["ansible"]
---
In the last few posts, I've talked about using Ansible for configuring servers and local environments, during both of which, you're likely to have some sensitive or secret values. These could be database credentials within your application and on your server, and your SSH private keys within your local environment.
Rather than committing these to a code repository in plain text, Ansible includes the `ansible-vault` command to encrypt values.
To see this working, run `ansible-vault encrypt_string my-secret-password`, enter a password, and then you should see something like this:
```
!vault |
$ANSIBLE_VAULT;1.1;AES256
33353031663366313132333831343930643830346531666564363562666136383838343235646661
6336326637333230396133393936646636346230623932650a333035303265383437633032326566
38616262653933353033376161633961323666366132633033633933653763373539613434333039
6132623630643261300a346438636332613963623231623161626133393464643634663735303664
66306433633363643561316362663464646139626533323363663337363361633333
```
This is the encrypted version of that password, and this could be committed and pushed to a code repository.
You can use it within a playbook, and you'll be prompted to re-enter the password so that Ansible can decrypt and use it.
Rather than a single string, you could have a file of variables that you want to encrypt. You can do this by running `ansible-vault encrypt vault.yml` and include it as before. Again, you'll be prompted by Ansible so that it can decrypt and use the values.
For an example of how I'm using Ansible Vault, see [the Dransible repository](https://github.com/opdavies/dransible/tree/986ba5097d62ff4cd0e637d40181bab2c4417f2e/tools/ansible) on GitHub or my [ Deploying PHP applications with Ansible, Ansible Vault and Ansistrano]({{site.url}}/talks/deploying-php-ansible-ansistrano) talk.
---
Want to learn more about how I use Ansible? [Register for my upcoming free email course]({{site.url}}/ansible-course).

View file

@ -0,0 +1,20 @@
---
title: "Refactoring a Tailwind CSS component"
pubDate: 2022-09-09
permalink: "archive/2022/09/09/refactoring-tailwind-component"
tags: ["tailwind-css"]
---
After last night's Pro Tailwind theming workshop, I decided to revisit and refactor some similar code that I'd worked on before.
It was a demo for a presentation on utility-first CSS and Tailwind whilst I was at Inviqa.
I'd taken one of the components from the website that we'd lauched and rebuilt it - in particular to show how Tailwind could be used for responsive and themeable components.
[The original version](https://play.tailwindcss.com/Yfmw8O5UNN) was written in Tailwind 1 and used custom CSS with `@apply` rules to include text or background colours to elements based on the theme being used on that page or component.
As well as moving it into a Next.js application, [the new version](https://github.com/opdavies/inviqa-tailwindcss-example) uses techniques covered in Simon's workshop - using CSS custom properties (aka variables) to override the colours, and writing custom plugins to generate the required styles. It doesn't include everything from the workshop, but enough for this refactor.
I also moved the `flex-basis` classes into their own standalone plugin and might release that as it's own open-source plugin.
I'm working on a client project at the moment which will need switchable themes so I'm looking forward to putting these techniques to use again in the near future.

View file

@ -0,0 +1,38 @@
---
title: "Automating Ansible deployments in CI"
pubDate: 2022-09-10
permalink: "archive/2022/09/10/automating-ansible-deployments-ci"
tags: ["ansible"]
---
Once you have a deployment that's run using Ansible, rather than running it manually, it's easy to automate it as part of a continuous integration pipeline and have your changes pushed automatically by tools like GitHub Actions and GitLab CI.
You'll need to configure SSH by adding a known hosts file and a private key so the tool can connect to your server, but after that, it's just running the same Ansible commands.
If you're using Ansistrano or other roles, you can install dependencies by using `ansible-galaxy`, and `ansible-vault` to decrypt and use any encrypted variables - securely storing the Vault password and any other secrets as environment variables within your pipeline.
Here's an example using GitHub Actions:
```
- name: Download Ansible roles
run: ansible-galaxy install -r requirements.yml
- name: Export the Ansible Vault password
run: echo $ANSIBLE_VAULT_PASS > .vault-pass.txt
env:
ANSIBLE_VAULT_PASS: ${{ secrets.ANSIBLE_VAULT_PASS }}
- name: Deploy the code
run: >
ansible-playbook deploy.yml
-i inventories/$INVENTORY_FILE.ini
-e "project_git_branch=$GITHUB_SHA"
--vault-password-file=.vault-pass.txt
- name: Remove the Ansible Vault password file
run: rm .vault-pass.txt
```
Before these steps, I've added the SSH key and determined which inventory file to use by the updated branch. The Vault password is exported and then removed once it has been used.
Automated tests and other code quality checks can be run in prior job, ensuring that the deployment only happens if those checks pass, but assuming that all is good, the playbook will be run and the changes will be deployed automatically.

View file

@ -0,0 +1,62 @@
---
title: "Custom styles in Tailwind CSS: `@apply`, `theme` or custom plugins"
pubDate: 2022-09-11
permalink: "archive/2022/09/11/custom-styles-tailwind-css-apply-theme-custom-plugins"
tags: ["tailwind-css"]
---
There are three ways to add custom styles to a Tailwind CSS project. As there have been [some recent tweets](https://twitter.com/adamwathan/status/1559250403547652097) around one of them - the `@apply` directive - I'd like to look at and give examples for each.
## What is `@apply`?
`@apply` is a PostCSS directive, provided by Tailwind, to allow re-using it's classes - either when extracting components or overriding third-party styles.
The CSS file is the same as if you were writing traditional CSS, but rather than adding declarations to a ruleset, you use the `@apply` directive and specify the Tailwind CSS class names that you want to apply.
For example:
```css
fieldset {
@apply bg-primary-dark;
}
```
This is a simple example but it's easy to see how this could be used in ways that weren't intended and how edge-cases can be found.
Adam said in a another tweet:
> I estimate that we spend at least $10,000/month trying to debug extremely edge-case issues people run into by using `@apply` in weird ways.
## Using the `theme` function
As well as `@apply`, Tailwind also provides a `theme` function that you can use in your CSS file. This removes the abstraction of using the class names and adds the ability to retrieve values from the `theme` section of your tailwind.config.js file.
```css
fieldset {
backgroundColor: theme('colors.primary.dark');
}
```
This seems to be the preferred approach over using `@apply`.
## Creating a custom plugin
The `theme` function is also available if you write a custom Tailwind CSS plugin:
```javascript
const plugin = require('tailwindcss/plugin')
plugin(({ addBase, theme }) => {
addBase({
fieldset: {
backgroundColor: theme('colors.primary.dark'),
}
})
})
```
This is an approach that I've used for [generic, open-source plugins](https://github.com/opdavies?tab=repositories&q=%23tailwindcss-plugin) but for project-specific styling, I've mostly used `@apply` or the `theme` function.
That said, I like the modular architecture of having different custom plugins - especially if they're separated into their own files - and being able to easily toggle plugins by simply adding to or removing from the `plugins` array.
I usually don't write many custom styles in a Tailwind project but I think that I'll focus on using the `theme` function going forward, either in a stylesheet or a custom plugin.

View file

@ -0,0 +1,15 @@
---
title: "A month of daily emails"
pubDate: 2022-09-12
permalink: "archive/2022/09/12/month-daily-emails"
---
Its already been a month since I started my email list and writing daily emails.
Since then, Ive written emails on various development and workflow-based topics, including Drupal, Git, Docker, Neovim, Ansible and Tailwind CSS.
The first email was written on Thursday the 12th of August and after initially wondering whether I should start on the upcoming Monday, or how often to post, I decided to jump in with both feet and wrote the first daily post that day. The first few weren't actually emailed as I waited to see if I could sustain writing a daily post (I was just posting them to my website), but after a few days, I set up the email list and started sending the posts.
I can confirm what [Jonathan Stark](https://jonathanstark.com) and [Jonathan Hall](https://jhall.io) have said - that it's easier to write daily and that you start to see topic ideas everywhere. I started with a list of between 20 and 25 ideas and still have most of them as I've pivoted on a day's topic based on an article or tweet that I saw, some code that I'd written, or some approach that I took.
If you're considering starting a daily email list, I'd recommend it.

View file

@ -0,0 +1,67 @@
---
title: "The simplest Drupal test"
pubDate: 2022-09-14
permalink: "archive/2022/09/14/simpletest-drupal-test"
---
Most of my work uses the Drupal framework, and I've given talks and workshops on automated testing and building custom Drupal modules with test-driven development. Today, I wanted to see how quickly I could get a working test suite on a new Drupal project.
I cloned a fresh version of my [Docker Examples repository](https://github.com/opdavies/docker-examples) and started the Drupal example.
I ran `mkdir -p web/modules/custom/example/tests/src/Functional` to create the directory structure that I needed, and then `touch web/modules/custom/example/tests/src/Functional/ExampleTest.php` to create a new test file and populated it with some initial code:
```language-php
<?php
namespace Drupal\Tests\example\Functional;
use Drupal\Tests\BrowserTestBase;
use Symfony\Component\HttpFoundation\Response;
class ExampleTest extends BrowserTestBase {
protected $defaultTheme = 'stark';
}
```
For the simplest test, I decided to test some existing Drupal core functionality - that an anonymous user can view the front page:
```language-php
/** @test */
public function the_front_page_loads_for_anonymous_users() {
$this->drupalGet('<front>');
$this->assertSession()->statusCodeEquals(Response::HTTP_OK);
}
```
To execute the test, I ran `SIMPLETEST_DB=sqlite://localhost//dev/shm/test.sqlite SIMPLETEST_BASE_URL=http://web phpunit -c web/core web/modules/custom`. The environment variables could be added to a `phpunit.xml.dist` file but I decided to add them to the command and use Drupal core's PHPUnit configuration file.
As this is existing functionalty, the test passes. I can change either the path or the response code to ensure it also fails when expected.
With the first test working, it's easy to add more for other functionality, such as whether different users should be able to access administration pages:
```language-php
/** @test */
public function the_admin_page_is_not_accessible_to_anonymous_users() {
$this->drupalGet('admin');
$this->assertSession()->statusCodeEquals(Response::HTTP_FORBIDDEN);
}
/** @test */
public function the_admin_page_is_accessible_by_admin_users() {
$adminUser = $this->createUser([
'access administration pages',
]);
$this->drupalLogin($adminUser);
$this->drupalGet('admin');
$this->assertSession()->statusCodeEquals(Response::HTTP_OK);
}
```
Hopefully, this shows how quickly you can get tests running for a Drupal module. If you'd like to see more, the slides and video recording of my [Test-Driven Drupal talk]({{site.url}}/talks/tdd-test-driven-drupal) are online.

View file

@ -0,0 +1,111 @@
---
title: "Why I mostly write functional and integration tests"
pubDate: 2022-09-16
permalink: "archive/2022/09/16/why-mostly-write-functional-and-integration-tests"
tags: ["drupal"]
---
In [Wednesday's email]({{site.url}}/archive/2022/09/14/simpletest-drupal-test), I showed how quick it is to get started writing automated tests for a new Drupal module, starting with a functional test.
I prefer the outside-in style (or London approach) of test-driven development, where I start with a the highest-level test that I can for a task. If the task needs me to make a HTTP request, then Ill use a functional test. If not, Ill use a kernel (or integration) test.
I find that these higher-level types of tests are easier and quicker to set up compared to starting with lower-level unit tests, cover more functionality, and make it easier to refactor.
## An example
For example, this `Device` class which is a data transfer object around Drupal's `NodeInterface`. It ensures that the correct type of node is provided, and includes a named constructor and a helper method to retrieve a device's asset ID from a field:
```language-php
final class Device {
private NodeInterface $node;
public function __construct(NodeInterface $node) {
if ($node->bundle() != 'device') {
throw new \InvalidArgumentException();
}
$this->node = $node;
}
public function getAssetId(): string {
return $this->node->get('field_asset_id')->getString();
}
public static function fromNode(NodeInterface $node): self {
return new self($node);
}
}
```
## Testing getting the asset ID using a unit test
As the `Node::create()` method (what I'd normally use to create a node) interacts with the database, I need to create a mock node to wrap with my DTO.
I need to specify what value is returned from the `bundle()` method as well as getting the asset ID field value.
I need to mock the `get()` method and specify the field name that I'm getting the value for, which also returns it's own mock for `FieldItemListInterface` with a value set for the `getString()` method.
```language-php
/** @test */
public function should_return_an_asset_id(): void {
// Arrange.
$fieldItemList = $this->createMock(FieldItemListInterface::class);
$fieldItemList
->method('getString')
->willReturn('ABC');
$deviceNode = $this->createMock(NodeInterface::class);
$deviceNode
->method('bundle')
->willReturn('device');
$deviceNode
->method('get')
->with('field_asset_id')
->willReturn($fieldItemList);
// Act.
$device = Device::fromNode($deviceNode);
// Assert.
self::assertSame('ABC', $device->getAssetId());
}
```
This is quite a long 'arrange' section for this test, and just be confusing for those new to automated testing.
If I was to refactor from using the `get()` and `getString()` methods to a different implementation, it's likely that the test would fail.
## Refactoring to a kernel test
This is how I could write the same test using a kernel (integration) test:
```language-php
/** @test */
public function should_return_an_asset_id(): void {
// Arrange.
$node = Node::create([
'field_asset_id' => 'ABC',
'type' => 'device'
]);
// Assert.
self::assertSame('ABC', Device::fromNode($node)->getAssetId());
}
```
I can create a real `Node` object, pass that to the `Device` DTO, and call the `getAssetId()` method.
As I can interact with the database, there's no need to create mocks or define return values.
The 'arrange' step is much smaller, and I think that this is easier to read and understand.
### Trade-offs
Even though the test is cleaner, because there are no mocks there's other setup to do, including having the required configuration available, enabling modules, and installing schemas and configuration as part of the test - and having test-specific modules to store the needed configuration files.
Because of this, functional and kernel tests will take more time to run than unit tests, but an outside-in approach could be worth considering, depending on your project and team.

View file

@ -0,0 +1,21 @@
---
title: "Thoughts on automated code formatting"
pubDate: 2022-09-17
permalink: "archive/2022/09/17/thoughts-automated-code-formatting"
---
For a long time, I've been focused on writing code that complies with defined coding standards, either to pass an automated check from a tool like PHP Code Sniffer (PHPCS) or eslint, or a code review from a team member.
Complying with the standards though is something that I've done manually.
As well as automated tools for linting the code, there are tools like PHP Code Beautifier and Fixer, and Prettier for formatting the code based on the same standards, which I've started to use more recently.
These tools can be run on the command line, VS Code has a "Format on save" option, and I can do the same in Neovim using an auto-command that runs after writing a file if an LSP is attached. I typically use a key mapping for this though so I can run it when I need, rather than it running automatically every time a file is saved.
One of my concerns with automated code formatting is what to do when working with existing code that doesn't already follow the standards. If I need to make a change to a file, with automated formatting, the rest of the file can change due to formatting being applied when I save my change.
I recently introduced a PHPCS step to a CI pipeline for an existing project. I knew that it was going to fail initially, but I was able to see the list of errors. I ran the code formatter on each of the files to fix the errors, committed and pushed the changes, and watched the pipeline run successfully.
This meant that I had a commit reformatting all of the affected files, but it was good to combine these together rather than having them separate, and not mixed with any other changes like a new feature or a bug fix.
Since doing this, it's been nice when working in this codebase to not have to worry about code style violations, and I can focus on writing the code that I need to, knowing that I can rely on the automated formatting to fix any issues before I commit them.

View file

@ -0,0 +1,26 @@
---
title: "Useful Git configuration"
pubDate: 2022-09-19
permalink: "archive/2022/09/19/useful-git-configuration"
tags: ["git"]
---
Here are some snippets from my Git configuration file.
These days, I use a much simpler workflow and configuration since doing more trunk-based development, but in general, I rebase instead of merging by default, and prefer to use fast-forward merges that doesn't create a merge commit.
`branch.autosetuprebase = always` and `pull.rebase = true` configure Git to always rebase instead of pull. It does this for all branches, though I might override this for `main` branches.
`pull.ff = only` and `merge.ff = only` prevents creating a merge commit and will prevent the merge if it would create one. If I needed to override this, I could by using the `--no-ff` option on the command line.
I use `checkout.defaultRemote = origin` to ensure that the `origin` remote is used if I have multiple remotes configured, and `push.default = upstream` to set the default remote to push to.
`merge.autoStash` allows for running merges on a dirty worktree by automatically creating and re-applying a stash of the changes, and `fetch.prune` will automatically prune branches on fetch - keeping things tidy.
I also have and use a number of aliases.
Some like `pl = pull` and `ps = push` are shorter versions of existing commands, and some like `aa = add --all`, `fixup = commit --fixup` and some additional arguments to commands.
I also have some like `current-branch = rev-parse --abbrev-ref HEAD` and `worktrees = worktree list` which add simple additional commands, and some like `repush = !git pull --rebase && git push` which use execute shell commands to execute more complex commands or combine multiple commands.
This is a snapshot of my Git configuration. The [full version is on GitHub](https://github.com/opdavies/dotfiles/blob/7e935b12c09358adad480a566988b9cbfaf5999e/roles/git/files/.gitconfig).

View file

@ -0,0 +1,26 @@
---
title: "Why I like trunk-based development"
pubDate: 2022-09-20
permalink: "archive/2022/09/20/why-like-trunk-based-development"
tags: ["git"]
---
For the majority of my software development career, I've worked with version control in a very similar way.
There are one or two long-lived branches, usually a combination of `develop`, `master` or `main`, that contain the production version of the code. When starting work on a new feature or bug fix, a new branch is created where the changes are made in isolation, and is submitted for review once complete. This is typically referred to as "Git Flow" or "GitHub Flow".
Whilst those changes are awaiting review, a new task is started and the process is repeated.
## Trunk-based development
Something that I've been practicing and advocating for lately is trunk-based development, where there's only one branch that everyone works on, and commits and pushes to instead of creating separate per-task branches.
Even on a client project where I was the only Developer, I was used to creating per-task branches and I can recall when trying to demo two features to a client and the application broke when switching between branches.
The vast majority of the time, whether working individually or on a team, I've found that the per-task branches weren't needed and working on a single branch was easier and simpler.
There are still occassions when a temporary branch is needed, but in general, all changes are made to the single branch.
Trunk-based development ties in nicely with the continuous integration approach, where everyone commits and pushes their work at least once a day - ideally, multiple times a day. This eliminates long-running feature or bug fix branches that get out of sync with the main branch as well as conflicting with each other.
It seemed scary to begin with, having been used to per-task branches and asynchronous peer reviews via pull or merge requests, but trunk-based development has made things simpler and encourages other best practices such as pair and mob programming. having a good CI pipeline to identify regressions, using feature flags to separate code deployments from feature releases, and frequent code integration and deployment via continuous commits and pushes.

View file

@ -0,0 +1,34 @@
---
title: "Being a Drupal contribution mentor"
pubDate: 2022-09-21
permalink: "archive/2022/09/21/being-drupal-contribution-mentor"
tags: ["drupal"]
---
This week is DrupalCon Prague, and although I'm not at this event, I'd like to write about some my experiences at DrupalCon - in particular about being a contribution mentor.
## My first DrupalCon
The first DrupalCon that I attended was in 2013, also in Prague.
I was enjoying the session days when I stopped at the mentoring table to find out more about the contribution sprints that were happening on the Friday.
I didn't have any commits in Drupal core but had already worked on and released some of my own contributed modules, so I was familiar with the tools and the Drupal.org contribution workflow. In short, I was signed up to be a mentor during the sprints.
I remember being involved in the preparation too, sitting in a hotel lobby, identifying potential issues for new contributors to work on, alongside people who I'd previously interacted with in the issue queues on Drupal.org.
On the day, I helped new contributors get their local environments up and running, select issues to work on, and perform tasks like creating and re-rolling patch files and submitting them for review.
One of my highlights at the end of the day was the live commit, when a patch that a new contributor had worked on that day was committed to Drupal core live on stage!
Whenever I've attended DrupalCon events since, I've always volunteered to be a contribution mentor, as well as mentoring and organising sprints at other Drupal events.
## The Five Year Issue
One of the most memorable times mentoring was whilst working with a group of contributors at DrupalCon in May 2015.
Someone was working on a Drupal core issue that was very similar to [one that I'd looked at](https://www.drupal.org/project/drupal/issues/753898) a few years before.
We focused on the original issue that I'd commented on, reviewed, tested, and re-rolled the patch, fixed a failing test, and marked it as "reviewed and tested by the community".
A few days after the conference, and just over five years after my original comment, the patch was committed - giving my contributors their first commits to Drupal 8 core, and also [one of mine](https://git.drupalcode.org/project/drupal/-/commits/9.5.x?search=opdavies).

View file

@ -0,0 +1,20 @@
---
title: "Releasing a Drupal module template"
pubDate: 2022-09-22
permalink: "archive/2022/09/22/releasing-drupal-module-template"
tags: ["drupal"]
---
Today, I an the idea to create a reusable template for new Drupal modules, based on how I like to build modules and how I've shown others to do so in my Drupal testing workshop.
So I did, and released it for free [on my GitHub account](https://github.com/opdavies/drupal-module-template).
Like my Tailwind CSS starter theme on Drupal.org, it's not intended to be added as a module directly, but something that can be cloned and used as a base for people's own modules.
It includes an example route and Controller that load a basic page, and has a test to ensure that the page exists and loads correctly.
The Controller is defined as a service and uses autowiring to automatically inject the its dependencies, the same as in my workshop example code.
It's the initial release so it's rough around the edges still. I'll use it tomorrow to create a new module and document the steps to add to the README as well as other pieces of documentation.
If you're creating a new Drupal module and try it out, start a discussion on the GitHub repository or [let me know on Twitter](https://twitter.com/opdavies). If you have questions, create a discussion or just reply to this email and I'll get back to you.

View file

@ -0,0 +1,44 @@
---
title: "ADRs and Technical Design Documents"
pubDate: 2022-09-23
permalink: "archive/2022/09/23/adrs-technical-design-documents"
tags: []
---
## Architectural Decision Records
Architectural Decision Records (ADRs) are documents to record software design choices. They could be saved in your code repository as plain-text or Markdown files, or stored in Confluence or a wiki - wherever your team stores its documentation.
They usually consist of the sections:
* Status - is it proposed, accepted, rejected, deprecated, superseded, etc.?
* Context - what is the issue that is causing the decision or change?
* Decision - what is the change that's being done or proposed?
* Consequences - what becomes easier or more difficult to do?
Any change that is architecturally significant should require an ADR to be written, after which it can be reviewed and potentially actioned.
These will remain in place to form a decision log, with specific ADRs being marked as superseded if a newer ADR replaces it.
## Technical Design Documents
A similar type of document are Technical Design Documents (TDDs), that I first saw on TheAltF4Stream. I like to think of these as lightweight ADRs.
The first heading is always "What problem are we trying to solve?", or sometimes just "The problem".
Similar to the Context heading in an ADR, this should include a short paragraph describing the issue.
Unlike ADRs, there are no other set headings but these are some suggested ones:
- What is the current process?
- What are any requirements?
- How do we solve this problem?
- Alternative approaches
I like after describing the problem, being able to move straight into describing what's appropriate and relevant for this task and ignore sections that aren't needed.
When I started writing ADRs, they all had the 'Accepted' status as I was either writing them for myself or in a pair or mob. As wasn't adding any value, I've removed it since switching to writing TDDs.
Whether you use ADRs, TDDs or another approach, it's very useful to have a log of all of your architectural design decisions, both looking back in the future to remember why something was done in a certain way, or before you start implementing a solution to review the problem, evaluate the requirements and all potential solutions and document the selected one any why it was selected.
[Find our more about ADRs](https://adr.github.io) or [find out more about TDDs](https://altf4.wiki/t/how-do-i-write-a-tdd/21).

View file

@ -0,0 +1,28 @@
---
title: "Using a component library for front-end development"
pubDate: 2022-09-25
permalink: "archive/2022/09/25/using-component-library-for-front-end-development"
tags: []
---
On a current project, I've decided to use a component library as the first place to do front-end development.
I'm using [Fractal](https://fractal.build) as I can use Twig for templates. As Drupal also uses Twig templates, I have more reusabilty between the components in Fractal and Drupal compared to converting them from a different templating language like Handlebars or Nunjucks.
Rather than developing directly within the custom Drupal theme, I've been creating new components and pages initially within Fractal.
I have been able to create new components quickly and easily with the views uing Twig templates and inject data to it using a context file - a YAML file for each component that contains data that is injected automatically into the view.
This meant that I've been able to develop new components from scratch without needing to set up content types or paragraphs within Drupal, validate and confirm my data model, and present the templates to the client for review in Fractal. If a change is needed, it's quick to do.
I've also moved my asset generation step into Fractal. No CSS or JavaScript is being compiled within the Drupal theme, it is created within Fractal and copied over with the Twig templates.
In most cases, I've been able to copy the Twig templates into Drupal and replace the static context data with dynamic data from Drupal without needing to make any further changes.
In a couple of situations, I've needed to change my implementation slightly when moving a template into Drupal, so in this workflow, I've made the changes in Fractal and re-exported them to keep things in sync between the two systems.
In situations where there is existing markup and/or styles from the Drupal side, I've copied those into Fractal so that they match before adding the additional styling and any markup changes.
In general, I like the approach as it gives me more flexibility upfront to make changes before needing to configure Drupal. I can see how things could get out of sync between the two systems, but hopefully having the assets compiled in Fractal and needing to copy them into Drupal will keep things synced up.
I don't think that I'd use this approach for all projects, but for this one, where I'm working with multiple themes and will need to later add different variants of pages and components, it's worked well so far.

View file

@ -0,0 +1,20 @@
---
title: "Experimenting with the Nix package manager"
pubDate: 2022-09-26
permalink: "archive/2022/09/26/experimenting-with-the-nix-package-manager"
tags: ["nix"]
---
After seeing it on some recent live streams and YouTube videos, I've recently been trying out the Nix package manager and looking into how I might use it for my local environment setup - potentially replacing some of my current Ansible configuration.
Separate from the NixOS operating system, Nix is a cross-platform package manager, so instead of using `apt` on Ubuntu and `brew` on macOS, you could run Nix on both and install from the 80,000 packages listed on https://search.nixos.org/packages.
There is a community project called Home Manager which can be installed alongside Nix which, similar to Stow or what I'm doing with Ansible, can manage your dotfiles or even create them from your Home Manager configuration, and can manage plugins for other tools such as ZSH and tmux.
There's also a Nix feature called "Flakes" which allow you to separate configuration for different operating systems. I currently have a flake for Pop!\_OS which installs all of my packages and a minimal flake for my WSL2 environment as some of the packages are installed in Windows instead of Linux.
I can see Ansible still being used to set up my post-setup tasks such as cloning my initial projects, but the majority of my current Ansible setup where I'm installing and configuring packages I think could be moved to Nix.
I have a work-in-progress Nix-based version [in my dotfiles repository](https://github.com/opdavies/dotfiles/tree/7c3436c553f8b81f99031e6bcddf385d47b7e785) where you can also see [how I've configured Git with Home Manager](https://github.com/opdavies/dotfiles/blob/7c3436c553f8b81f99031e6bcddf385d47b7e785/home-manager/modules/git.nix).
I may install NixOS on an old laptop to test that out too.

View file

@ -0,0 +1,16 @@
---
title: "Mentoring with Drupal Career Online"
pubDate: 2022-09-27
permalink: "archive/2022/09/27/mentoring-with-drupal-career-online"
tags: ["drupal"]
---
Today, I met my new mentee from the Drupal Career Online program.
[As well as mentoring at events like DrupalCamps and DrupalCons]({{site.url}}/archive/2022/09/21/being-drupal-contribution-mentor), I enjoy mentoring and working with new Developers going through bootcamps and training programmes like Drupal Career Online, some who are experienced Developers who are learning a new skill, and some who are learning how to code and are taking their first steps into programming.
I've talked about [how I got started programming]({{site.url}}/archive/2022-08-28/how-started-programming), but as self-taught Developer, it would have been great to have had a mentor to ask questions of, to help me get me started, and to make sure that I was going down the right track and learning the correct things.
Maybe this is more applicable these days with more people learning and working from home since COVID-19?
Similar to helping mentees at a contribution sprint work towards their first commits to Drupal, it's great to be able to introduce new Developers to a open-source project and community such as Drupal, help develop their skills, and hopefully enable them to get the new job and career that they want.

View file

@ -0,0 +1,20 @@
---
title: "Mob programming at PHP South Wales"
pubDate: 2022-09-28
permalink: "archive/2022/09/28/mob-programming-php-south-wales"
tags: []
---
Tonight was our September meetup for the PHP South Wales user group, where I ran a hands-on session on mob programming.
I created [a small slide deck](https://speakerdeck.com/opdavies/an-introduction-to-mob-programming) before we started a mob session with the group.
We worked on the FizzBuzz kata in PHP, using Pest for our automated tests.
We followed the Driver and Navigator model, with one person responsible for the typing and interpreting the instructions from the Navigators, and switched roles every ten minutes.
You can [see the code that we wrote](https://github.com/opdavies/code-katas/blob/1da5dd5a79bc7ca083c0c4216fc3b4b0854f623d/php/tests/FizzBuzzTest.php) on my code katas GitHub repository.
It was a fun experience and nice to code with some people who I hadn't coded with before.
We did some code kata sessions during our online meetups which also seemed to go well, so coding nights on katas or personal or open-source projects might be something that we do more of in the future.

View file

@ -0,0 +1,83 @@
---
title: "Store Wars: different state management in Vue.js"
pubDate: 2022-09-30
permalink: "archive/2022/09/30/store-wars-vuejs"
tags: ["vue"]
---
I'm currently working on a Vue.js application that I started building in Vue 2 before starting to use the Composition API, and then moved it to Vue 3.
In the original version, I was using Vuex for state management within the application, and interacting with Vuex directly within my Vue components - calling `getters` and `dispatch` to retrieve and update data.
As part of moving to Vue 3, I wanted to evaluate any new options, like Pinia which is now the default state management library for Vue.
But because I was integrating with Vuex directly, switching to an alternative would mean changing code within my components.
## Defining a Store interface
This is a situation that often occurs in back-end development - where you may need to switch to a different type of database or a different payment provider in an eCommerce application.
In that situation, you need a generic interface that can be used by different implementations. Because they have consistent methods, one implementation can be replaced with another or multiple can be added at the same time. This is called the Strategy design pattern, and related to the open-closed principle in SOLID.
This is what I did by adding a `Store` interface:
```javascript
export default interface Store {
actions: {
addRow(): void;
init(): void;
removeRow(index: Number): void;
};
state: {
isLoading: boolean;
selection: {
items: [];
};
};
}
```
Any store that I want to work with needs to have these defined actions and state values, so I can use them within my components knowing that they will always be available.
## Creating a native Vue store
This is one implementation of the `Store` interface, using just Vue's `reactive` function from the Composition API:
```javascript
let state = reactive({
isLoading: false,
selection: {
items: [],
},
});
let actions = {
addRow(): void {
state.selection.items.push({
// ...
});
},
init(): void {
state.isLoading = true;
// ...
},
removeRow(index: number): void {
state.selection.items.splice(index, 1);
},
};
const vueStore: Store = {
actions,
state: readonly(state),
};
export default vueStore;
```
If I needed to add a Pinia version or another library, I can create another implementation that complies with same interface.
Each implementation being responsible for any specifics for that library - extracting that logic from the component code making it more flexible and reusable.

View file

@ -0,0 +1,34 @@
---
title: Why do code katas?
pubDate: 2022-10-01
permalink: daily/2022/10/01/code-katas
tags: []
---
## What are code katas?
Code katas are programming exercises which, like katas in martial arts, use practice and repetition to improve your skills.
Common katas are Fizzbuzz, the Bowling score calculator, and the Gilded Rose.
Each gives you the criteria of what the kata should do before it can be considered complete along with any specific information, and some websites will also give you a suite of failing tests to make pass - though I prefer to write my own and follow a test-driven development approach.
Once you have completed the solution and the criteria is satisfied, the kata is complete.
## Why I do code katas
As I said, doing code katas improves your skills by solving problems and identifying patterns that you may see when working in your project code.
Different katas focus on different patterns. For example, the Fibonacci Number kata focuses on recursion, whereas the Gilded Rose kata is all about refactoring complex legacy code.
Doing code katas keeps your skills sharp and gives you a different perspectives as you work through different katas. You can then use and apply these within your main projects.
If you want to learn a new programming language then working on a kata that you've already solved in a language that you're familiar with allows you to focus on the syntax and features of the new language. I've been working on some code katas in TypeScript as I've been working with that recently, and would like to do some in Go.
If you work as part of a team or a part of a meetup, code katas can be worked on as a group and can introduce new skills like automated testing and test-driven development as well as providing some opportunities for team-building and socialising. If you're trying to introduce pair or mob programming, then working on code katas could be a good first step.
If you're just getting started with programming, working on code katas will help you learn the fundamentals and problem solving, but I'd also encourage you to put the code on GitHub and blog about each kata that you complete. Doing so will help and encourage others and also look good when applying for roles.
P.S. There are lists of code katas at https://github.com/gamontal/awesome-katas and https://codingdojo.org/kata, and online versions at https://www.codewars.com/join and https://exercism.org/tracks. There are many others - if you have a favourite, reply to this email and let me know.
I have [some GitHub repositories for my code kata solutions](https://github.com/opdavies?tab=repositories&q=katas) and will continue to build these as I do more.

View file

@ -0,0 +1,32 @@
---
title: Minimum viable CI pipelines
pubDate: 2022-10-02
permalink: daily/2022/10/02/minimum-viable-pipelines
tags: []
---
When I start a new project, and sometimes when I join an existing project, there are no CI (continuous integration) pipelines, no automated tests, and sometimes no local environment configuration.
In that case, where should you start when adding a CI pipeline?
I like to start with the simplest solution to get a passing build and to prove the concept - even if it's a "Hello, world" message. I know that the pipeline is configured correctly and runs when expected, and gives the output that I expect.
I like to use Docker for my development environments, partly because it's very easy to reuse the same set up within a CI pipeline just by running `docker image build` or `docker compose build`.
Having a task that ensures the project builds correctly is a great next step.
Within a Dockerfile, I run commands to validate my lock files, download and install dependencies from public and private repositories, and often apply patch files to third-party code. If a lock file is no longer in sync with its composer.json or package.json file, or a patch no longer applies, this would cause Docker and the CI pipeline to fail and the error can be caught and fixed within the pipeline.
Next, I'd look to run the automated tests. If there aren't any tests, I'd create an example test that will pass to prove the concept, and expect to see the number of tests grow as new features are added and as bugs are fixed.
The big reason to have automated tests running in a pipeline is that all the tests are run every time, ensuring that the test suite is always passing and preventing regressions across the codebase. If any test fails, the pipeline fails. This is knows as continuous delivery - ensuring that code is always in a releasable state.
From there, I'd look to add additional tasks such as static analysis and code linting, as well as anything else to validate, build or deploy the code and grow confidence that a passing CI pipeline means that the code is releasable.
As more tasks are added to the pipeline, and the more of the code the tasks cover (e.g. test coverage) the more it can be replied upon.
If there is a failure that wasn't caught in the CI pipeline, then the pipeline itself should be iterated on and improved.
Having a CI pipeline allows you to identify issues sooner and fix them quicker, encourages best practices like automated testing and test-driven development, and enables continuous deployment where code is automatically deployed after a passing build.
If you have a project without a CI pipeline, I'd encourage you to add one, to start small, and continuously iterate on it over time - adding tasks that are useful and valuable, and that build confidence that you can safely release when you need to.

View file

@ -0,0 +1,75 @@
---
title: Refactoring to value objects
pubDate: 2022-10-03
permalink: daily/2022/10/03/refactoring-value-objects
tags: [php]
---
Here's a snippet of some Drupal code that I wrote last week. It's responsible for converting an array of nodes into a Collection of one of it's field values.
```language-php
return Collection::make($stationNodes)
->map(fn (NodeInterface $station): string => $station->get('field_station_code')->getString())
->values();
```
There are two issues with this code.
First, whilst I'm implicitly saying that it accepts a certain type of node, because of the `NodeInterface` typehint this could accept any type of node. If that node doesn't have the required field, the code will error - but I'd like to know sooner if an incorrect type of node is passed and make it explicit that only a certain type of node can be used.
Second, the code for getting the field values is quite verbose and is potentially repeated in other places within the codebase. I'd like to have a simple way to access these field values that I can reuse anywhere else. If the logic for getting these particular field values changes, then I'd only need to change it in one place.
## Introducing a value object
This is the value object that I created.
It accepts the original node but checks to ensure that the node is the correct type. If not, an Exception is thrown.
I've added a helper method to get the field value, encapsulating that logic in a reusable function whilst making the code easier to read and its intent clearer.
```language-php
namespace Drupal\mymodule\ValueObject;
use Drupal\node\NodeInterface;
final class Station implements StationInterface {
private NodeInterface $node;
private function __construct(NodeInterface $node) {
if ($node->bundle() != 'station') {
throw new \InvalidArgumentException();
}
$this->node = $node;
}
public function getStationCode(): string {
return $this->node->get('field_station_code')->getString();
}
public static function fromNode(NodeInterface $node): self {
return new self($node);
}
}
```
## Refactoring to use the value object
This is what my code now looks like:
```language-php
return Collection::make($stationNodes)
->map(fn (NodeInterface $node): StationInterface => Station::fromNode($node))
->map(fn (StationInterface $station): string => $station->getStationCode())
->values();
```
<<<<<<< HEAD:website/source/_daily_emails/2022-10-03.md
=======
>>>>>>> b9cea6d (chore: replace Sculpin with Astro):website/src/pages/daily-emails/2022-10-03.md
I've added an additional `map` to convert the nodes to the value object, but the second map can now use the new typehint - ensuring better type safety and also giving us auto-completion in IDEs and text editors. If an incorrect node type is passed in, then the Exception will be thrown and a much clearer error message will be shown.
Finally, I can use the helper method to get the field value, encapsulating the logic within the value object and making it intention clearer and easier to read.

View file

@ -0,0 +1,20 @@
---
title: First impressions of Astro
pubDate: 2022-10-08
permalink: daily/2022/10/08/first-impressions-astro
tags: [astro]
---
This week I attended another of Simon Vrachliotis' Pro Tailwind workshops.
The workshop again was great, teaching us about multi-style Tailwind components, such as a button that has props for variants like size, shape and impact, and how to create them in a flexible and maintainable way as well as making use of Headless UI.
For this workshop though, the examples and challenges used a tool that I wasn't familiar with - the Astro web framework.
I've seen a lot of blog posts and streams mentioning it but I hadn't tried it out for myself until the workshop.
What I find interesting is that it comes with a number of available integrations - from Tailwind CSS, to Vue, React, and Alpine.js, and you can use the all within the same project, or even on the same page. Installing an integration is as simple as `yarn astro add tailwindcss`.
The templates feel familiar and make use of front matter within Astro components, and regular YAML front matter works within Markdown files - which are supported out of the box.
I've been thinking of redoing my personal website and evaluating options, but I think that Astro might be a new one to add to the list.

View file

@ -0,0 +1,52 @@
---
title: Coding defensively, and Implicit vs explicit coding
pubDate: 2022-10-09
permalink: daily/2022/10/09/coding-defensively-implicit-explicit
tags: [tailwindcss, php]
---
As well as [being introduced to Astro](https://www.oliverdavies.uk/archive/2022/10/08/first-impressions-astro) in Simon's most recent Pro Tailwind workshop, something else that we discussed was implicit vs explicit coding, and coding defensively.
For example, if you had this code:
```javascript
const sizeClasses = {
small: 'px-3 py-1 text-sm',
medium: 'px-5 py-2',
large: 'px-7 py-2.5 text-lg',
}
const shapeClasses = {
square: '',
rounded: 'rounded',
pill: 'rounded-full',
}
```
Both the `medium` size and `square` shape have an implicit value.
The `small` size has a text size class of `text-sm` and the `large` size has `text-lg`. As there isn't a text size added for `medium`, it is implicitly `text-base` - the default text size.
Likewise, the `rounded` shape has a class of `rounded` and the `pill` shape has `rounded-full`. As a square button doesn't have any rounding, it has an empty string but it is implicitly `rounded-none` - the default border radius value.
If we were to code this explicitly, `text-base` and `rounded-none` would be added to their respective size and shape classes.
It's mostly personal preference, but explicitly adding the additional classes could potentially future-proof the components if there was a situation where the text size or border radius was being overridden.
It also makes it more obvious to anyone reading the code that these values are being set, rather than them needing to make that assumption - assuming that they're aware of the default values at all.
It's similar to having this example PHP code:
```language-php
function __invoke(string $type, int $limit): void {};
```
Whilst I'm using type hints for the parameters to ensure that the values are a string and an integer respectively, it's also safe to assume that the type shouldn't be an empty string, so do we check for that?
I'd also suggest that the limit shouldn't be a negative integer, so we'd want to check that the value is not less than zero, or if zero isn't being used as an "all" value, then we'd want to check that the limit is greater than one.
In this case, the type hints add some explicitness to the parameters, but checking for these additional conditions adds another defensive layer to the code - forcing it to return earlier with an explicit error message rather than causing a vaguer error and elsewhere in the application.
Personally, I like to be explicit and code defensively, making sure that I try and cover as many edge cases as possible and writing test cases for them.
Coming back to the Tailwind example, the majority of us decided to add in extra classes after the exercise and it was an interesting discussion and part of the workshop.

View file

@ -0,0 +1,14 @@
---
title: Contributing to open-source software, one small change at a time
pubDate: 2022-10-10
permalink: daily/2022/10/10/contributing-open-source-software-one-small-change-time
tags: [open-source]
---
Since looking more into Astro, I was looking through the GitHub repository - specifically within the examples - and spotted a typo within the title of one of the examples.
Rather than leaving it, I decided to follow the "boy-scout rule" and submit a fix and leave the code in a better state than I found it.
The Astro repository is hosted on GitHub so I was able to fork the repository, fix the typo and create a pull request with a few clicks in the GitHub UI.
Contributing to open-source software - particularly if you're new to it - doesn't mean that you need to always add large and complex changes. Small changes such as fixing a typo, updating documentation, fixing a small bug, or adding additional tests are all valid contributions that improve open-source projects.

View file

@ -0,0 +1,18 @@
---
title: Not long until Drupal 10
pubDate: 2022-10-11
permalink: daily/2022/10/11/not-long-until-drupal-10
tags: [drupal, php]
---
I was surprised to see this week that its only two months until Drupal 10 is released (14th December 2022).
Im starting a new Drupal development project in December so will be looking to get that on Drupal 10 as soon as possible.
From the clients perspective, getting their new project on Drupal 10 and them not needing to upgrade from D9 in the future is a big plus, even if the code differences between D9 and D10 are not that big - similar to Drupal 8 and 9.
As a module maintainer, its been great to again see issues being created with automated Drupal 10 compatibility patches - thanks to Rector.
Its great to see these regular updates and new versions of Drupal, but also for the PHP language, with PHP 7 being end-of-life next month.
Its a big difference compared to the long-term releases that we had for Drupal 6 and 7, and PHP 5, but one that I definitely prefer.

View file

@ -0,0 +1,28 @@
---
title: Overcoming deployment anxiety
pubDate: 2022-10-12
permalink: 'archive/2022/10/12/overcoming-deployment-anxiety'
---
As a Developer with 15 years of experience, I still sometimes get "deployment anxiety" - when I've backed up the database and tagged a release, but even though the CI pipelines are passing and the staging site is working, I'm holding off on pushing the latest code to be released to production - trying to think of any potential issues that could arise from this deployment and avoid any downtime.
When I thought about this further, the releases that I've felt anxious or nervous about have usually been in at one or both of the following categories:
* The release includes a lot of changes, and maybe a combination of different types of changes such as framework or CMS updates, bug fixes, or new features.
* It's been a long time, maybe weeks or months, since the last production release.
The best way to resolve both of these issues, I think, is to break down the large releases into smaller ones, and to deploy them more frequently.
In the opposite scenario, the releases where the changes are small and it's been a short time since the previous release - ideally minutes or hours - have been the ones where I've been the least nervous.
If a single commit is being released, then I can be confident that if there is a failure, I can either revert it and put things back the way they were or quickly identify the issue and push a fix. This isn't the case for large changes as the potential source of the failure is larger and it will take longer to find and fix.
If a bug fix or a feature needs to be reverted, I'm happy knowing that I can do that easily without also reverting the CMS update that was deployed separately - rather than them all being released together.
There are other advantages too - clients or product owners are generally happier if the new feature or fix that they requested is on production within hours or days rather than weeks or months, and having your latest code deployed to production rather than on a staging branch makes it a lot easier if you need to deploy an urgent fix or security update.
If you're familiar with the DevOps Research and Assessment (DORA) team, three of their key metrics are deployment frequency, lead time for changes, and time to restore service. All of these are improved by small and frequent releases.
In my [Deployments with Ansible and Ansistrano talk](https://www.oliverdavies.uk/talks/deploying-php-ansible-ansistrano), I mention that there is a separate rollback role, but I don't think that I've ever used it.
Because I'm deploying small changes often, it's usually much easier to fix forward than it is to rollback, and knowing this makes me a lot less anxious when deploying changes.

View file

@ -0,0 +1,17 @@
---
title: 14 years on Drupal.org and working with PHP and Drupal
pubDate: 2022-10-17
permalink: 'archive/2022/10/17/14-years-drupalorg'
---
Today I saw that my Drupal.org profile is showing that Ive been on that website for 14 years.
![A screenshot of my Drupal.org profile showing "On Drupal.org for 14 years 1 hour".]({{site.assets.url}}/assets/images/14-drupalorg.jpg)
Drupal.org is the online home of the open-source Drupal CMS project, and where I registered to ask questions on the forums as I started to learn Drupal. More recently, its been where Ive uploaded and maintain my own contributed projects and contribute patches to others, including Drupal core.
I even spent time working for the Drupal Association on Drupal.org itself.
I've talked about T-shaped Developers in a previous email, and whilst I've added complementary skills to my toolkit over the years, Drupal has been my main specialism and what I focused on when I started freelancing and later switched careers into software development.
With Drupal 10 just around the corner, I'm looking forward to seeing how Drupal continues to evolve and develop.

View file

@ -0,0 +1,19 @@
---
title: Pair and mob programming
pubDate: 2022-10-18
permalink: 'archive/2022/10/18/pair-mob-programming'
---
As well as my recent session at PHP South Wales, I've also been involved with a lot more mob programming recently with members of my team.
We recently added a new feature to our codebase that we completed over a couple of mob sessions - starting by describing the problem and some potential solutions within a [technical design document](https://www.oliverdavies.uk/archive/2022/09/23/adrs-technical-design-documents) before moving on to the implementation.
I was already familiar with the existing code that we needed to extend, so had some ideas of how to approach parts of the solution which we discussed - but there were other parts that I hadn't thought of.
What was very interesting was that an approach was suggested that I probably wouldn't have thought of myself, which become part of the final solution. This is an advantage of pair programming and is multiplied when working in groups - that you get to include everyone's thoughts, experience and perspective, and collectively decide on the best approach to take in real-time.
As a side effect, we had continuous code review from members of the group, and if we need to work on this code again in the future, everyone will already be familiar with it.
As it was already reviewed, we didn't need to wait before pushing the feature to production so it was delivered quickly and providing value by fixing an issue that someone was experiencing.
We're already working on the next feature as a group, and if you haven't tried pair or mob programming before, I'd recommend that you give it a try.

View file

@ -0,0 +1,23 @@
---
title: run file vs task runners
pubDate: 2022-10-19
permalink: daily/2022/10/19/run-vs-task-runners
# tags:
# -
---
[I've written a few earlier emails](https://www.oliverdavies.uk/archive/2022/08/15/using-run-file-simplify-project-tasks) about `run` files - a simple bash file that I add to my projects to simplify or combine common commands that I need to run often.
Recently, I've looked at a couple of alternatives to see how they compare.
One is very YAML based where all commands are written within a YAML file, and one is very Makefile-like and it does fix some of the confusion and issues that I've made with Makefiles in the past - such as passing arguments to commands, and dealing with `.PHONY`.
Whilst I like both of these approaches and that they offer small additional features like auto-completion of task names, after using one of them in a project for a short while, I think that I'm going to stick with the `run` file.
The main reason for this is that I like the simplicity of the `run` file, and that it's just a Bash file that contains functions.
There were a couple of things that I couldn't quite get to work in one of the other tools, such as setting the TTY value in a Docker Command - which is something that I was able to do with bash within the `run` file. The fact that I can write regular bash and reuse existing knowledge is a big plus rather than having to try to learn another syntax or DSL for another tool.
The main reason though is because bash is already installed everywhere. There's no additional tool for Developers to download and install so it keeps the barrier to entry low, and there's no additional dependencies to add to my CI pipeline for it to work.
I was able to use one of these other tools in GitHub Actions as someone had already written a workflow for it, and although I could possibly install it via a package manager, just being able to run a bash file in any CI tool was probably the deciding factor to stick with `run` files.

View file

@ -0,0 +1,27 @@
---
title: >
Cherry picking commits is an anti-pattern
pubDate: 2022-10-20
permalink: >-
archive/2022/10/20/cherry-picking-commits-is-an-anti-pattern
tags:
- git
---
`git cherry-pick` is a command that allows you to re-apply changes from existing commits - typically moving commits from one branch to another. Whilst it's good for some use-cases, I believe that it's generally an anti-pattern.
As I mostly do trunk-based development so only have a single branch, it's not a command that I'd run often - but I have seen it used in a Git Flow-type scenario where there are multiple long-lived branches and various other short-lived ones. Commits can be cherry-picked between a long-term branch like `develop` onto a feature branch rather than merging or rebasing, or to re-apply a hotfix from the `main` branch.
The main issue that I've seen with `cherry-pick` is where a number of changes have been merged into a branch which is being used for user acceptance testing by a client or product owner. They decide to approve some of the changes but not all, and the approved commits are cherry-picked onto a production branch and deployed.
In my opinion, this is very risky as there's no guarantee that the cherry-picked changes will work without the others, and as the artifact that's pushed to production is different to what was tested against, it arguably affects the value of doing the testing at all. Ideally, once the release has been tested and approved, the same artifact will be deployed - ensuring consistency and reducing the risk of any errors.
Potentially, the cherry-picked changes could be moved onto a release branch and tested again together without the other changes, but this would increase the testing overhead and the time for the changes to release production.
A good automated test suite would help, ensuring that the tests still pass once the cherry picking is done.
In this situation, I'd rather use feature flags (also known as "feature toggles"). This would mean that the code between the two environments would be the same, and allow for functionality to be enabled or disabled as needed. If a feature wasn't selected to be released, then it's feature flag would be disabled until it's approved.
A feature flag would also allow a feature to be switched off if it was causing errors without the need for a code deployment. If a change did need a code change, if you're following continuous integration and delivery, it would be easy to apply and push a fix.
These are the use-cases that I can think of or have seen for `git cherry-pick`. If you know of any others or use `cherry-pick` in your workflow in another way, reply to the email and let me know.

View file

@ -0,0 +1,23 @@
---
title: >
Automated testing and test-driven development are not the same
pubDate: 2022-10-21
permalink: >-
archive/2022/10/21/automated-testing-and-test-driven-development-are-not-the-same
tags:
- testing
---
Automated testing is where you write tests to ensure that your code works as expected, which can be re-run as needed and executed automatically without user input.
Test-driven development (TDD) is when you write the tests before the production code. You see the tests fail and write code until they pass, and then repeat the process. However, TDD is not just about the tests - it's about the design of the code.
By writing the tests first, you guarantee that the code that you write will be testable, which isn't something that you can't do if the production code is written first.
You may need to refactor your initial working implementation before it can be tested - which means that you could also break it during that time and need to spend time debugging and fixing any regressions. Ideally, you want the tests in place first before any refactoring, so if something breaks, you'll know because of the failing test.
TDD keeps your code cleaner and simpler, as you only write enough code to make the tests pass. Once a test is passing, you stop writing code, so you'll end up with less, simpler code as it's easy to know when to stop.
If you don't write the tests first, you may be tempted to skip writing them completely, leaving untested code or adding `TODO: add test` comments that may never get reviewed.
Also, where's the fun in writing tests for code that you've already written, that you know are going to pass?

View file

@ -0,0 +1,21 @@
---
title: >
Looking at LocalGov Drupal
pubDate: 2022-10-24
permalink: >-
archive/2022/10/24/looking-at-localgov-drupal
tags:
- drupal
---
Today, I've been looking at [LocalGov](https://localgovdrupal.org) - a Drupal distribution for building council websites, with a focus on code reuse and collaboration.
After a few small changes, I was able to get it running based on my [Docker Examples](https://github.com/opdavies/docker-examples) repository.
As someone who has worked with one of the Councils who are now using the platform, and was involved in early similar discussions around code reuse and collaboration between Councils, this has been something that I've been keen to try for a while.
I was able to get a basic site running after a fresh installation, and was interested to explore how some of the functionality was built. I've recently been looking at implementing similar functionality to LocalGov's alert banners onto a project and will be able to gain some inspiration from that or will look into the LocalGov version could be used.
I was happy to find some initial ways to contribute back. I had an error during the installation which I was able to fix and assist with in the [LocalGov issue queue on Drupal.org](https://www.drupal.org/project/localgov/issues/3307516#comment-14759989) by answering a support request, and after spotting a potential issue within the alert banner styling, [submitted a pull request with a fix](https://github.com/localgovdrupal/localgov_alert_banner/pull/225).
I like what the project is doing and agree with its goals, so hopefully I'll get an opportunity to use and contribute more in the future.

View file

@ -0,0 +1,25 @@
---
title: >
What are Drupal distributions?
pubDate: 2022-10-25
permalink: >-
archive/2022/10/25/what-are-drupal-distributions
tags:
- drupal
---
Yesterday's email was about the LocalGov Drupal distribution that I've been looking at, but I glossed over what a Drupal distribution is.
It's an interesting topic for me, having [written an article for Linux Journal](https://www.linuxjournal.com/content/speed-your-drupal-development-using-installations-and-distributions) about it in 2012.
Distributions are pre-configured versions of Drupal that include additional modules, themes, or configuration than you'd get if you installed a standard version of Drupal core.
By default, LocalGov includes content types for service pages, service landing pages and sub-pages, and additional menus and taxonomies, a different administration theme and a base theme to use for custom themes, and multiple additional modules that add alert banners, events, content reviews, search, media types, and sub-sites. This is all in addition to what Drupal core itself provides, and can be extended further with additional contrib or custom modules.
Commerce Kickstart was a distribution for Drupal 7 that added eCommerce functionality such as product and order types, shipping and payment methods, stock levels and discounts. Again, this could be extended further by adding more contrib or custom modules.
A few months ago, I started developing a distribution for managing meetup group websites, like PHP South Wales.
If you're starting a new Drupal website, there could be a distribution that exists that could provide some or all of the functionality that you need, and if new features or fixes are added, then they benefit everyone who uses it.
There are 1,430 distributions listed on https://www.drupal.org/project/project_distribution so take a look there and see if anything matches your needs.

View file

@ -0,0 +1,26 @@
---
title: >
Neovim as a Personalised Development Environment
pubDate: 2022-10-26
permalink: >-
archive/2022/10/26/neovim-as-a-personalised-development-environment
tags:
- neovim
---
A few months ago, TJ DeVries (a Neovim core team member) coined the phrase "Personalised Development Environment" or PDE.
[I've been using Neovim full-time](https://www.oliverdavies.uk/blog/going-full-vim) since July 2021 - starting with no configuration to configuring it with Vimscript and later with Lua - setting options like line numbers and relative numbers, tabs and spaces, and indent and fold levels.
I evaluated and installed some initial plugins to add functionality that I needed. Some of them I still use, and some I've replaced with alternative plugins or built-in solutions that have been included in newer versions of Neovim.
I added my own keymaps that made sense to me that either, in my opinion, improved on default keymaps or created new ones that made sense to me or configured a plugin that I'd added.
Recently, I found and added plugins that added a [HTTP client](https://github.com/rest-nvim/rest.nvim) and a [database connection manager](https://github.com/kristijanhusak/vim-dadbod-ui) to Neovim - two pieces of functionality that I'd used in other IDEs or separate applications.
I also [wrote my own Neovim plugin](https://github.com/opdavies/toggle-checkbox.nvim) for toggling checkboxes within Markdown lists.
Like Drupal and other open-source solutions that I use, I love being able to add or edit functionality as needed.
In the last year or so, I've definitely been able to personalise my Neovim setup to meet my needs, and have it work as a fully-fledged solution for PHP and JavaScript development, DevOps work, and technical writing (including this email).

View file

@ -0,0 +1,17 @@
---
title: >
Getting back into live streaming
pubDate: 2022-10-27
permalink: >-
archive/2022/10/27/getting-back-into-live-streaming
---
Surprisingly, it's been two and a half years since I last did a live coding stream.
As well as talk recordings and demos, I did a few live streams back then, working on the "Test-Driven Drupal" project and submitting a merge request to Drupal core.
It's been something that I'd like to get back into and pick up again, and I plan on doing that within the next few weeks.
I have a new freelance project due to start in December but getting back into streaming seems like a good way to make sure that I put aside time for open-source and side projects, as well as for writing longer form blog posts and hopefully starting to prepare more meetup and conference talks again.
I'll be streaming again on my YouTube channel, so if you'd like to be notified when I do, [please subscribe](https://www.youtube.com/channel/UCkeK0qF9HHUPQH_fvn4ghqQ?sub_confirmation=1).

View file

@ -0,0 +1,25 @@
---
title: >
Why write framework agnostic packages?
pubDate: 2022-10-28
permalink: >-
archive/2022/10/28/why-write-framework-agnostic-packages
tags:
- php
---
A couple of years ago, I wrote an integration for a client's Drupal Commerce website with an online eBook service as they wanted to sell eBook variations of their products.
They provided me with some example code for different PHP frameworks, each were separate and tightly-coupled to each framework, so there was no code shared between them. Because of this, and because there was no Drupal Commerce example, I wrote my own version.
However, I decided to make my version as reusable and loosely-coupled as possible. This meant that I'd be able to potentially reuse it for other clients and the same code could be used for different implementations.
Reusable code such as the configuration, different types of Requests, value objects for Customers, Orders and OrderItems, were all written within a separate, reusable PHP library. It contains it's own tests, has it's own CI pipeline, and it's own static analysis - ensuring that things work as expected.
With this code separated, the Drupal module was much smaller and responsible only for bridging the library's code with Drupal Commerce and adding any other Drupal-specific code.
The client is currently considering an upgrade from Drupal 7 to Drupal 9, which would also mean upgrading this module. But, with this approach, there's a lot less to upgrade. The library code can still be used, and I can focus on any Drupal-specific changes within the Drupal module.
I recently had an enquiry from someone who needs an integration with the same service. Whilst their requirements may be different, I could still re-use the reusable library code, and write any client-specific code within a custom module.
Finally, if I wanted to reuse this code within a different PHP eCommerce framework then I could by installing the library with Composer. This means that I'd get the same code without needing to manually copy it, keeping a single source that can be maintained separately upstream. I'd get the same code that I'm already familiar with, so I could focus only on how to integrate the library with that framework - again meaning less framework-specific code and a much lower development effort.

View file

@ -0,0 +1,25 @@
---
title: >
The open-source-first development workflow
pubDate: 2022-10-29
permalink: >-
archive/2022/10/29/the-open-source-first-development-workflow
tags:
- open-source
---
Yesterday's email talked about [writing reusable, framework-agnostic packages](https://www.oliverdavies.uk/archive/2022/10/28/why-write-framework-agnostic-packages) but didn't mention where those packages could be located.
They could be kept within a private repository and still have the same benefits, such as re-usability for internal projects, but I like to open-source code as often as I can and make it available publicly to see and use.
My preference is to follow an open-source-first workflow, identify which parts of a solution can be open-sourced and create them as open-source libraries or modules from the beginning rather than planning to extract them later. They can then be included within the main project using a dependency manager tool like Composer, npm or Yarn.
The eBook integration project that I mentioned was an example of this. I identified which pieces could be open-sourced, set up a public repository and put together an MVP based on that project's requirements. Issues were created for nice-to-have additions that could be added later, and the working version was installed with Composer.
There was no need to extract the code from the main project, no need to "clean it up" or check that it contained no client information, and I had the full Git history for the project - not just a new history from the point when the code was extracted and open-sourced.
I've worked on projects that contained a number of potential open-source components that would be released after project completion, but this didn't always happen - I assume due to time pressures to move on to the next project, a focus on adding new features or avoiding the risk of introducing breakages into the code. If the code had been open-sourced from the beginning, these things wouldn't have been an issue.
I've also worked on projects where I've followed an open-source-first approach and released a number of PHP libraries and Drupal modules, including [Private Message Queue](https://www.drupal.org/project/private_message_queue), [System User](https://www.drupal.org/project/system_user), and [Null User](https://www.drupal.org/project/null_user) modules. I've also been working on some legacy code recently and started to replace it with a library that I've already open-sourced, even though I'm in the early stages of developing it.
As someone who enjoys creating and working on open-source software, I would encourage you to open-source your code if you can and to do so sooner rather than later and not wait until the end of your project.

View file

@ -0,0 +1,45 @@
---
title: >
Refactoring one large test into multiple smaller tests
pubDate: 2022-10-30
permalink: >-
archive/2022/10/30/refactoring-one-large-test-into-multiple-smaller-tests
tags:
- php
- phpunit
---
Today I spent some time refactoring a large test within a client project, splitting it into several smaller tests. The commit removed 169 lines but added 233 lines.
So, why did I do this?
This test is responsible for testing the creation of products and product variants within Drupal Commerce from a custom CSV file and originally had a very generic name - "Should create a product and product variations from an array of data".
But it did much more than that:
1. It asserted that there are no initial existing products or product variations.
1. It ran a product import using some stub data.
1. It asserted that there are two products, each with two variations.
1. It asserted that each product has the correct title.
1. It asserted that each product variation has the correct title.
1. It asserted that each variation has the correct SKU.
1. It asserted that each variation has the correct price.
1. It asserted that each variation has the correct value for 10 product attributes.
All of this was hidden within a single test.
Whilst it was great as the original test name (I usually start with a vague name whilst I'm spiking the first test and until it's clearer to me what it's testing and what the correct name is), what I actually want this test to do is to check that the correct number of products and variations are created.
This refactoring task was to split the remaining assertions into their own named tests, after which I had six different tests.
This means that each piece of functionality and related assertions are now contained within their own named tests. I can read the test file and see the expected functionality within the test names rather than everything being grouped and hidden within a single vaguely-named test.
If an assertion fails, I can easily see in which test the failure occurred.
Each test is very simple and only a few lines long - it runs the product import, loads the created variation, and runs the appropriate assertions.
It'll be much easier to add new functionality to the importer by adding a new separate test rather than continuously adding to the large original one.
Even though there are more lines in the file after the refactoring, most of those are just because of adding the additional test functions. There are only 72 lines of actual test methods, and the reusable steps, such as running the product import as well as custom assertions, are defined as private methods to avoid duplication.
In my opinion, this was a good refactor to do, and now was a good time to do it before we get started on the next phase of the project.

View file

@ -0,0 +1,26 @@
---
title: >
Are sprints incompatible with Continuous Deployment?
pubDate: 2022-11-08
permalink: >-
archive/2022/11/08/are-sprints-incompatible-with-continuous-deployment
# tags:
# - a
# - b
---
It's been common for me whilst working on software projects to have work organised into sprints or cycles - a period, usually between 1 and 3 weeks, where the team is working on stories and tasks for that project.
In my experience, those changes are usually released at the end of that cycle. But it seems that's not always the case; see [release sprints](https://scrumdictionary.com/term/release-sprint):
> A specialised sprint whose purpose is to release deliverable results; it contains stories specific to release activities and finishing undone work. A release sprint usually contains no additional development.
If we worked in two-week cycles and released at the end of each one, it would be at least two weeks before a change could be deployed to production. But what if we wanted to follow continuous deployment and release more frequently? Maybe daily or hourly?
Instead of waiting for a release sprint, if we released multiple times within a single sprint, how would this fit into or affect the process?
Does the release cycle need to be tightly coupled to the sprint cycle or can they be separate and independent of each other?
I've worked on projects - including a current one - where I've done multiple releases in a sprint, so of course, it can be done from a technical perspective, but how do we get the best from both processes - whether they work together or separately?
This is something that I'm going to continue to experiment with, iterate on, and learn more about going forward.

View file

@ -0,0 +1,15 @@
---
title: >
Your conference talk has been accepted
pubDate: 2022-11-09
permalink: >-
archive/2022/11/09/your-conference-talk-has-been-accepted
---
Im happy to have had a conference talk proposal accepted for what will be my first in-person conference since DrupalCamp London in February 2020.
Ill be giving my "[Taking Flight with Tailwind CSS](https://www.oliverdavies.uk/talks/taking-flight-with-tailwind-css)" talk for the first time since February 2021, and in front of an in-person audience since June 2019.
The talk itself will need some updating. The last time I gave it, Tailwind CSS was on version 2.0.3. Its now on version 3.2.2 and includes features like the just-in-time engine, arbitrary values and variants, container queries, and a load of new utility classes.
I gave a lot of talks at online events in 2020, so after taking a bit of a break last year, it will be nice to speak in front of an in-person conference audience again.

View file

@ -0,0 +1,28 @@
---
title: >
Creating a small proof-of-concept application in an afternoon
pubDate: 2022-11-11
permalink: >-
archive/2022/11/12/creating-small-proof-of-concept-application-afternoon
# tags:
# - a
# - b
---
This morning, I was asked a “Could you build…” question.
It was an idea mentioned a short while ago and involves a simple, interactive form on the front end that sends requests to a public API, filters the results from the response and displays them to the user.
Id probably want to hide the API request behind a service responsible for interacting with the API and filtering the results - ensuring that the API could be switched with something else later if needed.
This afternoon, I built a small proof-of-concept application with Vue.js and TypeScript.
Theres no API, or service retrieving real-time results. All of the data is hard-coded within the App component, as well as the code that filters, sorts and returns the results.
The results are shown by adding a `<pre>{{ results }}</pre>` to the page, with a `<pre>{{ state.selection }}</pre>` to show the input data.
There isnt even any styling, with just some basic horizontal rules to split the page - similar to [these screenshots from Taylor Otwell](https://twitter.com/taylorotwell/status/1203356860818087944) of some work-in-progress versions of Vapor and Nova.
A working proof of concept, or a "spike", answers the initial "Can we build..." question. It can be shown to a client or other stakeholders, act as a starting point for discussions and requirements gathering and then be turned into user stories. It also allows the Developers to validate their initial thoughts and experiment with different approaches.
If the spike is successful, the idea can then be moved forward and implemented in a full way, otherwise, it can be stopped with a minimal amount of effort and time.

View file

@ -0,0 +1,26 @@
---
title: >
Building a minimum viable product and managing technical debt
pubDate: 2022-11-12
permalink: >-
archive/2022/11/12/building-a-minimum-viable-product-and-managing-technical-debt
# tags:
# - a
# - b
---
Yesterday's email was about a proof-of-concept application that Id quickly built to validate an idea and explore some initial approaches.
Today, Ive been working on a client project that Ive improved and maintained for a few years.
When I started working with this client, they had one website, built with Drupal 7 and Drupal Commerce. Now, there are x websites using the same codebase due to Drupals multi-site functionality.
My main task for the last few months has been to get one of their sites onto Drupal 9 (which I did, it went live in October).
This first site was the "minimum viable product" (MVP) - the least amount of functionality required to make it releasable to customers. This is different to a proof of concept which is to validate the idea and start a conversation about requirements and scope - where we define the MVP.
For example, there is the ability to create products and product variations from a CSV file. It loads the file from disk and creates the products, but it doesn't update a product variation if a row with an existing SKU is changed, or disable the variation if a row is removed from the file. There is no admin UI for the client to upload a file - the only file that it'll use is the one that's path is hard-coded within the module.
There are user stories for this, but we decided that we didn't need it for the initial launch and that we were happy to take on some technical debt, knowing that we can address it later when the original solutions are no longer sufficient.
Now the minimum viable solution has been released, we can continue to iterate and enhance it based on customers' feedback, add more functionality, and address the technical debt as needed and as requirements require us to do so.

View file

@ -0,0 +1,23 @@
---
title: >
How I manage multiple Drupal websites using the same codebase
pubDate: 2022-11-13
permalink: >-
archive/2022/11/13/how-i-manage-multiple-drupal-websites-using-the-same-codebase
tags:
- drupal
---
In my last email, I mentioned that I maintain several Drupal websites for a client using the same codebase, but how do I do that?
The sites use Drupal's multisite functionality to have a separate directory for each site, each with its own settings file and files, and potentially modules and themes. Whilst there are some downsides to this approach, and we did evaluate alternatives, this approach allows us to keep one hosting account and save the client money compared to hosting each site separately.
Each site has a separate database and configuration files, so out of the box, I can customise what functionality is needed on each site by turning modules on and off. Whilst this is fine for larger pieces of functionality, for smaller pieces I like to use feature flags.
I use feature flags on single-site projects to separate deploying code from releasing a change, but I can also use them here to toggle something per-site. This could be using a module like [Feature toggle](https://www.drupal.org/project/feature_toggle) or another way like a checkbox on a settings form. Anything that I can use to say "Do this if that is enabled".
Settings such as setting an endpoint URL or storing some API credentials would be set in an admin form and stored as configuration per site.
I've tried various iterations of this - initially duplicating the custom code and having several near-identical versions of the same modules (this wasn't good for maintenance). We also used environment variables. However, this didn't scale as I added more sites and needed to create a new set of environment variables every time.
This approach has worked well for the last few years on their original websites and should continue to work well as I upgrade and migrate them to their next versions.

View file

@ -0,0 +1,43 @@
---
title: >
Camel-case or snake-case for test methods?
pubDate: 2022-11-14
permalink: >-
archive/2022/11/14/camel-case-or-snake-case-for-test-methods
tags:
- testing
---
When writing object-orientated code, particularly in PHP, you usually write method names using camel-case letters - such as:
```language-php
public function doSomething(): void {
// ...
}
```
This is also true when writing methods within a test class - only that the method name is prefixed with the word `test`:
```language-php
public function testSomething(): void {
}
```
This is probably expected and complies with the PSR code style standards like PSR-12.
Something that I've seen some PHP developers and some frameworks prefer is to write their test methods using snake-case letters and commonly removing the `test` prefix in favour of using an annotation:
```language-php
/** @test */
public function the_api_should_return_a_200_response_code_if_everything_is_ok(): void {
// ...
}
```
This is something that I've done myself for a while, but now I'm starting to reconsider both options.
Whilst it's more readable, especially for longer test names (which I like to write), it's not consistent with method names in non-test files or non-test methods in test files; it looks odd if I need to add another annotation (do I keep a single annotation on one line, or just those with multiple annotations on the separate lines), and to do this, I need to disable some code sniffer rules for code to pass the PHPCS checks.
If I used camel-cased names, I wouldn't need the PHPCS overrides, the annotations would be simpler, and the code would be more consistent - so I think I'll try that way again in the next tests that I write and see how it feels.
Which do you prefer, and which would you expect to see in your project?

View file

@ -0,0 +1,21 @@
---
title: >
Writing good automated test names
pubDate: 2022-11-15
permalink: >-
archive/2022/11/15/writing-good-automated-test-names
tags:
- testing
---
Something that I often see in code examples or tutorials are test methods like `testGet` or `testAdd`, or `testSubtract`. Short method names that don't describe the scenario that they're testing in much detail.
What if a `get` method returns different types of value based on the input or a string is passed into a calculator method like `add` or `subtract`?
I'd assume that the result of the calculation returns the total, but the test method doesn't say this.
I'd rather be overly descriptive and write methods like `should_add_two_or_more_numbers_and_return_the_total()` rather than `testAdd`. It's a lot more readable and easier to see what the intention of the test is, and it's much better to use longer descriptive names when using options like `--testdox` with PHPUnit, which converts the method name into a sentence, which is what I do when running tests in CI pipelines.
Something that I've picked up and recommend is to start each test case with the word "It" or "Should". This gives it a more behavioural feel and puts you in the mindset of what you're testing and not the methods that you're executing.
A method like 'testAdd' might make sense within a unit test focusing on a single class and method, but as I usually do outside-in testing - which mostly uses functional and integration tests - this approach works well for me.

View file

@ -0,0 +1,17 @@
---
title: >
Why don't you write automated tests?
pubDate: 2022-11-16
permalink: >-
archive/2022/11/16/why-don't-you-write-automated-tests
tags:
- testing
---
Many projects Ive worked on in the past havent had an automated test suite.
If you don't or can't write tests for your project for some reason, I'd love it if you could reply to this email or [let me know on Twitter](https://twitter.com/opdavies) and let me know why.
I know some of the classic reasons, like "I don't have time" and "My clients won't pay for me to write them", but I'd like to get some more real-world examples.
Then I'll do some follow-up posts to look into and address them.

View file

@ -0,0 +1,17 @@
---
title: >
Agnostic CI pipelines with run files
pubDate: 2022-11-17
permalink: >-
archive/2022/11/17/agnostic-ci-pipelines-with-run-files
---
As I work on various projects, I use several different CI tools, such as GitHub Actions, Bitbucket Pipelines, and GitLab CI, as well as hosting providers that have build and deploy steps.
Some only run continuous integration checks, like automated tests and static analysis, some build and push Docker images, and some use Ansible and Ansistrano to deploy the changes to production.
Each tool has its configuration file with different settings and formats.
Rather than being too tightly coupled to a particular tool, I like to keep things as agnostic as possible and [use a run file](https://www.oliverdavies.uk/archive/2022/08/15/using-run-file-simplify-project-tasks) with separate `ci:build` and `ci:deploy` tasks.
This means that all the logic is within the run file rather than the CI tool-specific configuration file, so the file is shorter and cleaner; I can make changes to the CI tasks locally and quickly test changes and iterate, and also, as the logic is within the run file, I can easily switch to a different CI tool if needed without making changes to the tasks themselves.

View file

@ -0,0 +1,19 @@
---
title: >
One test a day keeps bugs away
pubDate: 2022-11-18
permalink: >-
archive/2022/11/18/one-test-a-day-keeps-bugs-away
tags:
- testing
---
This is a quote from a presentation by Diego Aguiar at SymfonyCon that I saw from [a tweet from SymfonyCasts](https://twitter.com/SymfonyCasts/status/1593551105471938560?t=A8wnRUa0tLbb2q5qLhcQnA).
I haven't seen the rest of the presentation, but I liked this quote and the idea of continuously improving a codebase using automated tests.
The talk was titled "Advanced Test Driven Development" so I assume that it was focused on ensuring that new functionality also has accompanying tests but it could also apply to existing code.
A lot of existing code that I've worked on wasn't covered by tests, so going back and writing tests for that code would be beneficial too - even if it's only one test a day. It would help to prevent and uncover existing bugs, enable the code to be refactored and changed without introducing regressions, and make the codebase more maintainable.
Small changes over time add up.

View file

@ -0,0 +1,17 @@
---
title: >
Are missing tests a blocker to refactoring?
pubDate: 2022-11-19
permalink: >-
archive/2022/11/19/are-missing-tests-a-blocker-to-refactoring
---
Is having automated tests a prerequisite for refactoring a piece of code?
Without passing tests for that code, any changes made could introduce a regression, and bugs can be accidentally introduced.
When refactoring with tests, you run them before making any changes to ensure they pass. The tests are rerun after every change to ensure that they still pass and that the change didn't introduce a regression that caused the test to fail. If a test fails, the change is reverted and re-attempted.
If I need to refactor some code without tests, the first thing that I'll do is add some initial tests before the main work.
Whilst nothing is stopping me from refactoring without the tests, the risk isn't something that I'd want to take on, and I'd much prefer to have some tests in place - just in case!

Some files were not shown because too many files have changed in this diff Show more