uuid: - value: d047a824-fbb1-40b0-8391-ceba0cc250f5 langcode: - value: en type: - target_id: daily_email target_type: node_type target_uuid: 8bde1f2f-eef9-4f2d-ae9c-96921f8193d7 revision_timestamp: - value: '2025-05-11T09:00:08+00:00' revision_uid: - target_type: user target_uuid: b8966985-d4b2-42a7-a319-2e94ccfbb849 revision_log: { } status: - value: true uid: - target_type: user target_uuid: b8966985-d4b2-42a7-a319-2e94ccfbb849 title: - value: 'Using a run file in your CI pipeline' created: - value: '2024-07-26T00:00:00+00:00' changed: - value: '2025-05-11T09:00:08+00:00' promote: - value: false sticky: - value: false default_langcode: - value: true revision_translation_affected: - value: true path: - alias: /daily/2024/07/26/using-a-run-file-in-your-ci-pipeline langcode: en body: - value: |
One of my earliest daily emails was about run
files - files that contain Bash functions that combine or simplify project-specific tasks.
In Drupal projects, these could be to execute Composer or Drush commands, connect to the database, or run automated tests.
For my CI pipelines, I like to use a function called ci:test
that contains all the commands to run in the pipeline.
This keeps the pipeline configuration as simple and agnostic as possible.
It also makes it easy for people to read and, because it's a bash file, it will run anywhere without any additional tools.
For an example, see my Drupal Docker example repository.
The main advantage, though, is being able to run the pipeline locally, if you need to.
Maybe you need to debug a failure in the pipeline or you want to test a change to the pipeline locally before pushing it.
By using a command in a run
file, doing so is as simple as running that one command.
One of my earliest daily emails was about run
files - files that contain Bash functions that combine or simplify project-specific tasks.
In Drupal projects, these could be to execute Composer or Drush commands, connect to the database, or run automated tests.
For my CI pipelines, I like to use a function called ci:test
that contains all the commands to run in the pipeline.
This keeps the pipeline configuration as simple and agnostic as possible.
It also makes it easy for people to read and, because it's a bash file, it will run anywhere without any additional tools.
For an example, see my Drupal Docker example repository.
The main advantage, though, is being able to run the pipeline locally, if you need to.
Maybe you need to debug a failure in the pipeline or you want to test a change to the pipeline locally before pushing it.
By using a command in a run
file, doing so is as simple as running that one command.