GithubRunPipeline
GithubRunPipeline(*args, pipeline_data_importer: PipelineDataImporter = None, **kwargs)
Bases: ExecutionCommand
Executes a GitHub Actions workflow pipeline and optionally imports artifacts.
This command triggers a GitHub workflow run, monitors its execution, and provides options for importing workflow artifacts and custom data processing through extensible importers.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"pipeline_owner": "Netcracker", # REQUIRED: Repository owner/organization
"pipeline_repo_name": "qubership-test-pipelines", # REQUIRED: Repository name
"pipeline_workflow_file_name": "test.yaml", # REQUIRED: Workflow filename (e.g., main.yaml, ci-cd.yml)
"pipeline_branch": "main", # OPTIONAL: Branch to run workflow from (default: repo's default branch)
"pipeline_params": { # OPTIONAL: Input parameters to pass to the workflow
"KEY1": "VALUE1",
"KEY2": "VALUE2"
},
"import_artifacts": false, # OPTIONAL: Whether to import workflow artifacts (default: false)
"use_existing_pipeline": 123456789, # OPTIONAL: Use existing workflow run ID instead of starting new one (debug feature)
"timeout_seconds": 1800, # OPTIONAL: Maximum wait time for workflow completion in seconds (default: 1800, 0 for async execution)
"wait_seconds": 1, # OPTIONAL: Wait interval between status checks in seconds (default: 1)
"retry_timeout_seconds": 180, # OPTIONAL: Timeout for GitHub client initialization and workflow start retries in seconds (default: 180)
"retry_wait_seconds": 1, # OPTIONAL: Wait interval between retries in seconds (default: 1)
"success_statuses": "SUCCESS,UNSTABLE" # OPTIONAL: Comma-separated list of acceptable completion statuses (default: SUCCESS)
}
Systems Configuration (expected in "systems.github" block):
{
"url": "https://github.com", # OPTIONAL: GitHub UI URL for self-hosted instances (default: https://github.com)
"api_url": "https://api.github.com", # OPTIONAL: GitHub API URL for self-hosted instances (default: https://api.github.com)
"password": "<github_token>" # REQUIRED: GitHub access token with workflow permissions
}
Output Parameters
- params.build.url: URL to view the workflow run in GitHub
- params.build.id: ID of the executed workflow run
- params.build.status: Final status of the workflow execution
- params.build.date: Workflow start time in ISO format
- params.build.duration: Total execution duration in human-readable format
- params.build.name: Name of the workflow run
Extension Points
- Custom pipeline data importers can be implemented by extending PipelineDataImporter interface
- PipelineDataImporter is passed into constructor of command via "pipeline_data_importer" arg
Notes
- Setting timeout_seconds to 0 enables asynchronous execution (workflow starts but command doesn't wait for completion)
- For self-hosted GitHub Enterprise, configure both "systems.github.url" and "systems.github.api_url"
- Custom data importers receive the command context and can implement advanced processing logic
GitlabRunPipeline
GitlabRunPipeline(*args, pipeline_data_importer: PipelineDataImporter = None, **kwargs)
Bases: ExecutionCommand
Runs GitLab Pipeline via Trigger or Create API and optionally imports artifacts.
This command runs GitLab Pipeline, monitors its execution, and provides options for importing resulting artifacts and custom data processing through extensible importers.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"pipeline_path": "path/to/gitlab_project", # REQUIRED: Full pipeline path (e.g. "group/subgroup/repo")
"pipeline_branch": "main", # OPTIONAL: Branch to run pipeline from (default: repo's default branch)
"trigger_type": "CREATE_PIPELINE", # OPTIONAL: Which API will be used to trigger the pipeline (CREATE_PIPELINE or TRIGGER_PIPELINE)
"pipeline_params": { # OPTIONAL: Input parameters to pass to the pipeline
"KEY1": "VALUE1",
"KEY2": "VALUE2"
},
"import_artifacts": false, # OPTIONAL: Whether to import pipeline artifacts (default: false)
"use_existing_pipeline": 123456789, # OPTIONAL: Use existing pipeline ID (or use 'latest' here) instead of starting new one (debug feature)
"timeout_seconds": 1800, # OPTIONAL: Maximum wait time for pipeline completion in seconds (default: 1800, 0 for async execution)
"wait_seconds": 1, # OPTIONAL: Wait interval between status checks in seconds (default: 1)
"retry_timeout_seconds": 180, # OPTIONAL: Timeout for GitLab client initialization and pipeline start retries in seconds (default: 180)
"retry_wait_seconds": 1, # OPTIONAL: Wait interval between retries in seconds (default: 1)
"success_statuses": "SUCCESS,UNSTABLE" # OPTIONAL: Comma-separated list of acceptable completion statuses (default: SUCCESS)
}
Systems Configuration (expected in "systems.gitlab" block):
{
"url": "https://github.com", # OPTIONAL: GitLab URL for self-hosted instances (default: https://gitlab.com)
"password": "<gitlab_token>" # REQUIRED: GitLab access token with CI/CD permissions
"trigger_token": "<gitlab_trigger_token>" # OPTIONAL: Special token issued for triggering pipeline. If not provided - will try to use CI_JOB_TOKEN
}
Output Parameters
- params.build.url: URL to view the pipeline run in GitLab
- params.build.id: ID of the executed pipeline
- params.build.status: Final status of the pipeline execution
- params.build.date: Workflow start time in ISO format
- params.build.duration: Total execution duration in human-readable format
- params.build.name: Name of the pipeline execution
Extension Points
- Custom pipeline data importers can be implemented by extending PipelineDataImporter interface
- PipelineDataImporter is passed into constructor of command via "pipeline_data_importer" arg
Notes
- Setting timeout_seconds to 0 enables asynchronous execution (workflow starts but command doesn't wait for completion)
- For self-hosted GitLab instances, configure "systems.github.url"
- Custom data importers receive the command context and can implement advanced processing logic
JenkinsRunPipeline
JenkinsRunPipeline(*args, pipeline_data_importer: PipelineDataImporter = None, **kwargs)
Bases: ExecutionCommand
Runs Jenkins Pipeline and optionally imports artifacts.
This command runs Jenkins Pipeline, monitors its execution, and provides options for importing resulting artifacts and custom data processing through extensible importers.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"pipeline_path": "TENANT-NAME/path/to/job", # REQUIRED: Full pipeline path (e.g. "TENANT/folder/job")
"pipeline_params": { # OPTIONAL: Input parameters to pass to the pipeline
"KEY1": "VALUE1", # Side-note: if you want to run your parametrized job with default parameters,
"KEY2": "VALUE2" # you still need to pass some fake params (they will be ignored by Jenkins), e.g. "__fake_key":"fake_value",
}, # Otherwise, if this dict is empty - endpoint for non-parametrized jobs will be triggered
"import_artifacts": true, # OPTIONAL: Whether to import pipeline artifacts (default: true)
"use_existing_pipeline": 123456789, # OPTIONAL: Use existing pipeline ID instead of starting new one (debug feature)
"timeout_seconds": 1800, # OPTIONAL: Maximum wait time for pipeline completion in seconds (default: 1800, 0 for async execution)
"wait_seconds": 1, # OPTIONAL: Wait interval between status checks in seconds (default: 1)
"retry_timeout_seconds": 180, # OPTIONAL: Timeout for GitLab client initialization and pipeline start retries in seconds (default: 180)
"retry_wait_seconds": 1, # OPTIONAL: Wait interval between retries in seconds (default: 1)
"success_statuses": "SUCCESS,UNSTABLE" # OPTIONAL: Comma-separated list of acceptable completion statuses (default: SUCCESS)
}
Systems Configuration (expected in "systems.jenkins" block):
{
"url": "https://github.com", # REQUIRED: Jenkins instance URL
"username": "<jenkins_user>" # REQUIRED: Jenkins user
"password": "<jenkins_token>" # REQUIRED: Jenkins password or token with job-triggering permissions
}
Output Parameters
- params.build.url: URL to view the pipeline run in GitLab
- params.build.id: ID of the executed pipeline
- params.build.status: Final status of the pipeline execution
- params.build.date: Workflow start time in ISO format
- params.build.duration: Total execution duration in human-readable format
- params.build.name: Name of the pipeline execution
Extension Points
- Custom pipeline data importers can be implemented by extending PipelineDataImporter interface
- PipelineDataImporter is passed into constructor of command via "pipeline_data_importer" arg
Notes
- Setting timeout_seconds to 0 enables asynchronous execution (workflow starts but command doesn't wait for completion, and won't fetch build id)
JiraCreateTicket
JiraCreateTicket(context_path: str = None, input_params: dict = None, input_params_secure: dict = None, folder_path: str = None, parent_context_to_reuse: ExecutionContext = None, pre_execute_actions: list[ExecutionCommandExtension] = None, post_execute_actions: list[ExecutionCommandExtension] = None)
Bases: ExecutionCommand
Creates new issue/ticket in JIRA project.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"ticket": {
"fields: { # REQUIRED: Dict structure that will be used as ticket-creation-body, without transformations
"project": {"key": "<YOUR_PROJECT_KEY>"}, # REQUIRED: Project Key
"issuetype": {"name": "Bug"}, # REQUIRED: Issue type name
"priority": {"name": "High"}, # OPTIONAL: Other ticket fields with different formats, depending on your Project configuration
"duedate": "2030-02-20", # OPTIONAL: Text-value fields need no dict wrappers
"summary": "[SOME_LABEL] Ticket Subject",
"description": "Ticket body",
"components": [{"name":"COMPONENT NAME"}],
"labels": ["Test_Label1"],
},
"comment": "your comment body", # OPTIONAL: Comment to add to created ticket
"field_names_filter": "summary,issuetype,creator,status", # OPTIONAL: Comma-separated names of fields to extract from created ticket to output params
},
"retry_timeout_seconds": 180, # OPTIONAL: Timeout for JIRA client operations in seconds (default: 180)
"retry_wait_seconds": 1, # OPTIONAL: Wait interval between retries in seconds (default: 1)
}
Systems Configuration (expected in "systems.jira" block):
{
"url": "https://your_cloud_jira.atlassian.net", # REQUIRED: JIRA server URL
"username": "your_username_or_email", # REQUIRED: JIRA user login or email
"password": "<your_token>", # REQUIRED: JIRA user token
"auth_type": "basic" # OPTIONAL: 'basic' or 'bearer'
}
Command name: "jira-create-ticket"
JiraAddTicketComment
JiraAddTicketComment(context_path: str = None, input_params: dict = None, input_params_secure: dict = None, folder_path: str = None, parent_context_to_reuse: ExecutionContext = None, pre_execute_actions: list[ExecutionCommandExtension] = None, post_execute_actions: list[ExecutionCommandExtension] = None)
Bases: ExecutionCommand
Adds comment to JIRA ticket and retrieves latest comments.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"ticket": {
"id": "BUG-567", # REQUIRED: Ticket ID
"comment": "your comment body", # REQUIRED: Comment body
"latest_comments_count": 50, # OPTIONAL: Number of latest comments to fetch
},
"retry_timeout_seconds": 180, # OPTIONAL: Timeout for JIRA client operations in seconds (default: 180)
"retry_wait_seconds": 1, # OPTIONAL: Wait interval between retries in seconds (default: 1)
}
Systems Configuration (expected in "systems.jira" block):
{
"url": "https://your_cloud_jira.atlassian.net", # REQUIRED: JIRA server URL
"username": "your_username_or_email", # REQUIRED: JIRA user login or email
"password": "<your_token>", # REQUIRED: JIRA user token
"auth_type": "basic" # OPTIONAL: 'basic' or 'bearer'
}
Command name: "jira-add-ticket-comment"
JiraUpdateTicket
JiraUpdateTicket(context_path: str = None, input_params: dict = None, input_params_secure: dict = None, folder_path: str = None, parent_context_to_reuse: ExecutionContext = None, pre_execute_actions: list[ExecutionCommandExtension] = None, post_execute_actions: list[ExecutionCommandExtension] = None)
Bases: ExecutionCommand
Updates ticket fields and transitions status.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"ticket": {
"fields: { # REQUIRED: Dict structure that will be used as ticket-update-body, without transformations
"status": {"name": "Done"}, # OPTIONAL: Next status name
"transition": {"name": "From Review to Done"}, # OPTIONAL: Transition name
"priority": {"name": "High"}, # OPTIONAL: Other ticket fields with different formats, depending on your Project configuration
"duedate": "2030-02-20", # OPTIONAL: Text-value fields need no dict wrappers
"description": "Ticket body",
"labels": ["Test_Label1"],
},
"id": "BUG-567", # REQUIRED: Ticket ID
"comment": "your comment body", # OPTIONAL: Comment to add to created ticket
"field_names_filter": "summary,issuetype,creator,status", # OPTIONAL: Comma-separated names of fields to extract from created ticket to output params
},
"retry_timeout_seconds": 180, # OPTIONAL: Timeout for JIRA client operations in seconds (default: 180)
"retry_wait_seconds": 1, # OPTIONAL: Wait interval between retries in seconds (default: 1)
}
Systems Configuration (expected in "systems.jira" block):
{
"url": "https://your_cloud_jira.atlassian.net", # REQUIRED: JIRA server URL
"username": "your_username_or_email", # REQUIRED: JIRA user login or email
"password": "<your_token>", # REQUIRED: JIRA user token
"auth_type": "basic" # OPTIONAL: 'basic' or 'bearer'
}
Command name: "jira-update-ticket"
PodmanRunImage
PodmanRunImage(context_path: str = None, input_params: dict = None, input_params_secure: dict = None, folder_path: str = None, parent_context_to_reuse: ExecutionContext = None, pre_execute_actions: list[ExecutionCommandExtension] = None, post_execute_actions: list[ExecutionCommandExtension] = None)
Bases: ExecutionCommand
Executes a container using "podman run" command.
This command supports running containers with configurable execution parameters, environment variable management, file mounting, and output extraction.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"image": "docker.io/library/hello-world:latest", # REQUIRED: Container image to run
"command": "python -m pipelines_declarative_executor run --pipeline_dir="/WORK/EXEC_DIR"", # OPTIONAL: Command to execute in container
"execution_config": { # ALL OF THESE ARE OPTIONAL
"working_dir": "/some/dir/inside/container", # Working directory inside container
"timeout": "600", # Maximum execution time in seconds
"operations_timeout": "15", # Timeout for operations like file copying
"remove_container": True, # Whether to remove container after execution
"save_stdout_to_logs": True, # Save container stdout to execution logs
"save_stdout_to_files": True, # Save container stdout to output files
"save_stdout_to_params": False, # Save container stdout to output parameters
"expected_return_codes": "0,125", # Comma-separated list of acceptable exit codes
"additional_run_flags": "--cgroups=disabled", # Optional string of flags that will be added to "podman run" command
},
"before_script": {
"mounts": { # Filesystem mounts, "host_path: container_path"
"output_files": "/WORK",
"prepared_data": "/CONFIGS"
},
"env_vars": {
"explicit": { # Direct environment variable assignment
"PIPELINES_DECLARATIVE_EXECUTOR_ENCRYPT_OUTPUT_SECURE_PARAMS": False
},
"env_files": [ # Environment files on host to load and pass into container
"../CONFIGS/sample.env"
],
"pass_via_file": { # Sensitive vars passed via temp file
"SOMETHING_VERY_SECURE": "PASSWORD"
},
"host_prefixes": [ # Host environment variable prefixes to pass through. Can use "*" to pass everything from host.
"SOME_PREFIX_*"
]
}
},
"after_script": {
"copy_files_to_host": { # Copy files from container to host after execution, "host_path: container_path"
"output_files/report.json": "/WORK/EXEC_DIR/pipeline_state/pipeline_ui_view.json",
"output_files/pipeline_state": "/WORK/EXEC_DIR/pipeline_state",
},
"extract_params_from_files": { # OPTIONAL: Extract parameters from container files. Supports JSON, YAML, and ENV files
"SOME_FILE_IN_CONTAINER": "SECTION_NAME_IN_PARAMS_WHERE_IT_WILL_BE_STORED",
}
}
}
Output Parameters
- params.execution_time: Total execution time in seconds
- params.return_code: Container exit code
- params.stdout: Container stdout (if save_stdout_to_params enabled)
- params.stderr: Container stderr (if save_stdout_to_params enabled)
- params.extracted_output.*: Extracted parameters from files (if extract_params_from_files configured)
Notes
- The command automatically handles container lifecycle including start, execution, and cleanup
- All host-paths (including mount paths) are resolved relative to context directory.
SendEmail
SendEmail(context_path: str = None, input_params: dict = None, input_params_secure: dict = None, folder_path: str = None, parent_context_to_reuse: ExecutionContext = None, pre_execute_actions: list[ExecutionCommandExtension] = None, post_execute_actions: list[ExecutionCommandExtension] = None)
Bases: ExecutionCommand
This command sends email notification with optional attachments.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"email_subject": "Report for 01.01.2026", # REQUIRED: E-mail subject
"email_body": "Following jobs were completed: ...", # REQUIRED: E-mail message
"email_recipients": "user1@qubership.org,user2@qubership.org", # REQUIRED: Comma-separated list of recipients
"email_body_type": "plain", # OPTIONAL: Either "plain" or "html
"attachments": { # OPTIONAL: Dict with attachments
"unique_attachment_key": {
"name": "HTML_report.html", # REQUIRED: File name used for attachment
"content": "<html>...</html>", # REQUIRED: Text content put inside attachment
"mime_type": "text/html",
},
"another_attachment_key": {...}
}
}
Systems Configuration (expected in "systems.email" block):
{
"server": "your.mail.server.org", # REQUIRED: E-mail host server
"port": "3025" # REQUIRED: E-mail port
"user": "your@email.bot" # REQUIRED: E-mail user
"password": "<email_password>" # OPTIONAL: E-mail password
"use_ssl": "False" # OPTIONAL: SMTP connection will use SSL mode (default: False)
"use_tls": "False" # OPTIONAL: SMTP connection will use TLS mode (default: False)
"verify": "False" # OPTIONAL: SSL Certificate verification (default: False)
"timeout_seconds": "60" # OPTIONAL: SMTP connection timeout in seconds (default: 60)
}
Command name: "send-email"
SendWebexMessage
SendWebexMessage(context_path: str = None, input_params: dict = None, input_params_secure: dict = None, folder_path: str = None, parent_context_to_reuse: ExecutionContext = None, pre_execute_actions: list[ExecutionCommandExtension] = None, post_execute_actions: list[ExecutionCommandExtension] = None)
Bases: ExecutionCommand
This command sends Webex message with optional attachments.
Input Parameters Structure (this structure is expected inside "input_params.params" block):
{
"webex_message": "Hello, world!", # REQUIRED: Text message (Markdown format is supported)
"parent_id": "1234321", # OPTIONAL: The parent message to reply to
"attachments": { # OPTIONAL: Dict with attachments
"unique_attachment_key": {
"name": "HTML_report.html", # REQUIRED: File name used for attachment
"content": "<html>...</html>", # REQUIRED: Text content put inside attachment
"mime_type": "text/html",
},
"another_attachment_key": {...}
}
}
Systems Configuration (expected in "systems.webex" block):
{
"room_id": "...Y2lzY29zc...", # REQUIRED: Webex unique room_id where message will be posted
"token": "your_bot_account_token" # REQUIRED: Bot/Service account token that will be used to send message
"proxy": "https://127.0.0.1" # OPTIONAL: Host to be used as a webex-proxy
}
Output Parameters
- params.message_id: Received
message_idof sent message - params.attachment_message_ids: dict of
attachment_name->message_id
Command name: "send-webex-message"