AWS CodePipeline Concepts: Difference between revisions
(→Stage) |
|||
Line 82: | Line 82: | ||
Actions: | Actions: | ||
- ... | - ... | ||
=Artifact= | |||
A [[#Revision|revision]] propagates through the pipeline by having files associated with [[#Action|actions]] performed at different stages copied by CodePipeline service in different folders of the S3 bucket associated with the pipeline. These objects are referred to as artifacts. Artifacts may be used as input to an action (Input Artifacts) or may be produced by an action (Output Artifacts). | |||
=<span id='Stage_Action'></span>Action= | =<span id='Stage_Action'></span>Action= |
Revision as of 17:39, 17 March 2019
External
- https://docs.aws.amazon.com/codepipeline/latest/userguide/concepts.html
- CreatePipeline API Request Reference
Internal
CodePipeline as AWS Service
CodePipeline is an AWS service, named "codepipeline.amazonaws.com".
Pipeline
A pipeline is a top-level AWS resource that provides CI/CD release pipeline functionality.
From a conceptual perspective, a pipeline is a workflow construct that describes how software changes go through a release process.
As implemented in AWS, the pipeline, consist in a set of sequential stages, each stage containing one or more actions. A specific stage is always in a fixed position relative to other stages. However, actions within a stage can be executed sequentially, according their run order or in parallel. Stages and actions process artifacts, which "advance" along the pipeline. A pipeline can be created the following CloudFormation sequence:
Resources: Pipeline: Name: !Ref AWS::StackName Type: AWS::CodePipeline::Pipeline Properties: RoleArn: 'arn:aws:iam::777777777777:role/CodePipelineServiceRole-1' ArtifactStore Type: 'S3' Location: 'experimental-s3-bucket-for-codepipeline' ... Stages: ...
An example of a simple, working GitHub-based pipeline is available here:
Required Configuration
The pipeline requires a number of configuration properties:
RoleArn
The pipeline needs to be associated with a service role, which allows the codepipeline service to execute various actions required by pipeline operations.
ArtifactStore
The pipeline requires an artifact store, which provides the storage for transient and final artifacts that are processed by the various stages and actions. In most cases, the storage is provided by an Amazon S3 bucket. "Location" specifies the name of the bucket. When the pipeline is initialized, the codepipeline service creates a directory associated with the pipeline. The directory will have the same name as the pipeline. As the pipeline operates, sub-directories corresponding to various input and output artifacts declared by actions will be also created.
When Amazon Console is used to create the first pipeline, an S3 bucket is created in the same region as the pipeline to be sued items for all pipelines in that region, associated with the account.
Optional Configuration
Optionally, a name can also be configured with a name:
Name
Optional parameter, that provides the physical ID for the pipeline. If not specified, a name will be generated based on the stack-name-Pipeline-24RCYXM52UE6A pattern. A recommended name is:
Name: !Ref AWS::StackName
Revision
A revision is a change made to a source that feeds a pipeline. The revision can be triggered by a git push command, or an S3 file update in a versioned S3 bucket. Each revision runs separately through the pipeline. Multiple revisions can be processed in the same pipeline, but each stage can process only one revision at a time. Revisions are run through the pipeline as soon as a change is made in the location specified in the source stage of the pipeline.
Stage
A stage is a component of the workflow implemented by the pipeline. Each stage has an unique name within the pipeline. A pipeline must have at least 2 stages, one-stage pipeline will be considered invalid. A stage contains one or more actions, which could be executed sequentially or in parallel. All actions configured in a stage must complete successfully before a stage is considered complete.
Resources: MyPipeline: Type: AWS::CodePipeline::Pipeline ... Stages: - Name: ... Actions: - ... - Name: ... Actions: - ...
Artifact
A revision propagates through the pipeline by having files associated with actions performed at different stages copied by CodePipeline service in different folders of the S3 bucket associated with the pipeline. These objects are referred to as artifacts. Artifacts may be used as input to an action (Input Artifacts) or may be produced by an action (Output Artifacts).
Action
An action is a task performed on an artifact, and it is triggered at a specific stage of a pipeline. The action may occur in a specified order, or in parallel, depending on their configuration. All actions share a common structure:
Action Name
An action name must match the regular expression pattern: [A-Za-z0-9.@\-_]+ The action name must not contain spaces.
Action Type Declaration (ActionTypeId)
The action type declaration specifies an action provider. Currently, six types of actions are supported:
Custom actions can also be developed.
Input Artifacts
An action declares zero or more input artifacts. These are actually files to be processed by the action. As an implementation detail, the name of an input artifact corresponds to the name of a sub-directory of the pipeline directory maintained in the artifact store of the pipeline.
Output Artifacts
An action declares zero or more output artifacts, which are the names or the IDs of the results of the action. Very commonly, an output artifact is the name of an S3 folder, created by the pipeline inside the pipeline's S3 folder, which is created inside the artifact store S3 bucket. The "OutputArtifact" S3 folder by the action.
Run Order
Configuration
Configuration elements are specific to the action provider and are passed to it.
Available Actions
Source
Resources:
MyPipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
...
Stages:
- Name: Source
Actions:
- Name: !Sub 'github-pull-${Branch}'
ActionTypeId:
Category: Source
Owner: ThirdParty
Version: '1'
Provider: GitHub
Configuration:
Owner: !Ref GitHubOrganizationID
Repo: !Ref GitHubRepositoryName
Branch: !Ref Branch
OAuthToken: !Ref GitHubPersonalAccessCode
InputArtifacts: []
OutputArtifacts:
- Name: 'sources'
RunOrder: 1
- Name: ...
The action provider, which can be GitHub or other source repository provider, performs a repository clone and packages the content as a ZIP file. The ZIP file is placed in the artifact store, under the directory corresponding to the pipeline and the sub-directory named based on the "OutputArtifacts.Name" configuration element. Assuming that the pipeline is named "thalarion", the output ZIP file is placed in s3://thalarion-buildbucket-enqyf1xp13z2/thalarion/sources.
An example of a simple, working GitHub-based pipeline is available here:
GitHub Authentication
Build
Resources:
MyPipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
...
Stages:
...
- Name: Build
Actions:
- Name: !Sub '${Buildspec}-driven-CodeBuild'
ActionTypeId:
Category: Build
Owner: AWS
Version: '1'
Provider: CodeBuild
InputArtifacts:
- Name: 'sources'
OutputArtifacts:
- Name: 'build'
Configuration:
ProjectName: !Ref CodeBuildProject
RunOrder: 1
- Name: ...
The action provider, which in this case is the CodeBuild service, executes the build. Existing build projects can be used, or new ones can be created in the CodePipeline console. The build artifacts are placed in the artifact store, under the directory corresponding to the pipeline and the sub-directory named based on the "OutputArtifacts.Name" configuration element. Assuming that the pipeline is named "thalarion", the build artifacts are placed in s3://thalarion-buildbucket-enqyf1xp13z2/thalarion/build-files. The following article explains in detail how CodePipeline and CodeBuild interact:
An example of a simple, working GitHub-based pipeline is available here:
Test
Deploy
Resources:
MyPipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
...
Stages:
...
- Name: Deploy
Actions:
- Name: !Sub '${DeploymentStackTemplate}-driven-deployment'
ActionTypeId:
Category: Deploy
Owner: AWS
Version: '1'
Provider: CloudFormation
InputArtifacts:
- Name: 'sources'
- Name: 'build'
OutputArtifacts: []
Configuration:
StackName: !Ref ProjectID
TemplatePath: !Sub sources::${DeploymentStackTemplate}
# The union of parameters specified in 'TemplateConfiguration' and in 'ParameterOverrides' must
# match exactly the set of deployment template parameters that do not have defaults
TemplateConfiguration: build::cloudformation-deployment-configuration.json
# parameter values specified in "ParameterOverrides" take precedence over the values specified in
# 'TemplateConfiguration'
ParameterOverrides: !Sub '{ "MyConfigurationParameterA": "yellow", "MyConfigurationParameterB": "black" }'
ActionMode: CREATE_UPDATE
Capabilities: CAPABILITY_IAM
RoleArn:
Fn::ImportValue: !Sub '${ProjectID}-cloudformation-service-role-ARN'
RunOrder: 1
This step relies on the presence of a CloudFormation stack template somewhere in an artifact produced by a previous pipeline stage. The name of the template file is configured as "TemplatePath". In the example above, the template path is relative the "source" InputArtifact, so it is expected to be found in the source tree.
A CloudFront template can be configured externally with parameters, and thus the deployment template can be configured by providing configuration values in a configuration file, specified as "TemplateConfiguration". "TemplateConfiguration" has the ArtifactName::TemplateConfigurationFileName format, which means that the template configuration file must be produced by one of the previous pipeline steps and must be placed in one of the artifacts. If the template configuration file is not found, maybe because the none of the previous stages created it, the deployment stage will fail with an S3 error. The configuration file allows JSON and YAML. A JSON configuration file is similar to:
{
"Parameters": {
"MyConfigurationParameter": "my value"
}
}
The buildspec produces it as follows:
...
- echo "{\"Parameters\":{\"MyConfigurationParameter\":\"spurious\"}}" > ./cloudformation-deployment-configuration.json
...
Note that all parameters that do not have defaults in the deployment template must be provided, otherwise the deployment will fail with: "Action execution failed: Parameters: [...] must have values (Service: AmazonCloudFormation; Status Code: 400; Error Code: ValidationError."
It is possible to override values in the template configuration file in the pipeline definition, using the "ParameterOverrides" key. If the same parameter is specified both in "ParameterOverrides" and in the template configuration file, the value specified in "ParameterOverrides" takes precedence.
The union of parameters specified as "ParameterOverrides" and those coming from the configuration files should match exactly the no-default template parameter set.
An example of a simple, working GitHub-based pipeline is available here:
Approval
Invoke
Custom Action
Custom actions can be developed.
Transition
After a stage is completed, the pipeline transitions the revision and its artifacts created by the actions in that stage to the next stage in the pipeline. A transition can be manually enabled or disabled.