Skip to content

Commit

Permalink
updating readme formatting
Browse files Browse the repository at this point in the history
  • Loading branch information
Charles Frenzel committed Sep 9, 2021
1 parent c82eb60 commit 8fc4bfd
Showing 1 changed file with 26 additions and 20 deletions.
46 changes: 26 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,57 +24,63 @@ It provides:
- [Customer Churn Pipeline on AWS](#customer-churn-pipeline-on-aws)
- [Table of contents](#table-of-contents)
- [Getting Started](#getting-started)
- [Clean up](#clean-up)
- [Read The Docs](#read-the-docs)
- [Solution Architecture](#solution-architecture)
- [Contributing](#contributing)

## Getting Started

# Step 1 - Modify default Parameters
1. **Step 1** - Modify default Parameters

Update the `.env` file in the main directory.
To run with Cox proportional hazard modeling instead of binary logloss set `COXPH` to '"positive"`.
Update the `.env` file in the main directory.
To run with Cox proportional hazard modeling instead of binary logloss set `COXPH` to `"positive"`.

`S3_BUCKET_NAME="{YOU_BUCKET_NAME}"`\
`REGION="{YOUR_REGION}"`\
`STACK_NAME="{YOUR_STACK_NAME}"`\
`COXPH="{negative|positive}"`
```shell
S3_BUCKET_NAME="{YOU_BUCKET_NAME}"
REGION="{YOUR_REGION}"
STACK_NAME="{YOUR_STACK_NAME}"
COXPH="{negative|positive}"
```

# Step 2 - Deploy the infrastructure
2. **Step 2** - Deploy the infrastructure

`./standup.sh`

# Step 3 - Update the pending GitHub Connections
3. **Step 3** - Update the pending GitHub Connections

To configure the [Github connection](https://docs.aws.amazon.com/codedeploy/latest/userguide/integrations-partners-github.html) manually in the CodeDeploy console, go to Developer Tools -> settings -> connections. This is a one time approval. Install as App or choose existing.

<p align="center">
<img src="images/UpdateConn.png" width="899" class="centerImage">
</p>

# Step 4 - Release change in churn pipeline for the first time
4. **Step 4** - Release change in churn pipeline for the first time

<p align="center">
<img src="images/ReleaseChange.png" width="899" class="centerImage">
</p>

# Step 4 - Once the build succeeds, navigate to Step Functions to verify completion
5. **Step 5** - Once the build succeeds, navigate to Step Functions to verify completion

# Step 5 - Trigger Inference pipeline. Batch Inference can be automated using cron jobs or S3 triggers as per business needs.
6. **Step 6**- Trigger Inference pipeline. Batch Inference can be automated using cron jobs or S3 triggers as per business needs.

`AWS_REGION=$(aws configure get region)`
```shell
AWS_REGION=$(aws configure get region)
aws lambda --region ${AWS_REGION} invoke --function-name invokeInferStepFunction --payload '{ "": ""}' out
```

`aws lambda --region ${AWS_REGION} invoke --function-name invokeInferStepFunction --payload '{ "": ""}' out`

# Clean up
## Clean up

`./delete_resources.sh`

This does not delete the S3 bucket. In order to delete the bucket and the contents in it, run the below -
This does not delete the S3 bucket. In order to delete the bucket and the contents in it, run the below -

`source .env`\
`accountnum=$(aws sts get-caller-identity --query Account --output text)`\
`aws s3 rb s3://${S3_BUCKET_NAME}-${accountnum}-${REGION} --force`
```shell
source .env
accountnum=$(aws sts get-caller-identity --query Account --output text)
aws s3 rb s3://${S3_BUCKET_NAME}-${accountnum}-${REGION} --force
```

## [Read The Docs](https://awslabs.github.io/aws-customer-churn-pipeline/)

Expand Down

0 comments on commit 8fc4bfd

Please sign in to comment.