Skip to content

Commit

Permalink
Merge pull request #257 from CybercentreCanada/feature/CLDN-1932-upda…
Browse files Browse the repository at this point in the history
…te-cccs-main-to-2.0.1-add-cccs-2.0

Feature/cldn 1932 update cccs main to 2.0.1 add cccs 2.0
  • Loading branch information
cccs-RyanK authored Feb 14, 2023
2 parents 7d2addf + 9f5e7eb commit 5dfb9b1
Show file tree
Hide file tree
Showing 297 changed files with 15,032 additions and 17,668 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/superset-python-unittest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ jobs:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
# TODO: separated requiermentes.txt file just for unit tests
# TODO: separated requirements.txt file just for unit tests
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
Expand Down
551 changes: 551 additions & 0 deletions CHANGELOG.md

Large diffs are not rendered by default.

17 changes: 12 additions & 5 deletions UPDATING.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,17 @@ under the License.
This file documents any backwards-incompatible changes in Superset and
assists people when migrating to a new version.

## Next
## 2.0.1

- [21895](https://github.com/apache/superset/pull/21895): Markdown components had their security increased by adhering to the same sanitization process enforced by Github. This means that some HTML elements found in markdowns are not allowed anymore due to the security risks they impose. If you're deploying Superset in a trusted environment and wish to use some of the blocked elements, then you can use the HTML_SANITIZATION_SCHEMA_EXTENSIONS configuration to extend the default sanitization schema. There's also the option to disable HTML sanitization using the HTML_SANITIZATION configuration but we do not recommend this approach because of the security risks. Given the provided configurations, we don't view the improved sanitization as a breaking change but as a security patch.

### Breaking Changes

### Potential Downtime

### Other

## 2.0.0

- [19046](https://github.com/apache/superset/pull/19046): Enables the drag and drop interface in Explore control panel by default. Flips `ENABLE_EXPLORE_DRAG_AND_DROP` and `ENABLE_DND_WITH_CLICK_UX` feature flags to `True`.
- [18936](https://github.com/apache/superset/pull/18936): Removes legacy SIP-15 interim logic/flags—specifically the `SIP_15_ENABLED`, `SIP_15_GRACE_PERIOD_END`, `SIP_15_DEFAULT_TIME_RANGE_ENDPOINTS`, and `SIP_15_TOAST_MESSAGE` flags. Time range endpoints are no longer configurable and strictly adhere to the `[start, end)` paradigm, i.e., inclusive of the start and exclusive of the end. Additionally this change removes the now obsolete `time_range_endpoints` from the form-data and resulting in the cache being busted.
Expand All @@ -46,10 +56,6 @@ assists people when migrating to a new version.
- [19017](https://github.com/apache/superset/pull/19017): Removes Python 3.7 support.
- [18970](https://github.com/apache/superset/pull/18970): The `DISABLE_LEGACY_DATASOURCE_EDITOR` feature flag is now `True` by default which disables the legacy datasource editor from being shown in the client.

### Potential Downtime

### Other

## 1.5.0

### Breaking Changes
Expand Down Expand Up @@ -82,6 +88,7 @@ assists people when migrating to a new version.
## 1.4.1

### Breaking Changes

- [17984](https://github.com/apache/superset/pull/17984): Default Flask SECRET_KEY has changed for security reasons. You should always override with your own secret. Set `PREVIOUS_SECRET_KEY` (ex: PREVIOUS_SECRET_KEY = "\2\1thisismyscretkey\1\2\\e\\y\\y\\h") with your previous key and use `superset re-encrypt-secrets` to rotate you current secrets

### Potential Downtime
Expand Down
7 changes: 3 additions & 4 deletions cccs-build/superset-base/azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,7 @@ variables:
buildTimestamp: $[format('{0:yyyyMMddHHmmss}', pipeline.startTime)]
DOCKER_BUILDKIT: 1

pool:
vmImage: ubuntu-latest
pool: cauldron-vmss

steps:
- bash: |
Expand All @@ -32,7 +31,7 @@ steps:
- task: Docker@2
displayName: Build image
inputs:
command: 'build'
command: build
repository: $(imageRepository)
arguments: --progress=plain
dockerfile: $(dockerfile)
Expand All @@ -42,7 +41,7 @@ steps:
- task: Docker@2
displayName: Push image
inputs:
command: 'push'
command: push
repository: $(imageRepository)
tags: |
$(BRANCH_NAME)
Expand Down
3 changes: 1 addition & 2 deletions cccs-build/superset/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Vault CA container import
ARG VAULT_CA_CONTAINER=uchimera.azurecr.io/cccs/hogwarts/vault-ca:master_2921_22315d60
FROM $VAULT_CA_CONTAINER AS vault_ca
FROM uchimera.azurecr.io/cccs/superset-base:cccs-1.5_20220720173232_b4325
FROM uchimera.azurecr.io/cccs/superset-base:feature_CLDN-1932-update-cccs-main-to-2.0.1-add-cccs-2.0_20230214125009_b6247

USER root

Expand All @@ -24,4 +24,3 @@ ENV BUILD_NUMBER_VAR=${BUILD_NUMBER}


USER superset

2 changes: 1 addition & 1 deletion cccs-build/superset/analytical-platform-requirements.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
hogwarts-auth~=1.8.0.7820
hogwarts-auth~=1.8.0.7820
3 changes: 1 addition & 2 deletions cccs-build/superset/azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,7 @@ variables:
buildTimestamp: $[format('{0:yyyyMMddHHmmss}', pipeline.startTime)]
DOCKER_BUILDKIT: 1

pool:
vmImage: ubuntu-latest
pool: cauldron-vmss

steps:
- bash: |
Expand Down
4 changes: 2 additions & 2 deletions cccs-build/superset/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ authlib==0.15.4
python-json-logger==2.0.2

# Additional drivers for Superset (or SQLAlchemy) to use
trino>=0.313.0
trino==0.318.0
mysql-connector-python==8.0.26
elasticsearch-dbapi==0.2.4
cachetools~=5.0.0
typing-extensions<4,>=3.10
typing-extensions<4,>=3.10
10 changes: 10 additions & 0 deletions docker/pythonpath_dev/superset_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,16 @@ def get_env_variable(var_name: str, default: Optional[str] = None) -> str:

RESULTS_BACKEND = FileSystemCache("/app/superset_home/sqllab")

CACHE_CONFIG = {
"CACHE_TYPE": "redis",
"CACHE_DEFAULT_TIMEOUT": 300,
"CACHE_KEY_PREFIX": "superset_",
"CACHE_REDIS_HOST": REDIS_HOST,
"CACHE_REDIS_PORT": REDIS_PORT,
"CACHE_REDIS_DB": REDIS_RESULTS_DB,
}
DATA_CACHE_CONFIG = CACHE_CONFIG


class CeleryConfig(object):
BROKER_URL = f"redis://{REDIS_HOST}:{REDIS_PORT}/{REDIS_CELERY_DB}"
Expand Down
2 changes: 2 additions & 0 deletions docker/run-server.sh
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,8 @@ gunicorn \
--threads ${SERVER_THREADS_AMOUNT:-20} \
--timeout ${GUNICORN_TIMEOUT:-60} \
--keep-alive ${GUNICORN_KEEPALIVE:-2} \
--max-requests ${WORKER_MAX_REQUESTS:-0} \
--max-requests-jitter ${WORKER_MAX_REQUESTS_JITTER:-0} \
--limit-request-line ${SERVER_LIMIT_REQUEST_LINE:-0} \
--limit-request-field_size ${SERVER_LIMIT_REQUEST_FIELD_SIZE:-0} \
"${FLASK_APP}"
15 changes: 8 additions & 7 deletions docs/docs/contributing/local-backend.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ You also need to install MySQL or [MariaDB](https://mariadb.com/downloads).

Ensure that you are using Python version 3.8 or 3.9, then proceed with:

````bash
```bash
# Create a virtual environment and activate it (recommended)
python3 -m venv venv # setup a python3 virtualenv
source venv/bin/activate
Expand Down Expand Up @@ -47,18 +47,18 @@ Or you can install via our Makefile

```bash
# Create a virtual environment and activate it (recommended)
$ python3 -m venv venv # setup a python3 virtualenv
$ source venv/bin/activate
python3 -m venv venv # setup a python3 virtualenv
source venv/bin/activate

# install pip packages + pre-commit
$ make install
make install

# Install superset pip packages and setup env only
$ make superset
make superset

# Setup pre-commit only
$ make pre-commit
````
make pre-commit
```

**Note: the FLASK_APP env var should not need to be set, as it's currently controlled
via `.flaskenv`, however if needed, it should be set to `superset.app:create_app()`**
Expand Down Expand Up @@ -103,4 +103,5 @@ app.logger.info(form_data)
```

### Frontend Assets

See [Building Frontend Assets Locally](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#frontend)
1 change: 1 addition & 0 deletions docs/docs/installation/alerts-reports.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,7 @@ SLACK_API_TOKEN = "xoxb-"
# Email configuration
SMTP_HOST = "smtp.sendgrid.net" #change to your host
SMTP_STARTTLS = True
SMTP_SSL_SERVER_AUTH = True # If your using an SMTP server with a valid certificate
SMTP_SSL = False
SMTP_USER = "your_user"
SMTP_PORT = 2525 # your port eg. 587
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ Finish installing by running through the following commands:

```
# Create an admin user in your metadata database (use `admin` as username to be able to load the examples)
$ export FLASK_APP=superset
export FLASK_APP=superset
superset fab create-admin
# Load some data to play with
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ all of the required dependencies. Docker Desktop [recently added support for Win
following command:

```bash
$ git clone https://github.com/apache/superset.git
git clone https://github.com/apache/superset.git
```

Once that command completes successfully, you should see a new `superset` folder in your
Expand All @@ -55,24 +55,24 @@ current directory.
Navigate to the folder you created in step 1:

```bash
$ cd superset
cd superset
```

When working on master branch, run the following commands:

```bash
$ docker-compose -f docker-compose-non-dev.yml pull
$ docker-compose -f docker-compose-non-dev.yml up
docker-compose -f docker-compose-non-dev.yml pull
docker-compose -f docker-compose-non-dev.yml up
```

Alternatively, you can also run a specific version of Superset by first checking out
the branch/tag, and then starting `docker-compose` with the `TAG` variable.
For example, to run the 1.4.0 version, run the following commands:

```bash
% git checkout 1.4.0
$ TAG=1.4.0 docker-compose -f docker-compose-non-dev.yml pull
$ TAG=1.4.0 docker-compose -f docker-compose-non-dev.yml up
git checkout 1.4.0
TAG=1.4.0 docker-compose -f docker-compose-non-dev.yml pull
TAG=1.4.0 docker-compose -f docker-compose-non-dev.yml up
```

You should see a wall of logging output from the containers being launched on your machine. Once
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/intro.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ geospatial charts.

Here are a **few different ways you can get started with Superset**:

- Download the [source from Apache Foundation's website](https://dist.apache.org/repos/dist/release/superset/1.4.1/)
- Download the [source from Apache Foundation's website](https://dist.apache.org/repos/dist/release/superset/)
- Download the latest Superset version from [Pypi here](https://pypi.org/project/apache-superset/)
- Setup Superset locally with one command
using [Docker Compose](installation/installing-superset-using-docker-compose)
Expand Down
22 changes: 22 additions & 0 deletions docs/docs/security.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -131,6 +131,28 @@ For example, the filters `client_id=4` and `client_id=5`, applied to a role,
will result in users of that role having `client_id=4` AND `client_id=5`
added to their query, which can never be true.

### Content Security Policiy (CSP)

[Content Security Policy (CSP)](https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP) is an added
layer of security that helps to detect and mitigate certain types of attacks, including
Cross-Site Scripting (XSS) and data injection attacks.

CSP makes it possible for server administrators to reduce or eliminate the vectors by which XSS can
occur by specifying the domains that the browser should consider to be valid sources of executable scripts.
A CSP compatible browser will then only execute scripts loaded in source files received from those allowed domains,
ignoring all other scripts (including inline scripts and event-handling HTML attributes).

A policy is described using a series of policy directives, each of which describes the policy for
a certain resource type or policy area. You can check possible directives
[here](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy).

It's extremely important to correclty configure a Content Security Policy when deploying Superset to
prevent many types of attacks. For that matter, Superset provides the ` TALISMAN_CONFIG` key in `config.py`
where admnistrators can define the policy. When running in production mode, Superset will check for the presence
of a policy and if it's not able to find one, it will issue a warning with the security risks. For environments
where CSP policies are defined outside of Superset using other software, administrators can disable
the warning using the `CONTENT_SECURITY_POLICY_WARNING` key in `config.py`.

### Reporting Security Vulnerabilities

Apache Software Foundation takes a rigorous standpoint in annihilating the security issues in its
Expand Down
6 changes: 3 additions & 3 deletions requirements/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ colorama==0.4.4
# via
# apache-superset
# flask-appbuilder
convertdate==2.3.2
convertdate==2.4.0
# via holidays
cron-descriptor==1.2.24
# via apache-superset
Expand All @@ -86,7 +86,7 @@ flask==2.0.3
# flask-migrate
# flask-sqlalchemy
# flask-wtf
flask-appbuilder==4.0.0
flask-appbuilder==4.1.3
# via apache-superset
flask-babel==1.0.0
# via flask-appbuilder
Expand Down Expand Up @@ -126,7 +126,7 @@ gunicorn==20.1.0
# via apache-superset
hashids==1.3.1
# via apache-superset
holidays==0.10.3
holidays==0.14.2
# via apache-superset
humanize==3.11.0
# via apache-superset
Expand Down
1 change: 0 additions & 1 deletion requirements/local.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,3 @@

# The following packages are considered to be unsafe in a requirements file:
# setuptools

2 changes: 1 addition & 1 deletion scripts/python_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -32,4 +32,4 @@ superset init

echo "Running tests"

pytest --durations-min=2 --maxfail=1 --cov-report= --cov=superset "$@"
pytest --durations-min=2 --maxfail=1 --cov-report= --cov=superset ./tests/integration_tests "$@"
6 changes: 3 additions & 3 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ def get_git_sha() -> str:
"cryptography>=3.3.2",
"deprecation>=2.1.0, <2.2.0",
"flask>=2.0.0, <3.0.0",
"flask-appbuilder>=4.0.0, <5.0.0",
"flask-appbuilder>=4.1.3, <5.0.0",
"flask-caching>=1.10.0",
"flask-compress",
"flask-talisman",
Expand All @@ -88,7 +88,7 @@ def get_git_sha() -> str:
"graphlib-backport",
"gunicorn>=20.1.0",
"hashids>=1.3.1, <2",
"holidays==0.10.3", # PINNED! https://github.com/dr-prodigy/python-holidays/issues/406
"holidays==0.14.2",
"humanize",
"isodate",
"markdown>=3.0",
Expand Down Expand Up @@ -150,7 +150,7 @@ def get_git_sha() -> str:
"impala": ["impyla>0.16.2, <0.17"],
"kusto": ["sqlalchemy-kusto>=1.0.1, <2"],
"kylin": ["kylinpy>=2.8.1, <2.9"],
"mmsql": ["pymssql>=2.1.4, <2.2"],
"mssql": ["pymssql>=2.1.4, <2.2"],
"mysql": ["mysqlclient>=2.1.0, <3"],
"oracle": ["cx-Oracle>8.0.0, <8.1"],
"pinot": ["pinotdb>=0.3.3, <0.4"],
Expand Down
20 changes: 15 additions & 5 deletions superset-frontend/README.md
Original file line number Diff line number Diff line change
@@ -1,32 +1,42 @@
To pull our own superset-ui packages from our Artifacts Feed when building locally, the steps below should be followed.

# How to build locally

## Setup credentials

### Step 1

Copy the code below to your [user .npmrc](https://docs.microsoft.com/en-us/azure/devops/artifacts/npm/npmrc?view=azure-devops).

```
; begin auth token
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/registry/:username=cccs-analytical-platform
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/registry/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]
; begin auth token
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/registry/:username=cccs-analytical-platform
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/registry/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/registry/:email=npm requires email to be set but doesn't use the value
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/:username=cccs-analytical-platform
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/:username=cccs-analytical-platform
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]
//pkgs.dev.azure.com/cccs-analytical-platform/99130e50-b4e3-4d7d-873e-2a947f564b87/_packaging/analytical-platform/npm/:email=npm requires email to be set but doesn't use the value
; end auth token
```

### Step 2

Generate a [Personal Access Token](https://dev.azure.com/cccs-analytical-platform/_usersSettings/tokens) with Packaging read & write scopes.

### Step 3

Base64 encode the personal access token from Step 2.
One safe and secure method of Base64 encoding a string is to:

1. From a command/shell prompt run:

```
node -e "require('readline') .createInterface({input:process.stdin,output:process.stdout,historySize:0}) .question('PAT> ',p => { b64=Buffer.from(p.trim()).toString('base64');console.log(b64);process.exit(); })"
```

2. Paste your personal access token value and press Enter/Return
3. Copy the Base64 encoded value

### Step 4

Replace both [BASE64_ENCODED_PERSONAL_ACCESS_TOKEN] values in your user .npmrc file with your personal access token from Step 3.
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ describe('Dashboard edit markdown', () => {
cy.get('[data-test="dashboard-markdown-editor"]')
.should(
'have.text',
'✨Markdown✨Markdown✨MarkdownClick here to edit markdown',
'✨Markdown\n✨Markdown\n✨Markdown\n\nClick here to edit markdown',
)
.click();

Expand Down
Loading

0 comments on commit 5dfb9b1

Please sign in to comment.