Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: remove deprecated apis on superset, testconn, validate_sql_json, schemas_access_for_file_upload #24354

Merged
merged 9 commits into from
Jun 13, 2023
4 changes: 0 additions & 4 deletions RESOURCES/STANDARD_ROLES.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,10 +82,8 @@
|can fetch datasource metadata on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can override role permissions on Superset|:heavy_check_mark:|O|O|O|
|can created dashboards on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can extra table metadata on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can csrf token on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can created slices on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can testconn on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can annotation json on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can add slices on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can fave dashboards on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
Expand All @@ -96,7 +94,6 @@
|can warm up cache on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can sqllab table viz on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|can profile on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can validate sql json on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can available domains on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can queries on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can stop query on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
Expand Down Expand Up @@ -202,7 +199,6 @@
|can edit on FilterSets|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can this form get on ColumnarToDatabaseView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can this form post on ColumnarToDatabaseView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can schemas access for file upload on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Upload a Columnar file|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can export on Chart|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can write on DashboardFilterStateRestApi|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
Expand Down
1 change: 1 addition & 0 deletions UPDATING.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ assists people when migrating to a new version.

### Breaking Changes

- [24354](https://github.com/apache/superset/pull/24354): Removed deprecated APIs `/superset/testconn`, `/superset/validate_sql_json/`, `/superset/schemas_access_for_file_upload`, `/superset/extra_table_metadata`
- [24381](https://github.com/apache/superset/pull/24381): Removed deprecated API `/superset/available_domains/`
- [24359](https://github.com/apache/superset/pull/24359): Removed deprecated APIs `/superset/estimate_query_cost/..`, `/superset/results/..`, `/superset/sql_json/..`, `/superset/csv/..`
- [24345](https://github.com/apache/superset/pull/24345) Converts `ENABLE_BROAD_ACTIVITY_ACCESS` and `MENU_HIDE_USER_INFO` into feature flags and changes the value of `ENABLE_BROAD_ACTIVITY_ACCESS` to `False` as it's more secure.
Expand Down
206 changes: 1 addition & 205 deletions superset/views/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,6 @@
from __future__ import annotations

import logging
import re
from contextlib import closing
from datetime import datetime
from typing import Any, Callable, cast
from urllib import parse
Expand All @@ -36,7 +34,7 @@
from flask_appbuilder.security.sqla import models as ab_models
from flask_babel import gettext as __, lazy_gettext as _
from sqlalchemy import and_, or_
from sqlalchemy.exc import DBAPIError, NoSuchModuleError, SQLAlchemyError
from sqlalchemy.exc import SQLAlchemyError

from superset import (
app,
Expand Down Expand Up @@ -66,15 +64,12 @@
from superset.dashboards.dao import DashboardDAO
from superset.dashboards.permalink.commands.get import GetDashboardPermalinkCommand
from superset.dashboards.permalink.exceptions import DashboardPermalinkGetFailedError
from superset.databases.commands.exceptions import DatabaseInvalidError
from superset.databases.dao import DatabaseDAO
from superset.databases.utils import make_url_safe
from superset.datasets.commands.exceptions import DatasetNotFoundError
from superset.datasource.dao import DatasourceDAO
from superset.errors import ErrorLevel, SupersetError, SupersetErrorType
from superset.exceptions import (
CacheLoadError,
CertificateException,
DatabaseNotFound,
SupersetCancelQueryException,
SupersetErrorException,
Expand All @@ -93,9 +88,7 @@
from superset.models.slice import Slice
from superset.models.sql_lab import Query, TabState
from superset.models.user_attributes import UserAttribute
from superset.security.analytics_db_safety import check_sqlalchemy_uri
from superset.sql_parse import ParsedQuery
from superset.sql_validators import get_validator_by_name
from superset.superset_typing import FlaskResponse
from superset.tasks.async_queries import load_explore_json_into_cache
from superset.utils import core as utils
Expand Down Expand Up @@ -985,90 +978,6 @@ def add_slices( # pylint: disable=no-self-use
session.close()
return "SLICES ADDED"

@api
@has_access_api
@event_logger.log_this
@expose(
"/testconn",
methods=(
"GET",
"POST",
),
) # pylint: disable=no-self-use
@deprecated(new_target="/api/v1/database/test_connection/")
def testconn(self) -> FlaskResponse:
"""Tests a sqla connection"""
db_name = request.json.get("name")
uri = request.json.get("uri")
try:
if app.config["PREVENT_UNSAFE_DB_CONNECTIONS"]:
check_sqlalchemy_uri(make_url_safe(uri))
# if the database already exists in the database, only its safe
# (password-masked) URI would be shown in the UI and would be passed in the
# form data so if the database already exists and the form was submitted
# with the safe URI, we assume we should retrieve the decrypted URI to test
# the connection.
if db_name:
existing_database = (
db.session.query(Database)
.filter_by(database_name=db_name)
.one_or_none()
)
if existing_database and uri == existing_database.safe_sqlalchemy_uri():
uri = existing_database.sqlalchemy_uri_decrypted

# This is the database instance that will be tested. Note the extra fields
# are represented as JSON encoded strings in the model.
database = Database(
server_cert=request.json.get("server_cert"),
extra=json.dumps(request.json.get("extra", {})),
impersonate_user=request.json.get("impersonate_user"),
encrypted_extra=json.dumps(request.json.get("encrypted_extra", {})),
)
database.set_sqlalchemy_uri(uri)
database.db_engine_spec.mutate_db_for_connection_test(database)

with database.get_sqla_engine_with_context() as engine:
with closing(engine.raw_connection()) as conn:
if engine.dialect.do_ping(conn):
return json_success('"OK"')

raise DBAPIError(None, None, None)
except CertificateException as ex:
logger.info("Certificate exception")
return json_error_response(ex.message)
except (NoSuchModuleError, ModuleNotFoundError):
logger.info("Invalid driver")
driver_name = make_url_safe(uri).drivername
return json_error_response(
_(
"Could not load database driver: %(driver_name)s",
driver_name=driver_name,
),
400,
)
except DatabaseInvalidError:
logger.info("Invalid URI")
return json_error_response(
_(
"Invalid connection string, a valid string usually follows:\n"
"'DRIVER://USER:PASSWORD@DB-HOST/DATABASE-NAME'"
)
)
except DBAPIError:
logger.warning("Connection failed")
return json_error_response(
_("Connection failed, please check your connection settings"), 400
)
except SupersetSecurityException as ex:
logger.warning("Stopped an unsafe database connection")
return json_error_response(_(str(ex)), 400)
except Exception as ex: # pylint: disable=broad-except
logger.warning("Unexpected error %s", type(ex).__name__)
return json_error_response(
_("Unexpected error occurred, please check your logs for details"), 400
)

@staticmethod
def get_user_activity_access_error(user_id: int) -> FlaskResponse | None:
try:
Expand Down Expand Up @@ -1695,84 +1604,6 @@ def stop_query(self) -> FlaskResponse:

return self.json_response("OK")

@has_access_api
@event_logger.log_this
@expose(
"/validate_sql_json/",
methods=(
"GET",
"POST",
),
)
@deprecated(new_target="/api/v1/database/<pk>/validate_sql/")
def validate_sql_json(
# pylint: disable=too-many-locals,no-self-use
self,
) -> FlaskResponse:
"""Validates that arbitrary sql is acceptable for the given database.
Returns a list of error/warning annotations as json.
"""
sql = request.form["sql"]
database_id = request.form["database_id"]
schema = request.form.get("schema") or None
template_params = json.loads(request.form.get("templateParams") or "{}")

if template_params is not None and len(template_params) > 0:
# TODO: factor the Database object out of template rendering
# or provide it as mydb so we can render template params
# without having to also persist a Query ORM object.
return json_error_response(
"SQL validation does not support template parameters", status=400
)

session = db.session()
mydb = session.query(Database).filter_by(id=database_id).one_or_none()
if not mydb:
return json_error_response(
f"Database with id {database_id} is missing.", status=400
)

spec = mydb.db_engine_spec
validators_by_engine = app.config["SQL_VALIDATORS_BY_ENGINE"]
if not validators_by_engine or spec.engine not in validators_by_engine:
return json_error_response(
f"no SQL validator is configured for {spec.engine}", status=400
)
validator_name = validators_by_engine[spec.engine]
validator = get_validator_by_name(validator_name)
if not validator:
return json_error_response(
"No validator named {} found (configured for the {} engine)".format(
validator_name, spec.engine
)
)

try:
timeout = config["SQLLAB_VALIDATION_TIMEOUT"]
timeout_msg = f"The query exceeded the {timeout} seconds timeout."
with utils.timeout(seconds=timeout, error_message=timeout_msg):
errors = validator.validate(sql, schema, mydb)
payload = json.dumps(
[err.to_dict() for err in errors],
default=utils.pessimistic_json_iso_dttm_ser,
ignore_nan=True,
encoding=None,
)
return json_success(payload)
except Exception as ex: # pylint: disable=broad-except
logger.exception(ex)
msg = _(
"%(validator)s was unable to check your query.\n"
"Please recheck your query.\n"
"Exception: %(ex)s",
validator=validator.name,
ex=ex,
)
# Return as a 400 if the database error message says we got a 4xx error
if re.search(r"([\W]|^)4\d{2}([\W]|$)", str(ex)):
return json_error_response(f"{msg}", status=400)
return json_error_response(f"{msg}")

@api
@handle_api_exception
@has_access
Expand Down Expand Up @@ -2042,38 +1873,3 @@ def sqllab(self) -> FlaskResponse:
@event_logger.log_this
def sqllab_history(self) -> FlaskResponse:
return super().render_app_template()

@api
@has_access_api
@event_logger.log_this
@expose("/schemas_access_for_file_upload")
@deprecated(new_target="api/v1/database/{pk}/schemas_access_for_file_upload/")
def schemas_access_for_file_upload(self) -> FlaskResponse:
"""
This method exposes an API endpoint to
get the schema access control settings for file upload in this database
"""
if not request.args.get("db_id"):
return json_error_response("No database is allowed for your file upload")

db_id = int(request.args["db_id"])
database = db.session.query(Database).filter_by(id=db_id).one()
try:
schemas_allowed = database.get_schema_access_for_file_upload()
if security_manager.can_access_database(database):
return self.json_response(schemas_allowed)
# the list schemas_allowed should not be empty here
# and the list schemas_allowed_processed returned from security_manager
# should not be empty either,
# otherwise the database should have been filtered out
# in CsvToDatabaseForm
schemas_allowed_processed = security_manager.get_schemas_accessible_by_user(
database, schemas_allowed, False
)
return self.json_response(schemas_allowed_processed)
except Exception as ex: # pylint: disable=broad-except
logger.exception(ex)
return json_error_response(
"Failed to fetch schemas allowed for csv upload in this database! "
"Please contact your Superset Admin!"
)
27 changes: 0 additions & 27 deletions tests/integration_tests/base_tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -398,33 +398,6 @@ def delete_fake_db_for_macros():
db.session.delete(database)
db.session.commit()

def validate_sql(
self,
sql,
client_id=None,
username=None,
raise_on_error=False,
database_name="examples",
template_params=None,
):
if username:
self.logout()
self.login(username=username)
dbid = SupersetTestCase.get_database_by_name(database_name).id
resp = self.get_json_resp(
"/superset/validate_sql_json/",
raise_on_error=False,
data=dict(
database_id=dbid,
sql=sql,
client_id=client_id,
templateParams=template_params,
),
)
if raise_on_error and "error" in resp:
raise Exception("validate_sql failed")
return resp

def get_dash_by_slug(self, dash_slug):
sesh = db.session()
return sesh.query(Dashboard).filter_by(slug=dash_slug).first()
Expand Down
Loading