Skip to content

Commit

Permalink
fix: failing sphinx tests
Browse files Browse the repository at this point in the history
  • Loading branch information
mufiAmazon committed Jan 10, 2024
1 parent a25643c commit cf80af5
Show file tree
Hide file tree
Showing 3 changed files with 13 additions and 16 deletions.
2 changes: 1 addition & 1 deletion doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@
}

# Example configuration for intersphinx: refer to the Python standard library.
intersphinx_mapping = {"http://docs.python.org/": None}
intersphinx_mapping = {"python": ("http://docs.python.org/", None)}

# -- Options for autodoc ----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#configuration
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,8 @@ def feature_processor(
If the decorated function is executed without arguments then the decorated function's arguments
are automatically loaded from the input data sources. Outputs are ingested to the output Feature
Group. If arguments are provided to this function, then arguments are not automatically loaded
(for testing).
Group. If arguments are provided to this function, then arguments are not automatically
loaded (for testing).
Decorated functions must conform to the expected signature. Parameters: one parameter of type
pyspark.sql.DataFrame for each DataSource in 'inputs'; followed by the optional parameters with
Expand Down Expand Up @@ -96,7 +96,6 @@ def transform(input_feature_group, input_csv):
development phase to ensure that data is not used until the function is ready. It also
useful for users that want to manage their own data ingestion. Defaults to True.
spark_config (Dict[str, str]): A dict contains the key-value paris for Spark configurations.
Raises:
IngestionError: If any rows are not ingested successfully then a sample of the records,
with failure reasons, is logged.
Expand Down
22 changes: 10 additions & 12 deletions src/sagemaker/session.py
Original file line number Diff line number Diff line change
Expand Up @@ -4565,20 +4565,18 @@ def update_inference_component(
Args:
inference_component_name (str): Name of the Amazon SageMaker ``InferenceComponent``.
specification ([dict[str,int]]): Resource configuration. Optional.
Example: {
"MinMemoryRequiredInMb": 1024,
"NumberOfCpuCoresRequired": 1,
"NumberOfAcceleratorDevicesRequired": 1,
"MaxMemoryRequiredInMb": 4096,
},
Example: {
"MinMemoryRequiredInMb": 1024,
"NumberOfCpuCoresRequired": 1,
"NumberOfAcceleratorDevicesRequired": 1,
"MaxMemoryRequiredInMb": 4096,
},
runtime_config ([dict[str,int]]): Number of copies. Optional.
Default: {
"copyCount": 1
}
Default: {
"copyCount": 1
}
wait: Wait for inference component to be created before return. Optional. Default is
True.
True.
Return:
str: inference component name
Expand Down

0 comments on commit cf80af5

Please sign in to comment.