Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AutoPR] datafactory/resource-manager #5747

Closed
wants to merge 21 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
5eb3fc7
[AutoPR datafactory/resource-manager] [Datafactory] ADLS Gen 2 suppor…
AutorestCI Jun 7, 2019
205aa23
[AutoPR datafactory/resource-manager] Add Dataset and CopySource for …
AutorestCI Jun 13, 2019
9dfd055
[AutoPR datafactory/resource-manager] (Public swagger update) Add Ter…
AutorestCI Jun 20, 2019
07b543f
[AutoPR datafactory/resource-manager] fix public swagger issues (#5985)
AutorestCI Jun 26, 2019
74b227d
[AutoPR datafactory/resource-manager] [Datafactory] Add three new con…
AutorestCI Jul 11, 2019
30a95dc
Packaging update of azure-mgmt-datafactory
AutorestCI Jul 11, 2019
08c3824
[AutoPR datafactory/resource-manager] [Datafactory] Add three new con…
AutorestCI Jul 17, 2019
6cf80af
[AutoPR datafactory/resource-manager] SSIS File System Support (#6216)
AutorestCI Jul 17, 2019
9003893
[AutoPR datafactory/resource-manager] Introduce ADX Command (#6404)
AutorestCI Jul 23, 2019
b46566a
[AutoPR datafactory/resource-manager] fix: datafactory character enco…
AutorestCI Jul 23, 2019
dfa7430
Generated from 6daaa9ba96f917b57001720be038e62850d1ccbc (#6471)
AutorestCI Jul 25, 2019
aef9b6b
Generated from 04df2c4ad1350ec47a500e1a1d1a609d43398aee (#6505)
AutorestCI Jul 29, 2019
9379260
[AutoPR datafactory/resource-manager] [DataFactory]SapBwCube and Syba…
AutorestCI Jul 29, 2019
f5d3db0
[AutoPR datafactory/resource-manager] Enable Avro Dataset in public s…
AutorestCI Jul 31, 2019
64a07f2
Generated from ccc8c92e96ab27329cf637c7214ebb35da8dce23 (#6625)
AutorestCI Aug 2, 2019
0c65fd1
updated release notes
Aug 6, 2019
326a827
fixed duplicate row
Aug 6, 2019
9f78b50
breaking changes
Aug 6, 2019
6e95bc5
Generated from 65a2679abd2e6a4aa56f0d4e5ef459407f105ae6 (#6774)
AutorestCI Aug 14, 2019
e41dd17
Generated from d22072afd73683450b42a2d626e10013330ab31b (#6795)
AutorestCI Aug 14, 2019
214041b
Generated from 6ca38e062bb3184e7207e058d4aa05656e9a755f (#6800)
AutorestCI Aug 15, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
150 changes: 150 additions & 0 deletions sdk/datafactory/azure-mgmt-datafactory/HISTORY.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,156 @@
Release History
===============

0.8.0 (2019-08-07)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given there are SDK breaking changes in this release, shouldn't the major version change?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not necessarily (https://github.com/Azure/azure-sdk-for-python/blob/master/doc/dev/mgmt/mgmt_release.md - pls check bottom of the page).
0.7.0 also had breaking changes.
Unless you want to specifically change to 1.0.0.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's fine, let's re;ease with 0.8.0. Thanks!

++++++++++++++++++

**Features**

- Model HubspotSource has a new parameter max_concurrent_connections
- Model CouchbaseSource has a new parameter max_concurrent_connections
- Model HttpSource has a new parameter max_concurrent_connections
- Model AzureDataLakeStoreSource has a new parameter max_concurrent_connections
- Model ConcurSource has a new parameter max_concurrent_connections
- Model FileShareDataset has a new parameter modified_datetime_start
- Model FileShareDataset has a new parameter modified_datetime_end
- Model SalesforceSource has a new parameter max_concurrent_connections
- Model NetezzaSource has a new parameter partition_option
- Model NetezzaSource has a new parameter max_concurrent_connections
- Model NetezzaSource has a new parameter partition_settings
- Model AzureMySqlSource has a new parameter max_concurrent_connections
- Model OdbcSink has a new parameter max_concurrent_connections
- Model ImpalaObjectDataset has a new parameter impala_object_dataset_schema
- Model ImpalaObjectDataset has a new parameter table
- Model AzureSqlDWTableDataset has a new parameter azure_sql_dw_table_dataset_schema
- Model AzureSqlDWTableDataset has a new parameter table
- Model SapEccSource has a new parameter max_concurrent_connections
- Model CopySource has a new parameter max_concurrent_connections
- Model ServiceNowSource has a new parameter max_concurrent_connections
- Model Trigger has a new parameter annotations
- Model CassandraSource has a new parameter max_concurrent_connections
- Model AzureQueueSink has a new parameter max_concurrent_connections
- Model DrillSource has a new parameter max_concurrent_connections
- Model DocumentDbCollectionSink has a new parameter write_behavior
- Model DocumentDbCollectionSink has a new parameter max_concurrent_connections
- Model SapHanaLinkedService has a new parameter connection_string
- Model SalesforceSink has a new parameter max_concurrent_connections
- Model HiveObjectDataset has a new parameter hive_object_dataset_schema
- Model HiveObjectDataset has a new parameter table
- Model GoogleBigQueryObjectDataset has a new parameter dataset
- Model GoogleBigQueryObjectDataset has a new parameter table
- Model FileSystemSource has a new parameter max_concurrent_connections
- Model SqlSink has a new parameter stored_procedure_table_type_parameter_name
- Model SqlSink has a new parameter max_concurrent_connections
- Model CopySink has a new parameter max_concurrent_connections
- Model SapCloudForCustomerSource has a new parameter max_concurrent_connections
- Model CopyActivity has a new parameter preserve_rules
- Model CopyActivity has a new parameter preserve
- Model AmazonMWSSource has a new parameter max_concurrent_connections
- Model SqlDWSink has a new parameter max_concurrent_connections
- Model MagentoSource has a new parameter max_concurrent_connections
- Model BlobEventsTrigger has a new parameter annotations
- Model DynamicsSink has a new parameter max_concurrent_connections
- Model AzurePostgreSqlTableDataset has a new parameter table
- Model AzurePostgreSqlTableDataset has a new parameter azure_postgre_sql_table_dataset_schema
- Model SqlServerTableDataset has a new parameter sql_server_table_dataset_schema
- Model SqlServerTableDataset has a new parameter table
- Model DocumentDbCollectionSource has a new parameter max_concurrent_connections
- Model AzurePostgreSqlSource has a new parameter max_concurrent_connections
- Model BlobSource has a new parameter max_concurrent_connections
- Model VerticaTableDataset has a new parameter vertica_table_dataset_schema
- Model VerticaTableDataset has a new parameter table
- Model PhoenixObjectDataset has a new parameter phoenix_object_dataset_schema
- Model PhoenixObjectDataset has a new parameter table
- Model AzureSearchIndexSink has a new parameter max_concurrent_connections
- Model MarketoSource has a new parameter max_concurrent_connections
- Model DynamicsSource has a new parameter max_concurrent_connections
- Model SparkObjectDataset has a new parameter spark_object_dataset_schema
- Model SparkObjectDataset has a new parameter table
- Model XeroSource has a new parameter max_concurrent_connections
- Model AmazonRedshiftSource has a new parameter max_concurrent_connections
- Model CustomActivity has a new parameter retention_time_in_days
- Model WebSource has a new parameter max_concurrent_connections
- Model GreenplumTableDataset has a new parameter greenplum_table_dataset_schema
- Model GreenplumTableDataset has a new parameter table
- Model SalesforceMarketingCloudSource has a new parameter max_concurrent_connections
- Model GoogleBigQuerySource has a new parameter max_concurrent_connections
- Model JiraSource has a new parameter max_concurrent_connections
- Model MongoDbSource has a new parameter max_concurrent_connections
- Model DrillTableDataset has a new parameter drill_table_dataset_schema
- Model DrillTableDataset has a new parameter table
- Model ExecuteSSISPackageActivity has a new parameter log_location
- Model SparkSource has a new parameter max_concurrent_connections
- Model AzureTableSink has a new parameter max_concurrent_connections
- Model AzureDataLakeStoreSink has a new parameter enable_adls_single_file_parallel
- Model AzureDataLakeStoreSink has a new parameter max_concurrent_connections
- Model PrestoSource has a new parameter max_concurrent_connections
- Model RelationalSource has a new parameter max_concurrent_connections
- Model TumblingWindowTrigger has a new parameter annotations
- Model ImpalaSource has a new parameter max_concurrent_connections
- Model ScheduleTrigger has a new parameter annotations
- Model QuickBooksSource has a new parameter max_concurrent_connections
- Model PrestoObjectDataset has a new parameter presto_object_dataset_schema
- Model PrestoObjectDataset has a new parameter table
- Model OracleSink has a new parameter max_concurrent_connections
- Model HdfsSource has a new parameter max_concurrent_connections
- Model PhoenixSource has a new parameter max_concurrent_connections
- Model SapCloudForCustomerSink has a new parameter max_concurrent_connections
- Model SquareSource has a new parameter max_concurrent_connections
- Model OracleSource has a new parameter partition_option
- Model OracleSource has a new parameter max_concurrent_connections
- Model OracleSource has a new parameter partition_settings
- Model BlobTrigger has a new parameter annotations
- Model HDInsightOnDemandLinkedService has a new parameter virtual_network_id
- Model HDInsightOnDemandLinkedService has a new parameter subnet_name
- Model AmazonS3LinkedService has a new parameter service_url
- Model HDInsightLinkedService has a new parameter file_system
- Model MultiplePipelineTrigger has a new parameter annotations
- Model HBaseSource has a new parameter max_concurrent_connections
- Model OracleTableDataset has a new parameter oracle_table_dataset_schema
- Model OracleTableDataset has a new parameter table
- Model RerunTumblingWindowTrigger has a new parameter annotations
- Model EloquaSource has a new parameter max_concurrent_connections
- Model AzureSqlTableDataset has a new parameter azure_sql_table_dataset_schema
- Model AzureSqlTableDataset has a new parameter table
- Model BlobSink has a new parameter max_concurrent_connections
- Model HiveSource has a new parameter max_concurrent_connections
- Model SqlSource has a new parameter max_concurrent_connections
- Model PaypalSource has a new parameter max_concurrent_connections
- Model AzureBlobDataset has a new parameter modified_datetime_start
- Model AzureBlobDataset has a new parameter modified_datetime_end
- Model VerticaSource has a new parameter max_concurrent_connections
- Model AmazonS3Dataset has a new parameter modified_datetime_start
- Model AmazonS3Dataset has a new parameter modified_datetime_end
- Model PipelineRun has a new parameter run_group_id
- Model PipelineRun has a new parameter is_latest
- Model ShopifySource has a new parameter max_concurrent_connections
- Model MariaDBSource has a new parameter max_concurrent_connections
- Model TeradataLinkedService has a new parameter connection_string
- Model ODataLinkedService has a new parameter service_principal_embedded_cert
- Model ODataLinkedService has a new parameter aad_service_principal_credential_type
- Model ODataLinkedService has a new parameter service_principal_key
- Model ODataLinkedService has a new parameter service_principal_id
- Model ODataLinkedService has a new parameter aad_resource_id
- Model ODataLinkedService has a new parameter service_principal_embedded_cert_password
- Model ODataLinkedService has a new parameter tenant
- Model AzureTableSource has a new parameter max_concurrent_connections
- Model IntegrationRuntimeSsisProperties has a new parameter data_proxy_properties
- Model ZohoSource has a new parameter max_concurrent_connections
- Model ResponsysSource has a new parameter max_concurrent_connections
- Model FileSystemSink has a new parameter max_concurrent_connections
- Model SqlDWSource has a new parameter max_concurrent_connections
- Model GreenplumSource has a new parameter max_concurrent_connections
- Model AzureDatabricksLinkedService has a new parameter new_cluster_init_scripts
- Model AzureDatabricksLinkedService has a new parameter new_cluster_driver_node_type
- Model AzureDatabricksLinkedService has a new parameter new_cluster_enable_elastic_disk
- Added operation TriggerRunsOperations.rerun
- Added operation ExposureControlOperations.get_feature_value_by_factory

**Breaking changes**

- Operation PipelinesOperations.create_run has a new signature
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Duplicate rows

- Model SSISPackageLocation has a new signature

0.7.0 (2019-01-31)
++++++++++++++++++

Expand Down
1 change: 1 addition & 0 deletions sdk/datafactory/azure-mgmt-datafactory/MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
recursive-include tests *.py *.yaml
include *.rst
include azure/__init__.py
include azure/mgmt/__init__.py
Expand Down
2 changes: 1 addition & 1 deletion sdk/datafactory/azure-mgmt-datafactory/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ This is the Microsoft Azure Data Factory Management Client Library.
Azure Resource Manager (ARM) is the next generation of management APIs that
replace the old Azure Service Management (ASM).

This package has been tested with Python 2.7, 3.4, 3.5, 3.6 and 3.7.
This package has been tested with Python 2.7, 3.5, 3.6 and 3.7.

For the older Azure Service Management (ASM) libraries, see
`azure-servicemanagement-legacy <https://pypi.python.org/pypi/azure-servicemanagement-legacy>`__ library.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,11 @@
# regenerated.
# --------------------------------------------------------------------------

from .data_factory_management_client import DataFactoryManagementClient
from .version import VERSION
from ._configuration import DataFactoryManagementClientConfiguration
from ._data_factory_management_client import DataFactoryManagementClient
__all__ = ['DataFactoryManagementClient', 'DataFactoryManagementClientConfiguration']

__all__ = ['DataFactoryManagementClient']
from .version import VERSION

__version__ = VERSION

Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
from msrestazure import AzureConfiguration

from .version import VERSION


class DataFactoryManagementClientConfiguration(AzureConfiguration):
"""Configuration for DataFactoryManagementClient
Note that all parameters used to create this instance are saved as instance
attributes.

:param credentials: Credentials needed for the client to connect to Azure.
:type credentials: :mod:`A msrestazure Credentials
object<msrestazure.azure_active_directory>`
:param subscription_id: The subscription identifier.
:type subscription_id: str
:param str base_url: Service URL
"""

def __init__(
self, credentials, subscription_id, base_url=None):

if credentials is None:
raise ValueError("Parameter 'credentials' must not be None.")
if subscription_id is None:
raise ValueError("Parameter 'subscription_id' must not be None.")
if not base_url:
base_url = 'https://management.azure.com'

super(DataFactoryManagementClientConfiguration, self).__init__(base_url)

# Starting Autorest.Python 4.0.64, make connection pool activated by default
self.keep_alive = True

self.add_user_agent('azure-mgmt-datafactory/{}'.format(VERSION))
self.add_user_agent('Azure-SDK-For-Python')

self.credentials = credentials
self.subscription_id = subscription_id
Original file line number Diff line number Diff line change
Expand Up @@ -11,55 +11,23 @@

from msrest.service_client import SDKClient
from msrest import Serializer, Deserializer
from msrestazure import AzureConfiguration
from .version import VERSION
from .operations.operations import Operations
from .operations.factories_operations import FactoriesOperations
from .operations.exposure_control_operations import ExposureControlOperations
from .operations.integration_runtimes_operations import IntegrationRuntimesOperations
from .operations.integration_runtime_object_metadata_operations import IntegrationRuntimeObjectMetadataOperations
from .operations.integration_runtime_nodes_operations import IntegrationRuntimeNodesOperations
from .operations.linked_services_operations import LinkedServicesOperations
from .operations.datasets_operations import DatasetsOperations
from .operations.pipelines_operations import PipelinesOperations
from .operations.pipeline_runs_operations import PipelineRunsOperations
from .operations.activity_runs_operations import ActivityRunsOperations
from .operations.triggers_operations import TriggersOperations
from .operations.rerun_triggers_operations import RerunTriggersOperations
from .operations.trigger_runs_operations import TriggerRunsOperations
from . import models


class DataFactoryManagementClientConfiguration(AzureConfiguration):
"""Configuration for DataFactoryManagementClient
Note that all parameters used to create this instance are saved as instance
attributes.

:param credentials: Credentials needed for the client to connect to Azure.
:type credentials: :mod:`A msrestazure Credentials
object<msrestazure.azure_active_directory>`
:param subscription_id: The subscription identifier.
:type subscription_id: str
:param str base_url: Service URL
"""

def __init__(
self, credentials, subscription_id, base_url=None):

if credentials is None:
raise ValueError("Parameter 'credentials' must not be None.")
if subscription_id is None:
raise ValueError("Parameter 'subscription_id' must not be None.")
if not base_url:
base_url = 'https://management.azure.com'

super(DataFactoryManagementClientConfiguration, self).__init__(base_url)

self.add_user_agent('azure-mgmt-datafactory/{}'.format(VERSION))
self.add_user_agent('Azure-SDK-For-Python')

self.credentials = credentials
self.subscription_id = subscription_id
from ._configuration import DataFactoryManagementClientConfiguration
from .operations import Operations
from .operations import FactoriesOperations
from .operations import ExposureControlOperations
from .operations import IntegrationRuntimesOperations
from .operations import IntegrationRuntimeObjectMetadataOperations
from .operations import IntegrationRuntimeNodesOperations
from .operations import LinkedServicesOperations
from .operations import DatasetsOperations
from .operations import PipelinesOperations
from .operations import PipelineRunsOperations
from .operations import ActivityRunsOperations
from .operations import TriggersOperations
from .operations import TriggerRunsOperations
from .operations import RerunTriggersOperations
from . import models


class DataFactoryManagementClient(SDKClient):
Expand Down Expand Up @@ -92,10 +60,10 @@ class DataFactoryManagementClient(SDKClient):
:vartype activity_runs: azure.mgmt.datafactory.operations.ActivityRunsOperations
:ivar triggers: Triggers operations
:vartype triggers: azure.mgmt.datafactory.operations.TriggersOperations
:ivar rerun_triggers: RerunTriggers operations
:vartype rerun_triggers: azure.mgmt.datafactory.operations.RerunTriggersOperations
:ivar trigger_runs: TriggerRuns operations
:vartype trigger_runs: azure.mgmt.datafactory.operations.TriggerRunsOperations
:ivar rerun_triggers: RerunTriggers operations
:vartype rerun_triggers: azure.mgmt.datafactory.operations.RerunTriggersOperations

:param credentials: Credentials needed for the client to connect to Azure.
:type credentials: :mod:`A msrestazure Credentials
Expand Down Expand Up @@ -140,7 +108,7 @@ def __init__(
self._client, self.config, self._serialize, self._deserialize)
self.triggers = TriggersOperations(
self._client, self.config, self._serialize, self._deserialize)
self.rerun_triggers = RerunTriggersOperations(
self._client, self.config, self._serialize, self._deserialize)
self.trigger_runs = TriggerRunsOperations(
self._client, self.config, self._serialize, self._deserialize)
self.rerun_triggers = RerunTriggersOperations(
self._client, self.config, self._serialize, self._deserialize)
Loading