Beam I/O connector for reading and writing from/to CDF resources.
The connector implements a sub-connector per resource type (Asset, Event, etc.) and is configured by passing it a config object and optionally a query object.
Please refer to the documentation for more information: https://github.com/cognitedata/cdf-beam-connector-java/blob/main/docs/index.md.
PCollection<Asset> mainInput = p.apply("Read cdf assets", CogniteIO.readAssets()
.withConfig(ProjectConfig.create()
.withProject(<project>)
.withApiKey(<api key>)
)
.withRequestParameters(RequestParameters.create()
.withFilterParameter("name", "value")
)
);
<dependency>
<groupId>com.cognite</groupId>
<artifactId>beam-connector-java</artifactId>
<version>0.9.46</version>
</dependency>
Requirements:
- Java 17.
- Apache Beam 2.48.0.
Resource | Read | Insert | Update | Delete | Read aggregates | Streaming read | Streaming write |
---|---|---|---|---|---|---|---|
Time series header | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Time series data points | Yes | Yes | Yes | Yes, via time series delete. | Yes | Yes | Yes |
Assets | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Events | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Files, metadata | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Files, binary | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Sequences header | Yes | Yes | Yes | Yes | Yes | No | Yes |
Sequences data points | Yes | Yes | Yes | Yes | No | No | Yes |
Relationships | Yes | Yes, via upsert | Yes, via upsert | Yes | No | No | Yes |
3D nodes | No | No | No | No | No | No | No |
Raw database | Yes | Yes, via row insert | N/A | No | No | No | Yes |
Raw table | Yes | Yes, via row insert | N/A | No | No | No | Yes |
Raw table row | Yes | Yes | Yes | Yes | No | Yes | Yes |
Data set | Yes | Yes | Yes | No | No | No | Yes |
Labels | Yes | Yes | Yes | Yes | No | No | Yes |
Resource | Feature support |
---|---|
Entity matcher | Predict |
Interactive Engineering diagrams / P&ID | Detect annotations, convert |
All connectors are accessed via static methods in com.cognite.beam.io.CogniteIO
. The connectors are prefixed with
read, readAll, write or delete:
- read: Issue a single query to CDF and return the results.
- readAll: Issue multiple queries to CDF and return the results.
- write: Write items to CDF. The write operation will behave as an upsert.
- delete: Delete the specified items from CDF.
For example, CogniteIO.readAssets()
will return a reader for the asset resource type.
The readers issue one or more queries to Cognite Fusion and returns the result as a PCollection<T>
of typed results items. The result item type depends on the CDF resource type (readAssets()
returns Asset
, readEvents()
returns Event
, etc.).
Readers have a set of common configuration options:
ProjectConfig
, mandatory. You must configure the reader with a Cognite tenant/project and api key. Optionally, you can also specify the host (for special cases when using a non-standard host).RequestParameters
, optional.RequestParameters
host any query/request specification for the reader. For example, if you want to apply filters (e.g. list events where type=MyEventType). If you do not supply any request parameters the reader will return all objects of the given resource type.Hints
, optional.Hints
Connector v0.9.24 will introduce a breaking change in the definition of the data transfer objects
(Asset
, Event
, File
, etc.). This is the carefully considered change, and the last big item before locking down
the SDK for v1 release.
Please check the documentation for more information.