-
Notifications
You must be signed in to change notification settings - Fork 45
IoT Functions Wiki
An AS Pipeline retrieves entity data and makes it available when executing custom functions.
Consider an example where you receive x,y and z as metrics data. The message must also deliver a deviceid and a timestamp (evt_timestamp) to a logical interface for it to show up in AS.
You want to perform some transformation of this data. You write CustomFunction1 that will act on incoming data items x and y using a constant 3.41 to deliver a new output item c.
Your transformation logic will be performed inside the execute() method of your custom function. The pipeline will deliver your custom function a dataframe containing the input items defined as inputs (to any function for the entity type not just CustomFunction1) along with deviceid and a system column called _timestamp. This data frame will be indexed on a copy of deviceid called id and your original timestamp.
You can write whatever you want in the execute method and add additional methods to your class to organize your code as you see fit. You can also import other packages and use them in your function.
Your execute method must return a dataframe that contains all the columns that were delivered to the execute method and the output column/s. In this example the output is called 'out1'.
The UI will allow you to chain multiple functions together. CustomFunction2 uses the out1 column delivered by CustomFunction1 to produce 'out2'. Available columns to out1 are input data and everything added by previous stages.
You will notice that there is no sequencing of functions in the UI. The UI sequences them automatically based on their dependencies. If you need to influence this dependency (e.g. to force something to run towards the end of the pipeline), you can include inputs that are only calculated late in the pipeline.