Skip to content

Advanced Samples#

Advanced samples cover the usual use cases for reading, writing and reading and linking telemetry data in an structured and organized code. According to that each sample .cs file has a Write(), Read() and ReadAndLink() methods and all of the sample files rely on the same structure. You can use them as working samples copying to your application code. The .cs files in the Samples folder have documenting and descriptive comments, but lets take a look at a simple and a more complex sample in depth.

Writing Telemetry Data with a parameter and default feed name to a Kafka topic.#

First of all you need to create or use an AtlasConfiguration. You need to set the details what AppGroupId, ParameterGroupId, ParameterID you want to use.

Once you have your AtlasConfiguration design, you need to set details for the DependencyService URI, the stream broker address, the group name and the output topic name where you want to write. The DependencyService is used to handle requests for AtlasConfigurations and DataFormats, you must provide an URI for this service.

A KafkaStreamAdapter is used to manage Kafka streams.

Using the KafkaStreamAdapter you must open and output topic in Kafka.

You need to persist your configuration, setup your DataFormat and setup a streaming session to be able to write to your broker. The Writer.cs code sample covers all of these, you only need to use a Writer object with the required parameters.

In this example we are usign the default feed name, which is the empty string. You must take care about opening and closing the session, otherwise it would mark the session as truncated, just as if any error would occur during session usage.

In the samples we are working with random generated data, while in real scenarios they would come as real data from external systems.

To write to the output topic you only need to invoke the Write method on your writer object, passing in the feed name and the telemetry data. The Writer object already "knows" your AtlasConfiguration, the DataFormat and the output topic name.

Reading Telemetry Data for a parameter from a Kafka stream.#

First of all you need to create or use an AtlasConfiguration. You need to set the details what AppGroupId, ParameterGroupId, ParameterID you want to use.

Once you have your AtlasConfiguration design, you need to set details for the DependencyService URI, the stream broker address, the group name and the output topic name where you want to write. The DependencyService is used to handle requests for AtlasConfigurations and DataFormats, you must provide an URI for this service.

A KafkaStreamAdapter is used to manage Kafka streams.

Using the KafkaStreamAdapter you must open and output stream in Kafka.

You need connect to the DependencyClient and the DataFormatClient and perist the StreamPipelineBuilder that was created by the KafkaStreamAdapter. Using a Reader object takes care about it.

You can start reading a stream with the Read method on your reader object. This takes 2 arguments, the first one is more trivial, it is the parameter id. The second must be a user specified method, aligning to the TelemetryDataHandler delegate. With the help of this you can handle the streamed Telemetry Data as you would like to. In another example you will see how can you link it directly to another output topic.

Lastly, in our sample code we invoke the Write() method while the streaming session is live, to have some input to see that the streaming is working. Our sample delegates, called as Models are in the Models.cs and in this example we use the TraceData method to trace the streamed telemetry data deatils.