Import Datastream Values

This feature enables you to import historical data for a device's datastream. It's useful for scenarios where data isn't streamed in real-time but is instead collected on the device and sent periodically as a dataset.

For instance, if your device is usually offline and only connects intermittently to upload a batch of data points.

Import DataStream Values

POST https://{server_address}/api/v1/organization/device/import/batch

Headers

Name
Type
Description

Authorization*

Bearer {access_token}

Content-Type*

application/json

Request Body

Name
Type
Description

deviceId*

1

Device identifier.

dataStreamId*

1

Device datastream identifier.

values*

Datastream values to import.

values[x].value*

1

Datastream value.

values[x].ts*

String

Unix epoch time in milliseconds representing datastream value creation time.

Note: Timestamps could be only one month in the past. Older timestamps won't be accepted.

{
    "error": {
        "message": "Device with identifier 1 is not found or belong to another organization."
    }
}

Request examples:

# curl command example:
$ curl -X POST -H "Content-Type: application/json" \
      -H "Authorization: Bearer {accessToken}" \
      -d '{"deviceId":1,"dataStreamId":1,"values":[{"value":5,"ts":1702656423136},{"value":2,"ts":1702656428136}]}' \
      https://fra1.blynk.cloud/api/v1/organization/device/import/batch

$ curl -X POST -H "Content-Type: application/json" \
      -H "Authorization: Bearer eIdWHQqRfFmvP5LDDh-IGxPUzi7I27HthzCPAVmS" \
      -d '{"deviceId":1,"dataStreamId":1,"values":[{"value":5,"ts":1702656423136},{"value":2,"ts":1702656428136}]}' \
      https://fra1.blynk.cloud/api/v1/organization/device/import/batch

Last updated

Was this helpful?