LNS duplicates upload message

Hi, I am facing an issue related to duplicate messages sent by the LNS (LoraWAN Network Server), sometimes the frame counter is being repeated once.

It would be easier to implement a test in my parser to ignore the message in case the frame counter is the same as the previous upload. But, for that, I need to save the previous upload information. Please, Is it possible to do it in the parser script?

If not, any other suggestion?

Hi @gadotti,

The deduplication (dedupe) process is usually performed on the Network Server side, as this is the appropriate architectural location for this functionality. If this is not possible, an analysis-based solution would be more appropriate than attempting to implement it in the Payload Parser, which does not have the necessary resources to store and compare previous frame counter values.

The Payload Parser is designed for data decoder, not for stateful operations such as deduplication, which require access to historical data.

Hello @gadotti,

Payload Parsers can’t access stored data, but they do have access to device configuration parameters and tags.

If you really need to use the payload parser for this, the only viable approach is to perform an analysis that updates the frame counter in a device configuration parameter.

A Payload Parser could look like this:

const frameCounterParam = device.params.find((param) => param.key === "frame_counter");
const myData = payload.find((data) => data.variable === "frame_counter");

if (Number(frameCounterParam?.value) >= Number(myData?.value)) {
  payload = []; // Empty the payload; do not store any data.
}

Be careful with the RPM on configuration parameter updates.

Hi @vitorfdl ,

that sounds good.

I tested the code you proposed, it reads the device paramenter. Working fine when I go manually to the device tab and enter a paramenter key/value.

But, how is the JS code to update the device parameter with the current frame counter value to be read on next time the parser is executed?

Thank you.

Updating the configuration parameter requires you to use an Analysis or the TagoIO API to perform such an update.

You need to get the configuration parameter list, find the configuration parameter ID that you want to update (or create a new one if not found), then perform the update.

In your case, if you want to use an analysis, you can set it up to run the analysis every time you receive a new frame_count variable, for example.

const { Analysis, Resources } = require("@tago-io/sdk");

async function upsertFrameCount(context, scope) {
  try {
    // Expecting: scope contains the device id (from the trigger) and a variable `frame_count`.
    // If triggered by an Action on data, scope will include the data records and device info.
    const device_id = scope?.[0]?.device || context.device_id; // adapt if your trigger passes device differently
    const frameCountPoint = scope.find((x) => x.variable === "frame_count");
    if (!device_id || !frameCountPoint) {
      console.warn("Missing device_id or frame_count in scope");
      return;
    }

    const newValue = String(frameCountPoint.value);

    // Get current configuration parameters for this device
    const params = await Resources.devices.paramList(device_id);

    // Find existing parameter by key
    const existing = params.find((p) => p.key === "frame_count");

    // If found, pass its id to update; otherwise paramSet without id will create it.
    await Resources.devices.paramSet(
      device_id,
      {
        key: "frame_count",
        value: newValue,
        sent: false, // keep as not-sent so your device/parser can act if you use this flag
      },
      existing?.id // optional paramID for edit
    );

    console.info(`frame_count ${existing ? "updated" : "created"} with value: ${newValue}`);
  } catch (error) {
    console.error("Error upserting frame_count parameter:", error);
  }
}

Analysis.use(upsertFrameCount);

1 Like

Hi @vitorfdl ,

I am not an expert in Java Script, but tried some implementation according to your indication.

  1. In the parser I can read the device parameter and make the comparisson with the current frame counter received in the payload. This is working fine.

  2. I created an analysis under Node.js runtime enviroment, correct?

  3. I do not understand where the Analysis will get the last received Frame Counter variable. Is it from the fresh received payload or from the stored variable in the device bucket? Because in the parser I ignore most os the variables and store only my useful data. Should I store the received Frame Counter variable as well to be accessed in the Analysis after the Parser?

Ps. It would be greate to have a command in the Parser to update/write to the Device Parameter, it would solve my problem and no need to run an Analisys. As fas as I know I can just read, but no write.

Thanks.

Hi @gadotti,

Great questions — and you’re on the right track.

  1. Runtime choice
  • Your choice of Node.js works. If possible, I recommend using the Deno runtime for new analyses. The only change you need is the import:
// Deno
import { Analysis, Resources } from "jsr:@tago-io/sdk";

For Node.js, your original require("@tago-io/sdk") is fine:

// Node.js
const { Analysis, Resources } = require("@tago-io/sdk");
  1. Where the Analysis gets frame_count
  • You’re correct: if your Action triggers the Analysis “on data,” the scope passed to the Analysis comes from the data that made the Action fire.
  • If you remove frame_count in the Payload Parser (i.e., you don’t store it), it won’t be present in the Analysis scope. So yes, you should store the frame_count variable if your Analysis depends on it. Otherwise, the Analysis won’t see it when triggered by data.
  • An alternative is to add the frame_countto a variable metadata that is already being stored. That way you won’t be charged any additional data input, and still have access to the information in your analysis.

I understand and share your wish to write Device Parameters directly in the Payload Parser.

The reason it isn’t supported is architectural: the Payload Parser is optimized to run very fast and stateless, without waiting for database/network calls. That’s why we can offer it without additional billing costs. Allowing dynamic DB access there would require the same architecture we use for Analysis (with associated runtime, scaling, and cost implications).

1 Like