How to export a complete bucket?

I am using analysis in combination with an action described in this tutorial.

It doesn’t work as expected so far. In line 25 of the script mentioned in the link I’m setting “qty: 1000000” but it only exports 10000 lines and in my Buckets I have more than 750k data.

My service profile allows 25000 transactions/h in “Data Output”. Is that what limits me?

Is there any way to use the API to extract all records from the bucket at once?

You can only get the maximum amount of 10000 data by request.

You can perform multiple requests in combination with the skip parameter, that will skip the first X amount of data.

Device.getData({ skip: 10000, amount: 10000 })

Will get 10000 data after skipping the first 10000 results.

1 Like

Thanks for the prompt reply Vitor.

In that case, if I didn’t get it wrong, would 1 csv file be generated for each request?
Ex, if I have 750k, it would be 75 csv to get the total data.

1 Like

Well, I’m assuming you’re changing the scripts to perform as you expect it to.

There is no need to generate several CSV files. Edit the analysis in order to get all the data and store into a variable:

    let data_list = [];
    while (true) {
        const data = await device.getData({ variable: ['temperature', 'humidity'], start_date: '10 year', qty: 9999, skip: data_list.length });
        if (!data.length) {
            break;
        } 
        data_list = data_list.concat(data);
    }

    let csv = 'variable,value,unit,time';
    for (const x of csv) {
        csv = `${csv}\n${x.variable},${x.value},${x.unit},${x.time}`;
    }

There is some things you need to have in mind tought:

  • Analysis does have a memory limit. I believe it is set to 5mb. For sure 750k of data will throw memory error if you run your analysis inside TagoIO. That means you will only be able to run this analysis in external: Running Analysis as External using Node.JS - TagoIO

Now I understand, thanks Victor!

1 Like