API Paging in HTTP bindings

I am attempting to call for a large amount of data and filter it down into a custom binding in one of my projects with an HTTP binding. I have already applied all the filters that are available for this specific endpoint and the individual calls are limited to 500 values. Does ignitions HTTP binding support any paging abilities to make multiple calls with predefined offsets and then combine that data to then transform? Or is the best bet here to have a separate property binding for each page offset and then combine them how you want? Or make the calls from a gateway-scoped script or something?

This.

2 Likes

I have a requirement on the same line, Now I am planning to expose historical data points through API to external systems. The volume of data is really large in the order of lakhs.. Here are the questions.
1.Can Ignition handle such volume?
2.If so through gateway scoped scripts, how to incorporate pagination mechanisms for data handling?

Exposing an API to outside users requires the WebDev module, and the details of the API operation are entirely up to you. (Ignition can handle substantial data traffic on suitable hardware--both gateway and DB--but I am unsure what "lahks" are.)

I think a lakh = 100,000 (= 105) and is in common use in in Afghanistan, Bangladesh, Bhutan, India, Myanmar, Nepal, Pakistan, and Sri Lanka. It is often used in Bangladeshi, Indian, Pakistani, and Sri Lankan English (according to Lakh - Wikipedia).

I don't think that lahkobytes are a thing though.

1 Like

Yes , 100000 rows of data needs to be exposed and want to do through pagination

Ok. Presumably your API will define a start row offset and a quantity in its request payload to accompany the data selection parameters. You would run your query in WebDev, and then construct an abbreviated dataset with the desired rows, then encode that (presumanly JSON) in the reply.

You will want to enable caching on your named query so follow-up pages can skip the actual query in common cases.

Show your actual API definition if you want more specific assistance. (Also, there are many WebDev topics on this forum that can help you.)

Yes Sure, I have historized data in influx db using Kymera module and am trying to expose this data through API end points using web dev module.
The volume of data I estimate is really huge and planning to use system.tag.querytaghistory for pulling data from influx historian.

Here I want to understand the limitations in
1.system.tag.querytaghistory
2. Web dev response
in terms of volume of data it can handle.

You are blazing a trail, I think. You will simply have to test. (I expect the bulk of the workload will be conversions to/from JSON.)