Dataset and External APIs with Report Builder

At our site we interface with most information that doesn't come from a plc tag or opc device connection via getApi requests to a custom api for datalogging / product info storing. The way we do this returns our needed info in the form of large dataset value tags, but I have recently learned this is a less than ideal acquisition method for what the company is most interested in, reports. What would be a better way or where should I start looking to learn that? This is an example of what our current process looks like in Ignition's scripting project library.

def PackingLineB24():

# API request to get information
rawdata = CAP.apiGet(13)

# Find the maximum number of analysis results in any data entry
max_analysis_results = [0]

# Function to sort packing and determine max analysis results length
def sortPacking(data):
    lot = data['LotNumber']
    prod = data['Product']['ProductKey']
    prodid = data['Product']['ID']
    owner = data['Product']['Owner']['Value']
    date = data['Date']
    cust = data['Customer']['Name']
    
    analysisResults = data['AnalysisResults']
    
    if analysisResults and len(analysisResults) > 0:
        max_analysis_results[0] = max(max_analysis_results[0], len(analysisResults))
        
        # Prepare the basic data
        base_data = [lot, prod, prodid, owner, date, cust]
        
        # Extract `hlvs` values
        hlvs_data = [result['Value'] for result in analysisResults]
        
        return base_data, hlvs_data
    
    return None, None

rawPacking = [sortPacking(data) for data in rawdata]

# Remove None values and prepare data with variable headers
rawPacking = [item for item in rawPacking if item[0] is not None]

# Finalize headers
headers = ["Lot Number", "Product Type", "Product ID", "Owner", "Date", "Customer"]
headers.append("Halves")
for i in range(1, max_analysis_results[0]): 
headers.append("Halves {}".format({i + 1}))

# Prepare rows of data for the dataset
packing = []
for base_data, hlvs_data in rawPacking:
    # Backfill `hlvs` values with `None` for missing analysis keys
    row = base_data + hlvs_data + [None] * (max_analysis_results[0] - len(hlvs_data)) 
    packing.append(row)
   
for pack in packing:
    print(pack)

# Create a new Dataset of updated information to save to the tag
currentSet = system.dataset.toDataSet(headers, packing)

#Write to the tag
system.tag.writeBlocking(['[default]CWS Perspective/PackingLineB24'], [currentSet])

To clarify, are you asking how you should form API returns to present to dashboards in perspective? Or are you wanting to use report builder? I see you have tagged perspective but your question is a tad confusing.

Oh dang it I keep tagging perspective when I don't mean to.

I want to use the report builder so am trying to learn how to form Api requests to be able to utilize them in the report, as datasets are evidently not very easy to use with reports.

Removed the perspective tag thank you.

I do not understand this, datasets are easy to use in a report. If you want to use an API request in a report, then create a script data source, call your project script, then assign the dataset to a key or process it further then assign it.

data['packLineB24'] = project.script.PackingLineB24()
1 Like

Was not familiar with this? Sorry, yesterday I was struggling with a history enabled dataset tag to use in ignition and was told that I would need to unpack each value of the dataset into individual rows.

To clarify (Apologies) instead of calling to the tag that the api request tagWrites to via a tag history query data source, I should instead use a script data source and simply call to the api request function itself?

Thank you.

Yes, that's what I would do, but I guess it depends on how often the data changes and how intensive the call is. I guess you could just read the tag in the script source instead of calling the function.

Depends on what you are trying to do. A table component makes it easy to display a dataset.

The data unfortunately (and frustratingly) does not change in any kind of a consistent manor, some days it doesn't get updated at all and others it gets updated hourly. That's why I am unsure if this is even intelligent in the slightest as far as the acquisition practices, and I wish I didn't have to interface with the api at all lol

What triggers you to run an API? Is it the API side that is inconsistent with data being updated or is it what triggers the API call?

The api itself is inconsistent, it is essentially an excel like manual entry field that ties multiple entry fields into a single "data aggregate" which then gets dumped in as a part of the getApi request