Ping from Ignition in Linux

Hi Everybody,

I’m having some trouble here. I am retrieving data from Vorne XL devices located at various facilities using an Ignition gateway running in Linux. This data is retrieved via an http-get call to the device’s IP address, which returns some xml I can parse and store, etc etc. All is well, except when the device is unreachable. The call takes about 3 minutes to time out, and with many dozens of devices to query, even a handful of unreachable units (which is almost always the case) can really throw off the script’s execution.

So I browsed the forum for some sort of ping functionality, and came up with this:

      from import InetAddress
      timeout = 5000
      device = InetAddress.getByName(address)
      ping = device.isReachable(timeout)
      if ping == 0:  #and so forth

So I put a little ping function in my script module. Again, all is well, IF this code is run from a client application (in Windows). If I create a SQLTag to set high once a day as an automatic trigger to load the devices, and run the same code from a gateway tag change event script, the ping always fails. If I want the script to run automatically, I have to ignore the ping and let the httpGet calls time out on devices that are down.

It looks like the issue is that the Linux system calls for ping require root privileges:

So my question is, is there any other way to ping an address, or even just a way to set the timeout for an httpGet/Post? I’d rather not go down the path of Linux shell scripts updating the database with ping results before Ignition starts the load, but I can’t have the load taking upwards of an hour or two when it should only take 10 minutes.



You should be able to set the http get timeout by doing it “by hand”

[code]from import URL
from import IOUtils

con = URL(‘’).openConnection()
stream = con.getInputStream()
xml = IOUtils.toString(stream,“UTF-8”)

print xml[/code]

Hope this helps,

[quote=“dep05d”]Hi Everybody,
I am retrieving data from Vorne XL devices located at various facilities using an Ignition gateway running in Linux. This data is retrieved via an http-get call to the device’s IP address, which returns some xml I can parse and store, etc etc. [/quote]

Hi Dan,

Can I ask how you are getting XML out of the Vorne XL (or post a snippet)? I can get garbage out of mine, but nothing very useful.

EDIT: Whoops! We’re running frimware v here, if that helps… :blush:

Carl, I can’t get this import to go. Throws back an exception that there’s no module named apache. Dunno if it’s because I’m running this on a Windows installation or not…

Sorry about the minor hijack, but I did keep things mostly on topic… :laughing:

Thanks guys!


Here is a script to run a command line ping. This has been written for windows, but Ill work on it to work in Linux as well.

Here is the code execution script
put this in a module, like

[code]def execute(command):
from import BufferedReader,InputStreamReader
from java.lang import Runtime,Process

	rt = Runtime.getRuntime()

	pr = rt.exec(command)

	output = BufferedReader(InputStreamReader(pr.getInputStream()))

	data = []

	line = output.readLine()

	while line != None:
		line = output.readLine()

	exitVal = int(pr.waitFor())

	return data,exitVal

now to determine if the ping is successful, we need to check the output from the command. (this is the routine for windows)

out ="ping -n 1")

for row in out[0]:
	if row.rfind("Received = ")!=-1:
		row = row[row.find("Received = ")+11:]
		success =  int(row[:row.find(",")])

This will return either 1 or 0 based on if we were successful. If we changed the command, to say ping -n 3, it would return the number of successful replies from the ping.

For Ubuntu Linux (might work for more)

out ="ping -c 1")

for row in out[0]:
	if row.rfind("received")!=-1:
		row = row[row.find(", ")+2:]
		success =  int(row[:row.find("received")])

Obviously, the rfind will change based on what the command outputs, but using this as a template, you should have no problem figuring it out.

Hey Jordan,

Here you go, but it’s kind of sloppy. I don’t know what the firmware requirements are, but this works for me. I’m collecting data from dozens of these devices once a day and it’s working well so far.

The xml returned from the http call is simple, but doesn’t give you column names in each row returned. I have to know what columns are in my query to be able to keep track of my processing in each row. Also, Vorne has some custom SQL functions for enum lookups and formatting dates.

I store the query strings, column lists, date formatting, device names, and device IPs in the database, but hopefully this will give you a good enough idea.

To start, you can query like this:,average_cycle_time,average_rate_total,down_time,efficiency,eighth_user_number,datetime_to_iso_string%28end_time%29,end_time_offset,end_time_day,fifth_user_number,first_user_number,first_user_string,fourth_user_number,fourth_user_string,goal_count,good_count,ideal_cycle_time,interval_id,oee,part_id,percent_down,percent_run,percent_setup,percent_standby,performance,quality,reject_count,run_time,second_user_number,second_user_string,sequence_number,setup_time,seventh_user_number,sixth_user_number,slow_cycles,slow_cycles_time,small_stops,small_stops_time,standard_cycles,standard_cycles_time,standby_time,datetime_to_iso_string%28start_time%29,start_time_offset,start_time_day,takt_time,target_count,third_user_number,third_user_string,total_count,total_time,type%20from%20interval_stream%20where%20start_time%20%3E=%20iso_string_to_datetime%28%222011-01-04T00:00:00Z%22%29%20and%20start_time%20%3C%20iso_string_to_datetime%28%222011-01-05T00:00:00Z%22%29

You should be able to use your device IP address, paste that into your browser, and see the data returned. Basically then you just change your SQL query, and know how to use some of the Vorne SQL functions.

The list of tables can be found at

You can see what tables and columns are available to you there. Here is some example code that makes the call and parses out the data for the two tables I care about.

from import File
from javax.xml.parsers import DocumentBuilder
from javax.xml.parsers import DocumentBuilderFactory
from org.w3c.dom import Document
from org.w3c.dom import Element
from org.w3c.dom import Node
from org.w3c.dom import NodeList
from import ByteArrayInputStream
from import InputStream
import system
import app
import sys
import time


##Table INTERVAL_STREAM - shift and job related data
##note the use of datetime_to_iso_string.  I think if you don't use it, it returns seconds since 1900 or something rather unusable.  I actually build this string in the db based on the table, but you should get the idea
#queryString='select availability,average_cycle_time,average_rate_total,down_time,efficiency,eighth_user_number,datetime_to_iso_string(end_time),end_time_offset,end_time_day,fifth_user_number,first_user_number,first_user_string,fourth_user_number,fourth_user_string,goal_count,good_count,ideal_cycle_time,interval_id,oee,part_id,percent_down,percent_run,percent_setup,percent_standby,performance,quality,reject_count,run_time,second_user_number,second_user_string,sequence_number,setup_time,seventh_user_number,sixth_user_number,slow_cycles,slow_cycles_time,small_stops,small_stops_time,standard_cycles,standard_cycles_time,standby_time,datetime_to_iso_string(start_time),start_time_offset,start_time_day,takt_time,target_count,third_user_number,third_user_string,total_count,total_time,type from interval_stream where start_time >= iso_string_to_datetime("2011-01-04T00:00:00Z") and start_time < iso_string_to_datetime("2011-01-05T00:00:00Z")'

#Table TIMELINE_STREAM - run and downtime event data
#uses datetime_to_iso_string like the above query, also vorne_enum_lookup to get meaningful values
queryString='select duration,vorne_enum_lookup("timeline_stream","reason",reason),sequence_number,datetime_to_iso_string(start_time),start_time_offset,start_time_day,vorne_enum_lookup("timeline_stream","state",state) from timeline_stream where start_time >= iso_string_to_datetime("2011-01-09T00:00:00Z") and start_time < iso_string_to_datetime("2011-01-10T00:00:00Z")'

#I do this because the XML the unit spits back does not identify columns.  They are all called "cell".  Having this data allows me to keep track of what I am reading in each row
cols = columnList.split(',')

#the spaces will cause an error in the SQL call
queryString = queryString.replace(' ','%20')

	deviceCall = "http://" + deviceIP + "/" + queryString
	response =
	#this can take 3 minutes to fail.  Hence why a ping before the call would be nice
	print "Device not responding to query!"
	#return -1

#parse xml returned from http call
dbf = DocumentBuilderFactory.newInstance()
db = dbf.newDocumentBuilder()
doc = db.parse(ByteArrayInputStream(response))
nodeLst = doc.getElementsByTagName("row")
insertCount = 0
errorCount = 0

#for each <row>
for s in range(nodeLst.getLength()):
	fstNode = nodeLst.item(s)
	if (fstNode.getNodeType() == Node.ELEMENT_NODE):
		fstElmnt = fstNode
		fstNmElmntLst = fstElmnt.getElementsByTagName("cell")
		colcount = 0
		datalist = []
		#for each column (<cell>) in <row>
		for z in cols:
			fstNmElmnt = fstNmElmntLst.item(colcount)
			colcount = colcount + 1
			fstNm = fstNmElmnt.getChildNodes()
			#do this to avoid error on null <cell/> values
			if fstNm.item(0):
		#now at end of row
		#append what we will use as asset_id, or any other data you may wish to identify the item by
			#I would insert rows into the database here.
			#app.db.insertRow(tableName,cols, datalist)
			insertCount = insertCount + 1
			#database error handling here
			errorCount= errorCount + 1
print str(insertCount) + " rows inserted"
print str(errorCount) + " rows did not insert"

It’s a little crude, and a good advertisement for using Ignition instead :thumb_left: , but if they are out there and you want data collected centrally, you are able to do it.

I hope this helps.


Sorry, throw this up at the top:

import sys sys.add_package("")

Hope this helps,

Thanks for the info, guys. I agree that Ignition is still the best way to go, but I’ve got several of these XL’s scattered around the shop. Be nice to get at least a minimal amount of info from those until I can get the equipment they’re attached to onto the network.