Spooler not sending QoS data from robot
search cancel

Spooler not sending QoS data from robot

book

Article ID: 35021

calendar_today

Updated On:

Products

DX Unified Infrastructure Management (Nimsoft / UIM)

Issue/Introduction

The Spooler fails to send QoS data.  <nimsoft>\robot\q1.rdb continues to grow without passing data.

spooler.log contains a loop of the same events such as:

Jun 11 15:06:19:772 [5308] spooler: checkphead: header not ok: HTTP/1.0 4
Jun 11 15:06:19:772 [5308] spooler: sockParse: illegal phead received
Jun 11 15:06:19:772 [5308] spooler: nimSessionWaitMsg: got error on client session: 0
Jun 11 15:06:19:772 [5308] spooler: FlushMessages - failed to flush message (communication error)
Jun 11 15:06:19:772 [5308] spooler: FlushMessages -    0 messages sent to ##.##.##.##:48001
Jun 11 15:06:24:842 [5308] spooler: FlushMessages - out-queue contains 1 records - continue
Jun 11 15:06:24:842 [5308] spooler: nimSessionConnect - host = ##.##.##.##, port = 48001, secWait = 15
Jun 11 15:06:24:842 [5308] spooler: sockConnect - to host ##.##.##.##, port 48001
Jun 11 15:06:24:842 [5308] spooler: SREQUEST: hubpost ->##.##.##.##/48001
Jun 11 15:06:24:842 [5308] spooler:  head   mtype=100 cmd=hubpost seq=0 ts=1339448784 frm=##.##.##.##/52865
Jun 11 15:06:24:842 [5308] spooler:  head   tout=10 addr=
Jun 11 15:06:24:842 [5308] spooler:  data   nimid=XM58589581-45592 nimts=1339023600 tz_offset=21600 source=##.##.##.##
Jun 11 15:06:24:842 [5308] spooler:  data   domain=<domain> origin=85780 pri=1 subject=QOS_MESSAGE prid=cdm
Jun 11 15:06:24:842 [5308] spooler:  data   dev_id=xxxxxxxxxxxxxx met_id=xxxxxxxxxxxxx
Jun 11 15:06:24:842 [5308] spooler:  data   udata=PDS(198)


(The same QoS object, dev_id, met_id, etc is looping through the spooler logs).

Environment

UIM any version

Resolution

1)  Check the <nimsoft>\robot\q1.rdb and q2.rdb files.  If q2.rdb is small (~1KB) and q1.rdb is continuing to grow, there's probably corrupt data in q2.rdb that's causing QoS to queue on the spooler.

2)  Stop the robot.

3)  Move q2.rdb to a temp folder

4)  Restart the robot.

This will  allow the robot to create a new q2.rdb file which should then allow q1.rdb to begin to flush.  An analysis of the archived q2.rdb may show corrupt data via illegal characters which was disrupting the normal data flow.