[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: duplicate calls to web service



Your NEP (S36 talk!) could work either way (do it in the NEP or submit a discrete job) unless you are worried about a long timeout causing the queue to backup (in which case submitting the job to a separate queue should work well). But, if the requests will all timeout, then it probably doesn't matter as much.

It's been a while since I worked with DDM data queues. The documentation I quick read said that the entries will be removed from the queue when read. Have you verified that is happening? Is it possible you are reading or processing the same entry? Do you have a timeout on receiving the data from the queue? If so, do you check that it timed out and not assume that you need to process an entry? What is the time between your duplicate attempts? If you look in the queue, are you only seeing one request per item?  I don't recall, are you using separate debug logs for each request? Do your duplicate requests occur in the same debug log or in different debug logs?

Lots of questions...hope something triggers a thought. 
 

> -----Original Message-----
> From: ftpapi-bounces@xxxxxxxxxxxxxxxxxxxxxx [mailto:ftpapi-
> bounces@xxxxxxxxxxxxxxxxxxxxxx] On Behalf Of Kim Gibson
> Sent: Tuesday, February 15, 2011 10:20 AM
> To: ftpapi@xxxxxxxxxxxxxxxxxxxxxx
> Subject: RE: duplicate calls to web service
> 
> Hello Mike,
> 
> All requests are made in the same job. Here is the scenario:
> 
> On our web server is a submitted CL running that is listening to a DDM
> data queue. The remote data queue is on our production box. Upon data
> being placed in the data queue a second CL is called with data being
> passed in as parameters.
> 
> The request is made from an RPG program called from the second CL. Each
> time a request is made the second CL gets called. The files created in
> the IFS all have unique names.
> 
> Suddenly it's hitting me---should I be submitting this second CL to a
> jobq instead of calling it directly?
> 
> Kim Gibson
> 
> 
> >From: Mike Krebs <mkrebs@xxxxxxxxxxxxxxxxxx>
> >To: HTTPAPI and FTPAPI Projects <ftpapi@xxxxxxxxxxxxxxxxxxxxxx>
> >Subject: RE: duplicate calls to web service
> >Date: Mon, 14 Feb 2011 17:27:12 -0600
> 
> >So long as you use unique named IFS files, you should be able to
> submit
> the jobs to a jobq >with max active X (say 5). This would let several
> jobs run at once. On error, you could >resubmit with a schedule time of
> say an hour or two.
> 
> >Are the duplicate requests coming in the same job at the same time?
> 
> 
> 
> -----------------------------------------------------------------------
> This is the FTPAPI mailing list.  To unsubscribe, please go to:
> http://www.scottklement.com/mailman/listinfo/ftpapi
> -----------------------------------------------------------------------
-----------------------------------------------------------------------
This is the FTPAPI mailing list.  To unsubscribe, please go to:
http://www.scottklement.com/mailman/listinfo/ftpapi
-----------------------------------------------------------------------