subprocess - Calling BCP from python throws timeout exception, then immediately completes -
I'm running a dragon script which makes a subprocess call. It's time out, and then immediately I see sub-process (in this case a BCP call) finish. Actually, I can see that it is full in the database. In addition, I can run the BCP command directly on the command line, and it works just fine.
Here's how my dragon script goes to the command prompt:
C: \ FaceIAPS \ StudyDataFiles & gt; PE. \ RUN_DATA.py 80 Syncing Topics Topic 11 Initial Copy ... Traceback (Most Recent Call End): File ". RUN_DATA.py", line 261, & lt; Module & gt; bulk_import (upload_file, 'Facet_Data') file ". \ RUN_DATA.py", line 171, bulk_import subprocess.check_call (the "BCP BehaviorResearch .." + table_to_upload_to + + filename_to_be_uploaded + "-T -c S PBB-" in "C202B -2 \ BEHAVIORRESEARCH -e bulk_copy_errors.log ", open = true, timeout = 5) file" C: \ Python34 \ lib \ subprocess.py ", line 554, check_call retcode = call (* popenargs, ** kwargs) file "C: \ Python34 \ lib \ subprocess.py", line 537, the call-back p.wait (timeout = expired) file to "C: \ Python34 \ lib \ subprocess.py", line 1157, to take forward TimeoutExpired ( self.args, timeout) subprocess.TimeoutExpired: command 'BCP BehaviorResearch..Facet_Data in _temp_ -t -c S PBB-C202B-2 \ BEHAVIORRESEARCH -e bu Lk_copy_errors.log '5 seconds after 1000 lines sent to SQL Server after expired. Total sent: 1000 1000 rows sent to SQL Server Total sent: 2000 PS C: \ FaceIAPS \ StudyDataFiles & gt; Total number of rows sent to SQL Server: 3000 Sent rows sent to SQL Server: Total sent to 4000 rows sent to SQL Server: Total 5000 rows sent to SQL Server. Total sent: Total sent 6000 1000 rows sent to SQL Server: 7000 1000 rows sent in SQL Server Sent Total: 8000 1000 Sent Rows Sent to SQL Server Total: 9000 1000 rows sent to SQL Server. Total sent: 10000 rows sent to SQL Server. Total sent: Total 11000 1000 rows sent to SQL Server: Sent 12000 1000 rows sent to SQL Server Total: 13000 sent 1000 rows sent to SQL Server Total: 14000 1000 rows sent to SQL Server Total sent: Total 15000 1000 rows sent to SQL Server Total Sent: 16000 16102 Rows copied. Network Packet Size (Bytes): 4096 Clock Time (MS) Total: 5164 Avg: (3118.13 copy per line) As you can see, separated the command prompt Has been producing BCP calls.
What's going on, and how can I fix it?
EDIT: How did I fix it
Change the subprocess call to:
argument = ["BCP", "BehaviorResearch .. "+ table_to_upload_to ,, filename_to_be_uploaded," -T "" in "" -c "," S PBB-C202B-2 \ BEHAVIORRESEARCH "," -e bulk_copy_errors.log "] subprocess.call (logic, timeout = 30) for an unlimited, as a FYI, "in" it's own logic. Documents for
:
for Popen.wait () Timeout logic has been passed. If the expiration expires, the child's process will be killed and then again will be waiting.
def call (* popenargs, timeout = None, ** kwargs): Try :: return except p.wait (timeout = expire): popen (* popenargs, * * Kwargs) With the exception of p.kill () #making timeoutExpired p.wait with any exception () raising (ie), that is what you see is the expected behavior: if If time expires, then shell (% COMSPEC% or cmd.exe ) that the BCP process should be terminated immediately, in turn, termination of BCP process Maybe. After the subprances are already emptied or you can see the output buffer in the console (I'm not sure) if you see the output from the live grandson BCP process , While her parent cmd.exe has already ended (a new signal is shown). Remove shell = True to prevent unnecessary intermediate cmd.exe process so call .kill () Instead of the Shell process, the BCP process directly.
Comments
Post a Comment