|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
As Python pushed a lot of functionality into libraries capturing of shell command output is pretty tricky and changed from one version to another. This is not just many way to do the same thing, this is simply a mess: solutions for Python 2.7 do not work in Python 3.7 and wise-versa.
Also Python provides different facility from capturing output from a single command and the other in case you have a pipe. Which doubles that confusion.
For Python 2.7 the following works
import os Inlist_of_ls = os.popen("ls ../..").read().split('\n')
In Python 3:
import subprocess output = subprocess.getoutput("ls -l") print(output)
As the result you can find the following solution to the problem :-)
import os os.system('sample_cmd > tmp') print open('tmp', 'r').read()
See
By DavidMuller
July 30, 2020 5.8kviewsNow that we can invoke an
external program using subprocess.run
,
let’s see how we can capture output from that program. For example, this process could be useful if we wanted to use git
ls-files
to output all your files currently stored under version control.
Note: The examples shown in this section require Python 3.7 or higher. In particular, the capture_output
and text
keyword
arguments were added in Python 3.7 when it was released in June 2018.
Let’s add to our previous example:
import subprocess
import sys
result = subprocess.run(
[sys.executable, "-c", "print('ocean')"], capture_output=True, text=True
)
print("stdout:", result.stdout)
print("stderr:", result.stderr)
If we run this code, we’ll receive output like the following:
stdout: ocean
stderr:
This example is largely
the same as the one introduced in the first section: we are still running a subprocess to print ocean
.
Importantly, however, we pass the capture_output=True
and text=True
keyword
arguments to subprocess.run
.
subprocess.run
returns a subprocess.CompletedProcess
object
that is bound to result
.
The subprocess.CompletedProcess
object
includes details about the external program’s exit code and its output. capture_output=True
ensures
that result.stdout
and result.stderr
are
filled in with the corresponding output from the external program.
By default, result.stdout
and result.stderr
are
bound as bytes, but the text=True
keyword
argument instructs Python to instead decode the bytes into strings.
In the output section, stdout
is ocean
(plus
the trailing newline that print
adds
implicitly), and we have no stderr
.
Let’s try an example that
produces a non-empty value for stderr
:
import subprocess
import sys
result = subprocess.run(
[sys.executable, "-c", "raise ValueError('oops')"], capture_output=True, text=True
)
print("stdout:", result.stdout)
print("stderr:", result.stderr)
If we run this code, we receive output like the following:
Outputstdout: stderr: Traceback (most recent call last): File "<string>", line 1, in <module> ValueError: oops
This code runs a Python
subprocess that immediately raises a ValueError
.
When we inspect the final result
,
we see nothing in stdout
and
a Traceback
of
our ValueError
in stderr
.
This is because by default Python writes the Traceback
of
the unhandled exception to stderr
.
Sometimes it's useful to
raise an exception if a program we run exits with a bad exit code. Programs that exit with a zero code are considered successful,
but programs that exit with a non-zero code are considered to have encountered an error. As an example, this pattern could be
useful if we wanted to raise an exception in the event that we run git
ls-files
in a directory that wasn't actually a git
repository.
We can use the check=True
keyword
argument to subprocess.run
to
have an exception raised if the external program returns a non-zero exit code:
import subprocess
import sys
result = subprocess.run([sys.executable, "-c", "raise ValueError('oops')"], check=True)
If we run this code, we receive output like the following:
OutputTraceback (most recent call last): File "<string>", line 1, in <module> ValueError: oops Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.8/subprocess.py", line 512, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['/usr/local/bin/python', '-c', "raise ValueError('oops')"]' returned non-zero exit status 1.
This output shows that we
ran a subprocess that raised an error, which is printed in stderr
in
our terminal. Then subprocess.run
dutifully
raised a subprocess.CalledProcessError
on
our behalf in our main Python program.
Alternatively, the subprocess
module
also includes the subprocess.CompletedProcess.check_returncode
method,
which we can invoke for similar effect:
import subprocess
import sys
result = subprocess.run([sys.executable, "-c", "raise ValueError('oops')"])
result.check_returncode()
If we run this code, we'll receive:
OutputTraceback (most recent call last): File "<string>", line 1, in <module> ValueError: oops Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.8/subprocess.py", line 444, in check_returncode raise CalledProcessError(self.returncode, self.args, self.stdout, subprocess.CalledProcessError: Command '['/usr/local/bin/python', '-c', "raise ValueError('oops')"]' returned non-zero exit status 1.
Since we didn't pass check=True
to subprocess.run
,
we successfully bound a subprocess.CompletedProcess
instance
to result
even
though our program exited with a non-zero code. Calling result.check_returncode()
,
however, raises a subprocess.CalledProcessError
because
it detects the completed process exited with a bad code.
subprocess.run
includes the timeout
argument
to allow you to stop an external program if it is taking too long to execute:
import subprocess
import sys
result = subprocess.run([sys.executable, "-c", "import time; time.sleep(2)"], timeout=1)
If we run this code, we'll receive output like the following:
OutputTraceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.8/subprocess.py", line 491, in run stdout, stderr = process.communicate(input, timeout=timeout) File "/usr/local/lib/python3.8/subprocess.py", line 1024, in communicate stdout, stderr = self._communicate(input, endtime, timeout) File "/usr/local/lib/python3.8/subprocess.py", line 1892, in _communicate self.wait(timeout=self._remaining_time(endtime)) File "/usr/local/lib/python3.8/subprocess.py", line 1079, in wait return self._wait(timeout=timeout) File "/usr/local/lib/python3.8/subprocess.py", line 1796, in _wait raise TimeoutExpired(self.args, timeout) subprocess.TimeoutExpired: Command '['/usr/local/bin/python', '-c', 'import time; time.sleep(2)']' timed out after 0.9997982999999522 seconds
The subprocess we tried
to run used the time.sleep
function to
sleep for 2
seconds.
However, we passed the timeout=1
keyword
argument to subprocess.run
to
time out our subprocess after 1
second.
This explains why our call to subprocess.run
ultimately
raised a subprocess.TimeoutExpired
exception.
Note that the timeout
keyword
argument to subprocess.run
is
approximate. Python will make a best effort to kill the subprocess after the timeout
number
of seconds, but it won't necessarily be exact.
Sometimes programs expect
input to be passed to them via stdin
.
The input
keyword
argument to subprocess.run
allows
you to pass data to the stdin
of
the subprocess. For example:
import subprocess import sys result = subprocess.run( [sys.executable, "-c", "import sys; print(sys.stdin.read())"], input=b"underwater" )
We'll receive output like the following after running this code:
underwater
In this case, we passed
the bytes underwater
to input
.
Our target subprocess used sys.stdin
to
read the passed in stdin
(underwater
)
and printed it out in our output.
The input
keyword
argument can be useful if you want to chain multiple subprocess.run
calls
together passing the output of one program as the input to another.
The subprocess
module
is a powerful part of the Python standard library that lets you run external programs and inspect their outputs easily. In this
tutorial, you have learned to use subprocess.run
to
control external programs, pass input to them, parse their output, and check their return codes.
The subprocess
module
exposes additional classes and utilities that we did not cover in this tutorial. Now that you have a baseline, you can use the subprocess
module's
documentation to learn more about other available classes and utilities.
h
|
Switchboard | ||||
Latest | |||||
Past week | |||||
Past month |
Jan 01, 2014 | stackoverflow.com
Ask Question Asked 6 years, 3 months ago Active 7 months ago Viewed 77k times
Juicy ,
I am using:
grepOut = subprocess.check_output("grep " + search + " tmp", shell=True)To run a terminal command, I know that I can use a try/except to catch the error but how can I get the value of the error code?
I found this on the official documentation:
exception subprocess.CalledProcessError Exception raised when a process run by check_call() or check_output() returns a non-zero exit status. returncode Exit status of the child process.But there are no examples given and Google was of no help.
jfs ,
"Google was of no help" : the first link (almost there it showse.output
), the second link is the exact match (it showse.returncode
) the search term:CalledProcessError
. – jfs May 2 '14 at 15:06DanGar , 2014-05-02 05:07:05
You can get the error code and results from the exception that is raised.
This can be done through the fields
returncode
andoutput
.For example:
import subprocess try: grepOut = subprocess.check_output("grep " + "test" + " tmp", shell=True) except subprocess.CalledProcessError as grepexc: print "error code", grepexc.returncode, grepexc.outputDanGar ,
Thank you exactly what I wanted. But now I am wondering, is there a way to get a return code without a try/except? IE just get the return code of the check_output, whether it is 0 or 1 or other is not important to me and I don't actually need to save the output. – Juicy May 2 '14 at 5:12jfs , 2014-05-02 16:09:20
is there a way to get a return code without a try/except?
check_output
raises an exception if it receives non-zero exit status because it frequently means that a command failed.grep
may return non-zero exit status even if there is no error -- you could use.communicate()
in this case:from subprocess import Popen, PIPE pattern, filename = 'test', 'tmp' p = Popen(['grep', pattern, filename], stdin=PIPE, stdout=PIPE, stderr=PIPE, bufsize=-1) output, error = p.communicate() if p.returncode == 0: print('%r is found in %s: %r' % (pattern, filename, output)) elif p.returncode == 1: print('%r is NOT found in %s: %r' % (pattern, filename, output)) else: assert p.returncode > 1 print('error occurred: %r' % (error,))You don't need to call an external command to filter lines, you could do it in pure Python:
with open('tmp') as file: for line in file: if 'test' in line: print line,If you don't need the output; you could use
subprocess.call()
:import os from subprocess import call try: from subprocess import DEVNULL # Python 3 except ImportError: # Python 2 DEVNULL = open(os.devnull, 'r+b', 0) returncode = call(['grep', 'test', 'tmp'], stdin=DEVNULL, stdout=DEVNULL, stderr=DEVNULL)mkobit , 2017-09-15 14:52:56
Python 3.5 introduced the
subprocess.run()
method. The signature looks like:subprocess.run( args, *, stdin=None, input=None, stdout=None, stderr=None, shell=False, timeout=None, check=False )The returned result is a
subprocess.CompletedProcess
. In 3.5, you can access theargs
,returncode
,stdout
, andstderr
from the executed process.Example:
>>> result = subprocess.run(['ls', '/tmp'], stdout=subprocess.DEVNULL) >>> result.returncode 0 >>> result = subprocess.run(['ls', '/nonexistent'], stderr=subprocess.DEVNULL) >>> result.returncode 2Dean Kayton ,
I reckon this is the most up-to-date approach. The syntax is much more simple and intuitive and was probably added for just that reason. – Dean Kayton Jul 22 '19 at 11:46Noam Manos ,
In Python 2 - use commands module:
import command rc, out = commands.getstatusoutput("ls missing-file") if rc != 0: print "Error occurred: %s" % outIn Python 3 - use subprocess module:
import subprocess rc, out = subprocess.getstatusoutput("ls missing-file") if rc != 0: print ("Error occurred:", out)Error occurred: ls: cannot access missing-file: No such file or directory
Jan 01, 2014 | stackoverflow.com
subprocess.check_output return code Ask Question Asked 6 years, 3 months ago Active 7 months ago Viewed 77k times
Juicy ,
I am using:
grepOut = subprocess.check_output("grep " + search + " tmp", shell=True)To run a terminal command, I know that I can use a try/except to catch the error but how can I get the value of the error code?
I found this on the official documentation:
exception subprocess.CalledProcessError Exception raised when a process run by check_call() or check_output() returns a non-zero exit status. returncode Exit status of the child process.But there are no examples given and Google was of no help.
jfs ,
"Google was of no help" : the first link (almost there it showse.output
), the second link is the exact match (it showse.returncode
) the search term:CalledProcessError
. – jfs May 2 '14 at 15:06DanGar , 2014-05-02 05:07:05
You can get the error code and results from the exception that is raised.
This can be done through the fields
returncode
andoutput
.For example:
import subprocess try: grepOut = subprocess.check_output("grep " + "test" + " tmp", shell=True) except subprocess.CalledProcessError as grepexc: print "error code", grepexc.returncode, grepexc.outputDanGar ,
Thank you exactly what I wanted. But now I am wondering, is there a way to get a return code without a try/except? IE just get the return code of the check_output, whether it is 0 or 1 or other is not important to me and I don't actually need to save the output. – Juicy May 2 '14 at 5:12jfs , 2014-05-02 16:09:20
is there a way to get a return code without a try/except?
check_output
raises an exception if it receives non-zero exit status because it frequently means that a command failed.grep
may return non-zero exit status even if there is no error -- you could use.communicate()
in this case:from subprocess import Popen, PIPE pattern, filename = 'test', 'tmp' p = Popen(['grep', pattern, filename], stdin=PIPE, stdout=PIPE, stderr=PIPE, bufsize=-1) output, error = p.communicate() if p.returncode == 0: print('%r is found in %s: %r' % (pattern, filename, output)) elif p.returncode == 1: print('%r is NOT found in %s: %r' % (pattern, filename, output)) else: assert p.returncode > 1 print('error occurred: %r' % (error,))You don't need to call an external command to filter lines, you could do it in pure Python:
with open('tmp') as file: for line in file: if 'test' in line: print line,If you don't need the output; you could use
subprocess.call()
:import os from subprocess import call try: from subprocess import DEVNULL # Python 3 except ImportError: # Python 2 DEVNULL = open(os.devnull, 'r+b', 0) returncode = call(['grep', 'test', 'tmp'], stdin=DEVNULL, stdout=DEVNULL, stderr=DEVNULL)> ,
add a commentmkobit , 2017-09-15 14:52:56
Python 3.5 introduced the
subprocess.run()
method. The signature looks like:subprocess.run( args, *, stdin=None, input=None, stdout=None, stderr=None, shell=False, timeout=None, check=False )The returned result is a
subprocess.CompletedProcess
. In 3.5, you can access theargs
,returncode
,stdout
, andstderr
from the executed process.Example:
>>> result = subprocess.run(['ls', '/tmp'], stdout=subprocess.DEVNULL) >>> result.returncode 0 >>> result = subprocess.run(['ls', '/nonexistent'], stderr=subprocess.DEVNULL) >>> result.returncode 2Dean Kayton ,
I reckon this is the most up-to-date approach. The syntax is much more simple and intuitive and was probably added for just that reason. – Dean Kayton Jul 22 '19 at 11:46simfinite , 2019-07-01 14:36:06
To get both output and return code (without try/except) simply use subprocess.getstatusoutput (Python 3 required)
electrovir ,
please read stackoverflow.com/help/how-to-answer on how to write a good answer. – DjSh Jul 1 '19 at 14:45> ,
In Python 2 - use commands module:
import command rc, out = commands.getstatusoutput("ls missing-file") if rc != 0: print "Error occurred: %s" % outIn Python 3 - use subprocess module:
import subprocess rc, out = subprocess.getstatusoutput("ls missing-file") if rc != 0: print ("Error occurred:", out)Error occurred: ls: cannot access missing-file: No such file or directory
Aug 19, 2020 | stackoverflow.com
n This question already has answers here : subprocess.check_output return code (5 answers) Closed 5 years ago .
While developing python wrapper library for Android Debug Bridge (ADB), I'm using subprocess to execute adb commands in shell. Here is the simplified example:
import subprocess ... def exec_adb_command(adb_command): return = subprocess.call(adb_command)If command executed propery exec_adb_command returns 0 which is OK.
But some adb commands return not only "0" or "1" but also generate some output which I want to catch also. adb devices for example:
D:\git\adb-lib\test>adb devices List of devices attached 07eeb4bb deviceI've already tried subprocess.check_output() for that purpose, and it does return output but not the return code ("0" or "1").
Ideally I would want to get a tuple where t[0] is return code and t[1] is actual output.
Am I missing something in subprocess module which already allows to get such kind of results?
Thanks! python subprocess adb share improve this question follow asked Jun 19 '15 at 12:10 Viktor Malyi 1,761 2 2 gold badges 18 18 silver badges 34 34 bronze badges
> ,
add a comment 1 Answer Active Oldest VotesPadraic Cunningham ,
Popen and communicate will allow you to get the output and the return code.
from subprocess import Popen,PIPE,STDOUT out = Popen(["adb", "devices"],stderr=STDOUT,stdout=PIPE) t = out.communicate()[0],out.returncode print(t) ('List of devices attached \n\n', 0)check_output may also be suitable, a non-zero exit status will raise a CalledProcessError:
from subprocess import check_output, CalledProcessError try: out = check_output(["adb", "devices"]) t = 0, out except CalledProcessError as e: t = e.returncode, e.messageYou also need to redirect stderr to store the error output:
from subprocess import check_output, CalledProcessError from tempfile import TemporaryFile def get_out(*args): with TemporaryFile() as t: try: out = check_output(args, stderr=t) return 0, out except CalledProcessError as e: t.seek(0) return e.returncode, t.read()Just pass your commands:
In [5]: get_out("adb","devices") Out[5]: (0, 'List of devices attached \n\n') In [6]: get_out("adb","devices","foo") Out[6]: (1, 'Usage: adb devices [-l]\n')Noam Manos ,
Thank you for the broad answer! – Viktor Malyi Jun 19 '15 at 12:48
Jan 01, 2008 | stackoverflow.com
Ask Question Asked 11 years, 11 months ago Active 3 months ago Viewed 3.5m times
> ,
How do you call an external command (as if I'd typed it at the Unix shell or Windows command prompt) from within a Python script?freshWoWer , 2008-09-18 01:35:30
edited Nov 27 '19 at 21:26 Chris 80.2k 21 21 gold badges 175 175 silver badges 162 162 bronze badges asked Sep 18 '08 at 1:35 freshWoWer 51.3k 10 10 gold badges 31 31 silver badges 33 33 bronze badges> ,
add a comment 61 Answers Active Oldest Votes 1 2 3 Next> ,
Look at the subprocess module in the standard library:import subprocess subprocess.run(["ls", "-l"])The advantage of
subprocess
vs.system
is that it is more flexible (you can get thestdout
,stderr
, the "real" status code, better error handling, etc...).The official documentation recommends the
subprocess
module over the alternativeos.system()
:The
subprocess
module provides more powerful facilities for spawning new processes and retrieving their results; using that module is preferable to using this function [os.system()
].The Replacing Older Functions with the subprocess Module section in the
subprocess
documentation may have some helpful recipes.For versions of Python before 3.5, use
call
:import subprocess subprocess.call(["ls", "-l"])David Cournapeau , 2008-09-18 01:39:35
edited Aug 31 '19 at 3:17 Corey Goldberg 50.5k 23 23 gold badges 114 114 silver badges 133 133 bronze badges answered Sep 18 '08 at 1:39 David Cournapeau 67.7k 7 7 gold badges 58 58 silver badges 67 67 bronze badgesDaniel F ,
Is there a way to use variable substitution? IE I tried to doecho $PATH
by usingcall(["echo", "$PATH"])
, but it just echoed the literal string$PATH
instead of doing any substitution. I know I could get the PATH environment variable, but I'm wondering if there is an easy way to have the command behave exactly as if I had executed it in bash. – Kevin Wheeler Sep 1 '15 at 23:17Eli Courtwright ,
Here's a summary of the ways to call external programs and the advantages and disadvantages of each:
os.system("some_command with args")
passes the command and arguments to your system's shell. This is nice because you can actually run multiple commands at once in this manner and set up pipes and input/output redirection. For example:os.system("some_command < input_file | another_command > output_file")However, while this is convenient, you have to manually handle the escaping of shell characters such as spaces, etc. On the other hand, this also lets you run commands which are simply shell commands and not actually external programs. See the documentation .
stream = os.popen("some_command with args")
will do the same thing asos.system
except that it gives you a file-like object that you can use to access standard input/output for that process. There are 3 other variants of popen that all handle the i/o slightly differently. If you pass everything as a string, then your command is passed to the shell; if you pass them as a list then you don't need to worry about escaping anything. See the documentation .- The
Popen
class of thesubprocess
module. This is intended as a replacement foros.popen
but has the downside of being slightly more complicated by virtue of being so comprehensive. For example, you'd say:print subprocess.Popen("echo Hello World", shell=True, stdout=subprocess.PIPE).stdout.read()instead of:
print os.popen("echo Hello World").read()but it is nice to have all of the options there in one unified class instead of 4 different popen functions. See the documentation .
- The
call
function from thesubprocess
module. This is basically just like thePopen
class and takes all of the same arguments, but it simply waits until the command completes and gives you the return code. For example:return_code = subprocess.call("echo Hello World", shell=True)See the documentation .
- If you're on Python 3.5 or later, you can use the new
subprocess.run
function, which is a lot like the above but even more flexible and returns aCompletedProcess
object when the command finishes executing.- The os module also has all of the fork/exec/spawn functions that you'd have in a C program, but I don't recommend using them directly.
The
subprocess
module should probably be what you use.Finally please be aware that for all methods where you pass the final command to be executed by the shell as a string and you are responsible for escaping it. There are serious security implications if any part of the string that you pass can not be fully trusted. For example, if a user is entering some/any part of the string. If you are unsure, only use these methods with constants. To give you a hint of the implications consider this code:
print subprocess.Popen("echo %s " % user_input, stdout=PIPE).stdout.read()and imagine that the user enters something "my mama didnt love me && rm -rf /" which could erase the whole filesystem.
tripleee ,
Nice answer/explanation. How is this answer justifying Python's motto as described in this article ? fastcompany.com/3026446/ "Stylistically, Perl and Python have different philosophies. Perl's best known mottos is " There's More Than One Way to Do It". Python is designed to have one obvious way to do it" Seem like it should be the other way! In Perl I know only two ways to execute a command - using back-tick oropen
. – Jean May 26 '15 at 21:16> ,
> ,
show 1 more comment> ,
Typical implementation:import subprocess p = subprocess.Popen('ls', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) for line in p.stdout.readlines(): print line, retval = p.wait()You are free to do what you want with the
stdout
data in the pipe. In fact, you can simply omit those parameters (stdout=
andstderr=
) and it'll behave likeos.system()
.EmmEff ,
edited Oct 4 '19 at 2:33 Trenton McKinney 13.5k 13 13 gold badges 27 27 silver badges 43 43 bronze badges answered Sep 18 '08 at 18:20 EmmEff 6,517 2 2 gold badges 14 14 silver badges 18 18 bronze badgestripleee ,
.readlines()
reads all lines at once i.e., it blocks until the subprocess exits (closes its end of the pipe). To read in real time (if there is no buffering issues) you could:for line in iter(p.stdout.readline, ''): print line,
– jfs Nov 16 '12 at 14:12newtover , 2010-02-12 10:15:34
Some hints on detaching the child process from the calling one (starting the child process in background).Suppose you want to start a long task from a CGI script. That is, the child process should live longer than the CGI script execution process.
The classical example from the subprocess module documentation is:
import subprocess import sys # Some code here pid = subprocess.Popen([sys.executable, "longtask.py"]) # Call subprocess # Some more code hereThe idea here is that you do not want to wait in the line 'call subprocess' until the longtask.py is finished. But it is not clear what happens after the line 'some more code here' from the example.
My target platform was FreeBSD, but the development was on Windows, so I faced the problem on Windows first.
On Windows (Windows XP), the parent process will not finish until the longtask.py has finished its work. It is not what you want in a CGI script. The problem is not specific to Python; in the PHP community the problems are the same.
The solution is to pass DETACHED_PROCESS Process Creation Flag to the underlying CreateProcess function in Windows API. If you happen to have installed pywin32, you can import the flag from the win32process module, otherwise you should define it yourself:
DETACHED_PROCESS = 0x00000008 pid = subprocess.Popen([sys.executable, "longtask.py"], creationflags=DETACHED_PROCESS).pid/* UPD 2015.10.27 @eryksun in a comment below notes, that the semantically correct flag is CREATE_NEW_CONSOLE (0x00000010) */
On FreeBSD we have another problem: when the parent process is finished, it finishes the child processes as well. And that is not what you want in a CGI script either. Some experiments showed that the problem seemed to be in sharing sys.stdout. And the working solution was the following:
pid = subprocess.Popen([sys.executable, "longtask.py"], stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE)I have not checked the code on other platforms and do not know the reasons of the behaviour on FreeBSD. If anyone knows, please share your ideas. Googling on starting background processes in Python does not shed any light yet.
maranas ,
i noticed a possible "quirk" with developing py2exe apps in pydev+eclipse. i was able to tell that the main script was not detached because eclipse's output window was not terminating; even if the script executes to completion it is still waiting for returns. but, when i tried compiling to a py2exe executable, the expected behavior occurs (runs the processes as detached, then quits). i am not sure, but the executable name is not in the process list anymore. this works for all approaches (os.system("start *"), os.spawnl with os.P_DETACH, subprocs, etc.) – maranas Apr 9 '10 at 8:09> ,
Charlie Parker ,
you might also need CREATE_NEW_PROCESS_GROUP flag. See Popen waiting for child process even when the immediate child has terminated – jfs Nov 16 '12 at 14:16nimish , 2008-09-18 01:37:24
import os os.system("your command")Note that this is dangerous, since the command isn't cleaned. I leave it up to you to google for the relevant documentation on the 'os' and 'sys' modules. There are a bunch of functions (exec* and spawn*) that will do similar things.
tripleee ,
No idea what I meant nearly a decade ago (check the date!), but if I had to guess, it would be that there's no validation done. – nimish Jun 6 '18 at 16:01> ,
Nikolay Shindarov ,
Note the timestamp on this guy: the "correct" answer has 40x the votes and is answer #1. – nimish Dec 3 '18 at 18:41sirwart , 2008-09-18 01:42:30
I'd recommend using the subprocess module instead of os.system because it does shell escaping for you and is therefore much safer.subprocess.call(['ping', 'localhost'])Lie Ryan ,
If you want to create a list out of a command with parameters , a list which can be used withsubprocess
whenshell=False
, then useshlex.split
for an easy way to do this docs.python.org/2/library/shlex.html#shlex.split (it's the recommended way according to the docs docs.python.org/2/library/subprocess.html#popen-constructor ) – Daniel F Sep 20 '18 at 18:07> ,
> ,
add a comment> ,
import os cmd = 'ls -al' os.system(cmd)If you want to return the results of the command, you can use
os.popen
. However, this is deprecated since version 2.6 in favor of the subprocess module , which other answers have covered well.Alexandra Franks ,
edited Jan 26 '16 at 16:53 Patrick M 8,966 8 8 gold badges 54 54 silver badges 93 93 bronze badges answered Sep 18 '08 at 1:37 Alexandra Franks 2,726 1 1 gold badge 16 16 silver badges 22 22 bronze badgesStefan Gruenwald ,
popen is deprecated in favor of subprocess . – Fox Wilson Aug 8 '14 at 0:22> ,
There are lots of different libraries which allow you to call external commands with Python. For each library I've given a description and shown an example of calling an external command. The command I used as the example isls -l
(list all files). If you want to find out more about any of the libraries I've listed and linked the documentation for each of them.Sources:
- subprocess: https://docs.python.org/3.5/library/subprocess.html
- shlex: https://docs.python.org/3/library/shlex.html
- os: https://docs.python.org/3.5/library/os.html
- sh: https://amoffat.github.io/sh/
- plumbum: https://plumbum.readthedocs.io/en/latest/
- pexpect: https://pexpect.readthedocs.io/en/stable/
- fabric: http://www.fabfile.org/
- envoy: https://github.com/kennethreitz/envoy
- commands: https://docs.python.org/2/library/commands.html
These are all the libraries:
Hopefully this will help you make a decision on which library to use :)
subprocess
Subprocess allows you to call external commands and connect them to their input/output/error pipes (stdin, stdout, and stderr). Subprocess is the default choice for running commands, but sometimes other modules are better.
subprocess.run(["ls", "-l"]) # Run command subprocess.run(["ls", "-l"], stdout=subprocess.PIPE) # This will run the command and return any output subprocess.run(shlex.split("ls -l")) # You can also use the shlex library to split the commandos
os is used for "operating system dependent functionality". It can also be used to call external commands with
os.system
andos.popen
(Note: There is also a subprocess.popen). os will always run the shell and is a simple alternative for people who don't need to, or don't know how to usesubprocess.run
.os.system("ls -l") # run command os.popen("ls -l").read() # This will run the command and return any outputsh
sh is a subprocess interface which lets you call programs as if they were functions. This is useful if you want to run a command multiple times.
sh.ls("-l") # Run command normally ls_cmd = sh.Command("ls") # Save command as a variable ls_cmd() # Run command as if it were a functionplumbum
plumbum is a library for "script-like" Python programs. You can call programs like functions as in
sh
. Plumbum is useful if you want to run a pipeline without the shell.ls_cmd = plumbum.local("ls -l") # get command ls_cmd() # run commandpexpect
pexpect lets you spawn child applications, control them and find patterns in their output. This is a better alternative to subprocess for commands that expect a tty on Unix.
pexpect.run("ls -l") # Run command as normal child = pexpect.spawn('scp foo [email protected]:.') # Spawns child application child.expect('Password:') # When this is the output child.sendline('mypassword')fabric
fabric is a Python 2.5 and 2.7 library. It allows you to execute local and remote shell commands. Fabric is simple alternative for running commands in a secure shell (SSH)
fabric.operations.local('ls -l') # Run command as normal fabric.operations.local('ls -l', capture = True) # Run command and receive outputenvoy
envoy is known as "subprocess for humans". It is used as a convenience wrapper around the
subprocess
module.r = envoy.run("ls -l") # Run command r.std_out # get outputcommands
commands
contains wrapper functions foros.popen
, but it has been removed from Python 3 sincesubprocess
is a better alternative.The edit was based on J.F. Sebastian's comment.
Tom Fuller ,
edited May 28 '17 at 23:14 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Oct 29 '16 at 14:02 Tom Fuller 4,280 6 6 gold badges 29 29 silver badges 38 38 bronze badges> ,
add a comment> ,
I always usefabric
for this things like:from fabric.operations import local result = local('ls', capture=True) print "Content:/n%s" % (result, )But this seem to be a good tool:
sh
(Python subprocess interface) .Look at an example:
from sh import vgdisplay print vgdisplay() print vgdisplay('-v') print vgdisplay(v=True)Jorge E. Cardona ,
edited Nov 29 '19 at 21:47 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Mar 13 '12 at 0:12 Jorge E. Cardona 81.8k 3 3 gold badges 29 29 silver badges 39 39 bronze badges> ,
add a comment> ,
Check the "pexpect" Python library, too.It allows for interactive controlling of external programs/commands, even ssh, ftp, telnet, etc. You can just type something like:
child = pexpect.spawn('ftp 192.168.0.24') child.expect('(?i)name .*: ') child.sendline('anonymous') child.expect('(?i)password')athanassis , 2010-10-07 07:09:04
edited May 28 '17 at 23:02 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Oct 7 '10 at 7:09 athanassis 909 6 6 silver badges 4 4 bronze badges> ,
add a comment> ,
With the standard libraryUse the subprocess module (Python 3):
import subprocess subprocess.run(['ls', '-l'])It is the recommended standard way. However, more complicated tasks (pipes, output, input, etc.) can be tedious to construct and write.
Note on Python version: If you are still using Python 2, subprocess.call works in a similar way.
ProTip: shlex.split can help you to parse the command for
run
,call
, and othersubprocess
functions in case you don't want (or you can't!) provide them in form of lists:import shlex import subprocess subprocess.run(shlex.split('ls -l'))With external dependenciesIf you do not mind external dependencies, use plumbum :
from plumbum.cmd import ifconfig print(ifconfig['wlan0']())It is the best
subprocess
wrapper. It's cross-platform, i.e. it works on both Windows and Unix-like systems. Install bypip install plumbum
.Another popular library is sh :
from sh import ifconfig print(ifconfig('wlan0'))However,
sh
dropped Windows support, so it's not as awesome as it used to be. Install bypip install sh
.6 revs, 2 users 79%
, 2019-11-29 21:54:25edited Nov 29 '19 at 21:54 community wiki
6 revs, 2 users 79%
Honza Javorek> ,
add a comment> ,
If you need the output from the command you are calling, then you can use subprocess.check_output (Python 2.7+).>>> subprocess.check_output(["ls", "-l", "/dev/null"]) 'crw-rw-rw- 1 root root 1, 3 Oct 18 2007 /dev/null\n'Also note the shell parameter.
If shell is
True
, the specified command will be executed through the shell. This can be useful if you are using Python primarily for the enhanced control flow it offers over most system shells and still want convenient access to other shell features such as shell pipes, filename wildcards, environment variable expansion, and expansion of ~ to a user's home directory. However, note that Python itself offers implementations of many shell-like features (in particular,glob
,fnmatch
,os.walk()
,os.path.expandvars()
,os.path.expanduser()
, andshutil
).Facundo Casco , 2011-04-28 20:29:29
edited Jun 3 '18 at 20:18 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Apr 28 '11 at 20:29 Facundo Casco 8,229 5 5 gold badges 38 38 silver badges 61 61 bronze badgesBruno Bronosky ,
Note thatcheck_output
requires a list rather than a string. If you don't rely on quoted spaces to make your call valid, the simplest, most readable way to do this issubprocess.check_output("ls -l /dev/null".split())
. – Bruno Bronosky Jan 30 '18 at 18:18> ,
This is how I run my commands. This code has everything you need pretty muchfrom subprocess import Popen, PIPE cmd = "ls -l ~/" p = Popen(cmd , shell=True, stdout=PIPE, stderr=PIPE) out, err = p.communicate() print "Return code: ", p.returncode print out.rstrip(), err.rstrip()Usman Khan , 2012-10-28 05:14:01
edited Oct 28 '12 at 5:44 answered Oct 28 '12 at 5:14 Usman Khan 621 5 5 silver badges 3 3 bronze badgesAdam Matan ,
I think it's acceptable for hard-coded commands, if it increases readability. – Adam Matan Apr 2 '14 at 13:07> ,
Update:
subprocess.run
is the recommended approach as of Python 3.5 if your code does not need to maintain compatibility with earlier Python versions. It's more consistent and offers similar ease-of-use as Envoy. (Piping isn't as straightforward though. See this question for how .)Here's some examples from the documentation .
Run a process:
>>> subprocess.run(["ls", "-l"]) # Doesn't capture output CompletedProcess(args=['ls', '-l'], returncode=0)Raise on failed run:
>>> subprocess.run("exit 1", shell=True, check=True) Traceback (most recent call last): ... subprocess.CalledProcessError: Command 'exit 1' returned non-zero exit status 1Capture output:
>>> subprocess.run(["ls", "-l", "/dev/null"], stdout=subprocess.PIPE) CompletedProcess(args=['ls', '-l', '/dev/null'], returncode=0, stdout=b'crw-rw-rw- 1 root root 1, 3 Jan 23 16:23 /dev/null\n')Original answer:I recommend trying Envoy . It's a wrapper for subprocess, which in turn aims to replace the older modules and functions. Envoy is subprocess for humans.
Example usage from the README :
>>> r = envoy.run('git config', data='data to pipe in', timeout=2) >>> r.status_code 129 >>> r.std_out 'usage: git config [options]' >>> r.std_err ''Pipe stuff around too:
>>> r = envoy.run('uptime | pbcopy') >>> r.command 'pbcopy' >>> r.status_code 0 >>> r.history [<Response 'uptime'>]Joe , 2012-11-15 17:13:22
edited Nov 29 '19 at 21:52 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Nov 15 '12 at 17:13 Joe 14.1k 9 9 gold badges 52 52 silver badges 69 69 bronze badges> ,
add a comment> ,
Use subprocess ....or for a very simple command:
import os os.system('cat testfile')Ben Hoffstein , 2008-09-18 01:43:30
edited Nov 29 '19 at 21:39 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Sep 18 '08 at 1:43 Ben Hoffstein 96.2k 8 8 gold badges 97 97 silver badges 117 117 bronze badges> ,
add a comment> ,
Calling an external command in PythonSimple, use
subprocess.run
, which returns aCompletedProcess
object:>>> import subprocess >>> completed_process = subprocess.run('python --version') Python 3.6.1 :: Anaconda 4.4.0 (64-bit) >>> completed_process CompletedProcess(args='python --version', returncode=0)Why?As of Python 3.5, the documentation recommends subprocess.run :
The recommended approach to invoking subprocesses is to use the run() function for all use cases it can handle. For more advanced use cases, the underlying Popen interface can be used directly.
Here's an example of the simplest possible usage - and it does exactly as asked:
>>> import subprocess >>> completed_process = subprocess.run('python --version') Python 3.6.1 :: Anaconda 4.4.0 (64-bit) >>> completed_process CompletedProcess(args='python --version', returncode=0)
run
waits for the command to successfully finish, then returns aCompletedProcess
object. It may instead raiseTimeoutExpired
(if you give it atimeout=
argument) orCalledProcessError
(if it fails and you passcheck=True
).As you might infer from the above example, stdout and stderr both get piped to your own stdout and stderr by default.
We can inspect the returned object and see the command that was given and the returncode:
>>> completed_process.args 'python --version' >>> completed_process.returncode 0Capturing outputIf you want to capture the output, you can pass
subprocess.PIPE
to the appropriatestderr
orstdout
:>>> cp = subprocess.run('python --version', stderr=subprocess.PIPE, stdout=subprocess.PIPE) >>> cp.stderr b'Python 3.6.1 :: Anaconda 4.4.0 (64-bit)\r\n' >>> cp.stdout b''(I find it interesting and slightly counterintuitive that the version info gets put to stderr instead of stdout.)
Pass a command listOne might easily move from manually providing a command string (like the question suggests) to providing a string built programmatically. Don't build strings programmatically. This is a potential security issue. It's better to assume you don't trust the input.
>>> import textwrap >>> args = ['python', textwrap.__file__] >>> cp = subprocess.run(args, stdout=subprocess.PIPE) >>> cp.stdout b'Hello there.\r\n This is indented.\r\n'Note, only
Full Signatureargs
should be passed positionally.Here's the actual signature in the source and as shown by
help(run)
:def run(*popenargs, input=None, timeout=None, check=False, **kwargs):The
popenargs
andkwargs
are given to thePopen
constructor.input
can be a string of bytes (or unicode, if specify encoding oruniversal_newlines=True
) that will be piped to the subprocess's stdin.The documentation describes
timeout=
andcheck=True
better than I could:The timeout argument is passed to Popen.communicate(). If the timeout expires, the child process will be killed and waited for. The TimeoutExpired exception will be re-raised after the child process has terminated.
If check is true, and the process exits with a non-zero exit code, a CalledProcessError exception will be raised. Attributes of that exception hold the arguments, the exit code, and stdout and stderr if they were captured.
and this example for
check=True
is better than one I could come up with:Expanded Signature>>> subprocess.run("exit 1", shell=True, check=True) Traceback (most recent call last): ... subprocess.CalledProcessError: Command 'exit 1' returned non-zero exit status 1Here's an expanded signature, as given in the documentation:
subprocess.run(args, *, stdin=None, input=None, stdout=None, stderr=None, shell=False, cwd=None, timeout=None, check=False, encoding=None, errors=None)Note that this indicates that only the args list should be passed positionally. So pass the remaining arguments as keyword arguments.
PopenWhen use
Popen
instead? I would struggle to find use-case based on the arguments alone. Direct usage ofPopen
would, however, give you access to its methods, includingpoll
, 'send_signal', 'terminate', and 'wait'.Here's the
Popen
signature as given in the source . I think this is the most precise encapsulation of the information (as opposed tohelp(Popen)
):def __init__(self, args, bufsize=-1, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=_PLATFORM_DEFAULT_CLOSE_FDS, shell=False, cwd=None, env=None, universal_newlines=False, startupinfo=None, creationflags=0, restore_signals=True, start_new_session=False, pass_fds=(), *, encoding=None, errors=None):But more informative is the
Popen
documentation :subprocess.Popen(args, bufsize=-1, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=True, shell=False, cwd=None, env=None, universal_newlines=False, startupinfo=None, creationflags=0, restore_signals=True, start_new_session=False, pass_fds=(), *, encoding=None, errors=None)Execute a child program in a new process. On POSIX, the class uses os.execvp()-like behavior to execute the child program. On Windows, the class uses the Windows CreateProcess() function. The arguments to Popen are as follows.
Understanding the remaining documentation on
Popen
will be left as an exercise for the reader.Aaron Hall , 2017-10-18 16:37:52
answered Oct 18 '17 at 16:37 Aaron Hall ♦ 254k 68 68 gold badges 349 349 silver badges 300 300 bronze badgestripleee ,
A simple example of two-way communication between a primary process and a subprocess can be found here: stackoverflow.com/a/52841475/1349673 – James Hirschorn Oct 16 '18 at 18:05> ,
os.system
is OK, but kind of dated. It's also not very secure. Instead, trysubprocess
.subprocess
does not call sh directly and is therefore more secure thanos.system
.Get more information here .
Martin W , 2008-09-18 01:53:27
edited Dec 10 '16 at 13:25 Dimitris Fasarakis Hilliard 106k 24 24 gold badges 206 206 silver badges 210 210 bronze badges answered Sep 18 '08 at 1:53 Martin W 1,269 7 7 silver badges 12 12 bronze badgestripleee ,
While I agree with the overall recommendation,subprocess
does not remove all of the security problems, and has some pesky issues of its own. – tripleee Dec 3 '18 at 5:36> ,
There is also Plumbum>>> from plumbum import local >>> ls = local["ls"] >>> ls LocalCommand(<LocalPath /bin/ls>) >>> ls() u'build.py\ndist\ndocs\nLICENSE\nplumbum\nREADME.rst\nsetup.py\ntests\ntodo.txt\n' >>> notepad = local["c:\\windows\\notepad.exe"] >>> notepad() # Notepad window pops up u'' # Notepad window is closed by user, command returnsstuckintheshuck ,
answered Oct 10 '14 at 17:41 stuckintheshuck 2,123 2 2 gold badges 22 22 silver badges 31 31 bronze badges> ,
add a comment> ,
It can be this simple:import os cmd = "your command" os.system(cmd)Samadi Salahedine , 2018-04-30 13:47:17
edited Jun 8 '18 at 12:06 answered Apr 30 '18 at 13:47 Samadi Salahedine 478 4 4 silver badges 12 12 bronze badgestripleee ,
This fails to point out the drawbacks, which are explained in much more detail in PEP-324 . The documentation foros.system
explicitly recommends avoiding it in favor ofsubprocess
. – tripleee Dec 3 '18 at 5:02> ,
Use:import os cmd = 'ls -al' os.system(cmd)os - This module provides a portable way of using operating system-dependent functionality.
For the more
os
functions, here is the documentation.Priyankara ,
edited May 28 '17 at 23:05 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Jun 29 '15 at 11:34 Priyankara 710 11 11 silver badges 20 20 bronze badgesCorey Goldberg ,
it's also deprecated. use subprocess – Corey Goldberg Dec 9 '15 at 18:13> ,
I quite like shell_command for its simplicity. It's built on top of the subprocess module.Here's an example from the documentation:
>>> from shell_command import shell_call >>> shell_call("ls *.py") setup.py shell_command.py test_shell_command.py 0 >>> shell_call("ls -l *.py") -rw-r--r-- 1 ncoghlan ncoghlan 391 2011-12-11 12:07 setup.py -rw-r--r-- 1 ncoghlan ncoghlan 7855 2011-12-11 16:16 shell_command.py -rwxr-xr-x 1 ncoghlan ncoghlan 8463 2011-12-11 16:17 test_shell_command.py 0mdwhatcott ,
edited Nov 29 '19 at 21:49 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Aug 13 '12 at 18:36 mdwhatcott 4,547 2 2 gold badges 29 29 silver badges 45 45 bronze badges> ,
add a comment> ,
There is another difference here which is not mentioned previously.
subprocess.Popen
executes the <command> as a subprocess. In my case, I need to execute file <a> which needs to communicate with another program, <b>.I tried subprocess, and execution was successful. However <b> could not communicate with <a>. Everything is normal when I run both from the terminal.
One more: (NOTE: kwrite behaves different from other applications. If you try the below with Firefox, the results will not be the same.)
If you try
os.system("kwrite")
, program flow freezes until the user closes kwrite. To overcome that I tried insteados.system(konsole -e kwrite)
. This time program continued to flow, but kwrite became the subprocess of the console.Anyone runs the kwrite not being a subprocess (i.e. in the system monitor it must appear at the leftmost edge of the tree).
Atinc Delican ,
edited Jun 3 '18 at 20:14 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Jan 8 '10 at 21:11 Atinc Delican 265 2 2 silver badges 2 2 bronze badgesPeter Mortensen ,
What do you mean by "Anyone runs the kwrite not being a subprocess" ? – Peter Mortensen Jun 3 '18 at 20:14> ,
os.system
does not allow you to store results, so if you want to store results in some list or something, asubprocess.call
works.Saurabh Bangad ,
edited Nov 29 '19 at 21:48 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Jun 11 '12 at 22:28 Saurabh Bangad 357 2 2 silver badges 2 2 bronze badges> ,
add a comment> ,
subprocess.check_call
is convenient if you don't want to test return values. It throws an exception on any error.cdunn2001 , 2011-01-18 19:21:44
answered Jan 18 '11 at 19:21 cdunn2001 14.9k 7 7 gold badges 49 49 silver badges 42 42 bronze badges> ,
add a comment> ,
I tend to use subprocess together with shlex (to handle escaping of quoted strings):>>> import subprocess, shlex >>> command = 'ls -l "/your/path/with spaces/"' >>> call_params = shlex.split(command) >>> print call_params ["ls", "-l", "/your/path/with spaces/"] >>> subprocess.call(call_params)Emil Stenström , 2014-04-30 14:37:04
answered Apr 30 '14 at 14:37 Emil Stenström 8,952 8 8 gold badges 43 43 silver badges 68 68 bronze badges> ,
add a comment> ,
Shameless plug, I wrote a library for this :P https://github.com/houqp/shell.pyIt's basically a wrapper for popen and shlex for now. It also supports piping commands so you can chain commands easier in Python. So you can do things like:
ex('echo hello shell.py') | "awk '{print $2}'"houqp , 2014-05-01 20:49:01
answered May 1 '14 at 20:49 houqp 571 7 7 silver badges 11 11 bronze badges> ,
add a comment> ,
You can use Popen, and then you can check the procedure's status:from subprocess import Popen proc = Popen(['ls', '-l']) if proc.poll() is None: proc.kill()Check out subprocess.Popen .
admire ,
edited May 28 '17 at 23:01 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Jul 16 '12 at 15:16 admire 320 2 2 silver badges 6 6 bronze badges> ,
add a comment> ,
In Windows you can just import thesubprocess
module and run external commands by callingsubprocess.Popen()
,subprocess.Popen().communicate()
andsubprocess.Popen().wait()
as below:# Python script to run a command line import subprocess def execute(cmd): """ Purpose : To execute a command and return exit status Argument : cmd - command to execute Return : exit_code """ process = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) (result, error) = process.communicate() rc = process.wait() if rc != 0: print "Error: failed to execute command:", cmd print error return result # def command = "tasklist | grep python" print "This process detail: \n", execute(command)Output:
This process detail: python.exe 604 RDP-Tcp#0 4 5,660 KSwadhikar C ,
edited May 28 '17 at 23:08 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Jun 17 '16 at 9:14 Swadhikar C 1,578 1 1 gold badge 15 15 silver badges 27 27 bronze badges> ,
add a comment> ,
To fetch the network id from the OpenStack Neutron :#!/usr/bin/python import os netid = "nova net-list | awk '/ External / { print $2 }'" temp = os.popen(netid).read() /* Here temp also contains new line (\n) */ networkId = temp.rstrip() print(networkId)Output of nova net-list
+--------------------------------------+------------+------+ | ID | Label | CIDR | +--------------------------------------+------------+------+ | 431c9014-5b5d-4b51-a357-66020ffbb123 | test1 | None | | 27a74fcd-37c0-4789-9414-9531b7e3f126 | External | None | | 5a2712e9-70dc-4b0e-9281-17e02f4684c9 | management | None | | 7aa697f5-0e60-4c15-b4cc-9cb659698512 | Internal | None | +--------------------------------------+------------+------+Output of print(networkId)
27a74fcd-37c0-4789-9414-9531b7e3f126IRSHAD ,
edited Nov 29 '19 at 22:05 Peter Mortensen 26.3k 21 21 gold badges 91 91 silver badges 121 121 bronze badges answered Jul 20 '16 at 9:50 IRSHAD 1,995 22 22 silver badges 34 34 bronze badgestripleee ,
You should not recommendos.popen()
in 2016. The Awk script could easily be replaced with native Python code. – tripleee Dec 3 '18 at 5:49Yuval Atzmon ,
Under Linux, in case you would like to call an external command that will execute independently (will keep running after the python script terminates), you can use a simple queue as task spooler or the at commandAn example with task spooler:
import os os.system('ts <your-command>')Notes about task spooler (
ts
):
- You could set the number of concurrent processes to be run ("slots") with:
ts -S <number-of-slots>
- Installing
ts
doesn't requires admin privileges. You can download and compile it from source with a simplemake
, add it to your path and you're done.tripleee ,
ts
is not standard on any distro I know of, though the pointer toat
is mildly useful. You should probably also mentionbatch
. As elsewhere, theos.system()
recommendation should probably at least mention thatsubprocess
is its recommended replacement. – tripleee Dec 3 '18 at 5:43
Nov 22, 2012 | stackoverflow.com
Vartec's answer doesn't read all lines, so I made a version that did:
def run_command(command): p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) return iter(p.stdout.readline, b'')Usage is the same as the accepted answer:
command = 'mysqladmin create test -uroot -pmysqladmin12'.split() for line in run_command(command): print(line)share python - Running shell command and capturing the output - Stack Overflow Share a link to this answer Copy link | improve this answer edited May 23 '17 at 11:33 Community ♦ 1 1 1 silver badge answered Oct 30 '12 at 9:24 Max Ekman Max Ekman 769 5 5 silver badges 5 5 bronze badges
- 6 you could use
return iter(p.stdout.readline, b'')
instead of the while loop – jfs Nov 22 '12 at 15:44- 1 That is a pretty cool use of iter, didn't know that! I updated the code. – Max Ekman Nov 28 '12 at 21:53
- I'm pretty sure stdout keeps all output, it's a stream object with a buffer. I use a very similar technique to deplete all remaining output after a Popen have completed, and in my case, using poll() and readline during the execution to capture output live also. – Max Ekman Nov 28 '12 at 21:55
- I've removed my misleading comment. I can confirm,
p.stdout.readline()
may return the non-empty previously-buffered output even if the child process have exited already (p.poll()
is notNone
). – jfs Sep 18 '14 at 3:12- This code doesn't work. See here stackoverflow.com/questions/24340877/ – thang May 3 '15 at 6:00
Nov 13, 2019 | unix.stackexchange.com
Execute shell commands in Python Ask Question Asked 4 years ago Active 2 months ago Viewed 557k times 67 32
fooot ,Nov 8, 2017 at 21:39
I'm currently studying penetration testing and Python programming. I just want to know how I would go about executing a Linux command in Python. The commands I want to execute are:echo 1 > /proc/sys/net/ipv4/ip_forward iptables -t nat -A PREROUTING -p tcp --destination-port 80 -j REDIRECT --to-port 8080If I just use
binarysubstrate ,Feb 28 at 19:58
You can useos.system()
, like this:import os os.system('ls')Or in your case:
os.system('echo 1 > /proc/sys/net/ipv4/ip_forward') os.system('iptables -t nat -A PREROUTING -p tcp --destination-port 80 -j REDIRECT --to-port 8080')Better yet, you can use subprocess's call, it is safer, more powerful and likely faster:
from subprocess import call call('echo "I like potatos"', shell=True)Or, without invoking shell:
call(['echo', 'I like potatos'])If you want to capture the output, one way of doing it is like this:
import subprocess cmd = ['echo', 'I like potatos'] proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) o, e = proc.communicate() print('Output: ' + o.decode('ascii')) print('Error: ' + e.decode('ascii')) print('code: ' + str(proc.returncode))I highly recommend setting a
timeout
incommunicate
, and also to capture the exceptions you can get when calling it. This is a very error-prone code, so you should expect errors to happen and handle them accordingly.jordanm ,Oct 23, 2015 at 15:43
The first command simply writes to a file. You wouldn't execute that as a shell command becausepython
can read and write to files without the help of a shell:with open('/proc/sys/net/ipv4/ip_forward', 'w') as f: f.write("1")The
iptables
command is something you may want to execute externally. The best way to do this is to use the subprocess module .import subprocess subprocess.check_call(['iptables', '-t', 'nat', '-A', 'PREROUTING', '-p', 'tcp', '--destination-port', '80', '-j', 'REDIRECT', '--to-port', '8080'])Note that this method also does not use a shell, which is unnecessary overhead.
Tom Hunt ,Oct 23, 2015 at 15:41
The quickest way:import os os.system("your command here")This isn't the most flexible approach; if you need any more control over your process than "run it once, to completion, and block until it exits", then you should use the
subprocess
module instead.jordanm ,Apr 5, 2018 at 9:23
As a general rule, you'd better use python bindings whenever possible (better Exception catching, among other advantages.)For the
echo
command, it's obviously better to use python to write in the file as suggested in @jordanm's answer.For the
iptables
command, maybepython-iptables
( PyPi page , GitHub page with description and doc ) would provide what you need (I didn't check your specific command).This would make you depend on an external lib, so you have to weight the benefits. Using subprocess works, but if you want to use the output, you'll have to parse it yourself, and deal with output changes in future
iptables
versions.,
A python version of your shell. Be careful, I haven't tested it.from subprocess import run def bash(command): run(command.split()) >>> bash('find / -name null') /dev/null /sys/fs/selinux/null /sys/devices/virtual/mem/null /sys/class/mem/null /usr/lib/kbd/consoletrans/null
Oct 07, 2019 | linuxhandbook.com
... ... ...
You can also store the output of the shell command in a variable in this way:
import os myCmd = os.popen('ls -la').read() print(myCmd)If you run the above program, it will print the content of the variable myCmd and it will be the same as the output of the ls command we saw earlier.
READ 3 Ways to Write a List to a File in PythonNow let's see another way of running Linux command in Python.
Execute shell command in Python with subprocess module
A slightly better way of running shell commands in Python is using the
subprocess module .If you want to run a shell command without any options and arguments, you can call subprocess like this:
import subprocess subprocess.call("ls")The call method will execute the shell command. You'll see the content of the current working directory when you run the program:
python prog.py agatha.txt count1.txt file1.txt prog.py target count count2.txt file2.txt sherlock.txtIf you want to provide the options and the arguments along with the shell command, you'll have to provide them in a list.
import subprocess subprocess.call(["ls", "-l", "."])When you run the program, you'll see the content of the current directory in the list format.
Now that you know how to run shell command with subprocess, the question arises about storing the output of the shell command.
For this, you'll have to use the Popen function. It outputs to the Popen object which has a communicate() method that can be used to get the standard output and error as a tuple . You can learn more about the
subprocess module here .import subprocess MyOut = subprocess.Popen(['ls', '-l', '.'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT) stdout,stderr = MyOut.communicate() print(stdout) print(stderr)When you run the program, you'll see the stdout and stderr (which is none in this case).
python prog.py total 32 -r--r--r-- 1 abhishek abhishek 456 Dec 11 21:29 agatha.txt -rw-r--r-- 1 abhishek abhishek 0 Jan 17 12:11 count -rw-r--r-- 1 abhishek abhishek 14 Jan 10 16:12 count1.txt -rw-r--r-- 1 abhishek abhishek 14 Jan 10 16:12 count2.txt --w-r--r-- 1 abhishek abhishek 356 Jan 17 12:10 file1.txt -rw-r--r-- 1 abhishek abhishek 356 Dec 17 09:59 file2.txt -rw-r--r-- 1 abhishek abhishek 212 Jan 17 16:54 prog.py -rw-r--r-- 1 abhishek abhishek 356 Dec 11 21:35 sherlock.txt drwxr-xr-x 3 abhishek abhishek 4096 Jan 4 20:10 target None... ... ...
Mar 27, 2014 | cmdlinetips.com
In Python, often you may want to execute linux command and get the output of the command as string variable. There are multiple ways to execute shell command and get output using Python. A naive way to do that is to execeute the linux command, save the output in file and parse the file.
Get output from shell command using subprocess
1 2 3 import
os
cmd
=
'wc -l my_text_file.txt > out_file.txt'
os.system(cmd)
A better way to get the output from executing a linux command in Python is to use Python module "subprocess". Here is an example of using "subprocess" to count the number of lines in a file using "wc -l" linux command.
Let us first import the subprocess module
1 2 # import subprocess library
>
import
subprocess
Launch the shell command that we want to execute using subprocess.Popen function. The arguments to this command is the shell command as a list and specify output and error.
1 2 3 >out
=
subprocess.Popen([
'wc'
,
'-l'
,
'my_text_file.txt'
],
stdout
=
subprocess.PIPE,
stderr
=
subprocess.STDOUT)
The output from subprocess.Popen is subprocess.Popen object. And this object has number of methods associated with it and we will be using communicate() method to get the standard output and error in case as a tuple.
Here standard output contains the result from wc -l command and the stderr contains None as there are no errors.
1 2 3 4 5 >stdout,stderr
=
out.communicate()
>
(stdout)
3503
my_text_file.txt
>
(stderr)
None
We can then parse the stdout to get the result from shell command in Python, the way we want . For example, if we just want the number of lines in the file, we split the stdout
1 2 >stdout.split()[
0
]
'3503'
Oct 13, 2019 | stackoverflow.com
Look at the subprocess module in the standard library:
import subprocess subprocess.run(["ls", "-l"])The advantage of
subprocess
vs.system
is that it is more flexible (you can get thestdout
,stderr
, the "real" status code, better error handling, etc...).The official documentation recommends the
subprocess
module over the alternativeos.system()
:The
subprocess
module provides more powerful facilities for spawning new processes and retrieving their results; using that module is preferable to using this function [os.system()
].The Replacing Older Functions with the subprocess Module section in the
subprocess
documentation may have some helpful recipes.For versions of Python before 3.5, use
call
:import subprocess subprocess.call(["ls", "-l"])
- Is there a way to use variable substitution? IE I tried to do
echo $PATH
by usingcall(["echo", "$PATH"])
, but it just echoed the literal string$PATH
instead of doing any substitution. I know I could get the PATH environment variable, but I'm wondering if there is an easy way to have the command behave exactly as if I had executed it in bash. – Kevin Wheeler Sep 1 '15 at 23:17- 22 @KevinWheeler You should NOT use
shell=True
, for this purpose Python comes with os.path.expandvars . In your case you can write:os.path.expandvars("$PATH")
. @SethMMorton please reconsider your comment -> Why not to use shell=True – Murmel Nov 11 '15 at 20:24- 11 To simplify at least conceptually:\n call("ls -l".split()) – slehar Jun 16 '18 at 17:15
- If you want to create a list out of a command with parameters , a list which can be used with subprocess when shell=False , then use shlex.split for an easy way to do this docs.python.org/2/library/shlex.html#shlex.split – Daniel F Sep 20 '18 at 18:05
- 8 you forgot to say it needs python 3.5 at least. It doesn't work on python 3.4.3 for example, which is default for Ubuntu 14.04 LTS – pulse Feb 2 at 22:50
Jean ,Dec 3, 2018 at 6:00
Here's a summary of the ways to call external programs and the advantages and disadvantages of each:
os.system("some_command with args")
passes the command and arguments to your system's shell. This is nice because you can actually run multiple commands at once in this manner and set up pipes and input/output redirection. For example:os.system("some_command < input_file | another_command > output_file")However, while this is convenient, you have to manually handle the escaping of shell characters such as spaces, etc. On the other hand, this also lets you run commands which are simply shell commands and not actually external programs. See the documentation .
stream = os.popen("some_command with args")
will do the same thing asos.system
except that it gives you a file-like object that you can use to access standard input/output for that process. There are 3 other variants of popen that all handle the i/o slightly differently. If you pass everything as a string, then your command is passed to the shell; if you pass them as a list then you don't need to worry about escaping anything. See the documentation .- The
Popen
class of thesubprocess
module. This is intended as a replacement foros.popen
but has the downside of being slightly more complicated by virtue of being so comprehensive. For example, you'd say:print subprocess.Popen("echo Hello World", shell=True, stdout=subprocess.PIPE).stdout.read()instead of:
print os.popen("echo Hello World").read()but it is nice to have all of the options there in one unified class instead of 4 different popen functions. See the documentation .
- The
call
function from thesubprocess
module. This is basically just like thePopen
class and takes all of the same arguments, but it simply waits until the command completes and gives you the return code. For example:return_code = subprocess.call("echo Hello World", shell=True)See the documentation .
- If you're on Python 3.5 or later, you can use the new
subprocess.run
function, which is a lot like the above but even more flexible and returns aCompletedProcess
object when the command finishes executing.- The os module also has all of the fork/exec/spawn functions that you'd have in a C program, but I don't recommend using them directly.
The
subprocess
module should probably be what you use.Finally please be aware that for all methods where you pass the final command to be executed by the shell as a string and you are responsible for escaping it. There are serious security implications if any part of the string that you pass can not be fully trusted. For example, if a user is entering some/any part of the string. If you are unsure, only use these methods with constants. To give you a hint of the implications consider this code:
print subprocess.Popen("echo %s " % user_input, stdout=PIPE).stdout.read()and imagine that the user enters something "my mama didnt love me && rm -rf /" which could erase the whole filesystem.
jfs ,Dec 3, 2018 at 5:39
Typical implementation:import subprocess p = subprocess.Popen('ls', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) for line in p.stdout.readlines(): print line, retval = p.wait()You are free to do what you want with the
stdout
data in the pipe. In fact, you can simply omit those parameters (stdout=
andstderr=
) and it'll behave likeos.system()
.maranas ,Feb 24 at 19:05
Some hints on detaching the child process from the calling one (starting the child process in background).Suppose you want to start a long task from a CGI-script, that is the child process should live longer than the CGI-script execution process.
The classical example from the subprocess module docs is:
import subprocess import sys # some code here pid = subprocess.Popen([sys.executable, "longtask.py"]) # call subprocess # some more code hereThe idea here is that you do not want to wait in the line 'call subprocess' until the longtask.py is finished. But it is not clear what happens after the line 'some more code here' from the example.
My target platform was freebsd, but the development was on windows, so I faced the problem on windows first.
On windows (win xp), the parent process will not finish until the longtask.py has finished its work. It is not what you want in CGI-script. The problem is not specific to Python, in PHP community the problems are the same.
The solution is to pass DETACHED_PROCESS Process Creation Flag to the underlying CreateProcess function in win API. If you happen to have installed pywin32 you can import the flag from the win32process module, otherwise you should define it yourself:
DETACHED_PROCESS = 0x00000008 pid = subprocess.Popen([sys.executable, "longtask.py"], creationflags=DETACHED_PROCESS).pid/* UPD 2015.10.27 @eryksun in a comment below notes, that the semantically correct flag is CREATE_NEW_CONSOLE (0x00000010) */
On freebsd we have another problem: when the parent process is finished, it finishes the child processes as well. And that is not what you want in CGI-script either. Some experiments showed that the problem seemed to be in sharing sys.stdout. And the working solution was the following:
pid = subprocess.Popen([sys.executable, "longtask.py"], stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE)I have not checked the code on other platforms and do not know the reasons of the behaviour on freebsd. If anyone knows, please share your ideas. Googling on starting background processes in Python does not shed any light yet.
Daniel F ,Dec 4, 2018 at 8:36
I'd recommend using the subprocess module instead of os.system because it does shell escaping for you and is therefore much safer: http://docs.python.org/library/subprocess.htmlsubprocess.call(['ping', 'localhost'])Fox Wilson ,Nov 7, 2017 at 23:19
import os cmd = 'ls -al' os.system(cmd)If you want to return the results of the command, you can use
os.popen
. However, this is deprecated since version 2.6 in favor of the subprocess module , which other answers have covered well.Peter Mortensen ,Dec 3, 2018 at 18:41
import os os.system("your command")Note that this is dangerous, since the command isn't cleaned. I leave it up to you to google for the relevant documentation on the 'os' and 'sys' modules. There are a bunch of functions (exec* and spawn*) that will do similar things.
Tom Fuller ,Oct 29, 2016 at 14:02
There are lots of different libraries which allow you to call external commands with Python. For each library I've given a description and shown an example of calling an external command. The command I used as the example isls -l
(list all files). If you want to find out more about any of the libraries I've listed and linked the documentation for each of them.Sources:
- subprocess: https://docs.python.org/3.5/library/subprocess.html
- shlex: https://docs.python.org/3/library/shlex.html
- os: https://docs.python.org/3.5/library/os.html
- sh: https://amoffat.github.io/sh/
- plumbum: https://plumbum.readthedocs.io/en/latest/
- pexpect: https://pexpect.readthedocs.io/en/stable/
- fabric: http://www.fabfile.org/
- envoy: https://github.com/kennethreitz/envoy
- commands: https://docs.python.org/2/library/commands.html
These are all the libraries:
Hopefully this will help you make a decision on which library to use :)
subprocess
Subprocess allows you to call external commands and connect them to their input/output/error pipes (stdin, stdout, and stderr). Subprocess is the default choice for running commands, but sometimes other modules are better.
subprocess.run(["ls", "-l"]) # Run command subprocess.run(["ls", "-l"], stdout=subprocess.PIPE) # This will run the command and return any output subprocess.run(shlex.split("ls -l")) # You can also use the shlex library to split the commandos
os is used for "operating system dependent functionality". It can also be used to call external commands with
os.system
andos.popen
(Note: There is also a subprocess.popen). os will always run the shell and is a simple alternative for people who don't need to, or don't know how to usesubprocess.run
.os.system("ls -l") # run command os.popen("ls -l").read() # This will run the command and return any outputsh
sh is a subprocess interface which lets you call programs as if they were functions. This is useful if you want to run a command multiple times.
sh.ls("-l") # Run command normally ls_cmd = sh.Command("ls") # Save command as a variable ls_cmd() # Run command as if it were a functionplumbum
plumbum is a library for "script-like" Python programs. You can call programs like functions as in
sh
. Plumbum is useful if you want to run a pipeline without the shell.ls_cmd = plumbum.local("ls -l") # get command ls_cmd() # run commandpexpect
pexpect lets you spawn child applications, control them and find patterns in their output. This is a better alternative to subprocess for commands that expect a tty on Unix.
pexpect.run("ls -l") # Run command as normal child = pexpect.spawn('scp foo [email protected]:.') # Spawns child application child.expect('Password:') # When this is the output child.sendline('mypassword')fabric
fabric is a Python 2.5 and 2.7 library. It allows you to execute local and remote shell commands. Fabric is simple alternative for running commands in a secure shell (SSH)
fabric.operations.local('ls -l') # Run command as normal fabric.operations.local('ls -l', capture = True) # Run command and receive outputenvoy
envoy is known as "subprocess for humans". It is used as a convenience wrapper around the
subprocess
module.r = envoy.run("ls -l") # Run command r.std_out # get outputcommands
commands
contains wrapper functions foros.popen
, but it has been removed from Python 3 sincesubprocess
is a better alternative.The edit was based on J.F. Sebastian's comment.
Jorge E. Cardona ,Mar 13, 2012 at 0:12
I always usefabric
for this things like:from fabric.operations import local result = local('ls', capture=True) print "Content:/n%s" % (result, )But this seem to be a good tool:
sh
(Python subprocess interface) .Look an example:
from sh import vgdisplay print vgdisplay() print vgdisplay('-v') print vgdisplay(v=True)athanassis ,Oct 7, 2010 at 7:09
Check the "pexpect" Python library, too.It allows for interactive controlling of external programs/commands, even ssh, ftp, telnet, etc. You can just type something like:
child = pexpect.spawn('ftp 192.168.0.24') child.expect('(?i)name .*: ') child.sendline('anonymous') child.expect('(?i)password')Bruno Bronosky ,Jan 30, 2018 at 18:18
If you need the output from the command you are calling, then you can use subprocess.check_output (Python 2.7+).>>> subprocess.check_output(["ls", "-l", "/dev/null"]) 'crw-rw-rw- 1 root root 1, 3 Oct 18 2007 /dev/null\n'Also note the shell parameter.
If shell is
True
, the specified command will be executed through the shell. This can be useful if you are using Python primarily for the enhanced control flow it offers over most system shells and still want convenient access to other shell features such as shell pipes, filename wildcards, environment variable expansion, and expansion of ~ to a user's home directory. However, note that Python itself offers implementations of many shell-like features (in particular,glob
,fnmatch
,os.walk()
,os.path.expandvars()
,os.path.expanduser()
, andshutil
).community wiki
5 revs
,Sep 19, 2018 at 13:55With Standard LibraryThe Use subprocess module (Python 3):
import subprocess subprocess.run(['ls', '-l'])It is the recommended standard way. However, more complicated tasks (pipes, output, input, etc.) can be tedious to construct and write.
Note on Python version: If you are still using Python 2, subprocess.call works in a similar way.
ProTip: shlex.split can help you to parse the command for
run
,call
, and othersubprocess
functions in case you don't want (or you can't!) provide them in form of lists:import shlex import subprocess subprocess.run(shlex.split('ls -l'))With External DependenciesIf you do not mind external dependencies, use plumbum :
from plumbum.cmd import ifconfig print(ifconfig['wlan0']())It is the best
subprocess
wrapper. It's cross-platform, i.e. it works on both Windows and Unix-like systems. Install bypip install plumbum
.Another popular library is sh :
from sh import ifconfig print(ifconfig('wlan0'))However,
sh
dropped Windows support, so it's not as awesome as it used to be. Install bypip install sh
.Eric ,Apr 2, 2014 at 13:07
This is how I run my commands. This code has everything you need pretty muchfrom subprocess import Popen, PIPE cmd = "ls -l ~/" p = Popen(cmd , shell=True, stdout=PIPE, stderr=PIPE) out, err = p.communicate() print "Return code: ", p.returncode print out.rstrip(), err.rstrip()Joe ,Nov 15, 2012 at 17:13
Update:
subprocess.run
is the recommended approach as of Python 3.5 if your code does not need to maintain compatibility with earlier Python versions. It's more consistent and offers similar ease-of-use as Envoy. (Piping isn't as straightforward though. See this question for how .)Here's some examples from the docs .
Run a process:
>>> subprocess.run(["ls", "-l"]) # doesn't capture output CompletedProcess(args=['ls', '-l'], returncode=0)Raise on failed run:
>>> subprocess.run("exit 1", shell=True, check=True) Traceback (most recent call last): ... subprocess.CalledProcessError: Command 'exit 1' returned non-zero exit status 1Capture output:
>>> subprocess.run(["ls", "-l", "/dev/null"], stdout=subprocess.PIPE) CompletedProcess(args=['ls', '-l', '/dev/null'], returncode=0, stdout=b'crw-rw-rw- 1 root root 1, 3 Jan 23 16:23 /dev/null\n')Original answer:I recommend trying Envoy . It's a wrapper for subprocess, which in turn aims to replace the older modules and functions. Envoy is subprocess for humans.
Example usage from the readme :
>>> r = envoy.run('git config', data='data to pipe in', timeout=2) >>> r.status_code 129 >>> r.std_out 'usage: git config [options]' >>> r.std_err ''Pipe stuff around too:
>>> r = envoy.run('uptime | pbcopy') >>> r.command 'pbcopy' >>> r.status_code 0 >>> r.history [<Response 'uptime'>]tripleee ,Dec 3, 2018 at 5:33
Without the output of the result:import os os.system("your command here")With output of the result:
import commands commands.getoutput("your command here") or commands.getstatusoutput("your command here")Ben Hoffstein ,Sep 18, 2008 at 1:43
https://docs.python.org/2/library/subprocess.html...or for a very simple command:
import os os.system('cat testfile')stuckintheshuck ,Oct 10, 2014 at 17:41
There is also Plumbum>>> from plumbum import local >>> ls = local["ls"] >>> ls LocalCommand(<LocalPath /bin/ls>) >>> ls() u'build.py\ndist\ndocs\nLICENSE\nplumbum\nREADME.rst\nsetup.py\ntests\ntodo.txt\n' >>> notepad = local["c:\\windows\\notepad.exe"] >>> notepad() # Notepad window pops up u'' # Notepad window is closed by user, command returnstripleee ,Dec 3, 2018 at 5:36
os.system
is OK, but kind of dated. It's also not very secure. Instead, trysubprocess
.subprocess
does not call sh directly and is therefore more secure thanos.system
.Get more information here .
James Hirschorn ,Dec 3, 2018 at 5:16
Calling an external command in PythonSimple, use
subprocess.run
, which returns aCompletedProcess
object:>>> import subprocess >>> completed_process = subprocess.run('python --version') Python 3.6.1 :: Anaconda 4.4.0 (64-bit) >>> completed_process CompletedProcess(args='python --version', returncode=0)Why?As of Python 3.5, the documentation recommends subprocess.run :
The recommended approach to invoking subprocesses is to use the run() function for all use cases it can handle. For more advanced use cases, the underlying Popen interface can be used directly.
Here's an example of the simplest possible usage - and it does exactly as asked:
>>> import subprocess >>> completed_process = subprocess.run('python --version') Python 3.6.1 :: Anaconda 4.4.0 (64-bit) >>> completed_process CompletedProcess(args='python --version', returncode=0)
run
waits for the command to successfully finish, then returns aCompletedProcess
object. It may instead raiseTimeoutExpired
(if you give it atimeout=
argument) orCalledProcessError
(if it fails and you passcheck=True
).As you might infer from the above example, stdout and stderr both get piped to your own stdout and stderr by default.
We can inspect the returned object and see the command that was given and the returncode:
>>> completed_process.args 'python --version' >>> completed_process.returncode 0Capturing outputIf you want to capture the output, you can pass
subprocess.PIPE
to the appropriatestderr
orstdout
:>>> cp = subprocess.run('python --version', stderr=subprocess.PIPE, stdout=subprocess.PIPE) >>> cp.stderr b'Python 3.6.1 :: Anaconda 4.4.0 (64-bit)\r\n' >>> cp.stdout b''(I find it interesting and slightly counterintuitive that the version info gets put to stderr instead of stdout.)
Pass a command listOne might easily move from manually providing a command string (like the question suggests) to providing a string built programmatically. Don't build strings programmatically. This is a potential security issue. It's better to assume you don't trust the input.
>>> import textwrap >>> args = ['python', textwrap.__file__] >>> cp = subprocess.run(args, stdout=subprocess.PIPE) >>> cp.stdout b'Hello there.\r\n This is indented.\r\n'Note, only
Full Signatureargs
should be passed positionally.Here's the actual signature in the source and as shown by
help(run)
:def run(*popenargs, input=None, timeout=None, check=False, **kwargs):The
popenargs
andkwargs
are given to thePopen
constructor.input
can be a string of bytes (or unicode, if specify encoding oruniversal_newlines=True
) that will be piped to the subprocess's stdin.The documentation describes
timeout=
andcheck=True
better than I could:The timeout argument is passed to Popen.communicate(). If the timeout expires, the child process will be killed and waited for. The TimeoutExpired exception will be re-raised after the child process has terminated.
If check is true, and the process exits with a non-zero exit code, a CalledProcessError exception will be raised. Attributes of that exception hold the arguments, the exit code, and stdout and stderr if they were captured.
and this example for
check=True
is better than one I could come up with:Expanded Signature>>> subprocess.run("exit 1", shell=True, check=True) Traceback (most recent call last): ... subprocess.CalledProcessError: Command 'exit 1' returned non-zero exit status 1Here's an expanded signature, as given in the documentation:
subprocess.run(args, *, stdin=None, input=None, stdout=None, stderr=None, shell=False, cwd=None, timeout=None, check=False, encoding=None, errors=None)Note that this indicates that only the args list should be passed positionally. So pass the remaining arguments as keyword arguments.
PopenWhen use
Popen
instead? I would struggle to find use-case based on the arguments alone. Direct usage ofPopen
would, however, give you access to its methods, includingpoll
, 'send_signal', 'terminate', and 'wait'.Here's the
Popen
signature as given in the source . I think this is the most precise encapsulation of the information (as opposed tohelp(Popen)
):def __init__(self, args, bufsize=-1, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=_PLATFORM_DEFAULT_CLOSE_FDS, shell=False, cwd=None, env=None, universal_newlines=False, startupinfo=None, creationflags=0, restore_signals=True, start_new_session=False, pass_fds=(), *, encoding=None, errors=None):But more informative is the
Popen
documentation :subprocess.Popen(args, bufsize=-1, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=True, shell=False, cwd=None, env=None, universal_newlines=False, startupinfo=None, creationflags=0, restore_signals=True, start_new_session=False, pass_fds=(), *, encoding=None, errors=None)Execute a child program in a new process. On POSIX, the class uses os.execvp()-like behavior to execute the child program. On Windows, the class uses the Windows CreateProcess() function. The arguments to Popen are as follows.
Understanding the remaining documentation on
Popen
will be left as an exercise for the reader.Corey Goldberg ,Dec 9, 2015 at 18:13
Use:import os cmd = 'ls -al' os.system(cmd)os - This module provides a portable way of using operating system-dependent functionality.
For the more
os
functions, here is the documentation.tripleee ,Dec 3, 2018 at 5:02
It can be this simple:import os cmd = "your command" os.system(cmd)tripleee ,Dec 3, 2018 at 4:59
use the os moduleimport os os.system("your command")eg
import os os.system("ifconfig")Peter Mortensen ,Jun 3, 2018 at 20:14
There is another difference here which is not mentioned previously.
subprocess.Popen
executes the <command> as a subprocess. In my case, I need to execute file <a> which needs to communicate with another program, <b>.I tried subprocess, and execution was successful. However <b> could not communicate with <a>. Everything is normal when I run both from the terminal.
One more: (NOTE: kwrite behaves different from other applications. If you try the below with Firefox, the results will not be the same.)
If you try
os.system("kwrite")
, program flow freezes until the user closes kwrite. To overcome that I tried insteados.system(konsole -e kwrite)
. This time program continued to flow, but kwrite became the subprocess of the console.Anyone runs the kwrite not being a subprocess (i.e. in the system monitor it must appear at the leftmost edge of the tree).
cdunn2001 ,Jan 18, 2011 at 19:21
subprocess.check_call
is convenient if you don't want to test return values. It throws an exception on any error.Emil Stenström ,Apr 30, 2014 at 14:37
I tend to use subprocess together with shlex (to handle escaping of quoted strings):>>> import subprocess, shlex >>> command = 'ls -l "/your/path/with spaces/"' >>> call_params = shlex.split(command) >>> print call_params ["ls", "-l", "/your/path/with spaces/"] >>> subprocess.call(call_params)mdwhatcott ,Aug 13, 2012 at 18:36
I quite like shell_command for its simplicity. It's built on top of the subprocess module.Here's an example from the docs:
>>> from shell_command import shell_call >>> shell_call("ls *.py") setup.py shell_command.py test_shell_command.py 0 >>> shell_call("ls -l *.py") -rw-r--r-- 1 ncoghlan ncoghlan 391 2011-12-11 12:07 setup.py -rw-r--r-- 1 ncoghlan ncoghlan 7855 2011-12-11 16:16 shell_command.py -rwxr-xr-x 1 ncoghlan ncoghlan 8463 2011-12-11 16:17 test_shell_command.py 0Saurabh Bangad ,Jun 11, 2012 at 22:28
os.system
does not allow you to store results, so if you want to store results in some list or somethingsubprocess.call
works.houqp ,May 1, 2014 at 20:49
Shameless plug, I wrote a library for this :P https://github.com/houqp/shell.pyIt's basically a wrapper for popen and shlex for now. It also supports piping commands so you can chain commands easier in Python. So you can do things like:
ex('echo hello shell.py') | "awk '{print $2}'"tripleee ,Dec 3, 2018 at 5:43
Under Linux, in case you would like to call an external command that will execute independently (will keep running after the python script terminates), you can use a simple queue as task spooler or the at commandAn example with task spooler:
import os os.system('ts <your-command>')Notes about task spooler (
ts
):
- You could set the number of concurrent processes to be run ("slots") with:
ts -S <number-of-slots>
- Installing
ts
doesn't requires admin privileges. You can download and compile it from source with a simplemake
, add it to your path and you're done.> ,Apr 16, 2013 at 20:23
You can use Popen, and then you can check the procedure's status:from subprocess import Popen proc = Popen(['ls', '-l']) if proc.poll() is None: proc.kill()Check out subprocess.Popen .
Google matched content |
python - Running shell command and capturing the output - Stack Overflow -- useful discussion that shows how misguided Python designers are
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: September 07, 2020