Pipe list to Python instead of string


I finally figured out how to pipe a string to Python and convert it to list.

do shell script "echo 'a, b c, d\\c' | python -c '
import sys
x = sys.stdin.read()
y = x.split(\", \")
print y

Does anyone know how to send Python a list instead of a string? Something other than ‘echo’? I’m not sure yet if {python can take a list as standard input yet. Still working on that.


That’s alright. I’m quite sure that Python only takes stdin as string. I used python to pipe a list to Python and the stdin was empty string.

Thanks anyway,

Just confirming your statement.

Pipes, FIFO, are files were one process writes data to the beginning of the file and another process reads data at the end of the file. Because pipes sends streams instead of data packets, the receiving (reading) process always needs to parse (analyse) the data it’s reading. Because the received data always needs to be parsed, it’s common sense to parse the received data as string like command line (text) interpreters do. Maybe there are a few commands I’m not aware of, but generally every command parse stdin as string.

For the record: There are unix domain sockets, which are basically pipes 2.0. These have a two way direction of communication as they have the ability to send data packages instead of byte stream only. The advantage of having an unix domain socket over two pipes is that with unix domain socket the receiver can immediately interpret the received data and doesn’t need to parse the data first.

Hi DJ,

Thanks for the confirmation. Wasn’t sure if you could just send strings, normally.

I have decided to use tab delimited fields and carriage return delimited lines since they are rarely used. Or maybe linefeeds instead of returns because ‘do shell script’ has problems with returns.

Thanks a lot,


You can configure a terminal device to read say binary data, and thereby a pty indirectly, via the stty command, which has a flag -raw, for not interpreting things into lines, usually, we operate in cooked mode. This doesn’t pertain to anything that has to do with the do shell script of course, since it is set up with a preconfigured pty, which you probably can’t override, since many pty’s really “hardcode” the settings, so they know they have a sane device benath them.

If you should ever play with the stty command, then it is important to rember the -sane flag, which sets the terminal back to something, that should work, if not awesomly.

As an alternative to using sockets, one can use mkfifo, to have two processes simultaneously read and write from the same thread. (The protocol for how they should communicate is for you to implement).

It was just mentioned about sockets, so I though’t I’d mention it here, I have posted a thread at Code Exchange (Cheapest interprocess communcation ever), that the describes this approach. There are also some great blogposts out there delving into using mkfifo, should you need it.

That’s an great approach because tab delimited files, like CSV files, can be read, parsed and interpreted at the same time unlike XML for example. Personally I prefer CSV files with quoted fields so there is no limitation in character usage. But you can implement these quoted fields into tab delimited files as well.

I would use linefeed because your system is an unix system (linefeed terminated line system). The reason why do shell script command coerce linefeed into returns is because AppleScript comes originally from an Mac OS system where the carriage return was considered as an line terminator. To suppress this you can use “without altering line endings” at the end of the do shell script command and you get your output back into your script as your python command has printed it out.

So when you have a tab delimited file or CSV you would have different line terminators for different systems
Mac OS - carriage return
Unix (Mac OS X) - linefeed
Dos (Windows) - carriage return followed by a linefeed

Hi DJ,

Quoted fields sounds like a great idea! Because, I was just thinking that scripts using leading tabs. You don’t even need a comma with one field.

What do you mean by read, parse, and interpret at the same time?


Oh, I think you mean with line editors! That’s right.

After some thinking I thought that there is a CSV module in python.

set csvData to "Hello World!;\"quoted data\"
\"quoted data with \"\"-symbol\";and another field"

do shell script "echo " & quoted form of csvData & " | python -c '
import csv
import sys

reader = csv.reader(sys.stdin, delimiter='\\'';'\\'')
for row in reader:
    print \"%r\" % row'"

As you see it handles the quotation perfectly. The code behind the CSV parser is an stream parser written in C, it’s as fast as you can get and there is almost no delay. Meaning, reading the data into a string or directly into an array using the csv module has only a marginally difference.

edit: In you example you want a single list:

do shell script "echo 'a, b c, d' | python -c '
import sys
import csv

reader = csv.reader(sys.stdin)
theList = reader.next()
print theList'"

csv looks good to me. I couldn’t figure out a good way to remove those quotes. :slight_smile: Also, you can write csv files if needed. Strange how Python considers tabs and spaces altogether. I guess it just looks at white space sometimes.

Edited: ran your script an it works perfectly.

Edited: BTW, great examples.

Thanks a lot,