Command-line data pipes

16 views
Skip to first unread message

Stefan Urbanek

unread,
Apr 4, 2012, 7:41:17 PM4/4/12
to datab...@googlegroups.com
Hi,

I've implemented experimental command-line data pipes to the brewery command:

pipe - Create and run non-branched simple pipe stream. Each argument is either a node or a node attribute. The attribute has form: attribute_name=value. There should be at least one node defined. If there is no source node, then CSV on standard input is assumed. if there is no target node, then CSV on standard output is assumed.

Example - audit a CSV:

$ cat data.csv | brewery pipe audit

Make output nicer:

$ cat data.csv | brewery pipe audit pretty_printer

Read CSV from a file and store in newly created SQLite database table:

$ brewery pipe csv_source resource=data.csv \
             sql_table_target \
                url=sqlite:///data.sqlite \
                table=data  \
                create=1 \
                replace=1

WARNING: Issues might be experienced while using this command. There is no type conversion of values, which might cause problems. There is no way to specify non-scalar values (arrays, dictionaries). Some nodes might not have properely implemented attributes, therefore you might get error of non-existing attribute even if the attribute is there.

Let me know what you think.

Enjoy,

Stefan Urbanek
data analyst and data brewmaster

Twitter: @Stiivi



Reply all
Reply to author
Forward
0 new messages