I'm new to Python (mostly) and I'm wanting to use it for a new project
I'm faced with.
I have a machine (PLC) that is dumping its test results into a fixed-
length text file. I need to pull this data into a database (MySQL
most likely) so that I can access it with Crystal Reports to create
daily reports for my engineers.
I've been reading the Python manual for about a week now and I'm
learning a lot. Unfortunately, I was given a deadline today that I
cannot meet without a little help.
I need to know how to write a script that will DAILY pull this text
file into a MySQL database.
Can anyone show me how to do this?
Thanks
Stacey
I'm new to Python (mostly) and I'm wanting to use it for a new project
I'm faced with.
I have a machine (PLC) that is dumping its test results into a
fixed-length text file. I need to pull this data into a database
(MySQL most likely) so that I can access it with Crystal Reports to
create daily reports for my engineers.
I've been reading the Python manual for about a week now and I'm
learning a lot. Unfortunately, I was given a deadline today that I
cannot meet without a little help.
I need to know how to write a script that will DAILY pull this text
file into a MySQL database.
Can anyone show me how to do this?
Thanks in advance
Stacey
Just use the built in import SQL statement to import the information. You don't
really need a Python script. import can handle fixed field records (as well as
CSV, etc.).
-Larry
If the input data has to be pre-processed before storing it into the
database a Python script would be needed.
Just in case somebody needs a module for reading fixed-length files in
the spirit of module csv:
http://www.stroeder.com/pylib/fixedlenfields.py
For the MySQL part:
http://mysql-python.sourceforge.net/
Ciao, Michael.
Show us code & data, and we show you how to improve the code.
Or would you be able to write the script with the above information?
Diez
con = MySQLdb.connect(host='127.0.0.1',
port=4321, user='joe', passwd='shmoe', db='tst')
cursor = con.cursor()
sql = 'select * from YOUR_TABLE'
cursor.execute(sql)
results = cursor.fetch()
con.close()
=====================================================
hope that helps. good luck
Edwin
Stacey wrote:
Diez
--
http://mail.python.org/mailman/listinfo/python-list
The information contained in this message and any attachment may be
proprietary, confidential, and privileged or subject to the work
product doctrine and thus protected from disclosure. If the reader
of this message is not the intended recipient, or an employee or
agent responsible for delivering this message to the intended
recipient, you are hereby notified that any dissemination,
distribution or copying of this communication is strictly prohibited.
If you have received this communication in error, please notify me
immediately by replying to this message and deleting it and all
copies and backups thereof. Thank you.
While you are correct, that is not what the OP asked. There is no reference to
processing data prior to insertion into MySQL database. Also the OP said they
had a 1 day deadline.
-Larry
While it has nothing to do with python, I found that creating a MySQL
table with the proper fixed length char() fields and using 'load data
infile' was the easiest way to deal with that sort of scenario. The
python script is the secondary part, that handles the normalization
and proper typing of the first table to the second, permanent storage
area. But in this case, the more advanced bits are the database and
SQL details, and python is just a very convenient way to build the SQL
statements and execute them.
I'm really not sure what the best way to deal with fixed length data
is in python. I might define a list with the field lengths and use a
string slicing to get the items.. as a first thought:
myfile = '/somewhere/somefile.txt'
sizes = [16,4,8,8,8]
fd = open(myfile,r)
for line in fd.readlines() :
idx1 = 0
for l in sizes :
myfile = '/somewhere/somefile.txt'
sizes = [16,4,8,8,8]
fd = open(myfile,r)
data = []
for line in fd.readlines() :
a = []
idx1 = 0
for l in sizes :
idx2 = idx1 + l
a.append(line[idx1:idx2])
idx1 += l
data.append(a)
fd.close()
print data
This isn't tested, and there are probably more elegant ways to do it,
but for quick and dirty I think it should work.
Larry, having a bad day?
I'm confident that the OP is able to sort out *himself* what he needs.
Also the 1 day deadline would not be an obstacle. Would it for you?
Ciao, Michael.
def slicestring(s, fields):
d = []
for i in fields:
d.append(s[i[0]:i[1]])
return tuple(d)
sizes = [16,4,8,8,8]
s = '123456789012345678901234567890123456789012345678901234567890'
print slicestring(s, sizes2fields(sizes))
==========================================================
prints out:
('1234567890123456', '7890', '12345678', '90123456', '78901234')
hope it helps.
thanks Edwin
-----Original Message-----
From: python-list-bounces+edwin.madari=verizonwi...@python.org
[mailto:python-list-bounces+edwin.madari=verizonwi...@python.org]
On Behalf Of Eric Wertman
Sent: Thursday, August 14, 2008 1:59 PM
To: pytho...@python.org
Subject: Re: Fixed-length text file to database script
I have a machine (PLC) that is dumping its test results into a fixed-
length text file.
While it has nothing to do with python, I found that creating a MySQL
table with the proper fixed length char() fields and using 'load data
infile' was the easiest way to deal with that sort of scenario. The
python script is the secondary part, that handles the normalization
and proper typing of the first table to the second, permanent storage
area. But in this case, the more advanced bits are the database and
SQL details, and python is just a very convenient way to build the SQL
statements and execute them.
I'm really not sure what the best way to deal with fixed length data
is in python. I might define a list with the field lengths and use a
string slicing to get the items.. as a first thought:
myfile = '/somewhere/somefile.txt'
sizes = [16,4,8,8,8]
fd = open(myfile,r)
for line in fd.readlines() :
idx1 = 0
for l in sizes :
Those who are not paid by the keystroke and/or prefer to expend
keystrokes on meaningful names might like an alternative like this:
def sizes2offsets(sizes):
offsets = []
begin = 0
for size in sizes:
end = begin + size
offsets.append((begin, end))
begin = end
return offsets
This is even shorter: (and IMHO, clearer)
def split(s, fields):
ret = []
for field in fields:
s, d = s[field:], s[:field]
ret.append(d)
return ret
sizes = [16, 4, 8, 8, 8]
s = '123456789012345678901234567890123456789012345678901234567890'
print split(s, sizes)
alternatively, if what you've got is the partition position instead of
field width:
def split(s, sizes):
ret = []
start = sizes[0]
for end in sizes[1:]:
r, start = s[start: end], end
ret.append(r)
return ret
sizes = [0, 16, 20, 28, 36, 44]
s = '123456789012345678901234567890123456789012345678901234567890'
print split(s, sizes)
Michael Stroder:
> If the input data has to be pre-processed before storing it into the
> database a Python script would be needed.
But not for converting the fixed length string, you could just have
SQL process the fixed length string then retrieve it back as separate
fields.