[Fwd: [python-sybase] memory usage]

Andrew Thomson andrewjt at applecomm.net
Mon, 17 Jan 2005 10:23:59 +1100


Sorry to be Pete repeat here... but should the python-sybase module be
able to handle 2,000,000 stored procs...?

I'm kind of at a loose end here.

Regards,

ajt.

-------- Forwarded Message --------
From: Andrew Thomson <andrewjt@applecomm.net>
To: python-sybase@www.object-craft.com.au
Subject: [python-sybase] memory usage
Date: Mon, 20 Dec 2004 12:22:11 +1100
I did notice another post about high memory usage. I too am experiencing
the same thing. However eventually my machines runs out of physical and
swap memory and then the script is terminated! :(

Basically I have a script that is parsing around 2,000,000 rows of data.
For every row, a field is extracted and then passed to a stored
procedure to get some information back.

Watching the process in top, I can see the memory increasing steadily.
If I take out the Sybase stored procedure stuff and just hard code a
value of the stored procedure output, the script runs to completion
never using more than 4mb of memory.

However with the Sybase query in, memory just steadily increases.

  PID USERNAME  PRI NICE   SIZE    RES STATE    TIME   WCPU    CPU
COMMAND
89567 ajt       112    0 21604K 19320K RUN      1:42 28.81% 28.81%
python

I'm running my script on Freebsd 5.3 using the port: py24-sybase-0.36_1

My script does the following:

# connect to db
db = Sybase.connect('dbhost', 'user', 'pass', 'db')

# start for loop
for text in fileinput.input("datafile"):

# get cursor
mnpcursor = db.cursor()

# run stored proc
mnpcursor.callproc('MnpMsisdn', [msisdn])

# get output
row = mnpcursor.fetchone ()

# do stuff with row

# close cursor
mnpcursor.close ()

Is there are better way to be doing this or something I'm missing?

Kind regards,

ajt.

-- 
Andrew Thomson <andrewjt@applecomm.net>