I modified the java to fix these errors: ORA-01000: maximum open cursors exceeded.

1,863 views
Skip to first unread message

ggrabbe

unread,
Jun 23, 2010, 5:14:08 PM6/23/10
to mybatis-user
FYI, I made some code changes to the POOLED datasource to fix an error
I was getting from Oracle:

ORA-01000: maximum open cursors exceeded


I think the problem was the connections were being used after too many
connection requests. So I modified the pooled properties to use a new
property that I added:


<property name="poolMaximumConnectRequests" value="150" />


Then in the PooledConnection java code I added the number of requests
that gets incremented when it is checked out of the pool.

private int numRequests = 0;

If that number excees the maximum for the PooledDataSource then it
does not get checked back into the pool.

if (conn.isValid()) {
if (conn.getNumRequests() >= this.poolMaximumConnectRequests &&
log.isDebugEnabled()) {
log.debug("A connection (" + conn.getRealHashCode() + ") has reached
the maximum requests, closing.");
}
...


That fixed the errors and now the pool is very fast.



Ellis Miller

unread,
Jun 23, 2010, 5:27:18 PM6/23/10
to mybati...@googlegroups.com
There is also an Oracle database parameter related to this error:

open_cursors

SQL> ALTER SYSTEM SET open_cursors = 500

Also, every time connection is established and/or SQL executed against database cursor is opened (either explicitly or implicitly) and if one is using Java to call PL/SQL stored procedure, for example,  which uses explicit cursor, in particular, then even if Java connection is closed cursor opened by stored procedure could remain open.

Your fix makes complete sense but thought I augment the discussion as could, perhaps, look at the database parameter settings in conjunction with setting the max connection pool size.

Thanks;) 

Ellis R. Miller
937.985.3816
937.830.6027


Mundo Nulla Fides

Guy Rouillier

unread,
Jun 24, 2010, 1:21:27 AM6/24/10
to mybati...@googlegroups.com
On 6/23/2010 5:14 PM, ggrabbe wrote:
> FYI, I made some code changes to the POOLED datasource to fix an error
> I was getting from Oracle:
>
> ORA-01000: maximum open cursors exceeded
>
>
> I think the problem was the connections were being used after too many
> connection requests.

No, this is a very common Oracle problem. It is unrelated to how many
times a connection from a pool is used. Oracle doesn't know or care
about client-side connection pools; from Oracle's perspective, they are
all open connections.

What causes the "maximum open cursors exceeded" error is failure to
close result sets.

--
Guy Rouillier

Daniel López

unread,
Jun 24, 2010, 5:19:02 AM6/24/10
to mybati...@googlegroups.com
2010/6/24 Guy Rouillier <gu...@burntmail.com>:

In Oracle, not exactly, as Ellis pointed out if you call PL/SQL
procedures through JDBC and reuse the connections you will end up with
such exception sooner or later, no matter if you close all your
resultsets or not. Some kind of implicit cursors are not closed in
PL/SQL, so you end up reaching the limit no matter what. We also
implemented the "number of usages" limit in our pool long time ago
precisely because of that. On a related note, there's also another
problem when calling PL/SQL procedures and Oracle as the calls are
"cached" on a user/session level, not connection level, so even if you
close physically a connection, the PL/SQL calls are ketp in cache to
be reused, consuming memory. This kind of cache is just flushed when
you close all the connections from the pool at the same time, so we
also implemented a time out so when no connection has been used for a
while, we close them all to reset the cache. That, and making really
sure that the strings sent to the driver use PreparedStatement and are
properly reusable, to consume the minimum amount of cache memory
necessary.

Isn't it fun playing with Oracle? :).
D.

Larry Meadors

unread,
Jun 24, 2010, 7:10:01 AM6/24/10
to mybati...@googlegroups.com
Hm, odd. I've used Oracle for 10+ years and never had that problem. Ever. :-/

Larry

Bhaarat Sharma

unread,
Jun 24, 2010, 7:41:51 AM6/24/10
to mybati...@googlegroups.com
We're also having problems similar to this.  In production where there is a lot of load on the application, intermittently we get "Closed Connection" errors.  I'm beginning to suspect whether all this is related.  

Daniel, would you mind sharing some code that you used to implement things like time out and resetting the cache?

Thanks
-bhaarat

Giovanni Cuccu

unread,
Jun 24, 2010, 7:58:17 AM6/24/10
to mybati...@googlegroups.com
Hi,
I'm currently using Oracle and I do not have the problems you
describe. I worked on some application that were using some big
packages with implicit cursors and I never saw an ORA-01000.
Oracle does cache the cursors but close them if they are out of scope
and more are needed.
On MyBatis side you should never encounter this error because
(prepared) statement are closed by the framework after its usage.
You can see an ora-01000 only if you issue a high number of real
concurrent queries (i.e. you open some cursors before closing other
ones). the maximum number as Ellis Miller said is open_cursors server
parameters.
The big drawback is that the database views (the V$) does not allow to
find cached cursors out of scope (for 10g I did not check 11g).
ciao,
Giovanni

--------------------------------------------------------------------
"You don't know the power of dark side" - Darth Vader

Daniel López

unread,
Jun 24, 2010, 9:02:25 AM6/24/10
to mybati...@googlegroups.com
Well, I went with what the DB guys told me it was happening at the db
side, so it might have been a bug that slipped through the cracks and
as an explanation was found due to the implicit cursors, no thorough
checking was performed. It was very traffic intensive site so it might
have been the number of concurrent queries you pointed out, as they
were just calling procedures with prepared statements and very few
cursors were being open at the client side in code, and even though it
was not my code, it was not too difficult to check if they were being
closed properly so I trusted the guys doing it.
All in all, it is good to know it's not like that as it made some things harder.

OTOH, the thing about prepared statements cache being shared amongst
all connections being open and not being flushed unless all of them
are closed at the same time was something I saw myself with the DB
guys, recycling connections and all. Was that also a confussion?

Thanks all for the correction.
D.

PD: Note to self: Don't trust other's code without seeing it :).

2010/6/24 Giovanni Cuccu <giovann...@gmail.com>:

François Schiettecatte

unread,
Jun 24, 2010, 9:07:45 AM6/24/10
to mybati...@googlegroups.com
And I don't trust my code unless someone else has seen it :)

Giovanni Cuccu

unread,
Jun 24, 2010, 9:15:52 AM6/24/10
to mybati...@googlegroups.com
I'm not sure I understand you question and what you saw but if you use
a connection pool and each connection executes the same statements (as
usually happens) on the database side you see the statements in the
corresponding v$ views until you stop all of them. that is because the
same statement (i.e. sql) is executed by all the connections, so the
cursor disappears only when you close all the connections in the pool.
Hope it helps,

Daniel López

unread,
Jun 24, 2010, 1:15:43 PM6/24/10
to mybati...@googlegroups.com
Well, actually what we saw is that the full string for the callable
statement was being cached in DB memory, on the server side, and even
if we would close the connection in the pool, those statements were
still being kept in memory, as long as there was a single connection
open. As soon as we closed them all and re-opened them again, the
memory was flushed.

The problem in that case was that even though we were using parameters
appropriately for the PLSQL procedures, the full string contained some
initialization parameters that could be slightly different for each
user. So, for the cache, almost every call was slightly different from
the previous one, so the memory was filling up. And you can end up
killing your DB like that :(.

As soon as we detected it, we made the init parameters also be part of
the callable statement parameters in order to have just one version of
each call no matter what user was calling it, but what surprised us
was that the first workaround that we tried, recycling the connections
more often, did not work because of that "shared pool of statements
among all connections open".

S!
D.


2010/6/24 Giovanni Cuccu <giovann...@gmail.com>:

Chema

unread,
Jun 25, 2010, 7:19:04 AM6/25/10
to mybati...@googlegroups.com
> but what surprised us
> was that the first workaround that we tried, recycling the connections
> more often, did not work because of that "shared pool of statements
> among all connections open".
>

Well, Oracle maintains a "shared SQL area" for all users, caching all
SQL sentences. So, I can understand this behaviour.

What I don't know is why flushes this area when you close all
connections of your pool
isn't there any session active ?

Regards

Daniel López

unread,
Jun 25, 2010, 7:43:30 AM6/25/10
to mybati...@googlegroups.com
AFAIK, the only part that Oracle flushes when you close all the
sessions of a given user are the SQL sentences used by this user. So,
just at the user level.
That's what we had to do (close all the sessions of a user at the same
time) until we found out that we could reuse the sentences much better
so we did not fill up the cache.

S!
D.

2010/6/25 Chema <demab...@gmail.com>:

Daniel López

unread,
Jun 25, 2010, 12:23:57 PM6/25/10
to mybati...@googlegroups.com
Experimentally. We were seeing the cache growing with the sentences
being sent by the different connections of the application user. We
recycled them one by one and the cache would not empty, even after
recycling them all a couple of times. The only way we could get the
cache to "forget" those queries was to have no connection open for the
user, then start the pool again.

It might have been an issue with the specific version of Oracle, as
that was a while ago when working in a very high traffic site. Since
then, I've had no problems because none of our current applications
are really accessed 24/7, so the pool auto-shutdowns after 15 minutes
of inactivity.

S!
D.

2010/6/25 Chema <demab...@gmail.com>:
> 2010/6/25 Daniel López <d.lo...@gmail.com>:


>> AFAIK, the only part that Oracle flushes when you close all the
>> sessions of a given user are the SQL sentences used by this user. So,
>> just at the user level.
>

>  is this what you saw ( experimentally ) or this behaviour is
> documented by Oracle ?
>
> I'm not an expert on Oracle but I've only found information about
> cached SQL sentences for all users ( shared SQL area ).
>
> Thanks and regards
>

Reply all
Reply to author
Forward
0 new messages