使用c3p0數據源的Tomcat,超過了maxPoolSize。

[英]Tomcat using c3p0 Datasource, maxPoolSize exceeded


I have configured Tomcat to use a ComboPooledDataSource, by adding the following in my context.xml.

通過在我的context.xml中添加以下內容,我已經將Tomcat配置為使用了一個與之相匹配的數據源。

<Resource
    name="jdbc/abcdatasource"
    auth="Container"
    type="com.mchange.v2.c3p0.ComboPooledDataSource"
    factory="org.apache.naming.factory.BeanFactory"
    maxPoolSize="20"
    minPoolSize="5"
    maxIdleTime="3000"
    acquireIncrement="1"
    user="abc"
    password="pw_here"
    driverClass="oracle.jdbc.driver.OracleDriver"
    jdbcUrl="jdbc:oracle:thin:@abc.def.ghi.net:1521:BLAH"
/>

I want to limit the db connections to 20, as defined in the maxPoolSize, but my open connections are exceeding this number. Yesterday, for example, I had 35 open connections. My logs are capturing the following.

我希望將db連接限制為20,這是在maxPoolSize中定義的,但是我的開放連接超過了這個數字。例如,昨天我有35個開放連接。我的日志記錄如下。

2014-09-11 00:37:47,077  INFO [com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#2] NewPooledConnection:725 -  - [c3p0] Exceptions occurred while trying to close a PooledConnection's resources normally.
2014-09-11 00:37:47,077  INFO [com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#2] NewPooledConnection:735 -  - [c3p0] NewPooledConnection close Exception.
java.sql.SQLException: Io exception: Connection timed out
        at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:112)
        at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:146)
        at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:255)
        at oracle.jdbc.driver.T4CConnection.logoff(T4CConnection.java:481)
        at oracle.jdbc.driver.PhysicalConnection.close(PhysicalConnection.java:1203)
        at com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:549)
        at com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:234)
        at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool$1PooledConnectionResourcePoolManager.destroyResource(C3P0PooledConnectionPool.java:470)
        at com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask.run(BasicResourcePool.java:964)
        at com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:547)
2014-09-11 00:37:47,078  WARN [com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#2] BasicResourcePool:972 -  - Failed to destroy resource: com.mchange.v2.c3p0.impl.NewPooledConnection@3ca5d95d
java.sql.SQLException: Some resources failed to close properly while closing com.mchange.v2.c3p0.impl.NewPooledConnection@3ca5d95d
        at com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:571)
        at com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:234)
        at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool$1PooledConnectionResourcePoolManager.destroyResource(C3P0PooledConnectionPool.java:470)
        at com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask.run(BasicResourcePool.java:964)
        at com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:547)

Given this information, what is a likely cause for my open connections to be in excess of my maxPoolSize?

考慮到這些信息,我的開放連接有可能超過我的maxPoolSize嗎?

I know there are related stackoverflow threads, such as here and here, but it's not clear to what the issue could be in my case. In these similar threads, there were multiple connection pools - but I would expect one distinct connection pool when I configure my web application container to use the c3p0 ComboPooledDataSource. Maybe I'm missing something there...

我知道有相關的stackoverflow線程,比如這里和這里,但是不清楚這個問題在我的情況下會是什么。在這些類似的線程中,有多個連接池——但是,當我配置web應用程序容器來使用c3p0 ComboPooledDataSource時,我希望有一個不同的連接池。也許我漏掉了什么…

Any help would be very greatly appreciated!!!!!

任何幫助都將非常感謝!!!!!

Edit 1: For some additional context, this is not a new application. It has been up and running for years. What is new, however, is using Tomcat with the c3p0 datasource. This issue occurred after transitioning from WebLogic to Tomcat.

編輯1:對於一些額外的上下文,這不是一個新的應用程序。它已經運行多年了。然而,新出現的是使用c3p0數據源的Tomcat。這個問題發生在從WebLogic過渡到Tomcat之后。

Edit 2: I failed to provide what may be very useful logging information when I originally created this post. DEADLOCK issue captured in log4j is shown below. I have been closely monitoring the connections since I noticed this issue and the connection increase seems to be associated with the deadlock issue. If I had to guess, I would say that it's creating a new connection pool after the deadlock.

編輯2:當我最初創建這個帖子時,我沒有提供可能非常有用的日志信息。在log4j中捕獲的死鎖問題如下所示。自從我注意到這個問題以來,我一直在密切監視這些連接,並且連接的增加似乎與死鎖問題有關。如果我不得不猜測,我會說它在死鎖之后創建一個新的連接池。

2014-09-11 11:36:45,320  WARN [Timer-0] ThreadPoolAsynchronousRunner:608 -  - com.mchange.v2.async.ThreadPoolAsynchronousRunner$DeadlockDetector@1c203f2b -- APPARENT DEADLOCK!!! Creating emergency threads for unassigned pending tasks!
2014-09-11 11:36:45,323  WARN [Timer-0] ThreadPoolAsynchronousRunner:624 -  - com.mchange.v2.async.ThreadPoolAsynchronousRunner$DeadlockDetector@1c203f2b -- APPARENT DEADLOCK!!! Complete Status:
        Managed Threads: 3
        Active Threads: 3
        Active Tasks:
                com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask@4a50503d (com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#2)
                com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask@5ec3fae8 (com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#0)
                com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask@148dc5f3 (com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#1)
        Pending Tasks:
                com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask@67bedf0f
                com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask@242686ff
                com.mchange.v2.resourcepool.BasicResourcePool$AcquireTask@5d68e1e3
                com.mchange.v2.resourcepool.BasicResourcePool$AcquireTask@10c62dfe
                com.mchange.v2.resourcepool.BasicResourcePool$AcquireTask@42f5874c
                com.mchange.v2.resourcepool.BasicResourcePool$AcquireTask@585b0ec
                com.mchange.v2.resourcepool.BasicResourcePool$AcquireTask@c2258c9
Pool thread stack traces:
        Thread[com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#2,5,main]
                java.net.SocketInputStream.socketRead0(Native Method)
                java.net.SocketInputStream.read(SocketInputStream.java:129)
                oracle.net.ns.Packet.receive(Unknown Source)
                oracle.net.ns.DataPacket.receive(Unknown Source)
                oracle.net.ns.NetInputStream.getNextPacket(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.jdbc.driver.T4CMAREngine.unmarshalUB1(T4CMAREngine.java:1104)
                oracle.jdbc.driver.T4CMAREngine.unmarshalSB1(T4CMAREngine.java:1075)
                oracle.jdbc.driver.T4C7Ocommoncall.receive(T4C7Ocommoncall.java:106)
                oracle.jdbc.driver.T4CConnection.logoff(T4CConnection.java:465)
                oracle.jdbc.driver.PhysicalConnection.close(PhysicalConnection.java:1203)
                com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:549)
                com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:234)
                com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool$1PooledConnectionResourcePoolManager.destroyResource(C3P0PooledConnectionPool.java:470)
                com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask.run(BasicResourcePool.java:964)
                com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:547)
        Thread[com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#0,5,main]
                java.net.SocketInputStream.socketRead0(Native Method)
                java.net.SocketInputStream.read(SocketInputStream.java:129)
                oracle.net.ns.Packet.receive(Unknown Source)
                oracle.net.ns.DataPacket.receive(Unknown Source)
                oracle.net.ns.NetInputStream.getNextPacket(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.jdbc.driver.T4CMAREngine.unmarshalUB1(T4CMAREngine.java:1104)
                oracle.jdbc.driver.T4CMAREngine.unmarshalSB1(T4CMAREngine.java:1075)
                oracle.jdbc.driver.T4C7Ocommoncall.receive(T4C7Ocommoncall.java:106)
                oracle.jdbc.driver.T4CConnection.logoff(T4CConnection.java:465)
                oracle.jdbc.driver.PhysicalConnection.close(PhysicalConnection.java:1203)
                com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:549)
                com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:234)
                com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool$1PooledConnectionResourcePoolManager.destroyResource(C3P0PooledConnectionPool.java:470)
                com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask.run(BasicResourcePool.java:964)
                com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:547)
        Thread[com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#1,5,main]
                java.net.SocketInputStream.socketRead0(Native Method)
                java.net.SocketInputStream.read(SocketInputStream.java:129)
                oracle.net.ns.Packet.receive(Unknown Source)
                oracle.net.ns.DataPacket.receive(Unknown Source)
                oracle.net.ns.NetInputStream.getNextPacket(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.net.ns.NetInputStream.read(Unknown Source)
                oracle.jdbc.driver.T4CMAREngine.unmarshalUB1(T4CMAREngine.java:1104)
                oracle.jdbc.driver.T4CMAREngine.unmarshalSB1(T4CMAREngine.java:1075)
                oracle.jdbc.driver.T4C7Ocommoncall.receive(T4C7Ocommoncall.java:106)
                oracle.jdbc.driver.T4CConnection.logoff(T4CConnection.java:465)
                oracle.jdbc.driver.PhysicalConnection.close(PhysicalConnection.java:1203)
                com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:549)
                com.mchange.v2.c3p0.impl.NewPooledConnection.close(NewPooledConnection.java:234)
                com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool$1PooledConnectionResourcePoolManager.destroyResource(C3P0PooledConnectionPool.java:470)
                com.mchange.v2.resourcepool.BasicResourcePool$1DestroyResourceTask.run(BasicResourcePool.java:964)
                com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:547)

The c3p0 version that is being used is 0.9.1.2

正在使用的c3p0版本是0.9.1.2。

1 个解决方案

#1


1  

For what it's worth, I updated Tomcat's server.xml and set autoDeploy=false and I have not had any issues since.

對於它的價值,我更新了Tomcat的服務器。xml和設置autoDeploy=false,此后我就沒有任何問題。

My c3p0 connection pool settings are in Tomcat's context.xml and any modification to this file at runtime would result in the application being redeployed and an additional connection pool created. Setting autoDeploy to false prevents this Tomcat behavior which had the unwanted c3p0 connection pool side effect.

我的c3p0連接池設置在Tomcat的上下文中。xml和在運行時對該文件的任何修改都會導致應用程序被重新部署,並創建一個額外的連接池。將autoDeploy設置為false可以防止此Tomcat行為產生不必要的c3p0連接池副作用。


注意!

本站翻译的文章,版权归属于本站,未经许可禁止转摘,转摘请注明本文地址:https://www.itdaan.com/blog/2014/09/12/72f08ba3ae3d8e3a7134e07f4e4e58d9.html



 
粤ICP备14056181号  © 2014-2021 ITdaan.com