A script that was working fine on 100 records per batch choked when I fed it 1,000 records.
It gave:

mysql.connector.errors.OperationalError: 1040 (08004): Too many connections

So I went ahead and made sure to connect.close() everywhere that I opened a connection, and the script made it much further but eventually crashed, giving the same error.
Obviously, increasing the connection limit would only encourage sloppy coding? It would be better if I developed a methodology to prevent this type of crash. Because if this were PHP it wouldn't be happening, I admit that.

How can I prevent this type of error from happening again, short of switching to PHP?

Are you connecting to MySQL with persistent connections? What happens if you don't use persistent connections?

Also, I'm confused what you mean about it complaining about too many connections simply by you changing how many records a single connection works with?

commented: responded below, thanks +2

So I went ahead and made sure to connect.close() everywhere that I opened a connection

How many times do you have a single script opening and closing a connection to MySQL? Is this a web script?

Are you connecting to MySQL with persistent connections?

Most coders close connections (remember, these are local connections) after each transaction. That was how I was taught. I’m very open to looking at other ways of doing it but for what it’s worth one would be ill-advised to try to recycle connections in any program anywhere near a loop. I haven’t a notion if python mysql connect API is thread-safe. High db connection recycling is contra-indicated in any case.

Also, I'm confused what you mean about it complaining about too many connections simply by you changing how many records a single connection works with?

Well it was you coders who had convinced me to try batched transactions rather than to handle each and every component of each and every user request with a single transaction. In hindsight it was obvious, but this not only fixed the performance bottleneck that it was designed to mend, it enabled me to implement an ad-hoc resource scheduling system into the software. Which was fortuitous, because I won’t have to do this later and therefore shall probably never have to refactor the code for performance reasons.

How many times do you have a single script opening and closing a connection to MySQL?
Hmmm…

`grep -r 'mysql.connect' ../daniwebisthebest | wc

41 220 3928`

Is this a web script?

Basically? No. It’s a data warehouse. It will have a web user interface, but I’m not anywhere close to putting that in. Specifically it was choking on a scanning algorithm.

Generally it is hard to offer SQL suggestions without seeing the code. Are you closing the connection manually, or are you using a context manager?

commented: Manually. Thanks, this solved it!! I will switch to a context manager. +2

I can only speak for the limited experience I have in coding DaniWeb. What we do differs mainly between if the HTTP request that's requesting the PHP page uses GET or POST. In the case of a POST request, we're most likely signing up, creating a post, editing a user profile, etc. In the case of a GET request, we're most likely just reading content and not modifying the database.

GET Requests:

  • Open a persistent MySQL connection at the beginning of the PHP script
  • Execute n number of MySQL queries (typically 0-3) throughout the PHP script

POST Requests:

  • Open a non-persistent MySQL connection at the beginning of the PHP script
  • Start a transaction
  • Execute n number of MySQL queries (typically 3-5) throughout the PHP script
  • End the transaction
  • Close the MySQL connection

On average, a rough guess is that each script takes about 120 ms to execute for an end-user. HTTP latency then brings that number closer to 200 ms (of course, depending on where in the world you are, this is often much higher) from HTTP request to response.

We then do run PHP scripts in cron jobs to do things like send emails and perform some heavy calculations. We do this much the same way as our GET requests, in that we open a persistent MySQL connection. The cron job often executes a couple really complex MySQL queries that might take as much as a second or more to execute. There's also a lot of PHP logic that runs in these cron jobs, and sometimes they execute for 30 minutes or more. So far, this has never been an issue.

Its hard in a Python script to "create too many connections" to a DB. You don't have many concurrent clients , you don't have web requests , is this a server of some kind ? . From your line " Specifically it was choking on a scanning algorithm." I thought that it might create a new connection for each query (or for a group). Have you abstracted the db connection and its functionalities , do you select through a connection object ? If so it is easy to log to a file what is going on, if no I could you provide you my Python code on this (although I am not expert in Python ... but A.I. made me to dig deeper)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.