1

I'm playing with big data in Python and MySQL.

I've a got a huge table, I need to insert new rows while I'm fetching query's results.

I've got this error:

Traceback (most recent call last):
  File "recsys.py", line 53, in <module>
    write_cursor.executemany(add_sim, data_sims)
  File "/Library/Python/2.7/site-packages/mysql/connector/cursor.py", line 603, in executemany
    self._connection.handle_unread_result()
  File "/Library/Python/2.7/site-packages/mysql/connector/connection.py", line 1057, in handle_unread_result
    raise errors.InternalError("Unread result found")
mysql.connector.errors.InternalError: Unread result found

The code is the following one:

#!/usr/bin/env python
# -*- coding: utf-8 -*-

import logging
import mysql.connector

import numpy
import scipy
from scipy.spatial import distance

logging.basicConfig(filename='recsys.log', level=logging.INFO)

cnx = mysql.connector.connect(user='...', password='...', database='...')

cursor = cnx.cursor(dictionary=True)
write_cursor = cnx.cursor()


query = ("...")

cursor.execute(query)

while True:
    rows = cursor.fetchmany(100)
    if not rows:
        break

    add_sim = ("...")
    data_sims = []

    for row in rows:
        f1 = row['f1']
        f2 = row['f2']
        v1 = [[row['a1'], row['b1'], row['c1']]]
        v2 = [[row['a2'], row['b2'], row['c2']]]

        c = 1 - scipy.spatial.distance.cdist(v1, v2, 'cosine')

        if c > 0.8:

            data_sim = (f1, f2, c)
            data_sims.append(data_sim)

    write_cursor.executemany(add_sim, data_sims)
    cnx.commit()

cursor.close()

cnx.close()

I know I could use a buffered connection to mysql, but it's not a good choice in my case because of how really big my table is!

4 Answers 4

4

This is a documented behaviour of cursor.fetchmany() :

You must fetch all rows for the current query before executing new statements using the same connection.

To overcome this issue, you can establish a new connection to be used by write_cursor:

cnx_write = mysql.connector.connect(...)
write_cursor = cnx_write.cursor()
Sign up to request clarification or add additional context in comments.

Comments

0

I had a similar problem when I was threading, and just had to close and open a new connection in each thread separately... instead of leaving the connection open.

Comments

0

Limit 100 in your query.

query = ("..."),  //limit 100 here

Then use fetchall() instead of fetchmany(100)

Comments

0

In my case, I accidentally closed the mysql connection before the query.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.