Slow MySQL queries in Python but fast elsewhere -


i'm having heckuva time dealing slow mysql queries in python. in 1 area of application, "load data infile" goes quick. in area, select queries slow.

executing same query in phpmyadmin , navicat (as second test) yields response ~5x faster in python.

a few notes...

  • i switched mysqldb connector , using sscursor. no performance increase.
  • the database optimized, indexed etc. i'm porting application python php/codeigniter ran fine (i foolishly thought getting out of php speed up)
  • php/codeigniter executes select queries swiftly. example, 1 key aspect of application takes ~2 seconds in php/codeigniter, taking 10 seconds in python before of analysis of data done.

my link database standard...

dbconn=mysqldb.connect(host="127.0.0.1",user="*",passwd="*",db="*", cursorclass = mysqldb.cursors.sscursor) 

any insights/help/advice appreciated!

update

in terms of fetching/handling results, i've tried few ways. initial query standard...

# run query cursor.execute(query) 

i removed of code within loop make sure wasn't case bottlekneck, , it's not. put dummy code in place. entire process did not speed @ all.

db_results = "test"  # loop results row in cursor:      = 0 (this dummy code put in test)  return db_results 

the query result 501 rows (large amount of columns)... took 0.029 seconds outside of python. taking longer within python.

the project related horse racing. query done within function. query long, however, runs outside of python. commented out code within loop on purpose testing... print(query) in hopes of figuring out.

# pps def get_pps(race_ids):  # comma race list race_list = ','.join(map(str, race_ids))  # pps query query = ("select raceindex.race_id, entries.entry_id, entries.prognum, runlines.line_id, runlines.track_code, runlines.race_date, runlines.race_number, runlines.horse_name, runlines.line_date, runlines.line_track, runlines.line_race, runlines.surface, runlines.distance, runlines.starters, runlines.race_grade, runlines.post_position, runlines.c1pos, runlines.c1posn, runlines.c1len, runlines.c2pos, runlines.c2posn, runlines.c2len, runlines.c3pos, runlines.c3posn, runlines.c3len, runlines.c4pos, runlines.c4posn, runlines.c4len, runlines.c5pos, runlines.c5posn, runlines.c5len, runlines.finpos, runlines.finposn, runlines.finlen, runlines.dq, runlines.dh, runlines.dqplace, runlines.beyer, runlines.weight, runlines.comment, runlines.long_comment, runlines.odds, runlines.odds_position, runlines.entries, runlines.track_variant, runlines.speed_rating, runlines.sealed_track, runlines.frac1, runlines.frac2, runlines.frac3, runlines.frac4, runlines.frac5, runlines.frac6, runlines.final_time, charts.raceshape "          "from hrdb_raceindex raceindex "          "inner join hrdb_runlines runlines on runlines.race_date = raceindex.race_date , runlines.track_code = raceindex.track_code , runlines.race_number = raceindex.race_number "          "inner join hrdb_entries entries on entries.race_date=runlines.race_date , entries.track_code=runlines.track_code ,  entries.race_number=runlines.race_number , entries.horse_name=runlines.horse_name "          "left join hrdb_charts charts on runlines.line_date = charts.race_date , runlines.line_track = charts.track_code , runlines.line_race = charts.race_number "          "where raceindex.race_id in (" + race_list  + ") "          "order runlines.line_date desc;")  print(query)  # run query cursor.execute(query)  # query fields fields = [i[0] in cursor.description]  # pps list pps = []  # loop results row in cursor:      = 0     #this_pp = {}      #for i, value in enumerate(row):     #    this_pp[fields[i]] = value                  #pps.append(this_pp)  return pps 

one final note... haven't considered ideal way handle result. believe 1 cursor allows result come set of dictionaries. haven't made point yet query , return slow.

tho have 501 rows looks have on 50 columns. how total data being passed mysql python?

501 rows x 55 columns = 27,555 cells returned.

if each cell averaged "only" 1k close 27mb of data returned.

to sense of how data mysql pushing can add query:

show session status "bytes_sent" 

is server well-resourced? memory allocation configured?

my guess when using phpmyadmin getting paginated results. masks issue of mysql returning more data server can handle (i don't use navicat, not sure how returns results).

perhaps python process memory-constrained , when faced large result set has out page out disk handle result set.

if reduce number of columns called and/or constrain to, limit 10 on query improved speed?

can see if server running python paging disk when query called? can see memory allocated python, how used during process , how allocation , usage compares same values in php version?

can allocate more memory constrained resource?

can reduce number of columns or rows called through pagination or asynchronous loading?


Comments

Popular posts from this blog

php - Calling a template part from a post -

Firefox SVG shape not printing when it has stroke -

How to mention the localhost in android -