User not logged in - login - register
Home Calendar Books School Tool Photo Gallery Message Boards Users Statistics Advertise Site Info
go to bottom | |
 Message Boards » » PHP/MySQL script just stops (times out?) Page [1]  
bous
All American
11215 Posts
user info
edit post

If i break the script up into sections it all runs fine (the code is not the problem as i've tested, unless memory is an issue or something), but when i run it all at once, it just stops after about 20 seconds.

it stops at different places too, but at the same time each time.

Where would I begin to change a setting to allow a script to not time out so fast?

running this particular script on a godaddy linux hosted server, not dedicated.

5/9/2007 4:06:33 PM

qntmfred
retired
40559 Posts
user info
edit post

look up set_time_limit()


[Edited on May 9, 2007 at 4:27 PM. Reason : and find out which part is taking so long and make sure it's optimized]

5/9/2007 4:21:05 PM

bous
All American
11215 Posts
user info
edit post

can't really optimize it because it's reading large files from the disk. it's pretty optimized as is.

i'm reading all data into a database for faster access once this script is run.



max_execution_time = 60 in php.ini didn't work
set_time_limit(60) didn't work (still quit in 25 seconds)

phpinfo() shows all timeouts at 60s, but it's quitting in 25

5/9/2007 4:41:59 PM

qntmfred
retired
40559 Posts
user info
edit post

try using 0 as the parameter

[Edited on May 9, 2007 at 4:45 PM. Reason : and post the code if you can]

5/9/2007 4:45:23 PM

bous
All American
11215 Posts
user info
edit post

can't post the code.

the whole thing is basically a loop with 3 tiers of inner loops... it's not getting hung up anywhere and i'm constantly outputting data when i enable debugging.

i'm going to try and make a simple script to loop and sleep(1) and see how long that lasts.



hmmm i made a simple script that lasted 100 seconds... maybe a memory issue then?
i did a script using a database and not using one... lasted 100 seconds+.




how can i have php spit out ALL errors possible that could say why it suddenly quits?

[Edited on May 9, 2007 at 4:55 PM. Reason : ]

5/9/2007 4:48:04 PM

qntmfred
retired
40559 Posts
user info
edit post

when i get timing issues, i put in debug statements like

print date("h:i:s") . " - which part of the loop i'm in<BR>";


then you can quickly see which step is taking so long

[Edited on May 9, 2007 at 4:54 PM. Reason : code]

5/9/2007 4:54:14 PM

bous
All American
11215 Posts
user info
edit post

every single loop has tons of debugging outputs depending on which debug level i choose.

there is never a delay in output.

NOTE: THE BROWSER SAYS DONE. IT IS NOT A HUNG CONNECTION (I.E. TRAPPED IN A LOOP)


however, it stops at AROUND the same place each time, but not the exact place.

the place it stops has been accessed many times before it stops, so it's not a bug in the code from what i can see right now.



memory? remember, i can split the code to run in segments (i.e. 10 loops at a time and run it 4 separate times to get the 40 results needed). right now i'm trying to get all 40 results to save time.

after each loop everything is reset to NULL;

[Edited on May 9, 2007 at 4:58 PM. Reason : ]

5/9/2007 4:57:36 PM

BigMan157
no u
103352 Posts
user info
edit post

does putting "or die(mysql_error());" on a fetch object statement cause anything to show up?

[Edited on May 9, 2007 at 5:22 PM. Reason : also i think it'll tell you your memory in php_info()]

[Edited on May 9, 2007 at 5:23 PM. Reason : also turn on error reporting - http://us2.php.net/error_reporting]

5/9/2007 5:19:21 PM

bous
All American
11215 Posts
user info
edit post

there aren't any mysql errors.

i actually connect and get id's, then disconnect and free the result, so i have ruled out a mysql issue and am back to a memory thing i think.

5/9/2007 5:24:16 PM

bous
All American
11215 Posts
user info
edit post

i turned on maximum errors and don't get a single one throughout the program.

would it warn me about memory?




ALSO, when i switch up the loops it still stops after the same point in time/memory which leads me to think it's memory or something again.




report_memleaks On On in phpinfo() ... only thing about mem in there

[Edited on May 9, 2007 at 5:36 PM. Reason : ]

5/9/2007 5:35:08 PM

Noen
All American
31346 Posts
user info
edit post

Well looks like you ran into the same problem I did a year or so ago.

Max execution time. For most servers and browsers, your shit is going to timeout after 60 seconds no matter what you do.

How do you get around it for processing big shit? (I was scraping a website and converting the data from html 4.0 to xhtml 1.1)

You're gonna have to split the processing into two parts.

Part 1: Your frontend script launches a backend php script. Look up the exec commands on how to launch a commandline PHP app. Make sure the backend script creates a file, lets call it "working" for now. In the file it should write out some variable that will indicate it's progression through the task, lets say for this case "Working on file 2912 of 300020". It will start it off running in a new thread in the background.

Part 2: Your frontend script is simple, first load it checks for the working file, doesnt see it, and load the form to run it (as simple as a "run me" submit button, but you can add in whatever parameter data you need). Submit the page, on the next load it sees the working file and does two things:

Part 2a: It displays the current progress.
Part 2b: It writes a Meta refresh tag to auto-refresh the page after X seconds. Throw in a little animated GIF processing/loading icon to keep yourself happy, and you're done.

Part 3 (optional): If you want, you can add in a cancel/stop feature. I personally avoid the direct process kill, because it's treading on dangerous waters if someone exploits that. I just have the frontend make a file, lets call it "stop". In the backend file, each iteration it checks for stop.

When the backend finished, it kills the working file, your frontend doesnt find it and knows it's finished processing. You can add in a "finished" log to display whatever result you might want.

5/9/2007 5:46:02 PM

qntmfred
retired
40559 Posts
user info
edit post

^ ewww. i understand you gotta do what you gotta do, but that's nasty

5/9/2007 7:12:07 PM

bous
All American
11215 Posts
user info
edit post

the thing that bothers me is:

i can run a normal script with just 1 for loop (using mysql or not) and loop it for 120 seconds and it works.

once i start having a lot of code and a lot of memory usage in those loops, that's when it ends abruptly.

any ideas on that?

5/9/2007 8:20:16 PM

robster
All American
3545 Posts
user info
edit post

is this a shared hosting package that you are using??

If so, then the amount of mem and power they give you are very limited.

5/9/2007 10:37:02 PM

bous
All American
11215 Posts
user info
edit post

shared hosting

i'm setting up an apache/php/mysql server on my own system to see if it's them or me



maybe it's time to go dedicated

5/9/2007 10:46:29 PM

Noen
All American
31346 Posts
user info
edit post

^^ahhh yes, the CPU time limits. Good call robster

5/10/2007 11:43:20 AM

bous
All American
11215 Posts
user info
edit post

i setup a local apache/mysql/php environment w/ all the latest versions.

for some reason the output doesn't come as it's printed through php... it waits until the script is done then displays output...

any ideas on that? (for all php scripts).



also, the script that doesn't work on godaddy (quits) also doesn't display anything on the local environment (since it waits to display until script is done and finished okay).

[Edited on May 10, 2007 at 2:45 PM. Reason : ]

5/10/2007 2:45:00 PM

Stein
All American
19842 Posts
user info
edit post

Do you have mod_gzip enabled on the server? That'd do it.

5/10/2007 2:59:29 PM

bous
All American
11215 Posts
user info
edit post

only thing i added besides default install is the mysql extension...




so i tested it locally and my script runs just fine after i got around how php5 handles globals differently than php4...


so i guess it's a shared hosting limitation or something. i guess i'll have to somehow execute the script on the backend or something or just upgrade hosting packages.

5/10/2007 3:26:52 PM

qntmfred
retired
40559 Posts
user info
edit post

output buffering

http://us.php.net/manual/en/ref.outcontrol.php

5/10/2007 4:09:13 PM

bous
All American
11215 Posts
user info
edit post

changed php.ini to output_buffering = 0 (instead of 4096) and that didn't work... i'll try something else

[Edited on May 10, 2007 at 4:21 PM. Reason : ]

5/10/2007 4:18:25 PM

Noen
All American
31346 Posts
user info
edit post

you need to explicitly disable output buffering

5/10/2007 6:06:12 PM

 Message Boards » Tech Talk » PHP/MySQL script just stops (times out?) Page [1]  
go to top | |
Admin Options : move topic | lock topic

© 2024 by The Wolf Web - All Rights Reserved.
The material located at this site is not endorsed, sponsored or provided by or on behalf of North Carolina State University.
Powered by CrazyWeb v2.38 - our disclaimer.