GoDaddy: undocumented 20-second CPU time maximal execution limit? (python, ELF, etc)
16th October 2007
Today, setting up a relatively serious (in CPU resources needed) web-system, I ran into a weird problem of python scripts ending prematurely. After some investigation, it looked like any process which uses up more than 20 seconds of CPU time, is automatically killed. To verify this, I wrote an infinite loop in C,
int main () {
unsigned int i;
for (i = 0; i < 2 ; i++ ) { i = 0; } return 0; } [/c] compiled it and executed several times on the GoDaddy shared hosting server. I did observe the program running for the maximum of 20 seconds of CPU time, not a second more. Please note, that 20 seconds of CPU time can be much more of “real” time, if the script isn’t using 100% of CPU, which often the case for shared hosting. Thus if you have in your php.ini max_execution_time set to, say, 60 seconds, your php script may actually execute as long as one minute; but I’m pretty sure that if your script has lots of CPU-intensive procedures, then as soon as it uses 20 seconds of CPU time, it will be terminated (however, this statement still needs checking – anyone?). To verify, I also created a cron job with the same file. It ran for 30 seconds CPU time. Strangely, this behaviour is not documented anywhere. This limit may also explain a number of other problems, if you have heavy web-applications: they just might be killed before they are finished, causing errors. I do understand the reason for this limitation, and am sure similar limitations exist in other shared hosting environments. The only important thing here is that this limit should have been documented and even put upfront somewhere in the hosting plans descriptions. I also wonder if the limit is the same for all godaddy shared hosting plans, or if it differs. 20 seconds when executed from PHP, and 30 seconds when executed as a cron job were observed on the Deluxe Linux Hosting plan. Extensions, additions and comments are welcome.
October 19th, 2007 at 16:18
[...] GoDaddy: undocumented 20-second CPU time maximal execution limit? (python, ELF, etc) [...]
June 4th, 2008 at 18:51
[...] it’s better to have a 125MHz-clamp on CPU, than have a 20-seconds maximal CPU time limit [...]
October 19th, 2008 at 7:01
I guess you could probably try this out with a PHP script which has code like below:
sleep(10);
echo ‘The script has been running for 10 seconds’;
sleep(10);
echo ‘the script has been running for 20 seconds’;
sleep(10);
echo ‘the script has been running for 30 seconds’;
Etcetera? Not sure, just an idea.
Cheers,
Sid
October 19th, 2008 at 10:46
Sid,
that won’t show anything, as “sleeping” doesn’t consume CPU time as actively as an infinite loop – so this just is not the CPU time you would be trying to measure with this PHP script.
Moreover, as my little experiment with an infinite loop did work, there is no need in any additional testing.
October 19th, 2008 at 15:55
Ah, I did not realise the difference.
I’m on GoDaddy right now, and my scripts seem to be running fairly well, and WordPress 2.6 with decent speed, although I noticed that the Deluxe plan had better performance than my Economy hosting plan, although this might have been a placebo.
Anyways,
Cheers,
Sid
February 13th, 2009 at 18:30
Thanks guys for this info,
I have been trying to find the bug for a while now. I had a CPU intensive cron job that never finished, now I know why.
If you have SSH access, you can sometimes see “terminated” being printed after 20 seconds or so when you run the script.
February 13th, 2009 at 21:45
Stephan, you are welcome.
I haven’t yet tried the new SSH access GoDaddy has on offer – to enable it, I have to accept a phone call from US
March 8th, 2009 at 0:02
FYI: BE FORE WARNED!!! If you enable godaddy ssh, you had better backup your databases first. They move you to a different hosting server behind a different firewall with a different db host. So you have to restore your db to that new db host, and update your php scripts/config files to use that new db server. You can probably expect to have a minimum of several hours downtime. The reason they call you first is to verify that you actually want to do this and are prepared to do the change over.
March 8th, 2009 at 15:27
Randy,
thanks for sharing.
I guess DB restoration is very easy with their “backup/restore” functions (well, it should be).
May 12th, 2009 at 11:47
Thanks for this. I’ve worked it into my PHP download wrapper. A must have for resumable streams and downloads when you don’t what errors in the download data.
eg.
define ( ‘START_TIME’, time() );
….
….
$handle = @fopen( $filename, “r” );
if ($handle)
{
dlog( “fseek: {$this->getStart()}”);
fseek( $handle, $this->getStart() );
$count=0;
while (!feof($handle))
{
$buffer = fgets($handle, 8192);
echo $buffer;
ob_flush();
if ( (!(++$count % 20)) && ((time()-START_TIME) > 540) )
{
exit;
}
}
}
fclose($handle);
…
June 14th, 2009 at 10:54
It seriously would not surprise me at all. I’m actually quite annoyed that Godaddy crams close to 4000 virtual hosts on an ip. I say an “IP” because they may have one IP distributed out to several physical capacity servers. I really hope it’s not just one physical server, no matter how hyrbrid. What happens if one the IP of the server your site is hosted on, gets ddos’d? Probability wise (one in 3500-4000 chance) it will be directed at someone else’s site, that is running behind the same IP as you. To get back to the topic at hand, Godaddy HAS to think about CPU delegation or else their processor just won’t be capable of handling the sheer amount of diverse sites its running. GoDaddy does have some of it’s server variables set to modest limits. I have for instance actually checked certain server variables, in order to increase them for my particular needs. GoDaddy, if you read this, I know you’re making goooood money now. Why don’t you be more reasonable with your schemes. Scale back on the marketing maybe, and put the money to quality of infrastructure. I know I know… the colo’s are more expensive to run if you have more machines’s ya ya ya. Somebody’s got to do it!
July 6th, 2009 at 15:49
Scale back on growing your business mabey ? So that I can get something for nothing. I’m so annoyed at you for offering a service that makes my life more convenient and then not meeting my every little need. Your making all this money that I should have and then you have the odacity to cram me on an IP with 4000 other accounts. How dare you. After all, I’m paving your great halls with my $5.00 a month… You owe me an explaination !
You should do what I say because you’ve got more money than me and mommy says that’s not fair.
So quit trying to grow your business and gimme gimme gimme.
Instead of complaining – maybe choose a different hosting plan instead?
hosting is a commodity and is subject to market prices. If you want more CPU time, pay for a dedicated server and stop thinking that other people are obligated to serve you with no or minimal profit. They are offering you more options and hence abundance in your life. Be grateful instead of complaining that they missed one of your little needs.
I’ve got nothing against you personally but please do think from their perspective.
Note: comment edited by blog owner to remove/soften foul language.
July 6th, 2009 at 17:58
Michael, your point is evident. Quoting post text, I do understand the reason for this limitation, and am sure similar limitations exist in other shared hosting environments.
Also, please don’t use foul or offensive language – you know, children nowadays have access to Internet much earlier in their lives than we did. I’ve slightly edited your comment to make it PG or even G-compatible
August 29th, 2010 at 5:10
I struggled with this “Terminated” problem for quite a while, but I finally found that if I run potentially long-running commands using “nice”, the Godaddy server will let me get away with it. This worked for large database imports and un-taring large archive files, both of which failed without “nice”. For instance:
nice tar -xzvf myHugeFile.tar.gz &
August 29th, 2010 at 17:45
In the end they suspended my account for using too much processing power. That was 6 months ago or more, they still bill me. GoDaddy Go! No, really, go.
January 19th, 2011 at 4:01
When I first joined Godaddy, about 4 years ago, a PHP process was allowed to run for 25 minutes. This allowed for quite a lot of non-standard development and interesting cron jobs.
About 2 years ago, that got shortened to 10 minutes. This was OK, and still allowed for reasonable development.
In the last few days, it appears to have been shortened to 40 seconds. (At least on the sever I’m on.)
This is such a drastic clampdown, that it’s barely enough to run even dumbo-standard PHP scripts, let alone some more adventurous jobs.
I hope there is a sizable backlash to this one.
January 19th, 2011 at 7:52
Oops, it’s back to 600 seconds again. It was just some server overload problem.]
May 1st, 2013 at 18:00
[...] 我是在这篇åšæ–‡çœ‹åˆ°Godaddy有CPUé™é¢è¿™ä¸ªè¯´æ³•(åšä¸»è‡ªå·±ç»åŽ†äº†è¶…过é™é¢è¿›ç¨‹è¢«æ€),上é¢è¿˜é™„带了å¦ä¸€ç¯‡åšæ–‡çš„分æžï¼ŒæŒ‡å‡ºè¿™ä¸ªé™é¢æ˜¯20秒CPU时间。算进程一轮è¿è¡Œ50ms,æ¢æˆreal time大概是400秒å§ã€‚ä¸è¿‡é‚£ç¯‡åˆ†æžå†™äºŽå…å¹´å‰ï¼ŒæŒ‰ç…§æ‘©å°”定律,Godaddy现在应该有能力把é™é¢æ”¾å®½16å€ã€‚ [...]
November 6th, 2014 at 5:00
Damn I came across this just now. I’ve been trying to run a process that lasts for 6-7 hours. If I run it in my MacBook it will exhaust it so I wanted to run it on GoDaddy. Ssh’d and tested this and this is sadly true. Anybody knows a workaround to this?
November 10th, 2014 at 22:49
I guess AWS/EC2 could be a good “workaround” for your case, especially if you already know RAM requirements.
November 12th, 2014 at 12:34
Thanks for the response. I’ve got it up and running on Heroku. I did consider Amazon’s EC2 but they require credit card details even for the free tier usage whereas Heroku allows my program to run 24×7 without even the need for a card!
November 12th, 2014 at 13:58
EC2 free tier (the “mini” or “micro” instance they offer for free) is not fit for any kind of computations, only for static web-pages or low-popularity dynamic web-pages.
I think Heroku is actually using EC2 as their back-end, so there should be no price benefit of Heroku over EC2, unless Heroku’s payments model is much more convenient.