25th June 2007
Googling for “practical artificial intelligence” gives only two (somewhat) relevant links:
Looks like it isn’t widely acknowledged, that AI is, in fact, quite widely used. Though primarily in OCR, TTS, STT :), and NLP (including machine translation).
Posted in Artificial Intelligence, Links, Programming, Science | 5 Comments »
25th June 2007
If you happen to need to check your linux filesystem, which is mounted read-write, and for some reason you do not want to reboot, then the simple sequence of commands listed below should help you. Note, that the commands provided put you into single-user mode, which kills web-server and mysql daemon.
So, running e2fsck on a live (mounted) filesystem isn’t recommended – and f2sck asks if you really want to check the rw-mounted FS (be sure to answer ‘n’o). If the FS you want to check is root (/), you cannot also re-mount it read-only – unless first you go to single-user mode:
init 1
Now you can re-mount your FS read-only:
mount -o ro,remount /dev/cobd0
(/dev/cobd0 is my device, replace it for your device when repeating step-by-step).
Filesystem is now read-only, and it’s safe to run e2fsck:
e2fsck -D -C 0 -f -t -v /dev/cobd0
e2fsck options are optional:
-D: optimize directory structure
-C 0: show progress
-f: force check (use if you get “volume is clean” with no check)
-t: e2fsck timing statistics
-v: verbose mode
Finally, return to your previous user mode:
init 3
Posted in *nix, Notepad, OS | No Comments »
31st May 2007
Earlier in one of my posts (Using PEAR HTTP_Client or HTTP_Request with HTTP proxy) I gave an example of using PEAR HTTP_Client and/or HTTP_Request from behind an http proxy. However, I didn’t tell how to make PEAR itself work properly from behind an HTTP proxy (e.g., for online operations like “pear upgrade-all”).
So here’s that tiny missing bit of information.
Windows:
Launch regedit, navigate to HKEY_CURRENT_USER\Environment, and create a string value called PHP_PEAR_HTTP_PROXY. Modify that new value to hold the string like: http://proxy_username:proxy_password@proxy_server_address:proxy_port.
Linux:
In the Terminal/Konsole, execute (for a system-wide pear configuration)
sudo pear config-set http_proxy http://proxy_username:proxy_password@proxy_server_address:proxy_port
If your proxy password has symbols, special for the shell (e.g. question or exclamation mark) – enclose full proxy specifications with single-quotes, e.g.
sudo pear config-set http_proxy ‘http://proxy_username:proxy_password@proxy_server_address:proxy_port’
If your HTTP proxy server does not require authentication, then use http://proxy_server_address:proxy_port instead.
I think the strings are completely self-explanatory; however, here’s an example of proxy (with authentication) specification: http://john.smith:CrAzYP433WoRd@192.168.0.1:3128.
Posted in Misc, PHP, Programming | No Comments »
28th May 2007
I enjoyed this article, while trying to make the R environment on Windows process some 16 microarrays, using some memory-intensive algorithm… Well, after several hours of processing and gradually increasing the memory use, R script failed with memory-allocation error.
It’s interesting, but running the same script in Topologilinux, launched from within Windows, did the trick. Though being a simulated environment, it was slower, and page-swapping was really heavy with only 0.5GB memory.
Posted in Links, OS, Programming | No Comments »
23rd May 2007
Refreshing my scarce knowledge of Apache’s mod-rewrite, I read through the mod_rewrite guide, and found an extremely interesting section, titled
On-the-fly Content-Regeneration
Here’s the theoretical problem:
- we are building a high-traffic site with lots of once-per-(hour|day) updated items
- we have a CMS with just all the features we need, but it’s really CPU/DB-consuming and slow (does it sound familiar? )
- there’s a need to serve static files
And here’s the ‘radical alternative’ solution:
- install the CMS of choice
- tweak the CMS’s output layer to both produce/write to disk (or update) static HTML files, and to dump those same pages directly to browser
- use the “On-the-fly Content-Regeneration” mod_rewrite rules set
This is it, in short. The “On-the-fly Content-Regeneration” will read the static files if they exist, or will query the CMS, which will create/update the static files and output the necessary page. You can also setup a cron-job to remove all static files older than XX minutes, to force content refresh.
Below is the copy of “On-the-fly Content-Regeneration” from the mod_rewrite guide.
Read the rest of this entry »
Posted in CMS, Links, Notepad, Programming, Web | No Comments »
23rd May 2007
Yesterday I needed to put together a rather simple PHP script: it would read the contents of a single pre-configured directory, and randomly select up to a pre-configured number of files. These files were images, and were just dumped as IMG tags into the webpage. I came up with a solution, shown below.
The script is simple, but still it’s easier to use the ready solution than to write your own :).
It is heavily commented, and should be easy to understand.
Read the rest of this entry »
Posted in PHP, Programming, Web | No Comments »
23rd May 2007
Found a nicely illustrated method for running a background shell command from PHP and continuously checking if the process is still running.
Here’s sample code without explanations:
function run_in_background($Command, $Priority = 0)
{
if($Priority)
$PID = shell_exec("nohup nice -n $Priority $Command 2> /dev/null & echo $!");
else
$PID = shell_exec("nohup $Command 2> /dev/null & echo $!");
return($PID);
}
function is_process_running($PID)
{
exec("ps $PID", $ProcessState);
return(count($ProcessState) >= 2);
}
To run something like hmmsearch from the HMMER package, you’d do this:
echo("Running hmmsearch. . .")
$ps = run_in_background("hmmsearch $hmmfile $fastafile > $outfile");
while(is_process_running($ps))
{
echo(" . ");
ob_flush();flush();
sleep(1);
}
Posted in Links, PHP, Programming | 8 Comments »