Autarchy of the Private Cave

Tiny bits of bioinformatics, [web-]programming etc

    • Archives

    • Recent comments

    Archive for May, 2007

    How to make PEAR work from behind an HTTP proxy (Windows and Linux)

    31st May 2007

    Earlier in one of my posts (Using PEAR HTTP_Client or HTTP_Request with HTTP proxy) I gave an example of using PEAR HTTP_Client and/or HTTP_Request from behind an http proxy. However, I didn’t tell how to make PEAR itself work properly from behind an HTTP proxy (e.g., for online operations like “pear upgrade-all”).

    So here’s that tiny missing bit of information.

    Windows:
    Launch regedit, navigate to HKEY_CURRENT_USER\Environment, and create a string value called PHP_PEAR_HTTP_PROXY. Modify that new value to hold the string like: http://proxy_username:proxy_password@proxy_server_address:proxy_port.

    Linux:
    In the Terminal/Konsole, execute (for a system-wide pear configuration)

    sudo pear config-set http_proxy http://proxy_username:proxy_password@proxy_server_address:proxy_port

    If your proxy password has symbols, special for the shell (e.g. question or exclamation mark) – enclose full proxy specifications with single-quotes, e.g.

    sudo pear config-set http_proxy ‘http://proxy_username:proxy_password@proxy_server_address:proxy_port’

    If your HTTP proxy server does not require authentication, then use http://proxy_server_address:proxy_port instead.

    I think the strings are completely self-explanatory; however, here’s an example of proxy (with authentication) specification: http://john.smith:CrAzYP433WoRd@192.168.0.1:3128.

    Share

    Posted in Misc, PHP, Programming | No Comments »

    Windows memory management unveiled

    28th May 2007

    I enjoyed this article, while trying to make the R environment on Windows process some 16 microarrays, using some memory-intensive algorithm… Well, after several hours of processing and gradually increasing the memory use, R script failed with memory-allocation error.

    It’s interesting, but running the same script in Topologilinux, launched from within Windows, did the trick. Though being a simulated environment, it was slower, and page-swapping was really heavy with only 0.5GB memory.

    Share

    Posted in Links, OS, Programming | No Comments »

    On the use of Artificial Neural Networks for AI

    28th May 2007

    I came across a list of postulates (link removed – content disappeared), which define the space for creating strong artificial intelligence. One of the postulates, which says that AI can be implemented only using ANNs, appears to be not clearly enough proven to be a real requirement.

    Consciousness is not necessarily the derivative of complexity; it can be rather the derivative of the world’s model and the subject’s placement in that model, which causes consciousness to arise. (In other words, consciousness equals to the ability of the subject to place himself within the constantly self-re-approving environment model.) Thus, the requirement for ANNs use is not convincing: one can ensure that the appropriate world model is created without ANNs. I would even say that ANNs are just a kind of a “black box”, by using which we try to avoid the really obvious complexity, which can nevertheless be solved purely algorithmically, with no extra overhead from ANN-like simulators and wrappers.

    There appears to be a specific double-dichotomy in ANN versus Algorithmic approaches for AI development: ANN considers the brain to be a collection of individual neurons (or “perceptrons”? for this case), while algorithmic approach considers the brain to be a collection of “modules”, each performing some quite narrow function. At the same time, we are told that algorithmic approaches cannot foresee unforeseen circumstances, thus ANNs are better for AI development (that’s the second dichotomy). However, modern “intelligent” software (here I mean first of all cognitive-functions software) rather successfully uses “learning algorithms”, “pattern matching algorithms”, “inference algorithms”, “prediction algorithms” and many more other “algorithms”. At the same time, I’m unaware of the successful (or at least impressive) software tool built using ANNs.

    (Well, unwinding the above paragraph may lead to a controversy: ANNs’ implementations are algorithms themselves. However, I would make a clear distinction here: I consider ANN to be a rather generic simulator of inter-neuronal interactions and signal-response circuits; the same type of ANN could be applied to several different tasks (well, different instances of the same-type ANN). But if a generic ANN is trained for a specific task, and then optimized for that task only, and then probably also simplified and extended with fixed-value tables to avoid recalculating static relations – this is not ANN, but an algorithm, as being task-specific it falls under the definition of the algorithm much better than under the definition of the ANN.)

    I was given a reference to Daniel Dennett by Bernardo Kastrup, the author of the “postulates”. Daniel Dennett is, like me, a proponent of algorithmic approaches to AI. However, I didn’t read any of his works yet. As soon as I do, I’ll add more to the ANN vs Algorithms topic.

    Share

    Posted in Artificial Intelligence | No Comments »

    jpegtran and ffmpeg on GoDaddy in Gallery2

    28th May 2007

    Jpegtran is a library for lossless rotation and cropping of JPEG photo files. Ffmpeg is a library for some basic video processing and playback. Gallery2 is a powerful and popular photo-gallery web-software.

    Gallery2 uses by default either PHP’s GD2 or ImageMagick toolkits to rotate/crop images. However, you can install jpegtran plugin, to rotate/crop JPEGs with no losses in quality.

    On GoDaddy shared hosting the path to jpegtran is /usr/bin/jpegtran. However, at least in my case, that binary failed the ‘crop’ test (but succeeded in ‘rotate’ test). So I downloaded another binary of jpegtran (from this page), put it into one of my folders, and told Gallery2 to use that binary instead of /usr/bin/jpegtran. This worked perfectly.

    The same approach can be used to enable thumbnails for videos via the ffmpeg plugin and binary (unfortunately, have no idea where did I take ffmpeg from – it was quite a time ago). Just download the binary, put it into one of your folders, tell Gallery2 the absolute path to the binary, and you are done!

    Finally, here are the links to the two binaries mentioned above:
    jpegtran
    ffmpeg

    ffmpeg update: see here.

    Share

    Posted in Software, Web | 6 Comments »

    Radical Alternative to caching: On-the-fly Content-Regeneration

    23rd May 2007

    Refreshing my scarce knowledge of Apache’s mod-rewrite, I read through the mod_rewrite guide, and found an extremely interesting section, titled

    On-the-fly Content-Regeneration

    Here’s the theoretical problem:

    1. we are building a high-traffic site with lots of once-per-(hour|day) updated items
    2. we have a CMS with just all the features we need, but it’s really CPU/DB-consuming and slow (does it sound familiar? :) )
    3. there’s a need to serve static files

    And here’s the ‘radical alternative’ solution:

    1. install the CMS of choice
    2. tweak the CMS’s output layer to both produce/write to disk (or update) static HTML files, and to dump those same pages directly to browser
    3. use the “On-the-fly Content-Regeneration” mod_rewrite rules set

    This is it, in short. The “On-the-fly Content-Regeneration” will read the static files if they exist, or will query the CMS, which will create/update the static files and output the necessary page. You can also setup a cron-job to remove all static files older than XX minutes, to force content refresh.

    Below is the copy of “On-the-fly Content-Regeneration” from the mod_rewrite guide.
    Read the rest of this entry »

    Share

    Posted in CMS, Links, Notepad, Programming, Web | No Comments »

    Directory-based random image rotation PHP script

    23rd May 2007

    Yesterday I needed to put together a rather simple PHP script: it would read the contents of a single pre-configured directory, and randomly select up to a pre-configured number of files. These files were images, and were just dumped as IMG tags into the webpage. I came up with a solution, shown below.

    The script is simple, but still it’s easier to use the ready solution than to write your own :).
    It is heavily commented, and should be easy to understand.
    Read the rest of this entry »

    Share

    Posted in PHP, Programming, Web | No Comments »

    Executing and checking background shell process from PHP

    23rd May 2007

    Found a nicely illustrated method for running a background shell command from PHP and continuously checking if the process is still running.

    Here’s sample code without explanations:

    1. function run_in_background($Command, $Priority = 0)
    2. {
    3.  if($Priority)
    4.   $PID = shell_exec("nohup nice -n $Priority $Command 2> /dev/null & echo $!");
    5.  else
    6.   $PID = shell_exec("nohup $Command 2> /dev/null & echo $!");
    7.  return($PID);
    8. }
    9.  
    10. function is_process_running($PID)
    11. {
    12.  exec("ps $PID", $ProcessState);
    13.  return(count($ProcessState) >= 2);
    14. }

    To run something like hmmsearch from the HMMER package, you’d do this:

    1. echo("Running hmmsearch. . .")
    2. $ps = run_in_background("hmmsearch $hmmfile $fastafile > $outfile");
    3. while(is_process_running($ps))
    4. {
    5.  echo(" . ");
    6.  ob_flush();flush();
    7.  sleep(1);
    8. }
    Share

    Posted in Links, PHP, Programming | 8 Comments »