Author Topic: (HTTPD + CGI + ROOT) Better wikilocation search  (Read 2195 times)

Offline atomicdryad

  • Newbie
  • *
  • Posts: 4
    • View Profile
  • Device: Samsung Epic4g
(HTTPD + CGI + ROOT) Better wikilocation search
« on: March 28, 2014, 15:03:11 »
Note: This is a .cgi script meant to run on a personal webserver, from a device with root access (see below for alternatives)...technically it's not an addon, however it improves upon Locus functionality in the following ways:

* The Wikipedia search function will return a descriptive POI with the article's first paragraphs, and the article's thumbnail. In addition, the rather frequent "List of ..." duplicate articles are prefixed with (L)
* Searching with the 'EO' locale will query the datebase. Unlike the builtin geonames search, this is based on coordinates.
* Searching with the 'GL' locale will query geonames and return OSM POIs, based on coordinates.

install (server)

This requires a webserver that can run perl .cgi scripts. perl needs the LWP and JSON modules. The server must direct queries to . With apache, this means setting up a virtual host, or .htaccess rules like:
Code: [Select]
  RewriteCond %{HTTP_HOST} ^api\.wikilocation\.org$
  RewriteCond %{REQUEST_URI} articles
  RewriteCond %{QUERY_STRING} (.*)
  RewriteRule ^articles$ /wikiloc.cgi [L]

setup (android)

Unfortunately, since there's no functionality to change the search URLs in locus directly, one needs to redirect traffic to If you have root on your device, this is a simple matter of editing /system/etc/hosts and adding
where "" is the ip address of your webserver.
If you don't have root, it's possible to make this work over wifi by fiddling with your router's DNS. Modifying the Locus .apk is another alternative (however, I'm new and this is the dev's forum, so I won't assist in that).

Once done, simply search wikipedia, you should see descriptive text. If there's an error, it will return one POI titled "A script error has occurred", with debugging output in it's description.

final note if you plan to use geonames you'll want to change this config variable:
our $geonamesUsername="demo";
The "demo" user is always past it's quota. You can obtain a username at

If you know perl, extending this script should be easy. To replace a locale with a function, see the following config mapping:

Code: [Select]
our %localeOverride = (
  'eo' => sub { snag_geonames('default'); }, # EO: geonames spatial search
  'gl' => sub { snag_geonames('osm' ); },    # GL: osm poi search

 POIs are sent to Locus as so:

Code: [Select]
     "title":"Distance is in meters",
     "type":"<blink>I am a description</blink><br><img src=\"\">",

Quick howto: replace wikipedia for the russian locale with a POI over george w bush's house:
Add to %localeOverride:
Code: [Select]
  'ru' => sub { dummy_function(); },Then:
Code: [Select]
sub dummy_function {
  push @{$data->{articles}}, mkentry(123,29.75786,-95.46454,0,"Uhm..","Hey wait this isn't $param{lat} $param{lng}...","");
  $out = encode_json($data);

Offline jusc

  • Global Moderator
  • Professor of Locus
  • *****
  • Posts: 1871
  • Thanked: 10 times
    • View Profile
  • Device: Samsung Galaxy S2 and Note 2
Re: (HTTPD + CGI + ROOT) Better wikilocation search
« Reply #1 on: March 28, 2014, 20:05:46 »
Thank you for this.
If I understand it correctly, you have to setup your own server, for instance at home, to get these informations directly to Locus?
If it is so, I fear for most of the Locus users it's hard work to have only the chance to get the informations on tour, because you don't have always an internet connection all over the world.
Otherwise if it would be a service with an addon for Locus it seems to be interesting.
Regards J.

Offline atomicdryad

  • Newbie
  • *
  • Posts: 4
    • View Profile
  • Device: Samsung Epic4g
Re: (HTTPD + CGI + ROOT) Better wikilocation search
« Reply #2 on: March 29, 2014, 09:37:40 »
Unfortunately there are serious roadblocks when it comes to making this available offline; wikipedia's geolocation has 1.6 million entries...wikilocation's on database, on it's own is 80 megs which isn't big, but that only contains article titles. It's closer to 800 megs (ish) with article snippets, to say nothing about images.

Carving that up by region / country / state would help, colorado has around 7000 georeffed articles for example. But doing that entails an entirely different sort of pain; the article snippets are hosted on wikipedia and referenced by numeric ids in the wikilocation database, which means a script that either abuses the heck out of their api or downloading the entire wikipedia database (and still that doesn't include images).

I'll probably do the latter just to experiment, but there's one last pitfall; I'm not sure locus can handle hundreds of thousands of pois. If exported to .kml it would try to render them all, and be limited to 5.5 megs unless imported. I've experience issues importing lots of stuff into the waypoints database (the app crashed when selecting a category with 13k entries), and the database doesn't have a spatial index so selecting by range would be troublesome.