March 24, 2010

Fuzzy Image Classification for GRASS

Please refer to my previous post on evidential reasoning based image classification for GRASS. We have extended this to fuzzy image classification.

In normal classification, we classify each pixel into a class, say road, water body, forest etc. However a pixel, say having dimension 12.5 m X 12.5 m, will not fully belong to one class. Instead it may contain more than one class, say Road and Barren land. Fuzzy classification can derive the percentage of each class in a pixel.

Hence, fuzzy classification provides better results, but is computationally and conceptually more complex. An detailed explanation of fuzzy classification is beyond the scope of this post.

The code is available for download from github. Please contact me if you are interested in this this, or wish to enhance it further.


March 20, 2010

i.erclassifier: A better image classification module for GRASS

Extracting information from satellite imagery is the major task in remote sensing. There are a number of Image classification algorithms available, suitable for different purpose. However, availability of open source solutions for this is very limited. Major one is GRASS, grass has a maximum likely hood classifier (i.maxlik ) built-in. However, maximum likely hood classifier is the most basic form of supervised image classifications.

During my MTech course, I have developed a new classifier, based on the theory of evidential reasoning, for GRASS. Let me class it as i.erclassifier. Theoretically evidential reasoning classification algorithm can handle uncertainties better, and is not subjected to statistical assumptions. (Maximum likely hood classification algorithm, and other statistical classifiers, assumes that the data follows normal distribution).

i.erclassifier is available for download from github. To use this classifier you need to download i.genevid module also. This generates the evidence file (like signature file) required for the classifier.

February 28, 2010

Constraint Exclusion a python Script

For projects done at Keltron, almost all modules had “continuously growing data”. For example, consider the simple case of vehicle tracking application. In every second we receive a packet from each vehicle, and hence history data of vehicles grows with time. As the volume of the data grows, it takes more time to retrieve the data. I have seen many people using different method for this patter of problem, with its own inherent issues. One significant issue I have observed is that Indexing is nearly ineffective in such solutions. In every second there is an insertion in the table, and requires index updation.

Constraint exclusion, provided by PostgreSQL, was our choice of solution for this pattern of problems. Please see postgreSQL documentation of the details of Constraint exclusion. The major advantages we have observed are,

  • Fast response. With out constraint exclusion, response time increases significantly as data volume increases.
  • Indexing can be used efficiently, by avoiding indexing on the “child table” to which data is added currently.
  • Incremental Back up, removing data older than “x” days from the table and archive that on the tape, is now simple and straight forward.
  • No need to run vacuum after the incremental back-up. Even if you perform vacuum, it wont take much time.
  • Restoration is now simple and straight forward. In our case, user may ask to show data of date x on the software. We can restore only that portion of the data, and show it on the software along with the current data.

However, effective use of constraint exclusion requires proper planning and care. The following factors needs to be taken care of:

  • The constraints should be set properly, for effectiveness.
  • There should be no overlap between child tables.
  • Index may be created on previous child table at the time of creating new child table.
  • Child tables should be named properly. Do not reuse the name of the child table. Instead generate a unique name using the time stamp or so.

The python script we used for this purpose is available for download from github. There are a few lines of code we added to handle some special cases we encountered. Other wise it is a good generic tool. To the minimum, it will help you to get a clear idea on how to implement constraint exclusion for a time based data.

Thanks to Febin TT and Nels of Keltron for their support. Please get back to me for any clarifications or support.

February 25, 2010

Django and GRASS

Filed under: My Scripts and Tools,Uncategorized — sajithvk @ 4:00 pm
Tags: , , , , ,

FOSS based WebGIS has witnessed significant growth during last couple of years, thanks to OpenLayers, PostGIS, Mapserver etc.

At Keltron, we have utilised FOSS WebGIS stack for several projects.For front end, Open Layers is our default choice. For map rendering we used UMN mapserver, and for back end processing, we used GeoDjango.

GeoDjango provides a lot of features along with the simplicity of Django. However, there are situations when we need to use more complicated GIS analytics. Yes, GRASS is great for that, but how to use it in a web environment? For example, we had to compute weighted shortest route obeying one way rules, and even scheduled one way rules. module of GRASS can do it very easily, but we need a method to present the result through the WebGIS interface.

PyWPS was the best choice here. Well documented, and following WPS standard. However, for us, it was an over kill. Moreover, many a times, we found it difficult to debug PyWPS when some thing goes wrong.

Hence we developed a small script (shamelessly copying ideas from pyWPS) allowing us to execute a set of grass commands from Django. A simple script, have a look. The code is available at github for download.

Thanks to Febin TT and Gopakumar for their support in developing this.

Drop me a mail if you find it useful. Contact me for any further clarifications / support required.

February 24, 2010

Hello world!

Filed under: Uncategorized — sajithvk @ 4:54 pm

Welcome to This is your first post. Edit or delete it and start blogging!

Blog at