A few months back I came across an interesting website - http://www.technoprofits.org/volunteers/ It's a great place for technology experts to lend help to non-profits. And a few days back, someone working on a NREGA project called me up asking for help. It turned out it he was a fellow trek - mate and an old friend.. Small world indeed! Now I have a weekends-only, non-paid but highly rewarding job :) Apart from myself, there's Nikhil, Suvikas, Milind & Mrinmayee on the team and we expect a few more to board ship soon! And of course there's Priyadarshan who's going to be the first client for the application!
Now to the interesting part -
NREGA is the National Rural Employment Guarantee Act passed by the Indian parliament in 2005. With this act, employment has become a right of rural citizen as it guarantees 100 days of unskilled employment to a rural household. Last year budgeted expenses were at Rs 40,000 Cr. The budgets and spending under this scheme are meticulously uploaded to the NREGA website. The current project aims to making available the information from the NREGA website in a user friendly way to the target users - NGOs, grass-root activists, administration - thereby empowering them with information, and (in a later phase), provide a collaboration platform for them to work with each other.
Though the NREGA site has a lot of data - state/district/taluka/gram panchayat level details of the works, expenditures etc it's represented poorly and not very easy to work with. They do offer an export to excel but that too has the same issues. Here's an example link - http://188.8.131.52/netnrega/writereaddata/citizen_out/phy_fin_reptemp_Out_18_local_1011.html The data itself definitely seems to come from a database, but its hard to get hold of the database, so an alternative means of fetching, storing, processing & displaying the data is needed.
Since there's no direct database access, we revert to good old scraping to get all the data and store it in a MySQL database. Next, we lay the data in a neat, meaningful format with lots of options to be able to sort, filter and drill down the data. I learnt a lot about the power of Nooku at JandBeyond and am planning to put that knowledge to good use. Once the database & scraper is ready we plan to implement a Nooku based extension to show the data. Here's the mockup of the page done by one of the team members -
Providing a robust and scalable platform to build this application on. Extensions like Nooku and Joomfish will provide the much needed power to implement the drilldowns and multi-lingual facilities for non-English users.
Finalising the database structure is something that is being done right now, and once that is done, I hope I would be able to work up the UI pretty soon with some Nooku magic! I will be writing more as we proceed through this project. Ideas, thoughts any kind of help is appreciated!