The scientific Python community have been taken by storm by the IPython Notebook, which allows for 'exploratory programming', where you have your data loaded and you can easily try out small changes in algorithms or methods and quickly see the result.
When working with large images, doing spatial analysis on the images was simple and easy in this setting, but displaying results frustrating. An analysis algorithm might take a couple of seconds to rerun when the image was represented as a NumPy array. But showing the results overlaid on the image could take upwards of a minute.
The problem I had was solved by loading the large image once in an external application which pulled analysis results from a database, and updating the database from the IPython Notebook when rerunning the analysis.
This strategy of talking to applications which are very good at the their specific task, while performing exploratory method development or analysis in an IPython Notebook is very useful across many problems in various fields.
In the talk I will go through my initial problem and solution, enchantments on the initial solution, and other examples of the same strategy being useful. Which strategies for talking to external applications works better than ones which failed.