Favourite Code

In gerneral I use Python with Numpy/Scipy/Matplotlib for most of my experiments, so most of these are python tools.

Here are some projects I find helpful:

Computer Vision:

Scikits-Image:
Python library for image processing. Contains many elementary methods and algorithms and some descriptors. A fast growing project that is worth a look.
 
VlFeat:
This library contains very fast SIFT computation, Kmeans, QuickShift, MSER and other tools. Very fast, very easy to use. Matlab, C++ and binaries.

There is also a github clone which provides Python bindings for some of the functionality and my updated Python bindings.

Damascene
A CUDA implementation of the gPb operator by Malik .
This includes an implementation of textons and also normalized cuts on GPU. I have Python and Matlab bindings if any one needs them.




General Machine Learning:
Take a look at this post on many toolkits (not all of which I have used) for more.

scikit-learn:
A machine learning library for python. I use this for nearly everything. Mature, easy to use and has lots of algorithms. Includes LibSVM wrappers, Linear SGD classifiers, K-means, cross-validation and grid search tools, manifold learning, Gaussian processes, spectral clustering, ICA, dictionary learning, decision trees... you name it.
Very well documented and many examples, also featuring a cool example gallery!

Other Programming Tools

CUV Library
A C++/Cuda/Python library with basic matrix, vector and convolution routines developed at my group. It was intended for running Restricted Boltzmann Machines and neural networks on the GPU.
The idea is to call CUDA functions from python, thus having the speed of CUDA as well as the ease of development of python.
If only simple matrix computations are involved, a algorithm can be transfered to the GPU just by line-by-line replacement of Matlab or Python code to library calls.


joblib
A python package for parallelization and persistence. Very easy to use to
parallelize across multiple CPUs, does not work over multiple boxes, though.