How to run the training code.

Short version:
1) Set everything in globals.m
2) Make sure the libraries are set up properly in libs/ by running compile.m
   This may work; if not, all you need to do is have a working copy of
        a) libsvm 
        b) fconv and features from DPM
   in your path. 

3) Run TDPMaster.m -- this does the discriminative clustering

   This is a whole-day process or more, depending on how many iterations
   are done. This can be accelerated by launching instances of TDPWorker.

   In the paper, we set numIterations in globals.m to 3, but for testing
   purposes (since it's faster to do fewer iterations), this is set to 1.

4) Run process.m -- this extracts the clusters from the data generated 
   and generates various files.

   This is a ~1-2 hour process.

5) Run runDetectors.m -- this can be run in parallel (i.e., multiple copies)

   This is probably a ~1-2 hour process.

6) Finally, run calibrateOnTraining.m -- this can take advantage of parfor 
   (but beware of memory requirements: matlab makes copies of some objects,
    taking up tons of space)

   This is probably also a ~1-2 hour process.

That's it! The resulting folder (set by patchesTarget in the globals file) 
contains 3D Primitives that can be read by the inference code.

You will need to ensure that the rescaled normals data (produced by the data
setup script) is in the result folder. Alternatively, if you've provided your own
surface normal data, just make 1/4th sized surface normal versions of the images
with the same structure as the training data.

--------------------

Long version:

1) Set each of the variables to the correct locations.

a) trainBase should point to a folder containing the following subfolders:
    -trainTrain: this folder contains the positive samples. You should put ~80%
    of your available data here.
    -trainHN: This folder contains images that will be used for hard-negative mining
    in addition to the natural world dataset.
    -test: this can be empty and doesn't have to exist.
   Each folder should contain jpeg images of the form rgb_%06d.jpg
   The corresponding depth data will be stored in depthBase, etc.

b) natWorldBase: this is just the location of the natural world images. Once 
   you unzip this, you won't have to touch anything.

c) normalBase: this is the location of the surface normals. These will have the
   filename nm_%06d.mat, and inside should have variables nx, ny, nz, and depthValid.
   If this is generated from the depth images, then you do not need to worry about
   generating these. 

d) depthBase: this is the location of the depth images. Each has a filename 
   depth_%06d.mat and contains a depth image in meters named depth.

e) processingDir: this is where the temporary files involved in processing happen. 
   This will be automatically created if it doesn't exist

f) numIterations: This is the number of iterations to be done. 1 is a good idea for
   starters and produces reasonable results. In the paper, we used 3.

g) patchesTarget: this is where the final product will get stored.


2) The following libraries may need to be recompiled or updated. These are in libs/

    a) libsvm-mat . 

    b) voc-release4. 

3) The main code gets called via TDPMaster.m . This program does all the i/o and file managing.
If you have spare CPUs on the machine, you can run TDPWorker.m as well. This will help
out with parallel tasks, such as hard-negative mining and SVM training.

4) Run process.m -- this handles a bunch of tasks for pushing files around and precaching 
   things (boring but necessary).

5) Run runDetectors.m -- this can be run in parallel (i.e., multiple copies) and everything
   will by sync'd. 
   
   Alternatively, run runDetectorsParfor, which does this via parfor instead.

6) Finally, run calibrateOnTraining.m. This does the calibrating of the detectors to ensure
   that the independently trained detectors make sense together. This will automatically
   parfor one thing, but beware of memory blow-up: when variables get transferred to the workers,
   they often get copied, resulting in a huge temporary use of memory.



