Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The big refactor #67

Merged
merged 460 commits into from
Aug 26, 2015
Merged

The big refactor #67

merged 460 commits into from
Aug 26, 2015

Conversation

patricksnape
Copy link
Contributor

OK, so this is the big refactoring PR. I rebased this version from the refactor branch so that we have a consistent history. Some of the commits might be a little borked from this, but it won't matter much. The end result of that final merge is that this branch is now 100% consistent with the refactor branch.

I want to get this in to master ASAP so that people can start to play with it. Given that everything at least appears to work, I think that this is OK. It also has the advantage of building master conda builds that we can share with people.

The major missing components are:

  • Testing
  • Documentation
  • Fitting results refactoring
  • New algorithms

However, the major components of the SDM, CLM, AAM, ATM and LK algorithms are now all fairly consistent and appear to be working.

Brief Tour Of Changes

  • New CLMs and AAMs. Adds all of @jalabort work on new AAM and CLM implementations. There are now many new algorithms and features such as priors for AAMs. CLMs in particular have had a lot of work done to them including fancy FFT convolution.
  • Goodbye builders. The base classes are now builders themselves. They no longer have train methods and instead build in the constructor. This was necessary for the next update...
  • Incremental models. SDM, ATM, AAM and CLM can all be trained incrementally. This is great for building large models, which also enables...
  • Batched building. You can now build models using a generator, exhausting a finite number of inputs at a time. Where possible, we have attempted to perform an exact incremental update. However, this is not always possible for methods like the SDM - where small batch sizes will likely cause divergence.
  • More sensible initialization schemes. @jalabort updated the way in which perturbations are generated, to a much more sensible scheme that is more easily parametrizable.

patricksnape and others added 30 commits August 4, 2015 14:30
Conflicts:
	menpofit/aam/base.py
	menpofit/aam/builder.py
	menpofit/aam/fitter.py
	menpofit/builder.py
	menpofit/checks.py
	menpofit/fitter.py
	menpofit/fittingresult.py
	menpofit/sdm/fitter.py
Interfaces are now concrete rather than passing a class.
This makes the objects much less heavyweight
No more exceptions from iterations or error types.
Because we just deepcopy the shape model, we were throwing
the components away too early. Now we build all the shape models,
then truncate them at the end.
Also, add HolisticAAM as AAM alias
After speaking to @jalabort - he found this more useful.
Working on refactoring the ATMs
This fix is like not modelling the noise at all - its a reasonable
fix but we should think carefully about it.
Handle images and shapes
Helps visualize just work at the moment
Now ATMs act like AAMs - including the new interface usage so
that the algorithms just take a single interface. A lot of the
code is similar to the AAM - but not identical!
Just use the correct ordering for scales etc. Also, use
_prepare_image rather than _prepare_template to make sure
everything is consistent
Since we use Patch all the time in SDM etc, we should
probably use it for the 'parts' AAM which is more similar to
SDM. Since 'Patch' was being used merely for an AAM where
the appearance model has a disjoint mask applied - we change
the name to 'Masked'
AAM/ATM/LK/SDM Refactoring
Now everyone method has essentially the same _train method.
Then, this method calls a _train_batch method that is different
for each method and actually performs training.
Consistency for training
Wasn't passing parameters up to super class properly.
When holistic feature is no_op, don't copy images
or even bother running through the loop, just skip it.
Includes non-verbose bug for CLMs
This renames the fitting methods and adds a new method that
can fit from bounding boxes.
This is consistent with the previous renaming of parts aam
to patch aam
Now everyone method has essentially the same _train method.
Then, this method calls a _train_batch method that is different
for each method and actually performs training.
Wasn't passing parameters up to super class properly.
When holistic feature is no_op, don't copy images
or even bother running through the loop, just skip it.
Includes non-verbose bug for CLMs
This renames the fitting methods and adds a new method that
can fit from bounding boxes.
This is consistent with the previous renaming of parts aam
to patch aam
This is consistent with Menpo's extract_patches method and with
CLM
features -> holistic_features
normalize_parts -> patch_normalisation
normalise_callable -> patch_normalisation
patch_features -> patch_features

At the moment, only SDMs have patch_features due to the lack
of distinction on menpo between image and vector features.
Consistent with other renaming
Conflicts:
	menpofit/aam/algorithm/sd.py
	menpofit/checks.py
	menpofit/fitter.py
@jabooth
Copy link
Member

jabooth commented Aug 13, 2015

Fantastic work guys.

As you've already been doing code reviews on the refactor branch I see no reason why we would delay getting this in to master - as you've said once there it will be built with conda and easier to test internally before release.

+1 after CI builds succeed.

@nontas
Copy link
Member

nontas commented Aug 13, 2015

Awesome!! +1

Empty strings are no longer allowed as colours - use None instead.
Important to stop appveyor from hangin
Fixes WIndows builds.
Removal of *Widget classes
This helps when you have different number of features per patch,
otherwise, you get an object array back. (SDM)
@patricksnape
Copy link
Contributor Author

Noone is saying anything about this... so! MERGING

patricksnape pushed a commit that referenced this pull request Aug 26, 2015
@patricksnape patricksnape merged commit aa056af into master Aug 26, 2015
@patricksnape patricksnape deleted the refactor_merge branch August 26, 2015 14:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants