Added a SubsampledHonestForest scikit-learn extension, which is a regression forest that implements honesty and instead of bootstrap, performs subsampling to construct each tree. It also offers predict_interval via the bootstrap of little bags approach and the asymptotic normal characterization of the prediction estimate.
Added NonParamDMLCateEstimator, which is essentially another meta-learner that has an arbitrary final stage that supports fit and predict (albeit fit must accept sample_weight). This is based on the observation that, when treatment is single-dimensional or binary one can view the RLearner problem as a weighted regression.
Added ForestDMLCateEstimator (which is essentially a causal forest implemented slightly differently via viewing it as a weighted non-parametric regression and piggy backing on scikit-learn tree construction) and has bootstrap of little bags based inference. This is essentially a NonParamDMLCateEstimator with a SubsampledHonestForest final model.
Also added ForestDRLearner, which uses the doubly robust approach and uses an honest forest for each pseudo outcome regression. This also offers non-parametric confidence intervals to the Doubly Robust estimation classes. This is essentially a DRLearner with a SubsampledHonestForest final model.
Side additions:
re-organized inference class hierarchy to make most of code re-use.
added monte_carlo folder and monte_carlo experiments for LinearDMLCateEstimator and SubsampledHonestForest