Граф коммитов

921 Коммитов

Автор SHA1 Сообщение Дата
Evan Shelhamer c028d09ca6 rewrite MKL flag note, polish makefile
add MKL dirs conditioned on USE_MKL
include libraries before making LD_FLAGS
2014-03-21 13:52:36 -07:00
Rowland Depp a8c9b66b7f major refactoring allow coexistence of MKL and non-MKL cases 2014-03-21 13:52:36 -07:00
Kai Li 1cf822e53b Replace atlas with multithreaded OpenBLAS to speed-up on multi-core CPU
issue: #79
2014-03-21 13:52:35 -07:00
Jeff Donahue 4f6b26632a fix bernoulli generator bug 2014-03-21 13:52:35 -07:00
Jeff Donahue 6cbf9f189b add bernoulli rng test to demonstrate bug (generates all 0s unless p ==
1)
2014-03-21 13:52:35 -07:00
Jeff Donahue b3e4ac55fe change all Rng's to use variate_generator for consistency 2014-03-21 13:52:35 -07:00
Jeff Donahue 4b1fba7be3 use boost variate_generator to pass tests w/ boost 1.46 (Gaussian filler
previously filled in all NaNs for me, making many tests fail)
2014-03-21 13:52:35 -07:00
Jeff Donahue 93c9f151dc make uniform distribution usage compatible with boost 1.46 2014-03-21 13:52:35 -07:00
Alejandro Dubrovsky b9257396d6 mean_bound and sample_mean need referencing with this 2014-03-21 13:52:35 -07:00
Alejandro Dubrovsky 2ae2683fb8 nextafter templates off one type 2014-03-21 13:52:35 -07:00
Evan Shelhamer d37a995b96 relax precision of MultinomialLogisticLossLayer test 2014-03-21 13:52:35 -07:00
Kai Li 788f070d06 Fix math funcs, add tests, change Eigen Map to unaligned for lrn_layer
[shelhamer: removed math function tests, since they were merged via
other branches]
2014-03-21 13:52:34 -07:00
Kai Li 38457e1c1f Fix test stochastic pooling stepsize/threshold to be same as max pooling 2014-03-21 13:52:34 -07:00
Kai Li d666bdc9d3 Fixed FlattenLayer Backward_cpu/gpu have no return value 2014-03-21 13:52:34 -07:00
Kai Li 04ca88ac15 Fixed uniform distribution upper bound to be inclusive 2014-03-21 13:52:34 -07:00
Rodrigo Benenson e4e93f4d12 compile caffe without MKL (dependency replaced by boost::random, Eigen3)
- examples, test and pycaffe compile without problem (matcaffe not tested)
- tests show some errors (on cpu gradient tests), to be investigated
- random generators need to be double checked
- mkl commented code needs to be removed
2014-03-21 13:52:34 -07:00
Jeff Donahue 510b3c028f Merge pull request #247 from jeffdonahue/loss-in-forward-window-data-layer
Loss in forward pass fix for window data layer
2014-03-21 13:11:36 -07:00
Jeff Donahue a123130cb3 loss in forward pass fix for window data layer 2014-03-21 13:05:59 -07:00
Jeff Donahue e6ef9ca4c8 Merge pull request #209 from jeffdonahue/loss-in-forward-pass
Compute loss in the forward pass
2014-03-21 12:52:16 -07:00
Evan Shelhamer 50862880dd Back-merge documentation and script fixes
fix script path incantation
convert css indentation to spaces
fix cifar10 leveldb creation path
wget without checking certificate for dropbox (dodge complaint on linux)
docs: added list of contributors
minor style update of docs
2014-03-19 22:34:01 -07:00
Evan Shelhamer 19a7e23ccc fix script path incantation 2014-03-19 22:31:59 -07:00
Sergey Karayev aa0a52f9e6 convert css indentation to spaces 2014-03-19 22:17:28 -07:00
Evan Shelhamer 9312fe0be2 fix cifar10 leveldb creation path 2014-03-19 22:13:24 -07:00
Evan Shelhamer 62a4ba88f7 wget without checking certificate for dropbox (dodge complaint on linux) 2014-03-19 22:03:40 -07:00
Sergey Karayev 45b31bf5c9 docs: added list of contributors 2014-03-19 21:39:52 -07:00
Sergey Karayev fd47362deb minor style update of docs 2014-03-19 21:27:22 -07:00
Sergey Karayev 3b51aab66d Fix to #161
- signficantly change the documentation file
- link to it from index.md
- remove the image resizing script, since (a) it does not work, (b) is obviated by using ImagesLayer
- add sample prototxt that uses ImagesLayer.
2014-03-19 21:25:44 -07:00
Sergey Karayev c10ba54f0f Merge pull request #161 from kloudkl/simplify_feature_extraction
Feature extraction, feature binarization and image retrieval examples
2014-03-19 21:21:28 -07:00
Jeff Donahue ec6f3b8466 minor cleanup in rcnn-finetuning -- rcnn feature computation tested at
this commit (in addition to all caffe unit tests passing)
2014-03-19 20:50:13 -07:00
Ross Girshick dfec9471ea cleanup matlab demo 2014-03-19 19:17:40 -07:00
Ross Girshick 9c7a993947 add initialization key for verifying state 2014-03-19 19:12:57 -07:00
Ross Girshick ba10066aff demo on how to get net weights using the matlab interface 2014-03-19 19:12:57 -07:00
Ross Girshick dee9ce76ff return model weights 2014-03-19 19:12:54 -07:00
Evan Shelhamer b68cf5ee13 keep DLOG (revert accidental switch to LOG) 2014-03-19 19:09:57 -07:00
Evan Shelhamer 32cea65d8a file pascal finetuning prototxt examples and fix paths 2014-03-19 19:09:57 -07:00
Ross Girshick 6f653798f0 set default to the best value 2014-03-19 19:09:54 -07:00
Ross Girshick f1a64b7bc1 some cleanup 2014-03-19 19:09:20 -07:00
Ross Girshick 458fa2b6b4 fix paths 2014-03-19 19:08:30 -07:00
Ross Girshick 13392972d9 support for tightest square mode while finetuning 2014-03-19 19:08:28 -07:00
Ross Girshick 5cb7c23367 10x learning rate for fine tuning makes a big difference 2014-03-19 19:07:23 -07:00
Ross Girshick 17a0c1686b support for adding padding to windows in the window_data_layer 2014-03-19 19:07:20 -07:00
Ross Girshick bb0e72c817 Code that was used to finetune with reasonable success 2014-03-19 19:06:05 -07:00
Ross Girshick d98ce8fc48 some major bug fixes (includes some to-be-removed debugging code) 2014-03-19 19:06:05 -07:00
Ross Girshick f40c30bdd5 adjustments to try to match the setup for fine tuning with cuda-convnet 2014-03-19 19:06:05 -07:00
Ross Girshick f4a2c14b13 define pascal finetuning models 2014-03-19 19:06:05 -07:00
Ross Girshick fc79306482 add window data layer 2014-03-19 19:05:57 -07:00
Jeff Donahue a6ae5be95e post rebase fixes: images layer and padding layer compute loss in
forward
2014-03-19 12:37:31 -07:00
Jeff Donahue 0551d93831 null pointer defaults for forward loss outputs 2014-03-19 12:37:31 -07:00
Jeff Donahue 44fbb82f47 loss in forward pass for concat layer (thought i'd rebased to latest dev
but apparently not)
2014-03-19 12:37:31 -07:00
Jeff Donahue ed23b68906 fix softmax loss layer bug; all tests pass 2014-03-19 12:37:31 -07:00