Граф коммитов

2121 Коммитов

Автор SHA1 Сообщение Дата
Nick Carlevaris-Bianco d149c9a98d Added contrastive loss layer, associated tests, and a siamese network example using shared weights and the contrastive loss. 2014-09-08 16:14:58 -04:00
Evan Shelhamer fc921bf9d6 Back-merge to dev for slides 2014-09-08 12:47:47 +02:00
Evan Shelhamer 7353e3d774 Merge pull request #1052 from shelhamer/caffe-presentation
Caffe tutorial slides
2014-09-08 12:46:21 +02:00
Evan Shelhamer 64c8dcbd1d [docs] replace intro slides with caffe tutorial 2014-09-08 12:44:21 +02:00
Jeff Donahue 63bad310c1 Revert "call __signbit for CUDA >= 6.5 implementation" -- doesn't
compile on OSX w/ CUDA 6.5

This reverts commit 8819f5953b.
2014-09-08 11:02:31 +02:00
Jeff Donahue 8cfd587803 Merge pull request #1050 from jeffdonahue/linecount-more
linecount counts more dirs than just src/
2014-09-08 10:49:06 +02:00
Jeff Donahue e855bb91e9 Merge pull request #1044 from jeffdonahue/no-tmpnam
change uses of tmpnam to mkstemp/mkdtemp
2014-09-08 10:48:49 +02:00
Jeff Donahue 2d88103215 linecount counts more dirs than just src/ 2014-09-08 10:42:55 +02:00
Evan Shelhamer 99c4ed5bb7 [lint] cuDNN conv declaration 2014-09-08 10:03:55 +02:00
Evan Shelhamer 3bafe2fcbb Merge pull request #1046 from shelhamer/cudnn
cuDNN acceleration
2014-09-08 09:57:44 +02:00
Jeff Donahue ae8599655b Merge pull request #1049 from niuzhiheng/dev
Fixed CMake script of FindOpenBLAS.
2014-09-08 09:28:49 +02:00
ZhiHeng NIU 68e26576d8 Fixed CMake script of FindOpenBLAS. 2014-09-08 14:46:43 +08:00
Jeff Donahue adaad52861 Merge pull request #1045 from akosiorek/origin/dev
Fixed CMake building test objects multiple times
2014-09-08 08:41:38 +02:00
Jeff Donahue 5ab3d97609 Merge pull request #1048 from jyegerlehner/conv_layer-init-weight
Conv layer: fix crash by setting weight pointer
2014-09-08 07:34:46 +02:00
J Yegerlehner a739cdadbb Fix more lint. 2014-09-07 23:10:33 -05:00
J Yegerlehner 396da71569 Repair crash in conv_layer due to weight pointer being NULL. 2014-09-07 21:52:11 -05:00
Evan Shelhamer 359197b039 [docs] include cuDNN in installation and performance reference 2014-09-07 19:56:45 +02:00
Evan Shelhamer c65d5a0357 report cuDNN error string 2014-09-07 19:56:45 +02:00
Evan Shelhamer 9e3d86f3b6 CUDNN_CHECK 2014-09-07 19:56:42 +02:00
Evan Shelhamer 84bd1f5e98 strategize cuDNN softmax 2014-09-07 19:56:15 +02:00
Evan Shelhamer 14a9198ae1 strategize cuDNN activations: ReLU, Sigmoid, TanH 2014-09-07 19:25:23 +02:00
Evan Shelhamer 00f5fa6bf9 strategize cuDNN pooling 2014-09-07 19:25:23 +02:00
Evan Shelhamer d1b38ee630 strategize cuDNN convolution 2014-09-07 19:25:23 +02:00
Evan Shelhamer 8819f5953b call __signbit for CUDA >= 6.5 implementation 2014-09-07 19:25:23 +02:00
Evan Shelhamer 77d91242cd add cuDNN to build 2014-09-07 19:25:23 +02:00
Adam Kosiorek 9086df9e20 added common.cpp explicitly to tests 2014-09-07 19:22:43 +02:00
Adam Kosiorek 37e55faef8 cpp and cu files processed separately in test build 2014-09-07 19:22:43 +02:00
Adam Kosiorek 1cb704025e enabled object file reusing in test builds 2014-09-07 19:22:43 +02:00
Jeff Donahue 3182b1c330 add <cuda>/lib64 only if exists to suppress linker warnings 2014-09-07 11:50:58 +02:00
Jeff Donahue fb0a3d0275 remove uses of tmpnam 2014-09-07 11:14:51 +02:00
Jeff Donahue 3cf3df829e fix transform_param in mnist_autoencoder.prototxt 2014-09-07 09:44:58 +02:00
Jonathan L Long b37f4f9f68 [docs] tutorial/layers: fix inner product sample 2014-09-06 22:10:29 -07:00
Jonathan L Long cbc50e1c66 [docs] tutorial/layers: describe some more data layers 2014-09-06 22:09:13 -07:00
Jonathan L Long bd13f32123 [docs] tutorial/layers: clean up sample markdown 2014-09-06 21:42:35 -07:00
Jonathan L Long 15456286da [docs] tutorial/layers: brief descriptions of some loss layers 2014-09-06 21:39:56 -07:00
Jonathan L Long 40fa5be9b1 [docs] in tutorial/layers, Options -> Parameters
It sounds funny to have optional options, and "parameters" is more in
line with the internal usage.
2014-09-06 21:22:23 -07:00
Jonathan L Long 853d65a8a5 [docs] split layer params in required/optional
Also, make the parameter name come first. This makes it much easier to
find/scan parameters.
2014-09-06 21:20:36 -07:00
Jonathan L Long 85c93659fc [docs] add LRN layer to tutorial/layers 2014-09-06 21:05:56 -07:00
Jonathan L Long 4f977d0514 [docs] fix pooling markdown and add some comments in tutorial 2014-09-06 20:28:02 -07:00
Jonathan L Long c099fd8b2e [doc] minor edits to convolution layer in tutorial 2014-09-06 20:23:55 -07:00
Jonathan L Long 68849e4aab [docs] fixup the MathJax notation in tutorial/layers 2014-09-06 20:14:28 -07:00
Evan Shelhamer 4a425283b4 Merge pull request #1022 from shelhamer/engine
Add engine switch to pick computational backend
2014-09-07 03:47:46 +02:00
Evan Shelhamer e922d119b2 revert separate strategies: engines will extend the caffe standards 2014-09-07 03:34:53 +02:00
Evan Shelhamer e05428f1af revert engine switch for build to always include caffe engine 2014-09-07 03:27:07 +02:00
Evan Shelhamer d5605ec466 default engine to Caffe in case config is missing 2014-09-07 03:27:07 +02:00
Evan Shelhamer 347fdbd609 default engine to Caffe according to compile flag 2014-09-07 03:27:07 +02:00
Evan Shelhamer 791243f4a0 grooming: drop pointless overrides, stub layer comments 2014-09-07 03:27:07 +02:00
Evan Shelhamer dd958e0410 strategize softmax 2014-09-07 03:27:07 +02:00
Evan Shelhamer 8e8872ddfd strategize relu, sigmoid, tanh 2014-09-07 03:27:07 +02:00
Evan Shelhamer 63323765f5 strategize pooling
Scaffold engine switching for pooling.

The Caffe pooling is instantiated without regard for engine in:
- LRNLayer
- PoolingLayer tests
- StochasticPoolingLayer tests
- MaxPoolingDropout tests
2014-09-07 03:27:06 +02:00