1.cpu版
(1)安装依赖库
sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libhdf5-serial-dev libboost-all-dev protobuf-compiler sudo apt-get install libatlas-base-dev libgflags-dev libgoogle-glog-dev liblmdb-dev
在安装过程中这些库的安装除了hdf5库有问题外,其他都正常。问题出现在安装caffe是makeall后出现错误提示。这点接下来再说。
(2)安装caffe
终端输入:
git clone git://github.com/BVLC/caffe.git
然后找到caffe文件夹,打开后会发现Makefile.config.example文件,
这时候将Makefile.config.example复制一份命名为Makefile.config,并打开,将
#cpu_ONLY := 1
前的“#”去掉即可
在caffe文件夹中打开终端,运行makeall,这时候就会出现hdf5库丢失问题:
./include/caffe/util/hdf5.hpp:6:18: fatal error: hdf5.h: No such file or directory
解决方法:
sudo find / -name hdf5.h
找到对应文件路径,将其加入Makefile.config中。
查询结果为
/usr/include/hdf5/serial/hdf5.h
打开Makefile.config,在
INCLUDE_DIRS := $(PYTHON_INCLUDE)/usr/local/include
之后添加
/usr/include/hdf5/serial
即可。类似的,若编译caffe时出现丢失文件的情况,都可以用此方法解决。注意路径,有
INCLUDE_DIRS和LIBRARY_DIRS两个,看清丢失文件的路径在/usr/include中还是/usr/lib中
然后继续makeall,又出现问题:
/usr/bin/ld: cannot find -lhdf5_hl /usr/bin/ld: cannot find -lhdf5
LIBRARIES +=glog gflags protobuf boost_system boost_filesystem m hdf5_hl hdf5
更改最后两项为:
LIBRARIES +=glog gflags protobuf boost_system boost_filesystem m hdf5_serial_hl hdf5_serial
之后执行
make all make test make runtest
安装过程中问题相关链接:
http://blog.csdn.net/goofysong/article/details/52116265
http://blog.csdn.net/striker_v/article/details/51615197
http://blog.csdn.net/lkj345/article/details/51280369
(3)安装anaconda,配置Python接口
先提一下配置之前环境:Ubuntu16.04,gcc版本5.4.0
1.安装anaconda,选择Python2.7
bash Anaconda2-4.3.0-Linux-x86_64.sh
2.安装过程中注意提示,千万不要一直enter,记得一直选择默认的就行,傻瓜式安装,之后会提示是否在bashrc文档中添加路径,这里选择“yes”
3.打开bashrc文档,在Home下,打开显示隐藏文件选项,就可以找到该文件。在最后一行里加入
export PATH=/home/(你的用户名)/anaconda2/bin:$PATH(具体路径根据你的安装路径而定) export PYTHONPATH=/home/(你的用户名)/caffe/python:$PYTHONPATH(同上) export LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libstdc++.so.6
Bashrc文档更改后需要source一下,可以重启电脑或者输入命令:
sudo ldconfig
4.修改caffe下的Makefile.config文档。
## Refer to http://caffe.berkeleyvision.org/installation.html # Contributions simplifying and improving our build system are welcome! # cuDNN acceleration switch (uncomment to build with cuDNN). # USE_CUDNN := 1 # cpu-only switch (uncomment to build without GPU support). cpu_ONLY := 1 # uncomment to disable IO dependencies and corresponding data layers # USE_OPENCV := 0 # USE_LEVELDB := 0 # USE_LMDB := 0 # uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary) #You should not set this flag if you will be reading LMDBs with any #possibility of simultaneous read and write # ALLOW_LMDB_NOLOCK := 1 # Uncomment if you're using OpenCV 3 # OPENCV_VERSION := 3 # To customize your choice of compiler,uncomment and set the following. # N.B. the default for Linux is g++ and the default for OSX is clang++ # CUSTOM_CXX := g++ # CUDA directory contains bin/ and lib/ directories that we need. # CUDA_DIR := /usr/local/cuda # On Ubuntu 14.04,if cuda tools are installed via # "sudo apt-get install nvidia-cuda-toolkit" then use this instead: # CUDA_DIR := /usr # CUDA architecture setting: going with all of them. # For CUDA < 6.0,comment the *_50 lines for compatibility. # CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \ #-gencode arch=compute_20,code=sm_21 \ #-gencode arch=compute_30,code=sm_30 \ #-gencode arch=compute_35,code=sm_35 \ #-gencode arch=compute_50,code=sm_50 \ #-gencode arch=compute_50,code=compute_50 # BLAS choice: # atlas for ATLAS (default) # mkl for MKL # open for OpenBlas BLAS := atlas # Custom (MKL/ATLAS/OpenBLAS) include and lib directories. # Leave commented to accept the defaults for your choice of BLAS # (which should work)! # BLAS_INCLUDE := /path/to/your/blas # BLAS_LIB := /path/to/your/blas # Homebrew puts openblas in a directory that is not on the standard search path # BLAS_INCLUDE := $(shell brew --prefix openblas)/include # BLAS_LIB := $(shell brew --prefix openblas)/lib # This is required only if you will compile the matlab interface. # MATLAB directory should contain the mex binary in /bin. # MATLAB_DIR := /usr/local # MATLAB_DIR := /Applications/MATLAB_R2012b.app # NOTE: this is required only if you will compile the python interface. # We need to be able to find Python.h and numpy/arrayobject.h. # PYTHON_INCLUDE := /usr/include/python2.7 \ # /usr/lib/python2.7/dist-packages/numpy/core/include # Anaconda Python distribution is quite popular. Include path: # Verify anaconda location,sometimes it's in root. ANACONDA_HOME := $(HOME)/anaconda2 PYTHON_INCLUDE := $(ANACONDA_HOME)/include \ $(ANACONDA_HOME)/include/python2.7 \ $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include # Uncomment to use Python 3 (default is Python 2) # PYTHON_LIBRARIES := boost_python3 python3.5m # PYTHON_INCLUDE := /usr/include/python3.5m \ # /usr/lib/python3.5/dist-packages/numpy/core/include # We need to be able to find libpythonX.X.so or .dylib. # PYTHON_LIB := /usr/lib PYTHON_LIB := $(ANACONDA_HOME)/lib # Homebrew installs numpy in a non standard path (keg only) # PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include # PYTHON_LIB += $(shell brew --prefix numpy)/lib # Uncomment to support layers written in Python (will link against Python libs) WITH_PYTHON_LAYER := 1 # Whatever else you find you need goes here. INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/include/hdf5/serial /usr/local/include LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial # If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies # INCLUDE_DIRS += $(shell brew --prefix)/include # LIBRARY_DIRS += $(shell brew --prefix)/lib # NCCL acceleration switch (uncomment to build with NCCL) # https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0) # USE_NCCL := 1 # Uncomment to use `pkg-config` to specify OpenCV library paths. # (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.) # USE_PKG_CONFIG := 1 # N.B. both build and distribute dirs are cleared on `make clean` BUILD_DIR := build DISTRIBUTE_DIR := distribute # Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171 # DEBUG := 1 # The ID of the GPU that 'make runtest' will use to run unit tests. TEST_GPUID := 0 # enable pretty build (comment to see full commands) Q ?= @
5.在终端下输入命令
python
此时会显示你的默认python版本,若出现Anaconda,则说明之前Anaconda安装成功,否则重新安装,
接着输入
make pycaffe
编译成功后,不能重复编译,否则会提示Nothingtobedonefor"pycaffe"的错误,若想重新编译,则输入
make clean
然后在Caffe目录下输入命令
make all
Makefile:626: recipe for target '.build_release/tools/convert_imageset.bin'然后错误提示中全部提示未定义的引用,大量出现“std::__cxx11::”,
那么问题就是gcc不匹配的问题了,gcc5.4版本太高,具体原因官网有说明:
https://gcc.gnu.org/onlinedocs/gcc-5.2.0/libstdc++/manual/manual/using_dual_abi.html,
1)降级系统gcc版本,降至gcc5.1之前才可以,不推荐
2)按照官网提示,将_GLIBCXX_USE_CXX11_ABI值更改为0,
CXXFLAGS += -D_GLIBCXX_USE_CXX11_ABI=0
但是不知道怎么改,在哪里改,放弃
3)升级Anaconda中的gcc版本
输入命令
conda install libgcc
再回到caffe目录下,输入命令
成功后,输入python,接着输入
import caffe
No module named google.protobuf
此时输入命令
conda install protobuf
结束后再重新importcaffe,成功的话应该没有任何提示。
2.GPU版
待更新
3.Mnist数据库测试Caffe
在caffe根目录下终端运行
./data/mnist/get_mnist.sh ./examples/mnist/create_mnist.sh
经过上述操作./examples/mnist/路径下会有mnist_test_lmdb和mnist_train_lmdb
两个文件夹,分别是测试和训练数据。
我在这里使用lenet模型进行训练,首先修改
./examples/mnist/lenet_solver.prototxt最后一句话为
solver_mode:cpu
然后运行命令
./examples/mnist/train_lenet.sh
即可。
原文链接:https://www.f2er.com/ubuntu/354831.html