Do I still need to learn Caffe, or can I get by with Tensorflow and Keras?

Hey Folks, I am new to the deep learning world, but my background is in statistics. I had a question about libraries. So I love DIGITS and have been able to get the demos working, etc. So that is good. But I am having any amount of frustration with getting Caffe models to run in DIGITs because of some protobuf errors, etc. (Yes, I have consulted lots of stackexchange, nvidia forum, and github posts on issue).

Anyhow, this effort got me thinking about whether I really needed to learn all of the environments? I think DIGITS supports Caffe, Torch, and Tensorflow now, but even without DIGITS there is no reason to learn every library out there. A lot of my applications focus on CNNs for Object Detection in images, as well as some applications on textual analysis, etc.

So I was not sure if I really needed to go through all of the trouble of getting every library installed and working, if I really was not going to use them all. Perhaps this is a good application of Zipf’s law :). Of course, the folks on the forums have more experience with these libraries than I do, so I would defer to their judgement. Are there any big problems or limitations that I will run into if I focus on just TF and Keras? If I had to pick an additional library beyond this, which one should I choose?

Well, seeing as TF 1.7 & Keras uses Cuda 9 & cuDNN 7.05 (which is currently unavailable), switching to TF might be a large waste of time. Perhaps pytorch instead?

All griping aside, Keras is pretty useful for image recognition tasks, less robust from what I understand for RNN’s / LSTM’s, but it is a good gateway drug. From there you can either head into pure TF, pytorch, or the other flavors if you prefer.