CUDA Natural language processing

Hi all,

Being new to CUDA programming and after talking with one of the NVidia people at SC17 today I wanted to see if anyone could direct me to any coding examples for doing NLP using CUDA?

I have a K20, 1060 and 1080 and not sure if the K20 would be ideal for this or if the 1060 would have enough resources to do processing in real time but until I have some code to work with I just wanted to point out the resources that I have available to work with.

Basically, I’m looking to replace my Google API that does my speech to text processing with an open microphone concept.

Thanks in advance for any help or suggestions.

James

Google Scholar finds a number of papers about accelerating various aspects of NLP with GPUs and specifically CUDA. Have you checked whether any of those are close to the kind of processing you are envisioning? You could always try contacting the authors directly to see whether you can get access to code if they don’t have their projects on GitHub or similar sites.

The number of seemingly relevant papers I find is not exactly huge, possibly indicating that there isn’t much interest (possibly because CPUs are plenty fast for what most people want to do in NLP?), or that GPUs aren’t well suited to this kind of processing (e.g. lack of sufficient parallelism, issues with real-time requirements?). In other words, with the right idea(s), you might be the person to drive GPU-accelerated NLP forward.