Video decoding on tesla V100
29 views (last 30 days)
Show older comments
I' am doing video analysis on H.264 encoded videos and I have currently the problem, that for the decoding the CPU instead of the GPU is used.
The special thing about my setup is, that I’ am using a Nvidia tesla v100 graphics card. It is properly installed so that the windows 10 device manager can identify the card. Also, Matlab identifies the graphics card via gpuDevice as the Nvidia tesla v100 32gb version. In addition, I downloaded and started the gpuBench from the fileexchance and proved with GPU-Z, that the calculations are done on the GPU.

So, I' am shure that MATLAB is able to use my tesla v100. But when it comes to the video decoding, the calculations are done on the CPU. I have also tested my program on a different desktop pc with a GTX 1070 graphics card, where GPU-Z shows that the GPU is doing the work (video encoder load goes up to 100%)
The main part of my program is the for loop with the readFrame of the VideoReader Object. Together with adjusting the CurrentTime of the video object, this is also the part which takes the longest time to compute.
vidObj = VideoReader(videoFile);
for nFrame = startFrame:1/frameRate:vidObj.Duration
img = readFrame(vidObj);
vidObj.CurrentTime = nFrame;
[box, score, label] = detect(frcnn, img);
% some calculations
end
I' am now seeking for a solution to speed up the video processing by letting the GPU decode the video. Is there any way to force the video decoding to be done on the GPU?
Or is it a different problem, related to the tesla card? Has it something to do with the WDMM capability or something similar because the tesla card is in the TCC mode and supports WDMM only if a pay for a Nvidia GRID license? Has anyone ever gotten something similar like this to run?
For several days now I have searched to find a solution and tried by myself to find one without any success. So, I would really appreciate any hint or trick how I can speed up the video analysis using my GPU.
Thank you very much I’m really looking forward to your help!
Best regards
Nils
1 Comment
Jason Ross
on 15 Mar 2019
I don't know about the code parts of your question, but the best practice for GPU calculations on Windows is to have the GPU in TCC mode. If it's in TCC mode, that's where it should be.
Answers (0)
See Also
Categories
Find more on Startup and Shutdown in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!