Hi !!!

I'm starting working with openGL to get a video (Camera still) (I'm using C++ in a linux enviroment, the openCV version is 3.something) and I'm have a issue that I cant get around:

I open the camera in a URL on a LAN. I need to get some frames in moments that are determinated by an Java aplication in the same machine (it send me a request). At first I was creating the Videocapture object inside the reading method and everthing was working just fine. due to the time required by this object to open the camera (about 1s) I decided to create the Videocapture object as a property of the class and starting it in the class construction, so I wont need to reopen it every time the reading method is called, saving that precious 1 second.
The problem is that the frames are with about 1 minute of lag. I can't belive this but is true. In the camera's manufacturer application everything runs ok, no lag.
I have made a lot of research about this issue and seems that other people suffer of this and the solution propose by them don't work to me, I did:

1 cap.set(CV_CAP_PROP_BUFFERSIZE,1); to make the cameraƛ buffer just 1 frame long, dont worked
2 cap.set(cv::CAP_PROP_POS_AVI_RATIO,0) to get the last frame first, dont worked,
3 cap.set(cv::CAP_PROP_POS_MSEC,0) to get the last frame first (in ms), dont worked,
4 I tryed before read the frame requested, to read all frames that should exist int the "buffer"... is there a buffer ??? new to me since is a stream...


Is this an openCV issue?
When I was reopening the camera every time I needed a frame it was working fine, now that the camera is open all the time this 1 minute lag....

Please anyone know about this problem? how to solve this???

Any help will be appreciated

Thank you!!