How to reduce high CPU usage caused by receiver thread of CAN messages? - Page 2
CodeGuru Home VC++ / MFC / C++ .NET / C# Visual Basic VB Forums Developer.com
Page 2 of 2 FirstFirst 12
Results 16 to 27 of 27

Thread: How to reduce high CPU usage caused by receiver thread of CAN messages?

  1. #16
    Join Date
    Nov 2003
    Posts
    1,815

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    What is the messages per second (msgs/s) that the current code can achieve?

    If your parsing/processing code can't handle more than 25 messages a second, then no amount of buffering is going to help.
    Code:
    case Can_Msgs.mMAP_SPEED_INFO.msg_id: 
        RawCanData = xDataConv.data_to_ui64(CanData);
        i = filt_CParsDbcFile.Search4Message(uMsgId);
        if ( i != -1 )
        {
            Can_Msgs.mMAP_SPEED_INFO.last_receive_time = GetTickCount();
            j = filt_CParsDbcFile.Search4Signal(uMsgId,"MapCrntPstSpdLimVld" );
            if( j != -2 )
            {
                Startbit = (63-((int)filt_CParsDbcFile.m_pSigDescs[j].startByte*8))-(7-(int)filt_CParsDbcFile.m_pSigDescs[j].startBit);
                Can_Msgs.mMAP_SPEED_INFO.MapCrntPstSpdLimVld = xDataConv.get_bool(RawCanData, Startbit);
            }
        }
    break;
    That code needs to be profiled to see what's taking so long. If that code can't handle 25 msgs/s, then which part is taking so long?

    gg

  2. #17
    Join Date
    Nov 2003
    Posts
    1,815

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    >> I tried commenting whole parse messages but still the result was the same
    Code:
    while (vThread_exit.Rcv_Msg)
    {
        RcvVal = receive_data(CanData, MsgId,g_xlPortHandle_global);
    }
    So you're saying that that code there has the same CPU usage? If that's the case, then Vector's XL Driver library requires that much CPU to get the job done.

    But again, CPU usage itself isn't necessarily a problem - as long as real work is being done. Is there still a perceived "problem" with the amount of CPU usage? Or will it suffice to say "Vector needs this much CPU to get the job done"?

    gg

  3. #18
    Join Date
    Nov 2003
    Posts
    1,815

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    >> yes that code has the same CPU usage.
    How did you confirm that it is the Rcv_Msg() thread that is actually using the CPU? What are the other threads in the application doing?

    >> The Vectors XL driver library for the old code worked well.
    Same version of library? Same hardware setup? Using CAN functions? What are the differences between new and old?

    gg

  4. #19
    Join Date
    Apr 2000
    Location
    Belgium (Europe)
    Posts
    3,981

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    I'm not sure anything can be said about that XL lib so far.

    what I can say is that the suggested receive_data() in #12 has several potential problem area's.

    1) it uses a bool to check for 'thread running'.
    not necessarily a problem, but this doesn't typically mesh well with uses of WaitFor...()
    a bool will work well enough when you're not waiting/blocking. If you are, you really should look at using an event and waiting for your normal handle AND the event handle in a WaitForMultipleObject() call.

    2) there is no need (or should be no need at least) to call the setnotification every call in the receive function.
    This IS potentially the bottleneck if this creates sync objects (which is seems to do).

    3) I don't know this lib, but I would expect there to be a call to stop the notification and release resources. This could could be leaking sync objects (or it may not, depending how it works). This could be part of the "getting slower" problem over time, you're running out of memory and potentially other system resources.

    4) "arbitrary" timeouts without really understanding what's behind it... tend to be sources of problems. Either you don't have a timeout (infinity), or there is a clear reason why you would need this to have finite state.
    If this is because you want to periodically check for the "thread running" state, then that's more indication/proof you want an event and not a bool for this purpose.

    5) multithreading isn't easy. Make sure you have a proper "plan of attack" before you begin coding the thread including properly identifying/documenting all possible synchronisation issues and how you'll deal with them rather than fumbling along and making a mess of things.
    If you don't know this xl library, make test applications to test the cummunication system it's based on first, then write the multithreading around how it works. don't make a thread, then try and somehow hack your way into making the thread do more or less what you want.

  5. #20
    Join Date
    Feb 2013
    Posts
    14

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    Quote Originally Posted by Codeplug View Post
    >> yes that code has the same CPU usage.
    How did you confirm that it is the Rcv_Msg() thread that is actually using the CPU? What are the other threads in the application doing?

    >> The Vectors XL driver library for the old code worked well.
    Same version of library? Same hardware setup? Using CAN functions? What are the differences between new and old?

    gg
    [QUOTE] How did you confirm that it is the Rcv_Msg() thread that is actually using the CPU? What are the other threads in the application doing?

    I have used Procexp to see what all threads are running and how much CPU each thread is consuming.I have seen other threads are using around<1% of CPU.

    Same version of library? Same hardware setup? Using CAN functions? What are the differences between new and old?
    Yes they have used the same version of library and hardware setup.Few CAN functions differed.They have implemented the CANDll with in the code.But I have implemented the CANDll separately.

  6. #21
    Join Date
    Feb 2013
    Posts
    14

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    Quote Originally Posted by OReubens View Post
    I'm not sure anything can be said about that XL lib so far.

    what I can say is that the suggested receive_data() in #12 has several potential problem area's.

    1) it uses a bool to check for 'thread running'.
    not necessarily a problem, but this doesn't typically mesh well with uses of WaitFor...()
    a bool will work well enough when you're not waiting/blocking. If you are, you really should look at using an event and waiting for your normal handle AND the event handle in a WaitForMultipleObject() call.

    2) there is no need (or should be no need at least) to call the setnotification every call in the receive function.
    This IS potentially the bottleneck if this creates sync objects (which is seems to do).

    3) I don't know this lib, but I would expect there to be a call to stop the notification and release resources. This could could be leaking sync objects (or it may not, depending how it works). This could be part of the "getting slower" problem over time, you're running out of memory and potentially other system resources.

    4) "arbitrary" timeouts without really understanding what's behind it... tend to be sources of problems. Either you don't have a timeout (infinity), or there is a clear reason why you would need this to have finite state.
    If this is because you want to periodically check for the "thread running" state, then that's more indication/proof you want an event and not a bool for this purpose.

    5) multithreading isn't easy. Make sure you have a proper "plan of attack" before you begin coding the thread including properly identifying/documenting all possible synchronisation issues and how you'll deal with them rather than fumbling along and making a mess of things.
    If you don't know this xl library, make test applications to test the cummunication system it's based on first, then write the multithreading around how it works. don't make a thread, then try and somehow hack your way into making the thread do more or less what you want.
    I have changed my code in #12 as per the code in #13.And I am sure that the threads I have implemented working fine except for this receive thread.

  7. #22
    Join Date
    Nov 2003
    Posts
    1,815

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    It sounds like you're getting more messages than you expect - ie, way more than the 25 msgs/s that you're interested in.

    Do you see any differences in how the port is opened and setup between the new and old code?

    Looking at the CAN Application flowchart in the manual, there are setup functions like xlCanSetReceiveMode() and xlCanSetChannelMode() that seems to affect what will be received. Or perhaps it's a settings difference in in the "Vector Hardware Config" tool.

    You could write some code that logs all the messages coming across that you are not interested in. Knowing what those are may help with figuring out how to exclude those so that xlReceive() never returns them.

    >> 2) there is no need (or should be no need at least) to call the setnotification every call in the receive function.
    The manual implies this as well - since the flowchart has this being called in the Setup phase only once.

    gg

  8. #23
    Join Date
    Feb 2013
    Posts
    14

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    Quote Originally Posted by Codeplug View Post
    It sounds like you're getting more messages than you expect - ie, way more than the 25 msgs/s that you're interested in.

    Do you see any differences in how the port is opened and setup between the new and old code?

    Looking at the CAN Application flowchart in the manual, there are setup functions like xlCanSetReceiveMode() and xlCanSetChannelMode() that seems to affect what will be received. Or perhaps it's a settings difference in in the "Vector Hardware Config" tool.

    You could write some code that logs all the messages coming across that you are not interested in. Knowing what those are may help with figuring out how to exclude those so that xlReceive() never returns them.

    >> 2) there is no need (or should be no need at least) to call the setnotification every call in the receive function.
    The manual implies this as well - since the flowchart has this being called in the Setup phase only once.

    gg
    All the messages which are around 11 of them has to be received in 40 milliseconds.That means approx 4 ms for each messages.But instead of getting all 11 messages once in 40 ms,these are received like almost 10+ times.I don't see much difference between how port is opened and setup between old and new code.

  9. #24
    Join Date
    Apr 1999
    Posts
    27,434

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    Quote Originally Posted by bobby2387 View Post
    [CODE]if I am adding same printf statement for xlstatus I am getting the CPU usage down to 7% from 50%.I am not getting the reason why?
    So you added a benign "printf" statement, and all of a sudden the behaviour changes?

    That sounds like a memory overwrite bug to me, and you just moved the memory corruption to another location in the program. Corrupting memory can cause things to go "slower" due to stepping on important variables (for example, a loop that is supposed to only go 100 iterations, and the variable that holds the number of iterations gets stepped on and now is equal to 2349473927).

    I'm going to put odds on it that you were always overwriting memory, and the new code exposed the bug to you. If you have gone through all of these steps of commenting things out, not knowing where things are slow, etc. and you're using the same libraries, etc., then it is almost assured you've corrupted memory, and the new code is showing this error.

    That is the only reasonable explanation why a simple "printf" which should have no bearing on the CPU usage would all of a sudden change the behaviour of the program. You changed the binary executable around, therefore the corruption moves to another area of the program. My advice is to take out the printf() and actually fix the problem.

    Regards,

    Paul McKenzie
    Last edited by Paul McKenzie; February 19th, 2013 at 02:06 PM.

  10. #25
    Join Date
    Feb 2013
    Posts
    14

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    Yes Paul I have removed the printf. Thanks for the information.

  11. #26
    Join Date
    Feb 2013
    Posts
    14

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    Quote Originally Posted by Codeplug View Post
    It sounds like you're getting more messages than you expect - ie, way more than the 25 msgs/s that you're interested in.

    Do you see any differences in how the port is opened and setup between the new and old code?

    Looking at the CAN Application flowchart in the manual, there are setup functions like xlCanSetReceiveMode() and xlCanSetChannelMode() that seems to affect what will be received. Or perhaps it's a settings difference in in the "Vector Hardware Config" tool.

    You could write some code that logs all the messages coming across that you are not interested in. Knowing what those are may help with figuring out how to exclude those so that xlReceive() never returns them.

    >> 2) there is no need (or should be no need at least) to call the setnotification every call in the receive function.
    The manual implies this as well - since the flowchart has this being called in the Setup phase only once.

    gg
    Can you Please tell me how to set filter for CAN driver to receive only the message id,which are required?

  12. #27
    Join Date
    Nov 2003
    Posts
    1,815

    Re: How to reduce high CPU usage caused by receiver thread of CAN messages?

    I wish I could help more, but this thread and the other represent my only exposure to CAN bus programming. In other words, I'm just pulling what I can from the manuals.

    You say the old code runs with near 0% CPU utilization. What is the difference in functionality between that and your new code?

    gg

Page 2 of 2 FirstFirst 12

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Windows Mobile Development Center


Click Here to Expand Forum to Full Width

This is a CodeGuru survey question.


Featured


HTML5 Development Center