CodeGuru Home VC++ / MFC / C++ .NET / C# Visual Basic VB Forums Developer.com
Results 1 to 10 of 10
  1. #1
    Join Date
    Aug 2014
    Posts
    6

    Shared Memory to make objects of an application persistent?

    Hello,

    I have a question concerning shared memory (in linux-environment). In our company we currently have an application that is restarted once in a while. There are multiple instances running of this application (on different physical machines), and all access the same centralized database. Because of IO and Network bottleneck the start gets very slow, and takes about 10 Minutes. Besides some new data, most of the data stays the same, so reading from the database again is quite redundant. An idea is to write all relevant object to a shared memory segment, when the application is shut down. A second dummy process attaches to the same shared memory (just so that some process is still attached). If the application is restarted, it is attached again to its shared memory, and reads it to its own adress range again. There might occur a difference regarding the data so it might me necessary to load some new data afterwards, but that is different problem.

    The current idea is to serialize all Objects (could be about 1-2 Gigabyte) with Apache Thrift, and write them into shared memory. With Thrift the data is more or less ordered, so creating the objects anew is possibly easier (not sure here).

    My Question is:
    - Does it even make sense to consider shared memory in this scenario. I've read a lot stuff about in the last few days, and for now I don't see big disadvantages (except if the application crashes, in this case I've to read from database again). On the other hand I don't know how to really implement this functionality (as I'm no experienced Developer)
    - Should I aim for Boost::Interproces, considering even memory mappable files, or stay with the traditional shmat (and stuff..)?
    - I guess 1-2 Gigabyte shared memory will be necessary. This amount is only needed in the gap between application shutdown and restart. Will the sheer amount of needed shared memory be a problem (all examples I found just used a few Bytes or Kilobytes)
    - Does it make sense to use Thrift in this scenario?

    If I am missing any information, please inform me.

    Thanks in advance

    Michael

  2. #2
    Join Date
    Apr 2000
    Location
    Belgium (Europe)
    Posts
    4,626

    Re: Shared Memory to make objects of an application persistent?

    shared memory only works for applications running on the same physical machine. Typically, the apps also need to be running under the same instance of the OS if you're virtualizing multiple OSes on a single PC, though there are a few exceptions where it's possible to used shared memory across OS boundaries (I know this can be done with Windows. I don't know if Linux has something similar).

    If the database is updated infrequently. it might be feasible to have a separate instance of the DB on each computer and use a synchronisation system to keep all the DB's on the different computers in check. This can be anything from very simple to complex depending on how 'hard' or 'tight' you want the synchronisation to be and if updates need to be transaction safe across all instances.

    1GB of shared memory could be problematic if it needs to be available in a single contiguous chunk and you need this on a 32bit OS.


    another alternative might be some kind of client side caching so you don't actually fetch any date from the remote DB unless the cached data you have has become invalid (this requires some form of mechanism to validate/update the cache).

  3. #3
    Join Date
    Aug 2014
    Posts
    6

    Re: Shared Memory to make objects of an application persistent?

    Thanks for the fast answer.
    Although there a multiple instance of the application on different machines, each machine would use its own shared memory (with its own dummy process attaching to it), so the communication only occurs within a machine. The distributed database approach was already discarded (guess we don't have the time to implement it in a satisfying way) and the client side caching is the other alternative we are currently discussing.

    Shared Memory doesn't have to be in one big chunk (as I heard a few ours ago), I just have to make sure an object is not splitted between to chunks. If I am remembering all my Keys somewhere, I can just iteratively attach to the shared memory area again, and deserialize my data back into my application, right? So technically shared memory would be a working alternative?

  4. #4
    Join Date
    Oct 2008
    Posts
    1,456

    Re: Shared Memory to make objects of an application persistent?

    Quote Originally Posted by micbeh View Post
    In our company we currently have an application that is restarted once in a while.
    wouldn't be easier to avoid restarting the process altogether? that is, why do you need to restart in the first place ? for example, if you need to restart in order to update some parts of the program, then some kind of "component" dynamic linking mechanism ( ala COM ) would be more appropriate, ain't it ? I mean, that dummy process looks like more an hack than a real solution, especially on the long run ...

  5. #5
    Join Date
    Aug 2014
    Posts
    6

    Re: Shared Memory to make objects of an application persistent?

    Hello,

    it is a high performance application, and as such, dynamic linking proved to be slower than our current solution (as far as I am informed). Additionally, from an organizational point of view the already quite complex application would get much more complex with dynamic linking.

  6. #6
    Arjay's Avatar
    Arjay is offline Moderator / EX MS MVP Power Poster
    Join Date
    Aug 2004
    Posts
    13,490

    Re: Shared Memory to make objects of an application persistent?

    When the program starts up, can you start up the program and background load parts of the database?

  7. #7
    Join Date
    Aug 2014
    Posts
    6

    Re: Shared Memory to make objects of an application persistent?

    Ansynchronous load from database is already done for some objects. However, the application is a http-Server and needs to answer request based on the current state of the database. If it takes 10 Minutes to load the data ansynchronous, the server would send wrong http-responses during this period. We allow this misbehaviour under special circumstances, but loading all in background would make it worse.

  8. #8
    2kaud's Avatar
    2kaud is offline Super Moderator Power Poster
    Join Date
    Dec 2012
    Location
    England
    Posts
    7,824

    Re: Shared Memory to make objects of an application persistent?

    Isn't the first question that needs to be answered is why do you have the need to keep restarting the process(s) as asked by Superbonzo in post #4? If this is due to planned maintenance then couldn't this be planned for a time when 10 minutes downtime would be acceptable? If it is due to other issues wouldn't time be better spent investigating why and fixing these so that unplanned restarts are not needed?
    All advice is offered in good faith only. All my code is tested (unless stated explicitly otherwise) with the latest version of Microsoft Visual Studio (using the supported features of the latest standard) and is offered as examples only - not as production quality. I cannot offer advice regarding any other c/c++ compiler/IDE or incompatibilities with VS. You are ultimately responsible for the effects of your programs and the integrity of the machines they run on. Anything I post, code snippets, advice, etc is licensed as Public Domain https://creativecommons.org/publicdomain/zero/1.0/ and can be used without reference or acknowledgement. Also note that I only provide advice and guidance via the forums - and not via private messages!

    C++23 Compiler: Microsoft VS2022 (17.6.5)

  9. #9
    Join Date
    Aug 2014
    Posts
    6

    Re: Shared Memory to make objects of an application persistent?

    I have to agree, that restarting the process is not the best idea in the first place. However, as we all now bugs may occur and lead to a crashing process. Though the main problem is the error and has to be fixed asap, restarting the process is inevitable. Some kind of maintenance plan would be a good idea for standard updates, though it would not help in case of bugs. See it more like an addition for special problems. If you have to restart only once per month (because you have stuff like a maintenance plan) it would be still quite annoying, if you have to restart 100 applications and every restart takes 5-10 Minutes.

  10. #10
    Arjay's Avatar
    Arjay is offline Moderator / EX MS MVP Power Poster
    Join Date
    Aug 2004
    Posts
    13,490

    Re: Shared Memory to make objects of an application persistent?

    Quote Originally Posted by micbeh View Post
    Ansynchronous load from database is already done for some objects. However, the application is a http-Server and needs to answer request based on the current state of the database. If it takes 10 Minutes to load the data ansynchronous, the server would send wrong http-responses during this period.
    Another approach might be to go with a layered architecture and rather than having your http-server access the database, you might consider moving the data access into a web service pool (with multiple nodes). Under this approach, a node can be taken offline for any updates and isn't brought into the pool until after it completes its initial cache load. That way, the http-Server would always have access to data from one or more web service nodes.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  





Click Here to Expand Forum to Full Width

Featured