January 28th, 2014, 02:51 AM
Re: C++ Memory Mapped Files Trouble
Hmm, that's an interesting idea actually. Creating one folder including thousands of files...
Originally Posted by 2kaud
But creating one file for each data item, writing the data in it and saving won't be so costly from perfomance point of view? I mean is it possible to match the performance req. like lets say 15 packages per second with each data 2 MB size?
And what about searching the files according to names (timestamp) for the reader applications?...
I don't have this hardware setup now, it will be in the real system, so I have to develop it first on my personal computer.
Do you have any estimation for the performance?
Furthermore, if wirting and reading a file from hard drive doesn't cost too much and decide to go with that way, also I can create a circullar array in memory (shared memory perhaps), while writing the data to the file, I can add a new element to my "metadata array" as well with timestamp and a unique id attributes and I can give this unique id to filenames. So the reader applications can make the search first from this meta data array (without going to harddrive) and if the data is found then they can search the file from hard drive with this unique id. (How much time does it take to search a File with id (name) in for exmaple 100000 file sized repository???)
Yeah it seems really cool, only question is can I match that writing speed??? Yeah I have to try and see I think...
Thank you very much again.
Last edited by vecihi; January 28th, 2014 at 03:22 AM.
Tags for this Thread
Click Here to Expand Forum to Full Width
This a Codeguru.com survey!