0
Under review

Scanning a large library seems to time out

James Richardson 9 years ago updated by Jake Glashofer 7 years ago 14
I may be an edge case, here, but my comics collection is somewhere just north of 700gb. I have things organized (painstakingly) by publisher, and I have noticed that the server app seems to stop scanning about a quarter of the way through the source folder. I've set the automatic scan period out to a week and left it alone, but it appears to stall or stop scanning before it can complete the initial scan. As a result, I expect, the scan starts over from scratch when the scan period starts over. Right now, I have access to a fraction of my collection through the public IP, but would love to see the rest in the browser.

Would it be possible to run multiple instances of Ubooquity to divide the workload and access the collection as smaller libraries? Is there a setting I am missing? Thanks for the help.
Under review
Files informations are written to the database after every file during the scan. So even if you abruptly stop the scan, what has already been scanned is not lost.
Of course Ubooquity will have to check the files names and modification date the next time a scan is launched, but it is very, very fast (the time consumming operations are the thumbnail extraction and generation as well as the metadata extraction when they exist).
As for the scan period, it does not really matter for the initial scan as a new scan won't be launched until the first one is finished.

Now you can launch multiple Ubooquity instances as long as:
  • they are launched in different folders
  • they run on different ports
But it would be interesting to understand why the scan fails at some point.
Do you see any error in the log file when the scan stops ?

It looks like I'm encountering a memory issue, I think? From today's log:

In the mean time, I'll try setting up another instance pointing to one of the subfolders that isn't being reached by the first instance. I attempted to do this before, but the second instance seemed to have difficulty connecting to an external IP...

Thanks for getting back to me so quickly.
James, it looks like the Java process is running out of its allotted memory. The next time you start the Ubooquity server from the command line add this argument:
-Xmx1024m
This will give Java a total of 1GB of memory to work with. Your Java may be defaulting to giving itself a small, reasonable, amount of RAM like 256MB. Of course, you can tailor the argument to something even higher if you have the RAM to spare. For example, if you have 2GB of RAM to spare use -Xmx2048m.

Your command will look something like this:
java -jar -Xmx1024m Ubooquity.jar
Try to run the server with this extra memory and let Tom know if you still get the out of memory error.

Tom: Do you think that the server might be keeping, in memory, record of every file it runs across? Maybe that is why he's running out of RAM.
Ubooquity is not supposed to depend on the number of files that are scanned (at least not enough to cause memory problems).

Increasing the memory dedicated to Ubooquity is indeed an adequate workaround, although not a definitive solution (if a bug has to be found and fixed).
Hi, I have the same issue. When the number of books is small, everything runs fine. But with a large collection, I get outofmemory errors / java heap space. Also, when changing a ubooquity setting, i get the messeage 'server restarting' (in a different format than normal), and then the message, 'server down'. After this message, there is no way i can get it to work normally. I have to uninstall and install the software again, and rebuild the db. And a rebuild of the db takes more than 8 hours, ending with the memory error. 
I run the software on a Synology. And as said, with a small collection of books (less than 8000 or so) everything runs fine, and I realy like it.
I'm running the software on a Synology as well, with about 35.000 items and haven't seen any memory issues. The scan seemed to have issues1 time when I added a few new file paths, but resumed after restarting Ubooquity.
What do you consider a large collection? 
I do have most of my collection in separate paths, like paths to publishers in the comic section and separate paths for each letter in the alphabet for the ebooks.
It stops scanning at item 18196. All my books are comic books.
I used the package ubooquity_x64-5.0_1.7.0-1.spk to install ubooquity on the synology. But I cannot find where I can change the memory setting to -Xmx1024m. Do you know where to set/change it in DSM5?
All comics are grouped in a main folder with subfolders per serie. In total i have currently about 2500 subfolders. 

I am having the out of memory issue but I do not run from the command line normally. I have it just start the jar file. I tried running the command hoping that would change the setting and I could continue running normally but it did not start with any of my settings/libraries and the setting was back to the default when I started the program the next time from the jar file. Is it possible to set this through the interface or some other way?

I have had very similar issues... my current startup has -Xms4096m and -Xmx4096m


Rather than 'size' of files -- I have in excess of 100k cbr/cbz files...


Keep watch on size of ubooquity database --


On my system -- my Environ... too many stops/starts/restarts. And the database grows to a filesize in excess of 2GB...


Once... it exceeded 4gb...


Depending on __your__ setup.. 16/32/64 bit mode...will be a possible limiting factor...


At the moment - I have a 64-bit system, 8gb memory, 4gb hardwired to ubooquity... it has NOT generated any out of memory errors... when finished adding all of these journals... I'll have it add 100k epub files, if the DB stays under 1gb .... I'll configure either a raspberry pi or a pogoplug (either would be running debian Jessie) to be the server... at that point the heavy lifting is done, all it has to do is 'serve '


Running in 128mb might be tough, but it should work under 256...


... ... ...

Thoughts?


I have similar size comic database probably close to 100k ive set mineto 4096 and Am currently running it hoping for success.

To be honest I have no idea how Ubooquity performs with hundreds of thousands of files (apparently not that well), you're in uncharted territories.

Just got Ubooquity set up , had my database stop for the memory issue at roughly 45,000. my data base in well in excess of 5TB . Going to try the command line fix. wish me luck!!!!!!!

By "data base", you mean the one created by Ubooquity or the total size of your files ?

Because for 45k books, the database will take around 200 MB, not more.

I meant my harddrive that has all my comic files I was able to get them all scanned by ubooquity totaling 91,105 comic files cbr/cbz. So now when I try to access my server on my LAN via my cellphone it seems to give me the out of memory issue that James Richardson posted above in the thread. So ive completed the scan and everything, with most of my library(still more than the 91k to go actually, possible 10k more) accessible however I lose connection when the memory problem is encountered.