0
Under review

Mega import crash

Guillermo Catalano 4 years ago updated by Nuno Bastos 4 years ago 6

I was running Ubooquity very well until I imported 30.000 books and it crashed. Now even after a reboot when I try to access I see the spinning circle.Do you know if is there a fix and or a limit in how many books you can import?
Regards.

Hi there! I have similar problem. After importing around 12k books/comics, although the main page works perfectly and I can browse/read the books in my library, I cannot access the admin page, In fact, I find it crashing very often, and by craashing I mean trying to load but without any result, displaying only the blue bar "Not connected to server". Even If I try to reload the page, I have no luck.

I have to stop and start the app for the admin page for the admin page to be functional again.
 

Any help?

Also, my complete library has around 250k books/booklets/papers/documents etc. I wonder if the app will manage to handle them all or it will crash? Any thoughts?

UPDATE:

I forgot to mention before but I'm using Ubooquity on a Debian based server (headless), and mange it essentially through SSH but from times to times, I access it through SAMBA using Macintosh operating system (OS X). Even though it is hidden for OS X, this operative system leaves .file_name files everytime it handles a file to any other filesystem. 

After inspecting the log. I found a couple of warnings stating Ubooquity couldn't handle such files.

I don't believe it is the main reason for such crashes but still ill eliminate this files and try to start over (delete caches and DB and rescan everything).

Under review

Apart from the warnings you mentionned, any specific error in the logs ?

Nothing related to memory ?

Hi Tom, thank you for your reply.

Now that you've mentioned I went to take a look on the logs (didn't realise unitl now that the bottom lines were the most recent, actually wasn't paying attention to the timestamp).

I have found out that prior to rebooting the server and restarting the app (killing the app or the PID wasn't working) I got a bunch of this errors

java.lang.OutOfMemoryError: Java heap space

Some appear in the form of an error other in form of a warning.

Also in between this errors and warnings there are files that were successfully scanned.

How to overcome this error? Kill any other tasks the server might be doing, and then make a full scan?

Thanks

Edit: 

I also realized that the files I had warnings and errors were all above 512MB and on Ubooquity startup (on log) it shows:

Max heap size available: 512 MB

How can this value be changed?

I'm writing this information on a new post for those interested in the solution for this problem.

A brief description: I have around 250k items in my library between books and magazines/comics all separated in a folder structure according to my own way of arranging them. After trying to load only 10K/12K of those items I started to notice the admin page would crash, showing only the blue bar "Not connected to server" and no refresh would call this page again until Ubooquity process was killed and relaunched.

Tom (above) called my attention to a memory problem. The main problem with this issue was that Ubooquity was trying to load files bigger than the maximum memory available for Java apps.

After checking the log file I found a couple of warnings and errors with the same statement:

For those who call Ubooquity from a .sh file (like run-ubooquity.sh) there is a variable MEM_OPT, defined by default to 512m (512MB). You can either change this variable to, for instance, 1GB (1G) (or bigger) or if you're calling ubooquity from another shell file or command line, you must use -Xmx DESIRED_MEM_SIZE to specify (and increase) the maximum heap memory size available to java. This command helps to control the usage of RAM memory available for the java app so be careful and conscious with its usage.

Hope it was helpful and enlightening.

Thanks a lot, in fact Komga (which I use for comics) also recommends using the Xmx argument 

Now it would be useful to know more or less how many books equal how many memory usage.

From my noobieness (if that word even exists) I'm quite sure to say that it is related to the biggest file you will scan. This means that it allocates file by file to the memory individually, but feel free to correct me if I'm wrong.

Making it practical: Where did my logs gave me errors while scanning? On some anatomy compendiums and atlas which are huge pdf files with over 600MB and up to near 1GB (since it has a lot of high res anatomical images, so it doesn't lose quality)

As soon as I've increased the memory available to 1GB Ubooquity became able to scan them successfully and no more errors appeared.

I've only used Calibre before but now I'm certain that I've gave it way too many chances. Don't get me wrong, calibre is a great tool, It allows file format conversion and downloading metadata for any filetype with lots of criteria. It just didn't meet my needs, ever. not even when my library was 300GB much less now that it is over 1TB, but especially because of its file structure. My library is arranged in a bunch of folders and subfolders that meet my own criteria and allow me, in case of a SW failure, to grab any book I need via sftp easily and calibre would just mess it all up.

Ubooquity, on the other hand, respects my folder structure is very lightweight and an UI very apealing! :) 


Komga to be honest never heard about but will dig into it.