Thanks for posting. Of course, it's an interesting topic to get Files Gallery to work more effectively on slower systems and/or heavy folder structures. That is of course one reason we have cache implementation, to bypass heavy processing until it's strictly necessary. There are a few more existing options for you also.
netmaster Server is old and with spinning disks but it is not particularly slow.
This is the reason I mentioned SSD disks. You can have the fastest CPU and loads of memory, but that won't help if speed is limited by the disk read access speed. Spinning disks aren't much slower than SSD when it comes to serving images, large files or several files ... but when it comes to intense data collecting, for example reading data for the main menu, this will be very slow on HDD compared to SSD. The server (PHP) basically needs the disk to lookup all folders, and that's a lot of spinning for the amount of folders you are serving.
netmaster but directory tree in the left still not working.
In your case, you might want to consider limiting the menu depth with config option menu_max_depth, which is set to 5 by default. This will speed up menu creation dramatically, because the script doesn't need to loop through all dirs on the drive, only the topmost (at the depth level). I would try to set it to 1 or 2 first for testing. You could even disable the main menu, as it's entirely fine to browse through folders from the main layout. After all, this is how it works in most other file browsers, because preloading an entire menu structure from the root may be an expensive operation.
For starters, I would perhaps try to disable the menu entirely and see how everything works. Do single folders load at acceptable speed? Do resized images get processed without extreme delays? These operations should not be affected too much by the hard drive speed.
netmaster first time it took 1 minute, second time 1 second. Because directory structure is cached in ram, files.gallery works like expected.
Files Gallery uses a couple of levels of cache. First time a folder is processed, a cache file in JSON format gets stored on server, but the data will also get cached in browser Javascript (current session) and localStorage (reload). To test how the Files Gallery server caching works, you should probably try to reload in a new private browser window, so you bypass the browsers own localStorage cache. This way you know what it will look like for other users.
Indeed caching is very effective, especially in your case, but we can only cache folders that did not change. And for the main menu, we still need to validate the cache, to make sure the cache is valid (folders did not change). To achieve this, we still need to loop through all dirs on disk, and compare "modified time", and this might still be very slow on your server, although it should be faster than re-generating.
There is another option you might want to try menu_cache_validate false. When set to false, the menu cache will be used without validating (beyond first level), and should therefore load much faster in your case. Keep in mind, when using this option, the menu cache might become outdated if you create/delete/rename folders at a deep level. Cache would then need to get purged manually.
Of course, regardless of the options above, you could also pre-create the menu cache with the tasks plugin. This would require assigning a massive timeout, and I can't really see this as a solution, because if the menu is to remain updated, it will still need to refresh after you make changes.
Apart from the options above, there is nothing else we can do to speed up initial menu creation before it gets cached. PHP must loop through all dirs on disk, and that takes as long as the disk can respond. Your options are to disable the menu or limit the amount of subfolders, and consolidate cache by using menu_cache_validate false.
netmaster One solution would be somehow ensure, that directory structure stays in filesystem cache. Not sure how to do that.
Yes, so this is kinda what menu_cache_validation
does. It allows the current menu cache to apply without validating if anything changed deep inside the folder structure. It will still validate top level folders (if they changed, the the menu must get re-created). We could perhaps have another option to just create some static menu cache that doesn't validate at all, and will always load, which would have to be purged manually if you make changes. This is kinda like a "publish" option that creates a menu snapshot, and will use that, even if your folders change, in which case there would be errors.
netmaster Another solution would be switch off menu_cache_validation and update side menu cache offline after adding new files. Is this possible somehow?
Yes, I think I explained this above. The menu still needs to get created initially, and it will still validate menu cache vs top-level dirs (meaning, if you edit/rename/delete a top-level dir, the menu will still need to get entirely re-created). It will however cause the cached menu to load much faster for all users.
Technically, it wouldn't be a problem "adding new files" either, because this doesn't affect the menu itself. However, since folders get cached in browser based on folder "modified date", the old date might remain in the menu cache, causing the browser to serve old folder data when you navigate to a folder. It would work from "new browser window", but might cause problems in your own browser ... We might need another option to disable localStorage caching for this.
netmaster Anyway, would be nice to have some official way to generate all sorts of caches in background when needed. This would solve all problems with
big directory structures
too many files or thumbnails
slow computer or file system
I'm not sure it's that simple. Besides, that's already what the task.php plugin does, so it's unclear how you want this to be achieved differently. My initial suggestion, would be to either disable the menu or set menu_max_depth
to 1 or 2. I would assume that your system should be able to handle folder loading and image processing without too much delays.
I guess in your case, the optimal solution would be to have one big "publish" button, that simply creates all cache and resizes all images. It might be an interesting solution, as technically it could mean we could publish a gallery to a server without even using PHP. Something to consider.