So, I don't think this would be specifically slow, but there are some issues:
- There is no way to use CACHE for this. You are basically triggering a PHP file-system process for each visitor for every single page.
- The "7075 images" is all images inside child-folders of current dir? Of course, this value needs to get re-calculated recursively for each folder that is visited.
- With
$(".dir").each(function(i) { ... url: '/dirinfo.php',
, you are triggering a multiple separate requests to server for each dir, all at the same time. For example, if there are 50 directories, it will request 50 x dirinfo.php
at the same time. It might work, and it might break, but it's certainly not effective.
- Although speed might be fast for one visitor, but it might not be pretty if hundred visitors are loading hundred x PHP scripts, all at the same time.
In your screenshot, is the "7075 images" all images in all sub-directories recursively? It doesn't really make sense "21 files • 7075 images" ... One is dir, and one is all folders, but how is the visitor supposed to know? Also, in the folder preview "6812 images", does this include all images in all subfolders of the dir? Or only images inside the dir?
Anyway, it is possible to make something like this work some, but it needs to be properly planned and as effective as possible. I applaud your efforts 👏 but this solution could make browser and server really slow if there are a huge amount of visitors. As noted earlier, there is now way to use CACHE when counting files, because we don't know if anything in subfolders has changed.
If I was going to do something like this, first of all I would make sure to only send a SINGLE request to PHP for each folder visited. The PHP would then do all the processing (in a single request), and return file count for listed dirs, in one response.