The real test is in now, index server is reconstructing its index. It does this every 6 hours if there is new pages. Takes half an hour or so usually.
It's supposed to be able to handle searches at the same time, but jeepers, it's gonna have to chew through nearly 400 Gb of data while dealing with over 1 request per second.
Is your site/code on GitHub? I would be happy to give performance tips/tweaks. Also Fyi, https://marginalia.nu/ gives a certificate error (I know that's not the search site)