You could have a lidar coming in at 15Hz, a camera at 30Hz, odometry at 60 or 100Hz - but typically you'll want to plan within that same range, at least for navigation (20-50Hz). "Vastly different" is a bit of a stretch.
Also - we have used queues to deal with different time scales for a really long time. It works fine here too.
For higher-level behaviors around grasping or manipulation, your point is super valid though. I suppose I'm mostly focusing on navigation-type tasks.
You aren’t thinking broad enough. Algorithms can run at megahertz, sensors can run at 10s of kilohertz to 10s of Hz, control loops can run at 5Hz. Remote database calls can run of course much longer than that, and then you have very long range planning tasks that can cycle days or weeks depending on deployment. I’d say that’s quite the range.
And you mention queues, yes exactly. Abstract a little more and you get pub sub. Abstract a little more and you have the actor model, which is a lovely way of building resilient, reliable, fault tolerant systems — exactly what we want out of robots.
Control loops also need to run at kilohertz and if you can't schedule them to run without jitter the whole system is useless. Realtime systems need to have an understanding of time budgets otherwise they will never be reliable enough for actually running in places where if they work suboptimally money is lost.