The world is going to end if Y doesn’t happen. It’s been heard a thousand times before. Despite this, people fall for half-baked political proposals time and time again.
A good Bayesian should hyperbolically discount the concerns of people who regularly get concerned to excessive degrees. If someone generally displays poor epistemic practices, adopting their viewpoints into your framework for viewing the world actually makes you more credulous.
The question becomes “what do I do with new concerns about world-shaking events or situations?” I think we should start by looking to internal consistency. If the argument falls apart under scrutiny granting its premises, then it should be disbelieved. Similarly, if the premises are incorrect, based on circular reasoning, or overly complicated, one should discount reasoning. If A-B is more likely to be believable than A therefore B, C, D-Z.