One of those things that's worth keeping in the back of your mind:
Things tend to change at the orders of magnitude.
What you need at 99% reliability is not the same thing you need at 99.9% reliability is not the same thing that you need at 99.99998% reliability.
So when you see people confidently assert that "let's just throw a neural network at it" you should immediately be asking "what is the cost of a failure."