Replace LLM with employee in your argument - what changes ? Unless everyone at your workplace owns the system they are working on - this is a very high bar and maybe 50% of devs I've worked with are capable of owning a piece of non trivial code, especially if they didn't write it.
Realiy is you don't solve these problems by to relying on everyone to be perfect - everyone slips up - to achieve results consistently you need process/systems to assure quality.
Safety critical system should be even better equipped to adopt this because they already have the systems to promote correct outputs.
The problem is those systems weren't built for LLMs specifically so the unexpected failure cases and the volume might not be a perfect fit - but then you work on adapting the quality control system.
>>replace LLM with employee in your argument - what changes ?
I mentioned this part in my comment. You cannot trust an automated process to a thing, and expect the same process to verify if it did it right. This is with regards to any automated process, not just code.
This is not the same as manufacturing, as in manufacturing you make the same part thousands of times. In code the automated process makes a specific customised thing only once, and it has to be right.
>>The problem is those systems weren't built for LLMs specifically so the unexpected failure cases ...
We are not talking of failures. There is a space between success and failure where the LLM can go into easily.
Realiy is you don't solve these problems by to relying on everyone to be perfect - everyone slips up - to achieve results consistently you need process/systems to assure quality.
Safety critical system should be even better equipped to adopt this because they already have the systems to promote correct outputs.
The problem is those systems weren't built for LLMs specifically so the unexpected failure cases and the volume might not be a perfect fit - but then you work on adapting the quality control system.