ChatGPT safety systems can be bypassed to get weapons instructions

NBC News found that OpenAI’s models repeatedly provided answers on making chemical and biological weapons.

Oct 10, 2025 - 16:30
 0  0
ChatGPT safety systems can be bypassed to get weapons instructions
NBC News found that OpenAI’s models repeatedly provided answers on making chemical and biological weapons.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow