the chatbot also showed that it understood combining drugs like Kratom and Xanax with alcohol.
No it did not, LLMs do not understand. Anything. These are well-known combinations that figure often in the training data and that’s why they were mentioned in the output. The LLM does not know or understand why these combinations are bad.
