Having worked in computer graphics myself, it is spot on that this shit is uncontrollable.
I think the reason is fundamental - if you could control it more you would put it too far from any of the training samples.
That being said video enhancements along the lines of applying this as a filter to 3d rendered CGI or another video, that could (to some extent) work. I think the perception of realism will fade as it gets more familiar - it is pretty bad at lighting, but in a new way.


Other funny thing: it only became a fully automatic plagiarism machine when it claimed that it wrote the code (referring to itself by name which is a dead giveaway that the system prompt makes it do that).
I wonder if code is where they will ultimately get nailed to the wall for willful copyright infringement. Code is too brittle for their standard approach, "we sort of blurred a lot of works together so its ours now, transformative use, fuck you, prove that you don't just blur other people's work together, huh?".
But also for a piece of code, you can very easily test if the code has the same "meaning" - you can implement a parser that converts code to an expression graph, and then compare that. Which makes it far easier to output code that is functionally identical to the code they are plagiarizing, but looks very different.
But also I estimate approximately 0% probability that the assholes working on that wouldn't have banter between themselves about copyright laundering.
edit: Another thing is that since it can have no own conception of what "correct" behavior is for a piece of code being plagiarized, it would also plagiarize all the security exploits.
This hasn't been a big problem for the industry, because only short snippets were being cut and pasted (how to make some stupid API call, etc), but with generative AI whole implementations are going to get plagiarized wholesale.
Unlike any other work, code comes with its own built in, essentially irremovable "watermark" in the form of security exploits. In several thousands lines of code, there would be enough "watermark" for identification.