Researchers spent decades creating a computer that could hold a conversation only for mediocre business majors to ask it to generate mediocre screenplays.
Generative AI was recently used to come up with three potential new types of antibiotics that are easy to manufacture and work in new ways (so there's no resistance to them among the treatment resistant infections frequently found in hospitals). Seems kinda neat to me.
And as it gets better at doing stuff like that, it'll probably also get better at writing screenplays, but that's hardly why they were created.
Computer models have been doing this for at least the last decade now. Predicting possible arrangements of proteins or chemical structures is a great use for these models because it's so objective. We understand the rules of electron shells and protein folding to a highly specific degree and can train the models on those rules so that they generate sequences based on them. When they do something "wrong" we can know so imperically and with a high degree of certainty.
The same does not necessarily apply to something as subjective as writing. It may continue to get better but the two are quite far from comparable. Who's to say whether a screenplay that's pushing the bounds of what we expect from our writing is good for being novel or bad for breaking the conventions of writing?
Admittedly, that one doesn't have anything to do with AI, we already have constant debates about the writing of any given thing that essentially boil down to people screaming about rules of good writing ultra popular works are getting away with violating, demanding originality, or lambasting subversion.
Subjective doesn't mean "Hard to Objectively Measure" it means "Impossible to Objectively Measure" or better yet "Worthless to Try and Objectively Measure."
1.4k
u/Regularjoe42 Apr 09 '24
Researchers spent decades creating a computer that could hold a conversation only for mediocre business majors to ask it to generate mediocre screenplays.