While waiting for pizza this evening, I read an article by David Allan Grier in IEEE Computer about the ways in which technology has changed entertainment, particularly the theatre, over the last 40 years or so.
In particular, he discusses how automated lighting, sound and so forth can afford a stage manager the opportunity to calibrate the response of the audience by controlling the timing of cues much more closely, much in the same way a live television producer does the same. What this has meant is that show production, in addition to be a massive organizational exercise, is now a performance unto itself.
Later, he goes on to talk about ways in which producers of other media gauge audience reaction and adapt accordingly – focus groups for TV and movies, golden ears for music, and now, with technology, learning systems based on customer profiling and crowd-sourcing, that can supplement socially driven recommendations such as friends or local record store owners – last.fm being a prominent example.
So inspired, here’s an interesting extension that occurred to me:
What if specialized AI, running locally, could be injected into traditionally mass-produced media like music, TV, or movies to act as a kind of virtual stage manager? It could observe you, the audience, a focus group of one, then tweak the timing, the content, the tone, and even the script of media to better suit your current mood, your tastes, to stimulate you in ways to which you are more sensitive, or even to better fit your available time.