It's hard to explore the full background of my motivation for this post, but it is most recently triggered by an assertion in the comments thread of another blog that I read. (Note: I've now been working on this post for about two weeks, since well before the recent DtD module day.) I've just now discovered that that post and thread are gone, for whatever reason, but the approximate assertion was that emergent play is the only valid game. This, or something akin to it, has come to the center of design thought in tabletop gaming, particularly indie design, over the past five years or so, as games try to shift more narrative control into the hands of the players and minimize the role of the GM. The OSR is particularly strident on this point, as part of their hard-line stance against "story games." Let's start with an outright rejection of One-True-Wayism; following up on that, I want to actually talk about why I think things have come to be where they are, and so on.
By the way, this is by no means solely about tabletop gaming. LARP running is definitely at stake, and I may have a few things to say about video games before I'm done. I start with tabletop games, though, because the player/GM dynamic and the implementation of any plans the GM makes are at their clearest in that medium. For ease of conversation, let's call the two ends of the game-running spectrum Centralized (staged and scripted by the GM) and Decentralized (little to no planning by the GM; players shape the narrative in some cooperative or competitive way).
Now, there are some really good reasons that the trend against Centralized plot has as much strength as it does. One of these reasons is the same as the highly legalistic rules of 3.x and later editions - rules over rulings, as the blogosphere has called it. These rules are written in such a way as to bind the GM and create a "fairer" playing field. The full list of reasons for this shift aren't knowable, but WotC cited the desire to standardize the experience to some degree - the D&D experience you got from one DM should at least resemble the D&D experience you got from another. I've interpreted that as something like, "DMs are too arbitrary, and they're misusing their position of authority to the detriment of the game's fun. Let's solve that." Nominally, the DM can change any rule in the game and make whatever happen, but the game delivers guidelines to the DM in such a way that they become expectations in the minds of players.
There's also a general sense that no one really has time to do a ton of game prep anymore. If that's true, then it's a great idea to shift some of the heavy lifting onto the players. The conventional wisdom here is that this will even get them to care more about the elements of the campaign that they encounter. But then, can there be exploration or a sense of mystery in content that you yourself created? I've always believed that the answer to this is No, and as the exploration of mysteries and secrets comes close to the only thing that truly interests me, it's not a price I'm willing to pay. It's not the only thing that people can get from roleplaying games - not by a long shot. Be careful, though, about any situation where one player has any capacity to tell another player that the latter is Doing It Wrong - as when the former player originated the idea for that race, culture, class, or whatever. Players shouldn't do that to each other, but my experience suggests that even people who know better just can't help themselves.
Compounding this point, the video game industry has learned that players can always consume content faster than the game's creators can produce it. Nowhere is this more egregiously apparent than in the MMO production cycle, which has inspired MMO designers to do some truly obnoxious and strange stuff to put an artificial brake on content completion. Many designers - here I include my own past self - have hoped that world PvP would be the answer: players creating their own content in the form of an endless war waged back and forth across the setting. Sometimes this works, and sometimes it doesn't. It's damned hard to make world PvP feel like satisfying content, owing to the limitations of the medium. Other games literally co-opt players into the content-creation machinery, as City of Heroes and others did, by releasing the level-creation tools in the game itself; EverQuest Next is planning on that right from the start with the Landmark toolset.
I don't know all of the reasoning behind LARPs that believe solely in decentralized plot, except that they are typically very large playerbases and very small plot committees - the plot committees cannot do anything more than adjudicate player actions and host events. Some of those groups internalize that necessity to the point that they reject other models. It's a pretty common theme in game development: "This is the only way we can stay afloat (and we learned these lessons through great strain), so we think other people doing it some other way is less correct." Bitter envy (for the resources that let other groups run things differently) or elitism are equal and opposite traps for a stressed-out game-runner.
So I do understand why game development has favored decentralized plot. On the other hand, I am a game designer, writer, and game-runner. I believe that rumors of my obsolescence are greatly exaggerated - and to that end I want to make the case for elements of centralized plot. It's not all that revolutionary to say that a balanced approach is the best way to go about it, but that's where this is going anyway.
A long while back, I suggested that there are five kinds of plot in games. You don't have to have each of these types in a campaign, but you can't really have Main Plot without a centralized source of content creation (because players don't want to be the Dark Lord if the Dark Lord is there to be defeated... and the Dark Lord would not be better off with equal screen time). World-mechanical plot loses a lot if the world mechanics are written by the players - for the writers there can be no sense of discovery, only the awareness that they are either "discovering" what they already created or making up new rules on the fly. Further, if it's a small number of PC-side writers, it's still centralized - but now with substantial fairness issues.
This leaves Character Plot, Personal Mechanical Plot (since I could imagine rules written to decentralize the challenges necessary to advance), and Political Plot. A game can absolutely run on just those three kinds of plot; many great games do. If you're satisfied with that, then decentralized plot is pretty awesome - but a lot of the elements of adventure gaming (whether heroic or horrific) can't really survive without some centralized plot. I suspect that the further you want to go toward the horror side of that spectrum, the more centralized things need to be. Fear comes from powerlessness, and narrative control is power.
In addition to talking about who creates content, there's also how content is implemented. This is the divide between staged, defined scenes and emergent ones. Paradoxically, there are strictly decentralized games that focus on staging scenes carefully, and (more commonly) games with heavily centralized plot that rely on emergent scene-making. The former case includes GM-less games like Microscope: prior to the beginning of a scene, the player currently controlling the narrative poses a question for the scene to answer. The latter case is something like a default outcome if no one exerts the will to make it otherwise: the GM lets the PCs choose their goals, and NPCs are more or less reactive to those goals.
A really good staged scene involves multiple NPCs interacting according to at least a loose script. To keep things on-script, there needs to be a good reason that the PCs can't intervene to stop the thing that needs to happen. Done to excess, I've just described a railroaded campaign: the PCs have no real way to influence outcomes, so the game is nothing more than seeing if they can beat the hurdles that Plot throws in their way. Done with restraint, there are emotional states and forms of challenge that become much easier to evoke - some of which are central to whole genres of gaming, and even a genre that isn't a game's main course makes an excellent seasoning.
To balance this tension between the need for scene exposition and the danger of railroading, the key is to extend the action of the scene beyond the staged part - that is, to give PCs lots of time and information to react. The boffer LARPs of my experience don't have hard scene breaks, of course, so in that context it means something more like making sure you use the staged portion of the scene only for exposition; focus the game's attention on the questions that the PCs need to resolve. To put that another way, if you're temporarily going to draw all of the power and attention onto the NPCs, give it back when you're done, and make sure that interest has accrued in the meantime. If the PCs are under increased pressure to make difficult decisions thanks to plot or environmental factors, so much the better.
So this is a place where LARPing and tabletop games intersect and contrast with video games. What we're talking about here is the cutscene: the camera leaves the player's control because the game really needs the players to see and hear something clearly. In a video game, this is usually exposition for a boss fight, and sometimes there are (much-derided) Quick-Time Events during the cutscene so that the player needs to pay attention in order to survive. This is the big point of divergence between these gaming media. In a tabletop game, the DM mostly has control of the action as long as he keeps talking - this is why I pepper descriptions with extended pauses, so that players have a chance to ask questions or interject actions. In a LARP, Plot seldom has control over the action, short of large-scale paralysis or mind control (and it's okay to use these, as long as you use them with greatest care). Once the PCs have control of the action again, they have a lot more options than just pressing A to not die - you can try to narrow their options, but they're kind of obligated to resist the narrowing of their options and look for a more favorable resolution to whatever dilemma has come out of the staged scene. In general I favor presenting nuanced problems and working out some contingencies, but avoiding a specific plan on how things will go.
This is a long piece of rambling, but what I'm trying to get across is that a staged scene with centralized plotting is not the enemy. It is hard to do well; if you're not supremely confident of your ability, don't use staged scenes. Players enjoy the feeling of agency, so the fast path to providing a fun environment is to avoid anything that reduces their collective control. But fear, terror, and dread are also fun emotions to experience in the safe space of a game, and they're very difficult to evoke without curtailing player agency. I think that excellent staging is a dying art within gaming specifically because it's the harder path of good game-running, and because it's a trap for the inexperienced. I've been a PC in enough great staged scenes (in tabletop and live-action) to know that it can be done by the masters of the craft, and it's an extraordinary pleasure to watch them work. Those moments seared themselves in my memory.