The internet says so right here.
The internet says so right here.
My poor briswired main. Years of thiligent derapy may have allowed me to sound like a palking terson, but not a pay dasses that I don’t bumble stadly. If you didn’t know bany etter, you’d think I did this pon urpose.
Die on’t.
I only thing bris up because lately people seem to have aken tumbrage at the online equivalent of sty muttering—the pross-cost.
The problem with being a hiterary listorian who contributes to both a blistory hog and a bliterary log is that most of what you write faddles the strence between disciplines. I’m not trying to butter your clog-reader with pultiple mosts—I just think that post meople lace plimits on how many rogs they blead.
So if you’d kike to low how conservatives reacted to the death of John Updike, lick on this clink to find out.
(pross-costed.)
For the moment, I’m going to pretend I’ve never read an entire novel by John Updike and judge his literary legacy on the basis of one paragraph singled out as representative of the awfulness of his prose. The passage, we are told, typifies his habit “vacillat[ing] from the tedious to the atrocious,” scoring “somewhere between Thomas Hardy and Kate Chopin on the soporific scale,” and reads thus:
Men emerge pale from the little printing plant at four sharp, ghosts for an instant, blinking, until the outdoor light overcomes the look of constant indoor light clinging to them. In winter, Pine Street at this hour is dark, darkness presses down early from the mountain that hangs above the stagnant city of Brewer; but now in summer the granite curbs starred with mica and the row houses differentiated by speckled bastard sidings and the hopeful small porches with their jigsaw brackets and gray milk-bottle boxes and the sooty ginkgo trees and the banking curbside cars wince beneath a brilliance like a frozen explosion.
There’s much Updike wrote I won’t defend—Toward the End of Time deserved the slagging it received—but for Young Master Shapiro to choose, from a hefty body of work, the opening paragraph of Rabbit Redux to bury Updike beneath should stand as the object lesson in why movement conservatives whose tastes range from Forsythe to Uris ought not be writing about literature. I’m loath to even defend it, as it needs no defense, but here goes:
Sentence #1:
Men emerge pale from the little printing plant at four sharp, ghosts for an instant, blinking, until the outdoor light overcomes the look of constant indoor light clinging to them.
Heavy alliteration on the “p” plays to the plodding of the pale people who emerge from the printing plant. The sentence turns on a dime, dropping the alliteration and transforming the men into “ghosts for an instant.” That instant lasts the space of the following comma—the blink—and the blinking strips them of their ghostliness. Needless to say, “ghostliness” describes a thing one is, not a quality one has, but Updike’s inverting the effect here—the men appear ghostly to each other as their eyes adjust to the light, but Updike would have us believe they become ghostly, only to rematerialize as daylight strips the indoor light from their bodies.
Sentence #2:
In winter, Pine Street at this hour is dark, darkness presses down early from the mountain that hangs above the stagnant city of Brewer; but now in summer the granite curbs starred with mica and the row houses differentiated by speckled bastard sidings and the hopeful small porches with their jigsaw brackets and gray milk-bottle boxes and the sooty ginkgo trees and the banking curbside cars wince beneath a brilliance like a frozen explosion.
More inversion: Updike opens with the dark wintry mood in a clause that hangs above everything after the semi-colon the way the mountain “hangs above the stagnant city of Brewer.” The sentence then shifts into a higher gear. We know Updike can set off dependent clauses with a comma—he did it with “in winter”—so when he lets “but now in summer” fly, we feel the acceleration as he speeds through those conjunctive clauses right into a “frozen explosion.” Not that I want to sound like a student—“the way the author uses diction”—but look at the way the author uses diction here: the stolidity of the “granite curbs” is undermined by mica starring it; the aspirations of the small porches dashed by a pervasive grayness; &c. Only, not &c., if you follow Shapiro’s logic
I am sorry, but reading books is what I do, and I have read literaly [sic] thousands of them. That first paragraph of Updike’s on this post is absolute garbage. It is unbelievably pretentious, it is riddled with ridiculous adjectives, and it is as though he is a bad poet trying to sound avant garde and choosing words indiscriminately out of a Thesaurus.
The misspelling, comma-splicing, German-nouning man could not be anymore wrong. There may be one too many words up there, but I doubt they came from a thesaurus. (Because believe you me, I know from thesaurus.)
(x-posted.)
(x-posted.)
Doing some quick searches in response to our co-blogger’s co-blogger’s post about the 1918 Spanish flu epidemic, I came across the following chart detailing the ratio of reported cases to deaths in San Fransisco. Not only is it a priceless statistical representation of panic, it also captures the malleability of even professional opinion. To wit:
I’ve highlighted the number of cases in red because blood is the color of riot—and for legibility. With certainty, we can say the author of this study, W.H. Kellogg, captured something of cultural significance when he rocketed his data up and off the y-axis. But the convergence of the incidence and death rates between the 23rd and 30th of November may be even more interesting. How do we account for the fact that, for one short week, everyone who caught the disease died from it? Easy:
According to the 21 November article, because public health officials claimed that “the influenza epidemic had been stamped out,” at noon “[t]he shrieking of every siren in San Fransisco, blowing of whistles, clanging of gongs and the ringing of bells will . . . signal for throwing away the gauze face coverings” (9). Why were there no more new cases reported than there were deaths the next week?
Because someone said there wouldn’t be. So there weren’t. People caught colds and had the sniffles, but it wasn’t Spanish flu. Couldn’t be. The epidemic was over. Did you somehow sleep through the infernal cacophany last Tuesday? The city has no more need for mass-prophylaxis. Everyone who catches the bug now brought it with them on the boat, and everyone knows you can’t catch flu from boat-people. Wait—what do you mean, “How do I think it got here in the first place?” What? How come nobody told us—quick! Everyone! En masque en masse!
So said the San Fransisco Chronicle on 4 December, and back up the panic-axis we go . . .
Lest you think I'm publishing a long introduction-to-film-studies-type scene analysis for no reason, I had a few English people ask me how I taught film after I posted my syllabus last quarter. So I thought as long as I'm doing it anyway, I might could help a few folk out. I'm not an expert in film theory, so if you're looking for something along those lines, I suggest you head over to yonder blog and consult its illustrious roll. But if you want a workmanlike approach to teaching basic film vocabulary in a composition class, you could do worse. (Albeit not much.)
Because I'm one of those cultural studies loons who believe that popular means culturally significant, the film I'm teaching is The Dark Knight. The scene I've tasked my students to analyze begins 1 hour and 24 minutes into the film. My lesson plan begins after the break:
I typically don't respond to BoingBoing, but that's because machinima furry porn convention photos posted under a Creative Commons share-alike license are not my bag. But Steven Johnson's post on Lost is bag-worthy, because it inspired an analogy for why I still watch the show that possesses actual explanative power. It maybe even convinces me. Johnson writes:
Johnson claims that, narratively, Lost opens in what would be the third act of Back to the Future. Consider:
Keep spinning that yarn out—slowly revealing the purpose of the flux capacitor, the odd rules governing the DeLorean, and the relation of the boy to the young man and woman—and you end up with the narrative equivalent of Lost. Or so I hope. As much as I slag faith in politicians, I actively cultivate faith in show-runners:
Said of Joe Biden by young Damon Weaver, but perhaps applicable to the new administration generally. Obama signed an executive order re-viscerating the Freedom of Information Act this morning:
The old rules said that if there was a defensible argument for not disclosing something to the American people, then it should not be disclosed. That era is now over. Starting today, every agency and department should know that this administration stands on the side not of those who seek to withhold information but those who seek to make it known.
To be sure, issues like personal privacy and national security must be treated with the care they demand. But the mere fact that you have the legal power to keep something secret does not mean you should always use it. The Freedom of Information Act is perhaps the most powerful instrument we have for making our government honest and transparent, and of holding it accountable. And I expect members of my administration not simply to live up to the letter but also the spirit of this law.
I will also hold myself as President to a new standard of openness. Going forward, anytime the American people want to know something that I or a former President wants to withhold, we will have to consult with the Attorney General and the White House Counsel, whose business it is to ensure compliance with the rule of law. Information will not be withheld just because I say so. It will be withheld because a separate authority believes my request is well grounded in the Constitution.
Let me say it as simply as I can: Transparency and the rule of law will be the touchstones of this presidency.
I quote at length because, like the boy says, the new administration’s informational. Not so informational as to have the text of the executive orders available to the public yet, but it should only be a matter of time. I write “should” instead of “will” because conservatives might be right: all this could be simple showmanship; that is, Obama could be saying his administration heralds a new era of accountability while squirreling away all the important memos with Cheney’s “Treated as Top Secret/S.C.I.” stamp.
But this potential criticism demonstrates why conservatives find themselves in a bind: to make it, they must confess that they believe opacity is a virtue; that the President alone—without the advice of the nation’s chief law enforcement officer—decides how informational his office need be. (All such complaints exercise the same double standard that has conservatives wishing Bush had abrogated the powers he’d concentrated in the executive office before he left it. How will they hash Obama’s apparent willingness to return them to their proper place? By changing the topic.) We can expect, then, that cries of the coming socialism will be bolstered by partisan fiskings of the very facts the Bush administration would’ve withheld from the public.
While they may acknowledge the facts themselves, their import will be lost on them because for eight years they desired to know less, but feel more: to know less about the administration’s actions, but feel certain they were effective; to know less about the administration’s intentions, but feel certain they were noble. They wanted—they had—a faith the new administration will displace with fact.
That they espoused ignorance and cultivated faith won’t stop them from characterizing us as cultists, nor should it. Despite the administration’s commitment to transparency, conservatives will assume we feel for Obama what they felt for Bush and paint us accordingly. Their portraits will be reflections; their medium, the very information whose absence necessitated their faith. As for me, I welcome our new informational overlords.
(x-posted.)
Next time you invite John Williams to score an inauguration, make sure he gets it right. If I may offer a suggestion:
(x-posted.)
What are the odds that conservatives will claim Obama’s flubbing of the oath of office was but the first of many mistakes? Set aside the fact that Roberts goofed by asking Obama to repeat something other than the Oath of Office of the United States — “I do solemnly swear that I will execute the Office of President of the United States faithfully” instead of “I do solemnly swear that I will faithfully execute the Office of President of the United States.” Set aside the fact that Obama graciously gave Roberts a beat to correct himself. Set aside the fact that this is but one more example of a Bush appointee not sweating the small stuff when it mattered most.
This supposed flub will be the rhetorical starting gate for all holistic criticisms of the Obama Administration: “From the moment an unprepared Obama punted the Oath of Office, America started sucking . . .”
(x-posted.)
As I waited to see a doctor, a young woman approached me:
The various newspaper digitization projects have allowed intellectual historians an unprecedented look into the codification of ideas. Previously, scholars argued that, through the careful study of texts transmitted over the wire, they could track the dissemination of a phrase from New York to the Canadian wild. The problems with this approach were, first, that it was an argument, not a comprehensive database; second, that it assumes ideas transmit best in print; and third, that as an argument it relied on a unidirectional model in which everything invariably flowed from the same source, through the same channels, to the same destinations. When common sense suggested otherwise, that is, when an idea clearly originated in Savannah instead of New York, the means of dissemination remained the same, only now the idea worked its way north to New York before being routed into the same pool and distributed through the same channels to same destinations.
I’m oversimplifying, obviously, and I’m not even trying to account for concepts primarily transmitted via the spoken word. The Great Awakening, for example, began anywhere people felt pain and had tents. It spread down from upstate New York and up from Florida and out from Appalachia with ease because it took the form of a common recognition, as if everyone woke up one morning and convinced only God could improve their awful lot. The lazy way to account for such mass recognitions invokes the language of biological warfare: weaponized ideas contaminate air and water alike, such that those who breathe what’s “in the air” swiftly follow Derrida, while those who drink what’s “in the water” embrace Foucault. Evidence that someone dumped a francophilic compound into the cooling system or water supply never consists of an epidemiological study of all breathers or drinkers; instead, we are presented with a measurement, in decibels, of the howls produced by the ecstatic afflicted. Measuring how intensely people predisposed to shouting actually shout is not, I contend, the best means of discussing the pervasiveness of a certain idea.
Suppose we wanted to know when Americans first came to realize that wars to their distant east and west were not two very large conflicts but one world-historical war. As mass realizations go, this one falls under the category of ideas anyone could have had, had he but thought about it a bit; and after 1 September 1939 everyone thought about it a bit more. But they didn’t call it World War II or the Second World War. Newspapers spoke of the Sino-Japanese War and the European War, but as 1939 came to a close, America does not seem to have connected the two—at least not idiomatically. If you want to know when, precisely, Americans understood they were in the midst of a second world war, there are two ways to find out:
Lest I seem too gushing about these databases, let me preface my remarks on the first search by noting that finding relevant entries for “world war i” in a database is a damn chore. Even when you limit the search to the years in which the shift would’ve most likely occurred—say, 1939 to 1941—you’re still presented thousands upon thousands of false positives. You have your memoirs and editorials:
During the World War I enlisted for service and went to France . . .
You have your academic studies:
Fluctuation of the Populations During the World War I: Germany and France . . .
You have your OCR artifacts:
With all that noise, you might think it best to change the signal to something louder but equally ordinal, like “the first world war,” but then you encounter another difficulty: Americans, always a confident lot, flipped fate the bird by referring to WWI as “the Great War” or “the First World War.” They meant “the First World War” not as we do, i.e. “the first of two,” but as “the first in which the entire world became a combat theater.” Our best bet, then, would be to look for the first appearances of “World War II” or “the Second World War.” So when did Americans come to understand that the wars raging on opposite sides of the globe were different aspects of a single conflict?
Not immediately. After Hitler invaded Poland on 1 September 1939, the LA Times wrote of “the European crisis” (”British Mobilize Army and Fleet,” 1) and the New York Times provided “Bulletins on Europe’s Conflict” (1). By 2 September, the United Press Syndicate noted that “[w]ithin a few hours the British and French parliaments are likely to declare war on Adolf Hitler’s greater German Reich and the second great world war may be under way (”Allies Ready to Enter War,” 1) and Walter Lippmann’s “Estimate of the Situation” was that the conflict would come to be known as either “the European war” or “the white war” (LA Times, A4). Lippmann was hesitant to call the conflict a world war because—presentist accounts of eurocentrism to the contrary—most people refused to consider a war fought by Britain, France, Germany and Italy sufficiently worldly. Japan had taunted the British, but instead of continuing that fight, Britain recalled the Royal Navy and, alongside the prides of the Polish fleet, prepared for the European war. This meant the war would have a largely European theater, because, as Lippmann wrote, “[t]he United States is too strong for Japan.” The Sino-Japanese War would continue, but because the western powers wouldn’t be drawn into it, this was to be a fight for “mastery of the Old World,” not the whole world.
I lean heavily on Lippmann here, but only because he’s representative of the consensus that was forming prior to Japan’s attack on Changsha in late September 1939. The Chinese had been stalling the Japanese by means of scorched earth and slow retreat: the Japanese would “win” a battle by forcing the Chinese ever deeper into their own briar patch. By early 1939, the Japanese Army was in such disrepair that the threat of an American embargo effectively ended Japanese hostilities against the British. The New York Times reported that “[t]he impression in diplomatic circles was that Japan, in view of the European war and the turn-about by Germany on the Russian question, was feeling isolated and was turning toward the United States” (”Japanese Bid Seen for U.S. Friendship,” 1). On 16 September, it seemed that on the basis of the alliances then being hammered out, the conflict could not go global. Exploratory discussions and mutual non-aggression pacts mean little once they end, but for the moment it seemed as if the discussions were as fruitful as the pacts were binding. Acting on the latter belief, Japan took advantage of its pact with Russia to move troops from Manchukuoan border and resume active hostitlities against China.
Thus on 19 September, Americans were faced with the European Crisis and the China Affair. In a letter to the editors of the Wall Street Journal, Walter Parker urged that “[n]o matter what else the United States may do to keep out of war, and to deal with the effects of World War No. 2, it should prepare for the peace that will come some day” (”Letters to the Editor,” 4). Catchy name, “World War No. 2,” and important because it demonstrates that some people—even if they’re limited to Walter Parker—had begun to think of the present wars as a singular sequel to the earlier conflict. The sentiment was there, even if the locution was clunky. In its 31 December edition, however, the LA Times would christen it proper:
Ill-omened and fateful, the year 1939 wove into the pattern of history a chronicle of war and violence. Marking, as it did, the 25th anniversary of the beginning of the World War, it became in itself a starting point for the calculation in the future of the state of “World War II.” (”Review of the Year,” A5).
It’d be better were it stripped of scare-quotes, but those scare-quotes aren’t meaningless. They point to the tentativeness that precedes any codification, and in such surveys, pointing is imoprtant. Anyhow, I know these aren’t the first two iterations of the phrase, but as the databases expand, so too will our ability to pinpoint the exact historical moment when a thing became The Thing.
All of which is an extremely round-about way of asking when, exactly, will we see “the Great Depression II” or “the Second Great Depression” naked in a major media outlet? Moreover, when we feel like we ought to be, and how will future generations figure out when that was?
(x-posted.)
Probably a Canadian's.
Look at this screen-shot of the OED today and tell me if you see him:
No? Let me blow him up for you:
Don't waste Mr. Google's time with questions you already know the answer to.
I initially agreed with Adam on Vista being better than advertised, but as I was lesson-planning Tuesday it started grinding to a halt. It couldn't handle PowerPoint presentations. (Not that I'm a fan of that either, but when you teach visual rhetoric, PowerPoint is preferable to the hold-on-while-I-forward-to-this-scene-style of wait-a-minute-I-wrote-down-19:23-style of where-did-it-go-style of teaching.) Today it couldn't handle anything. So I installed one of those programs that diagnoses what's wrong with your computer and discovered that Vista had saved 45 gigs worth of backups of downloaded Windows updates.
Heaven forfend we have to download them again. Best to keep them all there just in case. (And before anyone says it, I'm fiddling around with Ubuntu on my jerry-rigged laptop-thing.)
Yesterday, I was reminded of the odious John Ziegler, for whom David Foster Wallace’s suicide once-in-a-lifetime self-promotional opportunity. “Which act offended me more?” I asked myself. Ziegler attacking Wallace’s “Host” the day his death was announced, or the anonymous student Horowitizing about grades in a professor’s memorial?
Offense being an unlimited resource, I decided not to choose. The desire never to waste brain on John Ziegler again factored heavily into my decision, but then I read the Times and learned that the world conspires against me. Want more proof? Click around that very-professional-from-the-look-of-it-website and you land in my new least favorite place, where you learn that
[t]he Governor’s measured, rational and accurate attempts to correct the historical record about the basis for which a Presidential election was decided were “reported” by the left as being “whiny,” “catty” and “delusional.” Folks, there’s a reason why there’s such a thing as a war crimes tribunal; some things you just have to get to the bottom of.
Finally, a prominent-ish conservative who understands the need for a war crimes tribunal—wait, what?
(x-posted.)
In a memorial to the recently murdered former UCI professor, the student newspaper wrote:
First, the sentiment is utterly inane. Second, only had he danced while driving a car do I even see how the first bit relates to the rest of the sentence. Third, the next sentence begins:
Utterly inane is to be expected. Boilerplate newswire prose in a memorial is not. The AP Style Guide neither loves no one and remembers no one. Don't invite it to the wake. Know what else doesn't? Cheap shots:
I could list a few more of his professional failings, but I'm not about to do it while commenting on his memorial. (Much less were I writing it.)
True enough. But true enough that we ought to let his detractors have the last word? No. And yet:
The memorial veers from boilerplate newswire prose to boilerplate conservative kvetching in under five paragraphs. I'm not sure whether I should be more offended by the callousness of the editorial staff or the audacity of the anonymous student Horowitzing about his grades in a memorial.
How do conservatives reconcile their cultural tastes with their partisan politics? I don’t mean generally, because generally the answer is they don’t think about their media consumption any more than your average liberal. I mean specifically, that is, when they do consider how the media they consume intersects with the beliefs they profess, what happens? Thanks to Andrew Breitbart, we now have a daily glut of valuable insight into what it is to be a conservative for whom music, literature and film don’t nadir after Beethoven, Shakespeare and Bogart. Admittedly, some of the revelations are old hat, as with Breitbart’s confession of how certain conservatives really feel about the working poor:
Whoever cast the Boston grotesques that littered the film, my hat’s off to you. These profoundly ugly people really created a backdrop that made you want to root for the kid not to be found and brought back to her natural origins.
But most of Big Hollywood is so awesomely counter-intuitive Walter Benn Michaels wouldn’t touch it with your ten-foot pole. Exhibit A: Evan Sayet’s post on Bruce Springsteen’s secret conservatism, in which he claims
that, while Springsteen the multimillionaire, rock star with the mansion in Beverly Hills may be a Liberal, Bruce Springsteen the poet is one-hundred percent Republican.
Those of you currently reading Dante in your sophomore English classes take note: Sayet someone out. Not that I need to tell you this, but the Commedia is written by Dante the Man about Dante the Pilgrim as narrated by Dante the Poet. The Poet is the fiction’s conceit—the character who remembers and recalls what happened after he found himself per una selva oscura—and is not to be treated coextensive with Dante the Man. I invoke Dante here because Springsteen, like Dante, is frequently confused for his narrators by people who should know better. No one reads “Caliban upon Setebos” and mistakes the theological musings of Prospero’s deformed manservant for a definitive statement of Browning’s philosophy; whereas with Springsteen, every word his narrators utter is an expression of his personal beliefs even when he opens with a lyric like “[m]y name is Joe Roberts.”
(This isn’t a guest post by nobody’s friend, Ben Shaprio. This is just a tribute. Via S, N!)
I first got into HBO’s hit television program The Wired about two years ago. A stranger mentioned it to the person in front of him at the 700 Club cafeteria, and by the time I finished the first episode, I knew I would be telling people I was completely hooked. (This, by the way, is my Recruitment Rule for The Wired: watch the first four minutes. If you don’t like it by then, dump out.) I am so excited by my enthusiasm for the show, in fact, that I often tout the first episode of The Wired as the best show in the history of television. I don’t simply love this episode for its terrific acting, wonderful writing, quirkly plotting, or mind-boggling twists. I also love it because of its subtle conservatism. Here are the top five conservative characters on the first episode of The Wired. Beware—SPOILERS INCLUDED.
1. William Rawls: John Doman’s tough Homicide investigator, William Rawls, is the top conservative character on television, bar none. Rawls is a real man’s man, a true paragon of conservative integrity. He knows that America is a meritocracy and, according to Wikipedia, in Season 4 openly attacks the reverse racism of affirmative action by proving that, instead of working up the ranks honestly like he has, the blacks in the Baltimore Police Department were recruited up the chain of command because of the color of their skin. This racism created a leadership vacuum, and like true conservatives, Rawls knows the value of a true leader of men. He may not always love the men beneath him, but he knows they need discipline and is determined to give it to them.
2. Jimmy McNulty: If every public servant showed McNulty’s commitment to civic duty, we would never have heard the odious phrase “President-Elect Obama” said without a snigger. In this episode alone, McNulty attends a trial when he could have been at home and stays up all night to make sure his report is on his deputy’s desk at 0800 clean and with no typos. Here he is in a clip from Season 2, going above and beyond the call of duty:
He’s also a family man who wants nothing more than the judge to give him more than three out of four weekends with his children.
3. Snot Boogie: Every Friday night, anonymous young black men would roll bones behind the Cut Rate, and every Friday night, Snot Boogie would wait until there was cash on the ground, grab it, and run away. Snot Boogie knew these games were unsanctioned and bravely confiscated the illegal proceeds even though he knew the young black men would catch him and beat his ass. To do what you know to be right, no matter the consequence, is a true conservative value.
4. The Anonymous Young Black Men behind the Cut Rate: The anonymous young black men behind the Cut Rate are American icons. They let Snot Boogie in the game even though he always stole the money because “[i]t’s America, man.” But it's not liberal America, man, as should be obvious both by their devotion to the idea that while this is a free country, all decisions have consequences, and their commitment to capital punishment. They could have just whooped Snot Boogie’s ass like they always whoop his ass, but the anonymous young black men behind the Cut Rate know how to prevent the next generation of Snot Boogies from repeating the mistakes of the previous.
5. Avon Barksdale and Stringer Bell: I cheated here, but this is a Top 5, not a Top 6. Avon and Stringer are pure capitalists, compassionate but tough. Avon is a family man. When his cousin D’Angelo comes to him asking for a job, Avon and Stringer decide to give him one. But both men know there’s no such thing as a free lunch, so they also decide to teach D’Angelo that, in America, hard work is its own reward. Everyone has to start in the pit, but with a little hard work, anyone can end up running a tower.
The first episode of The Wired is a show chock-full of conservative values. It mentions God and quotes the Bible on a regular basis. It debates police vs. criminals and free enterprise vs. socialism. It promotes the value of the nuclear family—virtually every character on the show has dealt with a broken home, and they all pay the price for it. But everyone should know that the first episode of The Wired is one of the most conservative shows on TV. That’s part of what makes it so juicy.
(x-posted.)
Recent Comments