ETA: I told myself several times yesterday to remember to mention this when the Souciant piece went live, then forgot about it today. This piece wouldn't have happened without Charlie Bertsch and his excellent editing skills. Or, if it happened, it would have been half as long, one-quarter as interesting, and would have really been a rehash of the blog posts.
Last week, Alison Krauss turned 40. A child prodigy, she always convinced me she was about as old as, say, Anna Paquin. She was playing violin by the age of 5, and Wikipedia tells us she already had her own band when she was 10. She made her first recorded appearance at 14, and released her solo debut when she was 16. (She is often backed by Union Station, although her contract for a long time stipulated she alternate between Union Station albums and solo work, which seemed to matter more for the name on the cover than it did for the music.) Her fiddle playing may have been the hook that got people’s attention, but it was her voice of crystal that kept us around.
Her popularity grew, twice leaping forward. In 1995, she released Now That I’ve Found You, an anthology of her work to that date (it’s a sign of how early she started that her well-deserved collection came when she was only 24). It was a country hit, and it crossed over into pop. In the early 2000s, her work on the O Brother, Where Art Thou? soundtrack raised her profile even more.
If there is a criticism to her later work, it’s that there is a consistency, a sameness to it. It’s not exactly a formula, though. And throughout her career, she has worked with other artists, which adds variety to her entire body of work. (This was obvious most recently in her successful partnership with Robert Plant.)
I’m thinking about Alison Krauss because of Pandora. I created a Pandora station for my wife that we listen to on weekend mornings. I worked obsessively on this station, and I have to say, a month might go by before a song comes on she doesn’t like. Pandora has some rule about the number of times you’ll hear a particular artist. But sometimes I think I should rename my wife’s station to “Alison Krauss.” I’m just guessing, but it seems like Pandora goes by the artist’s name when deciding how often they can be played. Well, Krauss records under her own name. She records as Alison Krauss and Union Station. She cut an album with Robert Plant. One album was by “Alison Krauss and the Cox Family.” On the Oh Brother soundtrack, she is listed three different ways, depending on who she is singing with.
The point is, in an hour of Pandora listening, we might hear Alison Krauss five times. And this is a good station, diverse in line with my wife’s taste. But Krauss’ work is spread around so that she can trick Pandora.
Finally, here’s a fact known to her fans, but perhaps surprising to those who haven’t fallen under her spell. Think of all the great female artists in recorded history. Aretha comes to mind, or maybe you like Barbra Streisand, or Diana Ross and the Supremes, or Patti Smith, or Pink. Well, Alison Krauss has won more Grammys than any of them. In fact, her 26 Grammys so far are more than any other female artist, ever.
“When You Say Nothing at All” was a bit of an anomaly when it was first released. Keith Whitley had recorded it in 1988, and took it to the top of the country charts. Whitley died young, and in 1994, Krauss and Union Station recorded a version of the song for a Keith Whitley tribute album. Radio stations picked up on it, and it was released as a single in early 1995, then included on the Now That I’ve Found You anthology. It was a hit, and won a CMA award for Single of the Year. I imagine she’s gotten a little tired of singing it over the years, but truth is, I’ve never heard any bad ones. She sang it on TV, she sang it in concerts, she sang it for her live CD/DVD, she sang it when I finally got around to seeing her in concert. And I’m going to listen to it again as I dig up the following YouTube versions (admittedly, they all sound pretty much the same, i.e. great, so there’s no reason to watch them all, other than to note when Krauss changed her hair color). Here’s the official video:
Here she is with the band on Austin City Limits … hair is blond, but she’s still downplaying the glamour:
This is from their live DVD, and is in HD. She talks at the end of this one:
And if you like those ghoulish tracks where “duets” are created between a dead person and a living one, here’s Krauss and Keith Whitley:
Finally, as a reward for anyone who actually made it this far, here’s a real old TV appearance with Krauss playing the fiddle:
The basic structure of Google+ continues to be enticing and marginally useful. I say “marginally” because, despite an impressive growth in the number of G+ users during this “field trial,” most people I know are either not participating yet in Google+, or are still posting the vast majority of their online output on Facebook or Twitter. I haven’t changed my mind about the pleasures of using Google+, nor have I changed my opinion that what a service like this needs is more users.
Meanwhile, though, Google has its own ideas about which users they want. Specially, Google is not happy with the use of pseudonyms, and have closed the accounts of people they find who aren’t using their real names. Two examples:
Dr. Kiki is an established online presence, a neurophysiologist who has moved from academia to science journalism. She makes no effort to hide her real name (Kirsten Sanford), but it is as Dr. Kiki that most of us know her and her work. She has a weekly netcast on TWiT, “Dr. Kiki’s Science Hour” (TWiT is not only one of the most popular producers of tech-related netcasts, but has spent a lot of time the last month talking about Google+, largely on “This Week in Google,” with that talk being objective but very positive). Yesterday, Dr. Kiki made a guest appearance on TWiG, explaining that Google had cut off her G+ account. The story was a bit complicated … I wasn’t sure if she was canned for simply using a pseudonym, or for having “Dr.” in her name (apparently that isn’t allowed, either). The point is that someone clearly not trying to scam anyone, someone with a following, someone who appears on a popular online network, was booted from Google+ because her real name isn’t Dr. Kiki.
A friend of mine has had a pseudonymous presence online for as long as I have known her, and that’s a long time. She does a good enough job of keeping her “real” life and her pseudonymous life separate … I don’t think I realized they were the same person for quite awhile. She has good reasons for using a pseudonym … as I recall, these reasons have changed over time, and at this point, many of her online companions know her by her pseudonym. She climbed on the Google+ bandwagon, and I looked forward to seeing her there. But she didn’t last long, deciding to preemptively remove herself from G+ before her pseudonym led Google to eject her. I have more than 300 people in my G+ circles, but only a couple of dozen of them are actual friends (most are semi-famous techies of interest). Now I have one less G+ friend.
There are reasons for requiring real names in online environments. Pseudonymous comments sections often devolve into useless bastions of vicious screeds (I’ve given up reading comments on SF Gate, for example). But there are important reasons for maintaining a pseudonym, as well, and thus far Google does not seem willing to treat users on their individual behaviors rather than on the name they choose to use.
On a different note, here’s a good, short article on being a newcomer to Google+. It’s written by someone with extensive online experience; she’s not a newbie to the tech world. And she likes what she’s seen so far on G+. But she also finds it is still a bit intimidating: “Stumbling through Google+ with two left feet.” (It is perhaps noteworthy in the above context to note that the author is Amber MacArthur, a respected, veteran Canadian reporter and online personality who goes by the pseudonym “Amber Mac.” That is the name she uses in the above-mentioned article, which appears in The Globe and Mail, and is the name she uses on Google+.)
I’ve been all over Google+ the last few weeks, but events in baseball today have reminded me that Twitter is the place to go for breaking news (and this holds true for more than sports, of course).
The Giants are on the verge of a huge trade. Details are unknown, although speculation is rampant. I wanted to follow this story as closely as I could, and that meant one thing: Twitter. No one from the team is talking, and fans are merely jabbering about what little we know. But we get that information from the writers who are on the story. They are the ones keeping us up to date. It goes without saying in 2011, I guess, but there is no need to wait for tomorrow’s morning paper, and I bet even the guys on the Giants’ radio station are mostly just repeating what they are reading on Twitter.
The sources for the story are interesting, too. Someone with a marginal connection to someone else in the Giants organization knows about so-and-so’s flight schedule. The kind of source that needs to be checked and double-checked, but in the meantime, it sits there for us to peruse.
And in the middle of all of this, a tweet comes through that Erwin Santana has a no-hitter through 8 innings. So I turn on the MLB Network and watch the historic 9th inning (Santana did it).
Google+ would be a good place to have a hangout where a bunch of us could have a group video chat about the above (although I don’t have enough friends online with G+ who want to do that, so it’s kind of irrelevant to me). It would be a good place to have a discussion about everything. But Twitter is where the news is breaking.
I watched the video of the Giants’ appearance at the White House today. A day doesn’t go by lately without me grumbling under my breath about what the President is up to. But seeing the World Champs, watching the President give his joshing, casual speech, seeing Willie Mays sitting behind Obama, damn if I didn’t get choked up for approximately the 8,234th time since last November 1.
I think my favorite part might have been when President Obama was shaking everyone’s hand, and he got to Mike Murphy. Mike Murphy shaking hands with the President of the United States at the White House!
I know it’s for show, I know tomorrow I’ll return to muttering about the President. But today, for 20 minutes or so, we all got a break.
Wagon Master (John Ford, 1950). I must say, I find the appearance of this film on best-of lists to be surprising. I could see some John Ford fan appreciating this mellow entry from the old man (OK, that’s cheating, he was younger when he made this than I am now), but that’s about it. A lot of the praise for the film comes from the way Ford gives the screen over to actors who were usually second-tier … the absence of John Wayne meant Ben Johnson, Ward Bond, and (to a less enjoyable extent) Harry Carey Jr. had nice moments in the sun. But Joanne Dru is wasted; Red River came only two years earlier, but Dru’s feistiness is buried here (Ford doesn’t give us “Hawksian” women). The movie is slow, largely gentle, doesn’t demonize the Indians for a change, and is over in 86 minutes. A genial time-passer, but nothing more. #625 on the They Shoot Pictures, Don’t They list of the top 1000 films of all time, which is a bit of a head-scratcher. 6/10.
Rango (Gore Verbinski, 2011). An animated western from Industrial Light & Magic and the guy who directed all of those crappy Pirates of the Caribbean movies. It’s a busy movie, nice to look at, and geared towards an adult audience. Roger Ebert, who gave it his highest rating, compared it favorably to another western he loves, Blazing Saddles, and that might my problem right there, because I don’t much like Blazing Saddles. As is often the case in contemporary movies, homage is paid to various films of the past, from Fear and Loathing in Las Vegas to Apocalypse Now, from Sergio Leone westerns to Chinatown (Ned Beatty does John Huston about as well as Daniel Day-Lewis did in There Will Be Blood). All of this allows for an entertaining parlor game, “Hey, I saw that movie!” 6/10.
The Passion of Joan of Arc (Carl Dreyer, 1928). #15 on my Facebook Fave Fifty list. #20 on the TSPDT list. 10/10.
The Earrings of Madame de … (Max Ophüls, 1953). #14 on my Facebook list, #85 on the TSPDT list. 10/10.
This week's song is inspired by a video I saw earlier in the week. I don't remember who had it first; I have a feeling it's viral and everyone has already seen it except me. It's so good, though, I don't mind being a johnny-come-lately.
“Black Dog” was the lead track from Led Zeppelin IV (or whatever you want to call it), arguably the greatest album from the greatest metal band ever. Of course, being Led Zeppelin, there’s a lot more than metal going on, with their folk roots showing and their blues spooky and magnificent. I’m guilty of ascribing all of the monster Zep riffs to Jimmy Page, who could seemingly pull them out of his ass at will, but apparently it was John Paul Jones who came up with this one. In true Zeppelinese, Jones said the intricate rhythms were intended to make the song impossible to dance to.
Led Zeppelin have been gone for decades, now, yet they retain the ability to inspire an audience. This video shows what I mean … the vast majority of people here were likely not born when Led Zeppelin broke up:
Last week, I noted that educated guesses had the number of Google+ users at around 10 million. The same person thinks it’s up to around 18 million now. Still a paltry sum compared to Facebook, but the increase is notable. All of my siblings are in, and the number of friends and acquaintances is up to around 40. Many of them are starting to post on G+, and I even just finished a brief hangout with two others. It’s all about baby steps, but it’s a lot different than even two weeks ago, when I spent most of my time reading posts by Scoble.
I’ve opened a second account for the Gmail account I use for school stuff, and so far, I’ve been able to make it work (although I use a different browser to help separate my two selves). I’m not sure yet how I’ll use it in the classroom; it will most certainly be an option for office hours and the like, but I don’t think at this early stage in the life of G+ that I can require my students use it. Meanwhile, Liz Dwyer has a few words on the topic: “Why Google+ Is an Education Game Changer.”
Edd Dumbill has written the most interesting Plus-related piece of the week, “Google+ Is the Social Backbone.” He doesn’t waste time getting to the point:
The launch of Google+ is the beginning of a fundamental change on the web. A change that will tear down silos, empower users and create opportunities to take software and collaboration to new levels.
Social features will become pervasive, and fundamental to our interaction with networked services. Collaboration from within applications will be as natural to us as searching for answers on the web is today.
He asks the question, “Why not Facebook?”, and follows it with “Why Google+?”.
Facebook's chief flaw is that is a closed platform. Facebook does not want to be the web. It would like to draw web citizens into itself, so it plays on the web, but in terms that leave no room for doubt where the power lies. … If you want to use Facebook's social layer, you must be part of and subject to the Facebook platform. … This is not to set up a Facebook vs Google+ discussion, but to highlight that Facebook doesn't meet the needs of a global social backbone. …
Google+ is … the first system to combine a flexible enough social model with a widespread user base, and a company for whom exclusive ownership of the social graph isn't essential to their business.
And he concludes:
Obstacles notwithstanding, Google+ represents the promise of a next generation of social software. Incorporating learnings from previous failures, a smattering of innovation, and a close attention to user need, it is already a success.
It requires only one further step of openness to take the Google+ product into the beginnings of a social backbone. By taking that step, Google will contribute as much to humanity as it has with search.
The entire piece is worth reading. As you can see from the excerpts above, Google+ has already inspired some utopian thinking. It’s not clear why this is happening. Charlie Bertsch noted the enthusiasm of early users and admitted he was surprised. I replied:
Some of us are Google fanboys, so we can't be trusted. More objectively, G+ has made a good start, it is in many ways better than the alternatives, and geeks don't care if they are the only ones using it right now ... in fact, some of them prefer it that way (there has already been joking, but negative, talk about the release of the iPhone app, for the simple reason that it will bring lots and lots of users).
I also don't think you can discount the frustration many people feel with Facebook.
Ultimately, though, it might be enough for the first batch of users that Google+ is not Wave or Buzz.
There’s much more to it than that, but I sure can’t put my finger on it. I know this: I never felt the need or desire to devote blog posts to things I’ve said and read on Twitter or Facebook, with a few notable exceptions. But three weeks in, and I’m already filling half of this post with quotes from or about Google+.
A discussion has started on our Facebook film group (still plenty of time to join, we’ve got 40+ movies to go, email me if you’d like in) that represents the best of what such a group can offer. I’ve tried for the most part to separate what we do over there from what I post here, but the discussion is too good for me not to bring it to the blog.
I put The Passion of Joan of Arc at #15 on my list, and claimed that it was “a justifiably great classic film … I don’t mean great like The Social Network but I mean great like Hamlet or The Great Gatsby or Born to Run.” Phil Dellio contested this … I’m hoping he doesn’t mind my quoting him … “To me, great is a very fluid, very subjective concept that can apply to anything, and I really don’t think of my favourite music and films and books as belonging to different categories of great.”
I love this comment, because I’ve long agreed with it. But a bit of self-analysis points out that I’m fooling myself to some extent. Here’s an edited version of some of my various replies:
In grad school, I was known as the "anti-canon" guy. But I realized two things. One was that, despite all my efforts, in my younger days I'd gotten a fairly traditional film education, and I've never escaped it (hence a lot of my choices on this list). The other was that I spent an entire chapter of my dissertation on Mickey Spillane. I just ignored any of my mentors who were dismissive of the idea that Spillane belonged in an English dissertation. But, finally, in a footnote, I confessed that working on that chapter had led me to an inescapable conclusion: Mickey Spillane was a crappy writer. Entertaining, canny about his understanding of his audience, worthy of serious examination ... yes to all of those things. But his writing wasn't any good "as writing." It did me no good to explicate this or that sentence (though I tried, I really tried). What was fruitful was looking at the worldview of the Mike Hammer books. …
I am not used to being the one defending the canon. It really does have something to do with watching six weeks of silent Ukrainian films when I was 19 ... I didn't know what a canon was, I just sucked it up (well, I didn't much like Ukrainian silent films, but that's for another chat). Once I hit grad school, I had to develop a feisty attitude because, with my years of auto-didactic learning as a steelworker, and my B.A. in American Studies, I was WAY behind everyone else on the canon of British literature. So I rebelled. In my first semester, after keeping my mouth shut for weeks, I finally asked, "why are we reading four books by John Bunyan?"
But then I took a film class, populated mostly with undergraduate film majors. They were the feisty ones, a great bunch of whippersnappers ... I loved that class. But, for all their brightness and energy, I soon realized I had something they didn't: a solid grounding in the canon. And I couldn't help understanding that maybe my background was actually useful.
I still never got around to reading Milton, and I rarely assign "classics" to my students now that I teach. But I'm not surprised my movie list is so stodgy. It's embarrassing, but what can I do?
Re-reading this, one thing stands out: I make it sound as if watching six weeks of silent Ukrainian movies was one of those key moments in a person’s life that affects us forever. And all this time I’ve been blaming my mom for my faults …
I suppose it’s not exactly the series premiere, since it apparently was a web series first (I admit I didn’t know this until I read the early reviews). Anyway, I’m a bit of an oddball when it comes to liking Lisa Kudrow. I never saw Friends … and by never, I mean I never saw a full episode, and never saw much more than a minute or two in any event. This was not by design, or because I thought it was a bad show. I had no idea if it was good. I just never got around to watching it.
But Lisa Kudrow … she put together one of the great lost series of our time, The Comeback, which was so excruciating in a self-aware way that it made Curb Your Enthusiasm look like child’s play. I loved The Comeback, and so I loved Lisa Kudrow. There is no doubt about it: if it wasn’t for The Comeback, I wouldn’t be here talking about Web Therapy.
Web Therapy mines that same area of excruciating comedy. But, as Hank Stuever pointed out, while Valerie Cherish in The Comeback was at times a pitiable character, Fiona Wallace in Web Therapy comes across as “a stupid, despicable person.” Kudrow once said that the genius of The Comeback grew out of the fact that Valerie knew that what she was doing was demeaning, but she couldn’t break out of the habit of acting in stupid ways, because she didn’t see any options. It was more excruciating, knowing that Valerie got the joke but continued on, anyway, allowing herself to be humiliated. In the first episode of Web Therapy, there is no sense that Fiona is self-aware.
Still, the style of the series is intriguing (it takes place entirely on webcams), critics who have seen more episodes say it improves, and Lisa Kudrow deserves some slack while her show grows into itself. The premiere was interesting enough for me to come back next week.