I’ve been a subscriber to Newsweek for several years now, and had been an occasional reader years before that.  In the past two years, there have been two redesigns of the venerable but struggling newsmagazine.  The latest, which debuted last week to poor reviews, came after Newsweek was sold off and then ignominiously merged with the (also struggling) Daily Beast website.

As I began reading the “new Newsweek”, I did a little tangential exploration on my iPod Touch as to what it was all about.  It seems a high profile editor in chief, Tina Brown, was tapped to replace the departed Jon Meacham, raising expectations in some quarters.  On the other hand, some observers noted that the last time Ms. Brown was given a big magazine to run, a decade ago, her Talk magazine (launched among great fanfare on the cusp of the millennium) went belly up after two years.

Meanwhile, there are others who see print magazines as a dead industry generally, due mostly to competition from the Web.  It seems generally understood that the “last big magazine launch” (ever, it is implied) was Portfolio, rolled out in 2007 by Conde Nast (where I worked briefly in its ’90s heyday) and it, like Talk, folded within a couple years, taking with it $125 million or more from the company’s coffers.  Here’s where I ran into a remarkable coincidence, which also serves as an illustration of the limitations of print magazines, which is fundamentally what spurred me to write this post.

Portfolio was helmed by Joanne Lipman, who like Brown had been feted as a wunderkind who had a track record of breathing new life into stodgy old journalistic institutions (in Brown’s case, the New Yorker; in Lipman’s, the Wall Street Journal).  It appears (as far as I can discover) that Lipman has been “between jobs” (other than serving on charitable boards) since Portfolio went under.  Thus, her appearance in the premiere issue of Tina Brown’s Newsweek redesign, would appear to announce the next phase of her career, as a business columnist.

I don’t know the relationship between Brown and Lipman, but it’s easy to imagine that Brown feels a sympathetic kinship for Lipman given the similarities in their having both started massively expensive magazines for Conde Nast and both having failed so spectacularly.  Brown is getting another chance, and in so doing, she is providing one (of a more modest proportion) for Lipman.  All well and good, except for the terribly unfortunate intervening of events.

For you see, this eagerly awaited issue of Newsweek, Brown’s first at the helm and containing Lipman’s column, arrived in my mailbox on Friday, March 11 (although I personally didn’t read most of it until a few days later).  March 11, of course, was the day the massive earthquake and tsunami struck Japan.  So, sure, right off the bat this highlighted the inherent “old-newsiness” of a print magazine, arriving in people’s hands just as everyone’s attention was focused on the disaster in Japan, which of course was nowhere addressed in the magazine.  But that’s just garden variety bad luck (I’m sure they were hoping for a couple of slow news days between sending the issue to printers and its arrival at subscribers’ homes).  The bad luck Lipman got in launching her columnist career was truly epic.

For you see, Lipman’s column was about–wait for it–nuclear power.  I’ll quote some choice bits, which are technically her characterisation of the views of Anne Lauvergeon, the French nuclear energy titan she is profiling, but make no mistake–Lipman does not hide the fact that she’s solidly in Lauvergeon’s corner:

“[N]uclear’s next big global moment has arrived…”

(Well, yeah–I guess you could say that.)

“If Lauvergeon is correct that now is a turning point, the timing couldn’t be better for her.”

(Euuuhhhh…I think you misspelled “worse”.)

“Bombs, missiles, commercial airplane crashes, terrorism, whatever happens, you will have no leak on the air or in the ground.”

I won’t take a cheap shot against the “one of her major pushes has been in the Middle East” line, as cringe-inducing as it is, since she goes on to insist that “she won’t do business with regimes she considers unstable. ‘Nuclear is made for countries that are stable,’ where it can be ‘managed in a rational way … that doesn’t mean a democratic way, but with some rationality,’ she says.”  So, does that mean geologically stable?  I guess not, as she lauds “major expansion into the U.S., China, Japan…” Oops.

Fish in a barrel, really.  I just marvel at the rotten luck in her first impression to readers here (most of whom will not have done the research I did and will not know her from Eve).  Brown and Lipman seem like smart, interesting women, and I’d hate to see them fall on their faces yet again–so I hope they will find greener pastures this week and in weeks to come.

This NYT article, about people (like me) who record TV shows (usually on DVRs, though I currently use a DVD recorder and timer) is from 2009, but was just brought to my attention:

Against almost every expectation, nearly half of all people watching delayed shows are still slouching on their couches watching messages about movies, cars and beer. According to Nielsen, 46 percent of viewers 18 to 49 years old for all four networks taken together are watching the commercials during playback, up slightly from last year. Why would people pass on the opportunity to skip through to the next chunk of program content?
The most basic reason, according to Brad Adgate, the senior vice president for research at Horizon Media, a media buying firm, is that the behavior that has underpinned television since its invention still persists to a larger degree than expected.
“It’s still a passive activity,” he said.

“Against almost every expectation” is a great way to put it.  I would not have dreamed in a million years the percentage would be that high.  What are these people thinking?!?  Other than the obvious fact that watching commercials is annoying, they are a huge time suck.  If you are like the average person and watch 35 hours of TV per week (though I suspect those who use DVRs watch even more), about 11 hours of that is commercials!  Over a year’s time that’s nearly 600 hours of commercial viewing–as if you spent over three months out of every year watching commercials as a full time job, 40 hours per week.

Even if you wouldn’t rather have that extra time to do something else (read a book, have a conversation, play tennis, take a walk, play a game with your kids, surf the Internet, canoodle with your sweetheart), and you’d rather spend that same amount of time vegging out in front of the TV, skipping commercials would at least allow you to watch more shows in the same amount of time. Let’s say your 35 hours a week are composed of ten half hour shows (news or comedies) and thirty one hour shows (dramas, dramedies, newsmagazines).  If you skip commercials, you could add a dozen more of the one hour shows and a half dozen more half hour shows in the same amount of time.  It just strikes me as a no-brainer.

“No brainer”…hmm…it would sure be interesting to do an IQ test on the two groups (those who skip commercials and those who do not).  As shocked as I am that there are so many people not skipping, It’d be even more surprising to me if there weren’t a significantly lower average IQ in the non-skipping group.

And that 46% is a very interesting proportion.  Stunningly large though it is, it’s nevertheless still a minority.  In a binary choice election, 46% is the mark of a soundly defeated loser.  I have a vague, difficult to pin down feeling that a proportion like this would be difficult to maintain in society in a stable way.  I just think eventually, the group that sits still for the commercials will decline to a more believable number (less than 20% and perhaps less than 10%).  The NYT article says the number had actually increased from ’08 to ’09, though; could it actually be heading for the majority?  I just can’t see it.  As DVRs get more popular, it will become more frequent that friends or relatives with DVRs visit other DVR households.  Surely the non-skippers will be met with incredulity from the skippers, and will be shamed into changing their ways (what counterargument could the non-skippers possibly mount?).

This might also be a generational thing.  I can imagine the 35-49 portion of that group, those who have become used to watching commercials for decades, being set in their ways and not be as comfortable thinking of grabbing a remote and hitting a button when the commercials pop up.  Although maybe young people find the ads entertaining, as in the Super Bowl?  Who knows.  I’m open to suggestions here as I still find this puzzling even after chewing on it for a while.

I make no secret of the fact that I’m a slacker.  I think it even says something about it in my “About Me” section.  I take a sort of impish pride in my slacker status, but it’s starkly apparent how uncomfortable this makes other people (at least here in the U.S., which is where I spend most of my time).  People look at me like I’ve just outed myself as a pedophile or ax murderer, nervously murmuring something along the lines of “well, surely you don’t mean it”.  I think it’s especially disconcerting for people to hear this “admission” from the father of three kids.

But it sure looks like Germans (and, more broadly, Europeans) are doing a lot of slacking–even those who have kids, one presumes–and enjoying a high quality of life in spite of it (in fact, in part because of it).  We could stand to learn a lot from them.

Now, this is not exactly breaking news: any member of the American intelligentsia (and I count myself one of that set) has seen any number of mentions of this disparity over the years.  But what makes this article (an interview with an author who is hawking a book on the subject) different is how it shatters some of the conventional wisdom about why the disparity exists.  To wit:

There aren’t any historical or cultural reasons for it. Americans famously had more leisure time than the Japanese back in the 1960s. I would say if you did a survey of most people who are in their late 50s or 60s, they will tell you that they take fewer vacations than their parents did. Now why did that change? It wasn’t because of the Pilgrims. People work hard in America, but there was a period where leisure time was increasing. I quoted Linda Bell and Richard Freeman in an article they wrote about what happened during the ‘90s. There was nobody to stop you from working longer. There was no government check, there was no union check as there is on excessive work as there is in Germany or elsewhere in Europe. These institutional checks are gone. So people feel like lab rats: “If I work an extra 10 minutes over the person in the cubicle next to me, then I’m less likely to get laid off.”

So what it comes down to is a more subtle version of the classic capitalistic method of squeezing more productivity out of workers: inexorably turning up the speed on assembly lines, raising quotas bit by bit, or having a sales contest in which first prize is a new Cadillac El Dorado, second prize is a set of steak knives, and “third prize is you’re fired”.

Another element of this great divergence between Europe and the U.S., which is not directly mentioned in the article but which is near and dear to my heart, is paid maternity/paternity leave, and work/life balance more broadly.  When it comes to parental leave, it’s not just that the U.S. simply lags behind other nations; we are in a whole different category from everyone else.  A USA Today article sums it up well:

With little public debate, the United States has chosen a radically different approach to maternity leave than the rest of the developed world.

That was written in 2005, but sadly nothing has changed since then.  This is in fact one of my greatest disappointments with Obama and Congressional Democrats (whom I generally support strongly): that they didn’t do something about this.

For low wage employees to get time off with their babies is going to take a government mandate or at least a strong role for unions. Low wage workers  just don’t have enough clout on their own to pressure employers to change their policies.  And in the current employment climate, even workers who have degrees or skills that are more in demand at higher wages are apt to be reluctant to insist on work/life balance when someone else will gladly take their job, balance be damned.

But before the “Great Recession” hit, there were signs that millennials in particular were beginning to insist on more work-life balance, and companies were slowly, grudgingly beginning to respond.  I’d expect to see a return to this dynamic once unemployment rates drop to normal levels.

This is, of course, the same generation that has been supporting Democrats so strongly, and who accepts the idea of activist government far more than older generations.  In a sense, they are looking for the U.S. to more closely resemble Europe, which is just what drives GenXers (my own generation and sadly the most conservative of them all) like Glenn Beck into conniptions. 

Seen through this lens, the Millennials and Boomers who wring their hands over their perception of Millennials as lazy may be just another expression of the older set dragging their heels and whining about moving in the 21st century Europeans already inhabit.  So I frankly rejoice that every year, Millennials join the ranks of the workforce and the electorate, replacing the Boomers and their elders who are holding us back.  It’s not going to be a smooth transition, that much is clear; but our society will be better off in the end.

That’s not a joke: I really am going to discuss my very own penis–so don’t say you weren’t warned!

Last week, the New York Times reported some very good news indeed: although 80% of all American men are still circumcised, the rate of neonatal circumcision plummeted from 56% in 2006 (which was already down about ten points from the rate in the ’80s and ’90s) to just 32.5% in 2009.  This qualifies as a big time trend (hopefully not just a “fad”).

Those in the pro-circumcision camp, including many male doctors who are no doubt themselves circumcised, have understandably gone into a tizzy upon learning this news.  They know that a 33% rate is not stable in American society.  Either they have to arrest the trend ASAP and get circumcising back into the majority position, or it will continue to drop precipitously until it’s in single digits.  Because let’s face it: most Americans are not very invested in either side of this issue.  They just want to stay with the herd, and do what is “normal” (that is, what the majority does, even if the minority has good arguments for why it is not normal at all, biologically speaking).  This was the strongest weapon in the pro-circ camp for many years, until just this moment in modern history.  Now that the majority has flipped decisively to the other side, and this has been made public via the NY Times and other media outlets (although in the case of NPR, in a despicably unbalanced way), there is likely to be a stampede away from the knife, unless the pro-circ side in their desperation can come up with some way to stem the tide.

One of the tactics they have been trying is to insist that there are myriad health benefits to circumcision.  Hannah Rosin, a vocal proponent of circumcision (she had her son circumcised, which is where her emotional investment comes in), has even warned of a “potential public health crisis” if the circumcision rates continue to drop.  As I told her at her blog, the only problem with that is that the  four healthiest countries in the world are Iceland, Sweden, Finland, and Germany, all nations in which almost no one is circumcised.  So much for that theory!

One argument I’ve seen bouncing around the blogosphere is a protest from Jewish men that the anti-circumcision campaign is “anti-Semitic”, or that at the very least, preserving circumcision has a “social benefit” by–get this–preventing Jewish (and Muslim) boys from being ostracised in the locker room!  Wow.  That’s just…no.  I’m sorry, but that doesn’t cut it (no pun intended).  I should be clear: I’m against circumcision no matter what your religion is.   I’ve seen some fellow “intactivists” make an exception for religion, but I utterly reject that.  A boy can grow up and decide to do that for his religion when he is 18; at age zero he has no ability to consent and may not even ultimately join the religion of his birth–which should be his right.

But wait, you ask: wasn’t this supposed to be about my penis?  Okay, yes: I am an American man with a rare attribute at my age (GenX): I did not get any part of my genitals cut off when I was a baby (or at any time since, for that matter). Let me tell you that while I don’t want to get graphic, I know how my body works, and I can see how the equipment of guys in pornos works, and let’s just say they are not just missing some irrelevant bit of skin. The whole natural way it’s supposed to function is not possible with them (as graphic as I’ll get is to suggest Googling “gliding action” and say that given this latest news, a good financial tip would be to short the stock for companies that make “personal lubricants” in about fifteen or twenty years). So I am ecstatic about this trend, and like to feel that I had a small but significant part in it by giving testimonials like this one online over the past decade.

I have often wondered how insane circumcision must look to men in non-circumcising countries. Now, as long as the momentum continues, we’ll get to see how insane a younger generation thinks it was that people did this “in the olden days”.  Arthur C. Clarke has a nice bit about this in his novel 3001, in fact.

For me as an intact American man in 2010, to listen to those who still desperately insist that circumcision is a good idea strikes me as what it would be like to live in a country where most people have traditionally had one eyeball removed at birth, but a growing number of people start questioning the wisdom of this tradition.  The defenders of routine neonatal eyeball removal would make defensive comments that “you don’t need that ‘extra’ eyeball; I can see just fine–and my risk of eye cancer is cut in half”. Well, sure, you can get by pretty well without it, certainly much better than with zero eyeballs!  But to get the full range of stereoscopic vision, you need both eyes.  And it’s really the same with the foreskin. We are naturally formed the way we are for a reason, and to routinely remove a part of a boy’s healthy genitals (the most sensitive part, by the way) is a holdover of a barbaric religious rite being awkwardly shoehorned into modern times by desperate defensive medical rationalisations.

The voters of the Academy of Motion Picture Arts and Sciences (AMPAS) are a strange, unpredictable group, not least when selecting the winner for their premiere award, Best Picture.  In some extent, the unpredictability of their choices can be an asset.  Unlike the Grammys, they usually do not reward the biggest moneymakers (although they won’t reflexively avoid crowning blockbusters either: in the last dozen years, Titanic, Gladiator, and Lord of the Rings: Return of the King have nabbed the prize).  Heavy, serious dramas predominate, as one might expect; but lighter fare like Shakespeare in Love wins now and then, as do actioners like the aforementioned Gladiator and LOTR.  2003’s award to Chicago reassured musical fans that their genre was still in the running, even in the 21st century.  (Only comedies have so rarely won, that one wonders if it might not be fair to give them their own category.)

But there’s another way in which it is hard to predict the winner of the award, one that is not so welcome: quality.  Now, of course the quality of a film is inherently a subjective issue, I understand that.  But frankly, my wife and I have good enough cinematic tastes that even if we might quibble over whether another film might have been slightly more deserving, we can at least understand how the film in question was at least in the conversation over “best of the year”.  Or, even more minimally, we ought to find the movie watchable enough that we aren’t fidgeting and checking our watches all the way through, hoping the agony will soon end.

It was just recently that she and I worked our way around on Netflix to catching up on the last two Best Picture winners.  Neither seemed appealing enough to be something we would have made a point to see absent that imprimatur of awesomeness from AMPAS, but we felt we ought to see what the fuss was all about.  We went in reverse order, watching Hurt Locker a few weeks back.  We were just so phenomenally bored by that film that I must confess that we did not in fact finish it–I think we only got about thirty or forty minutes into it and pulled the eject button.  Then last night it was Slumdog Millionaire’s turn.  The same fate almost befell it; but I convinced Brittany to watch it all the way through as I did want to see how it tied up, as well as the dance number I knew was coming at the end.  Still got a definite thumbs-down from both of us.

The film that preceded those two, No Country for Old Men, is in my opinion one of the greatest works of art in cinematic history.  So it’s only a two year dry run, but it is worrisome nevertheless.  And though some of the winners in the past have been sketchy (I’m thinking of Forrest Gump here), none were both dull and offensive as Slumdog was (Hurt Locker was just dull).

So I’ll be very interested in seeing what wins next time!  Anyone want to nominate anything that’s been out so far this year?

In my very first post on this blog, I said about parents who use “cry it out” (CIO) sleep training methods, that we who oppose it “need to turn up the heat, tighten the screws” because “I want parents to feel that if they CIO, it’s going to be something they have to feel nervous and at least a bit guilty about.”

Now today’s Globe and Mail features an article which first of all is helpful (despite the lame headline calling the alternative to CIO “coddling”, ugh) because it notes that “new research on infant sleep appears to deal a blow to those in the cry-it-out camp” and quotes a Penn State researcher as saying “Quite frankly, not too many researchers advocate that any more”.

But what really caught my attention was the closing paragraphs of the piece:

Although the method worked for them and their daughter, now eight months old, Mr. Reynolds is reluctant to discuss it with all the parents he knows.

“With some friends, we don’t really bring it up, as there is a lot of criticism out there.”

Toronto mom Carolyn Weaver [says:] “It’s gotten so controversial…People who are opposed truly believe that you are torturing and tormenting your child.”

Yes!  Just exactly what I was talking about.  This is a great sign of progress.  Most parents, I think, are reluctant to do something they have to hide like a dirty secret from the rest of the world.  Another couple decades, and new parents will wonder how this could ever have even been a debate.

Until recently, I confess that I had a fairly simplistic view of those who formula feed their infants: basically, they were either uneducated/ignorant/brainwashed, or they just couldn’t be bothered to do what is best for their babies.  And there are certainly many formula feeders who do fit one of these descriptions to a T.  My experience with breastfeeding, furthermore, has been with the mothers of my three children (my ex-wife and my currently breastfeeding wife), both of whom found breastfeeding easy; and the women I’ve met through AP groups who also seem to have no problems.

But more and more, I’ve had my eyes opened to the fact that there is a sizable chunk of the maternal population that really sincerely intended to breastfeed, who tried to do it, and who just didn’t manage to succeed.  As a result, they feel frustrated, bitter, guilt-ridden, and angry at those who dismiss or minimise their efforts.  And I’m increasingly concerned that lactivists do not shine enough of a light on this part of the story. 

There are, as with most things, blurry lines, shades of grey, involved.  How much of a factor are violations of WHO codes?  Non-baby friendly hospitals with their formula samples and spotty or nonexistent lactation assistance?  Lack of support and guidance from extended family, the broader kinship group, society generally?  Normative cultural and media portrayals of bottlefeeding?  Pathetically inadequate maternity (and paternity) leave?  I could go on, but a group called Best for Babes has compiled a great list of “Breastfeeding Booby Traps” and I’ll invite my readers to take a look at that. 

Still, I’m increasingly seeing signs that a not-insignificant number of women evade these “traps” pretty successfully, have the knowledge and motivation, seek professional lactation help when needed, but their breasts just don’t produce the milk.  Even famous lactivists, it turns out, are not immune.

As this blogger notes in a well-researched post, the problem seems to be on the rise; yet very few researchers are trying to develop treatments to help women overcome insufficient supply.  And as another blogger at the same site points out, the options for “crunchy”, whole foods eating, Michael Pollan reading types are not good: nearly every brand of formula contains corn syrup solids as its primary ingredient (eccchhh).  Blame for this I lay directly at the result of our megacapitalistic, factory farmed, monoculture processed food system.  We as a society should insist on higher standards for formula.  Maybe the government should step in, and subsidise the cost for those with lactation failure while working hard to make sure that those who can lactate, do.

So what do I feel we as lactivists have to do differently?  First and foremost, the mantra that every woman’s body is perfectly designed (by evolution, or by God, if you believe in her/him/it) to feed her baby is clearly not always accurate, and thus is unfair to those with lactation failure.  A variation on this theme is the arched eyebrow and the question: “how did babies survive before formula?” (well, they didn’t, always; and closer kinship groups provided aunts to be wet nurses if needed).  So instead of just insisting that women need to “trust their bodies”, lactivists should be beating the drum for more research on lactation failure, and improving formula for those who can’t breastfeed. 

At the same time, though, some women who have gone through lactation failure don’t always make it easy for lactivists to be their allies.  There is an understandable but troublesome tendency of human nature to gravitate toward “sour grapes” rationalisations as a coping mechanism to reduce the mental anguish that comes from being bitterly disappointed at not being able to achieve something, only to see others around you able to do it with no problems.  Specifically, I’m talking about comments minimising the considerable scientific evidence for the nutritional superiority of breastmilk, like “my child THRIVES on formula” or “no one can tell the difference ten years later between kids who were FF and those who were BF”. 

To a lactivist, these statements are always going to be like fingernails on a chalkboard.  We know the importance of breastfeeding, and to hear it minimised like that feels deeply wrong.  Furthermore, when it’s on a public forum, we feel we have to object and correct misinformation, in case any fence-sitters are reading.

But it’s my hope that lactivists (including me) can try to better focus our efforts on educating the ignorant, removing “booby traps”, taking on unethical formula marketers, pushing for research into lactation failure, and reaching out to those women who wanted as much as anyone to breastfeed but were unable to, and maybe even enlisting them as allies.

%d bloggers like this: