The great privacy debate comes home to roost

I recently came face-to-face with the murky issues of social media and hiring in the workplace.

Days before the person’s interview, a candidate to join my organization started sending Twitter messages addressed to my username.  So I got a notification from Twitter that I was “mentioned” in someone else’s Tweet (Twitter is designed in such a way that another user cannot contact me privately unless I choose to “follow” that user explicitly.  Which I wasn’t).

What was strange was that the person was addressing questions to me about the upcoming job interviews.  Asking something about my company’s job benefit package if I recall.  In other words, questions that could, and should, have been addressed in the setting of an interview.  Even more curious is that the person sent those same Tweets to another employee with the same questions.  And that other employee didn’t even work in my department.

A flood of questions came into my mind…..

  • Could I read this person’s Tweets?
  • Should I read them?
  • Are somebody’s Tweets in the public domain?  With over 100 million users, and the ability of any user to see the Tweets of any other, one could argue yes
  • Are things in the public domain fair game for evaluation in a hiring process?  Beyond the obvious off-limits discriminators such as age, gender, sexual preference, etc.
  • Even if such information wasn’t in the public domain, was it still something I could, or should, use in my evaluation?  In many parts of the world, employers are entitled to collect information about an employee beyond what is offered by the employee him/herself

I found that the answers to these questions were far from obvious.  I suspect it will be years before the law and business behavioral norms will catch up with these issues.

I won’t tell you what I did in answer to these questions, out of respect for the privacy of the individual and the fear of opening up a legal can of worms.

What I will say is that I approach online life as if everything I say can be read by others, and thus used by them to form some judgment of me: my blog, Facebook account, Twitter, LinkedIn, etc.  Is this a form of self-censorship?  Yes, to a degree.  I guess there’s still a role for offline communications and “antique” forms of online communication like email or SMS.

“Creative destruction”: Netflix gets it

About 15 years ago (!) I worked for Reed Hastings.  He was not one for sitting pat.  At the time, he realized that the software category of Software Quality Assurance (SQA) was going to consolidate.  And that you could either embrace that eventuality,  or hold on to the past.

He embraced it, by seeking a merger between his baby Pure Software (he was the founder) and Atria.  Soon thereafter, the combined company was part of Rational and is now a product suite at IBM.

Reed is at it again.  Netflix is separating its DVD-by-mail service from its live streaming service.  I loved the quote from Engadget today:

What really happened here is quite simple: Reed Hastings just put a gun to the side of his DVD-by-mail business and pulled the trigger. Given that he aimed for the ankle, though, it’ll probably take a while for it to completely bleed out. But hey — proactively putting a fading business out of its misery sure beats bleeding for it on the balance sheet.

Joseph Schumpeter and more recently Clayton Christensen have written about creative destruction and disruptive technologies.  Reed is one a few high tech leaders that has the courage to implement what the rest of us know: do unto oneself before it’s done to you.

Here’s the bit that stops others from doing the same: Netflix’s share price, and perhaps even near-term revenue, could suffer.  For most of the industry, one can’t tolerate the thought of taking a step back to take two forward.  And hence the balk at such bold moves, fearing the reaction of others.  Like shareholders or pundits.

Perhaps the definition of “courage” is not fearing the reaction of others?  Game on, Reed.

I’m having a case of “TED envy”

The TED Conference is going on this week.  I wish I was there, even as I’m consoled by the fact that the weather in Tel Aviv is gorgeous during my business trip.

While I’ve never been to the TED Conference, I have adopted the habit of watching TED Talks online.  The premise of the talks is that a (presumable) expert gives the “talk of their life” in 20 minutes on their area of their expertise.

An aside: I noted that humorist John Hodgman is speaking there this week.  He wrote a wickedly funny and strange book called “Areas of My Expertise”. Maybe that’s why he was invited.  The book is highly recommended.

While TED and TED Talks have been pretty interesting stuff, I thought about the undercurrent of TED.  Which seems to be the unspoken: “let’s all get together, call each other smart, and be confident that the high cost of conference admission weeds out the others”. This type of self-referential, self-reinforcing elitism usually brings out the contrarian and cynic in me.  As in, “the really smart people probably avoid this type of conference like the plague”.

But when you look at the caliber of the speakers, you have to ask yourself: is there a still-higher caliber of people left out?  If so, what percentage of the “smart people” population are they?  Probably very small indeed.

In the end, I decided TED people are way smarter than me.  Ergo, I’ll keep watching TED Talks and wishing I was there.

My parents are on Facebook; I’m outta here!

I’m not the first to write about how parents’ arrival on Facebook has driven their children elsewhere.  I wouldn’t be surprised to see some other social media site overtake Facebook in popularity amongst teens and young adults .  Just as Facebook did to MySpace years ago.

But for people over 25, Facebook is here to stay.

Being a teenager is about experimentation.  Teens go through phases of trying on “personas” through the cliques they belong to, the way they dress, their tastes in music, TV, movies, books etc.  A lot of those experiments are best forgotten, even if they form some facet of the future adult.

For example, I had the nickname “Bambi” in college when I wore my hair shaved to half an inch. I also wore Stranglers t-shirts with swear words on them.  The hairstyle and the t-shirts are gone now, but the music remains in my collection.

If you’re leaving evidence of one of those experiments online, you might prefer to forget about it later.  And you might also prefer others (read: parents) not to see the details along the way.  Hence, the reluctance to share Facebook with your parents.

So if being a teen is about forgetting , is being an adult about remembering?

I think the attraction of Facebook to adults is the ease of remembering by staying connected.  As an adult, friendships get left behind not out of embarrassment but out of practical necessity.  Getting married?  Your single friends might be superseded by couples.  Having kids?  You’ll probably hang out with other parents.  Moving cities?  It’s hard to keep in touch with your friends in the prior city.  Changing jobs?  Your old work friends will drift away.

Facebook helps keep you connected.   100 years ago, people were less mobile and had circles of friends that didn’t change much over time.  Today we change so fast.  But that doesn’t mean we want to divorce ourselves from the past.  Facebook plays a valuable role in helping us stay connected.

Global workforces: when will we get it right?

From time to time, my company looks at its strategic initiatives and asks the question: “where in the world should we locate the people for this work?”  Every company does it (or should),  so what follows should in no way be construed as being about my current employer.

But these internal conversations led me to reflect on how we still don’t seem to have the formula right in high tech.  And perhaps by extension in other industries that can have globally distributed workforces.

It seems like every country in the universe of high tech choices is stereotyped and typecast.  The USA is a place to export work from.  India or Czech Republic are places to export work to.  What about the inverse options?  Or the distribution of work between two (stereotypically) low cost countries?

I hate to say it, but management consultants’ use of the term “value chain” is apropos.  We need to get better at framing our choices and thinking about the integrated whole.

Let me suggest a simple scorecard.  Maybe each element can be rated on a 5-point scale.

First is labor costs.  For a given type of work, what’s the fully burdened cost in each locale?  And what’s the rate of wage inflation that forecasts costs in 2-3 years?  For example, I think a lot of people were disappointed in their near-term return on investment in India given the wage inflation, especially in high tech.  Those who have taken a long-run view of costs in India have been rewarded with an ever-deepening talent pool.

Second is critical mass of the labor pool.  Does the locale in question have an ecosystem to tap into?  Such as large companies you can raid.  Or lots of venture-based start-ups.  Or strong universities.  Bottom line: will there be enough people to choose from?

Third is critical mass of the team. Will the work being performed reach critical mass to form a team?  When teams feel ownership over their assignment, they can accomplish great things together.  Individuals working on teams located elsewhere perhaps not so much.

Fourth is distance from dependent resources.  This could be the distance from key executives.  Or peer departments.  Or subordinate functions.  It’s a hassle for people to stay late at work, or make calls from home at night, or get up really early, all to simply interact with the rest of a team.  Hence the critical mass comment above.

But it’s also a huge tax on managers’ and executives’ time and energy to travel long distances to oversee a distant operation.  After a few trips to a new site, the number of visits from “headquarters” starts to dwindle and the sites become divergent.  I had the pleasure of working for a company that was truly committed to its global locations, and the executives traveled accordingly.  I’m not sure this is the norm.

Last is infrastructure.  What kind of communication infrastructure do you need to link these sites?  This is not trivial if you want to maximize the flow of information, as you must.  For example, Cisco’s 3-screen Telepresence is a wonderful product.  But at $100,000 per site means most people will opt for something inferior.

I’ll sum it up.  People need to feel that they belong.  That they “own” the work they do.  That they don’t have a target on their back because the company is unsure if they are good “value for money”.  That they have easy access to others on whom they depend.

If we get this right, good things will surely happen.

Is a Private Internet Coming at Last?

Maybe I’m the last to the party of prognosticators that have long predicted the emergence of multiple or private internets over time.

But it does feel like something big is brewing in the world of consumers and the internet.  Time spent online is growing, at the expense of television viewing.  Time spent online is shifting, to social media.  Big user bases of 100 million and larger are emerging, from Facebook’s nearly 600 million to Twitter to Zynga to Last.fm to LinkedIn.  And let’s not forget Yahoo’s and Google’s  hundreds of millions of monthly users that have been with them for years. Let’s call these sites the  Cabal.

And there’s the dimension of capitalism.  Today, about $30b is spent annually on online advertising.  Google earns a whopping 87% of that money.  Meanwhile, Facebook has rocketed from 9% of U.S. online advertising revenue share a year ago to 23% today, overtaking Yahoo! along the way.  The Cabal is gearing up for a fight over users money and it’s going to get nasty.

So, we’re converging as users on these social media sites.  And these sites are vying for big advertising bucks in turn.

Where does that leave the rest of the internet?   You know, the billions of other web pages.  Has it been rendered less relevant?  Or does is contain the classic “long tail” of content that makes it valuable for each of us in highly personal ways.

The idealist in me says that the Internet in whole – this embarrassment of riches – is deeply rewarding to us all and will continue.  The capitalist in me says that with so much money at stake, the Cabal will shape the future.

Where do you stand?

San Francisco days and the “failure stigma”

I just wrapped up a week-long trip to Silicon Valley.  Despite traveling here probably 70 times or more over the years, I always feel the special energy of the place.  And it energizes me in turn.

Lots has been written about Silicon Valley’s culture of innovation and the appreciation of good ideas and smart people.  But what distinguishes this place the most, in my opinion, is the acceptance of failure.

Failure is the inevitable by-product of innovation.  After all, most innovation fails to live up to its commercial promise.  But nowhere else in the world is there a systemic lack of the “failure stigma”.   And somehow this unleashes a form of creativity that is less constrained by concerns about eventual success or failure.

An example: have you ever been in a brainstorming meeting with colleagues?  Where you came out of the meeting with some really good ideas?  Did you notice that few if any of those ideas were implemented?  Maybe it’s the “failure stigma” that stood in the way.

I think managers and executives are the source of the stigma.  And those who punish failure and reward success in binary terms are losing the subtleties of two things.  First, why was a success a success?  We often don’t actually know.  So how do we know how to replicate success?

Second, what can be learned from the failures to apply to the future?  It’s not “don’t screw up again”.  Though these are the signals we tend to send.

So, should the rest of us turn into wanton risk-takers?  Not exactly, given the cultures of the many other places in which we live.  But a good start would be to create a culture that inspects past failures and seeks to learn.  Without punishment.

Awkward times at the RSA Conference

I was at the RSA Conference in London this week.  As a recently-departed employee of the company that hosts the conference, it was a bit awkward.  On the one hand, these people were my colleagues and friends.  It was great to see them and have a beer.

On the other hand, both I and they have moved on.  Lots has changed at RSA since I left, so their recent history is divergent.  And I’m working in a market sector mostly unrelated to theirs, so there’s fewer shared business topics to talk about.

Perhaps the most awkward bit was trying to strike a balance between being friendly and not spending too much time lingering.  I don’t know about you, but in the past I’ve encountered ex-employees at conferences where you get the sense that their lingering equals longing.  As if they regretted leaving your company and yearned for the good old days.

I’m completely at peace with my choices.  I wouldn’t trade my time at RSA for anything, nor would I second guess my move to a new & exciting company where I’m constantly challenged and learning new things.  But I certainly miss my friends.

There is a slightly antique word “gadfly” that comes to mind.  Seems apropos, at least the annoying bit.  As in, don’t be a gadfly at the RSA Conference.

My media holiday

I’ve been on a U.S. media sabbatical for six months.  I don’t miss it.  And my perspective is slowly changing about my country.

Before I left Boston this year, I was a pretty voracious consumer of news media.  I spent 30-45 minutes reading the Boston Globe cover to cover every day.  I watched the morning newscast before work.  I read my Yahoo! portal page.

Was I a media junkie?  I didn’t consider myself one, if only because I didn’t watch political commentators on Fox News, CNN or MSNBC.  I guess on reflection I was pretty close though.

Today, I don’t have cable or satellite television service (yet).  As I wrote earlier, there’s so much content on the internet that my television functions mostly as a giant monitor for watching iTunes, or streaming TV shows from websites, or watching DVD’s.  The amount of news content I consume has dwindled to a daily skim through the Yahoo! portal and occasional visits to Boston.com.

Having disengaged for a while, it now seems like U.S. media is a tempest in a teapot.  For example, there is a hysteria by which journalists and commentators focus on even the most minute differences between parties on issues.  It’s divisive.

It’s also unfortunate because it serves to distract the citizenry from the real issues.  The big issues.  Such as?

For one, that the U.S. is a huge net debtor to China, given its addiction to inexpensive  Chinese goods (note I didn’t say cheap, or shoddy).  China will surely use its vast holdings of U.S. currency to exercise its interests, and at the expense of the U.S.’

Or, that the cost of health care is materially affected by the degree to which patients receive preventative care.  Waiting until you need to go to the emergency room because you couldn’t afford medical coverage is the best way to ensure costly care of what would then be an acute or chronic condition.  On principle, why can’t the government incent preventative care as a form of industrial policy?  After all, every country has an industrial policy with incentives designed to influence private sector behavior.  This is not the orientation of the current healthcare debate.

Or, the fact that small businesses employ most of the workforce, are the source of most job creation, and are the primary means to grow out of a recession.  But can you find a powerful small business lobby in Washington?

I could go on.

Americans are perplexed why other countries see America differently than it sees itself.  No doubt cultural and societal differences account for part of it.  But could the reason also lay with Americans’ media habits?

Has Twitter replaced your newspaper? And a corollary: search is dead?

I used to be very dismissive of Twitter.  “What’s its purpose?”, I asked.  Until I had the following thought…

A quick inspection of the Tweets I get suggests that about 70% of them contain URL’s.  Meaning, they are designed for me to go somewhere else for the full read.

Which led me to thinking: are Tweets replacing headlines in the newspaper industry?

Think about how we use headlines.  To quickly scan what we want to read.  And we probably read a small percentage of the articles in full.

What does this mean?  If I’m Google, I’d be concerned.

Google is the front door to a huge percentage of online content.  And they earn a lot of advertising revenue for being so.  But it’s the gateway to specific content that you’re searching for.  While search engines are tremendously useful, it’s not the only way we want to encounter online content.

Using the same analogy, Twitter is the front door to content you might want to read but aren’t searching for specifically.  Like how you scan newspaper headlines for something to read.

This would put Twitter in a powerful position if your business is to get your content read.  It’s not clear to me how Twitter intends to monetize its large base of users and volume of messages.  But you can imagine how it occupies a position between readers and writers, professional or otherwise.

For example, I use Twitter as one way to inform people about a new blog post.  And I noticed a lot of other bloggers doing the same.  If I was making a living from blogging (or journalism, or online marketing), Twitter would be awfully important as a means to communicate with readers.  And I might pay for the privilege.

Since I starting writing this post, I came across this article on the decline in search traffic.  Which seems to be the corollary.  Perhaps people’s use of the Web to acquire information is shifting, to one where the role of search is diminishing and the role of “headlines” (Tweets) is rising.  If so, Google’s dominance is ending.