Two big, new business influences

One of my personal delights is to come across new concepts and ideas that advance my professional thinking.

I’m usually suspicious of “form over substance” business books.  I’m attracted to the esoteric, favoring concepts that are meaty and somewhat hard to grasp at first. To me, the really valuable stuff is probably hard.

Two things I discovered in the past year have really got me thinking.

Amazon’s recipe for success is…..API’s?

Amazon’s success is hard to argue with.  Much has been written by the business media about their formula for success, deciphering Amazon’s culture and values such as the 2-pizza rule for organizational design (which I am a believer in, too).

This blog by Steve Yegge takes you much deeper into a seemingly arcane Amazon mandate that might have more to do with Amazon’s success than anything else.  Seriously.

I’ll tease you a bit:

So one day Jeff Bezos issued a mandate. He’s doing that all the time, of course, and people scramble like ants being pounded with a rubber mallet whenever it happens. But on one occasion — back around 2002 I think, plus or minus a year — he issued a mandate that was so out there, so huge and eye-bulgingly ponderous, that it made all of his other mandates look like unsolicited peer bonuses.

Before you click away and read it (as you should), it’s written by a technical leader who worked at Amazon for several years before working at Google for several years. He wrote this as an internal memo at Google, trying to explain why Amazon was succeeding where Google was not in the realm of public cloud services.

The gist of the memo was that Jeff Bezos himself was resolute in requiring that every system of theirs must interact with every other system using defined interfaces (API’s).

This is profound because:

  1. it meant that each team can operate autonomously of the next, maximizing each team’s agility
  2. this forced the creation of a technology services catalog well before Amazon Web Services was ever launched. Amazon was Customer 1.0 of AWS “for real” before anyone else. When AWS launched it was truly prepared to satisfy the needs of its customer base, and has sustained that ability over time

These days, business and IT are inextricably linked. If you can “grok” this article, you’ll be ahead of most in understanding how and why.

Wardley Maps: peeling a very large & useful onion

I’ve been interested in, and performed, strategic planning at times in my career.

However, strategy has gotten a bad reputation, and for good reason. Most of it doesn’t work in leading companies to having success or not.

Like so much innovation, the most interesting innovation I’ve seen about strategy came from an outsider.  Simon Wardley’s experience was not from the strategy industry of consultants, MBA’s etc.  Rather, it was from his functioning (and failing?) as a leader of a technology business.

His “Wardley Maps” resonate with me because they pinpointed why my past work on strategy was flawed in ways that I couldn’t quite articulate at the time.

Here are short and long versions introducing his work:

Wardley’s work is like peeling an onion. You can explain Wardley Maps succinctly as “value chain meets the dimension of time & evolution”.  But I’ve spent many, many hours with his writings and I still feel like I’m only peeling the outer layers. He writes about gameplay, team behavioral types, and much more.

Grand unification

What’s really interesting is how Steve Yegge’s memo and Wardley’s writings relate to each other.

Warley has written about AWS’ success using Wardley Maps to explain why. You start to understand why Amazon’s “API edict” in Yegge’s memo was so important in unleashing their business agility.  That agility has made it awfully hard for anyone to catch them in e-commerce or cloud computing.

If you’re in the tech industry, I hope you find the time to explore and enjoy these authors as much as I have.

Two security products you should use now

Here’s two products you should use to get a lot of extra security without a lot of hassle.

Lastpass

Lastpass is a browser plugin that stores your online passwords.

It eliminates the need to remember usernames and passwords for your online accounts because it auto-fills the forms whenever you login.  Super convenient.

LastPass generates very complex passwords, which helps avoid having your password guessed by hackers’ programs.  You don’t need to remember them, because LastPass does.

LastPass has a great report that inspects all of your stored passwords.  It quickly shows you duplicate passwords you’ve used in two or more accounts.  And it rates each password for its strength.  Don’t like what it found?  Use LastPass to generate new ones.

LastPass runs on Windows and Mac PC’s as well as all smartphones.

LastPass also has a sister product Xmarks. It’s a bookmark manager and syncs across all of your devices.  This is great when you change devices, such as buying a new smartphone or PC.  One click and all of your bookmarks show up on the new device.

One drawback to LastPass: you must use a master password to access the LastPass repository that contains all the other passwords.  Even though they encrypt everything, you’ll need to choose (and remember) a sufficiently complex password to your LastPass account.  There’s some extra features that can make access to LastPass even more secure, if you’re so inclined.

AnchorFree

AnchorFree establishes a secure, encrypted connection from your laptop to a wifi network.  This is important if you use a public wifi network, and is useful even on a password-protected one.

Like LastPass, AnchorFree gives you a lot of security without a lot of hassle.  Once you install it, it auto-connects over your network connection.  You don’t need to remember to turn it on.

AnchorFree doesn’t noticeably slow down your connection, at least here in the U.S.  They have built out their global network over the years to the point that it should perform well from most countries.

Customers also purchase AnchorFree for two other reasons:

  • it provides some anonymity due to IP address obfuscation and encrypted communications;  people living in countries where their internet use is subject to surveillance find this useful
  • its IP address obfuscation also enables people living in one country to access online entertainment content in another. For example, streaming Netflix from a country other than the U.S.

One drawback: I had trouble using AnchorFree on my iPhone.  The Mac and PC versions work fine for me.

Salesforce.com created a revolution, but not the one you think

revolutionIT departments in large and medium corporations face extinction thanks to SaaS, IaaS and PaaS vendors.  But it’s got nothing to do with “on-premise versus the Cloud”.

Rather, IT’s role in managing business applications is ending.  Business users can do for themselves in minutes what used to require an IT programmer hours and days.

What’s radical about SaaS business applications like Salesforce.com is their configurability.  The fact that they run in another data center called “the Cloud” is less significant, imho.

Think about it.  Using a browser, business users with admin permissions can do lots of stuff to tailor how the SaaS application behaves:

  • bulk import of data
  • add new fields to the database
  • create templates for workflows and business processes
  • provision new users
  • modify the role-based access model
  • design dashboards
  • …. and more

Contrast this to legacy on-premise applications like SAP.   Any change in application behavior required source code programming in “ABAP”.  Dozens of IT people would care and feed the beast, accumulating a long list of modification requests from the end-user community.  Upgrade cycles would require re-implementation of all of these changes against the new release.  Slow.  Expensive.  Brittle.

Thanks to the power of configuration, business department leaders are gradually and systematically dismantling on-premise ERP suites with a group of SaaS applications.  They are happy to be freed from the grip of a centralized IT organization.  And they’re voting with their feet (or, budgets) by consuming SaaS, IaaS and PaaS at an accelerating rate.

If IT organizations don’t re-invent themselves they’ll face extinction.  More in a future post on what re-invention might look like.

Are we underestimating the Cloud? One person’s story

cloud imageI spent some time in the last couple months getting my new company’s tooling and systems in place.  Why?

Because when the full engineering team is here soon, we’ll be in heads-down development mode along with our early customers.  No time for other stuff.

The results are pretty staggering:

  • Everything we implemented is software-as-a service; it lives in the cloud
  • Everything is “industrial strength” in terms of feature/functionality; we’re not going to outgrow these tools and apps anytime soon
  • Everything was implemented within minutes or hours.  Enter your credit card number and go.  Tweak the configurations now or later
  • Little or no installed software on laptops
  • No servers required
  • Everything is licensed as a monthly or yearly subscription (often I had a choice of either).  Easy on the cash-flow and easy to budget for growth

All of this was done without an IT employee or consultant.  All of this was done without owning a server.  All this was done without installing (and maintaining!) software.

We get immune to the hype surrounding the Cloud, but this experience reinforced the immense power of this trend.  Think about what this means to small businesses and their ability to “act big” on a budget.  Or, what this means to the IT department of a mid-size or larger corporation.

Massive change is underway and we might be underestimating it.

For the curious, here’s what we deployed so far:

  • Salesforce.com
  • Webex for conference calls and web meetings
  • Accompa for product requirements management
  • Rally for Agile product delivery
  • Jira for defect tracking
  • Github for source code control
  • Basecamp for general-purpose project management
  • Box for file repository
  • QuickbooksOnline for accounting
  • ExpenseCloud for expense report management
  • Google AdWords for keyword advertising
  • Google Analytics
  • Algentis for outsourced HR, benefits and payroll administration

As we get closer to market launch , we’ll take the same appraoch for everything else:

  • Website content management
  • Marekting campaign management
  • Various web analytics tools
  • e-commerce and/or customer billing
  • Various software development tools

Kudos to New Relic for writing about their toolset and inspiring me to write this post.

There’s an elephant in the Big Data room, and it ain’t Hadoop

elephantThe Strata conference is this week.  It’s the seminal conference on all things Big Data.

What’s notably missing?  Any talk on data quality and ways to deal with it.

I’m shocked, given my past and current experiences and the widely circulated anecdote that “80% of an analyst’s / data scientist’s time is spent preparing data to be analyzed”.  In other words, dealing with inbound data quality.

One explanation could be that the Big Data world is still focused on single-source click-stream data.  This is the cleanest data available.

But many of the best insights come from fusing many data sets together to paint a more comprehensive picture of a subject, such as a user or customer.  And this is when it gets messy.

How do you link multiple data sets together to know it’s the same user or customer across the various sources?  How do you deal with CRM and transactional data, which is rife with duplicate records, incorrect categorizations, missing values, etc.?

If we’re to take the next step in generating value from the Big Data ecosystem, the old problems still need solving.  Hopefully Strata 2014 will be a different story.

Where are the women in tech? Updated.

woman in boardroomHaving recently co-founded my own company, I get a big role in defining its culture.  And one of the things my partner and I agree on is the need for diversity in our team across cultures, genders and everything else.

Why?  Because it makes for a more inclusive culture.  And because bringing many different points of view to bear on important decisions yields….better decisions.

To that end, I started asking friends of mine the following question: “where can we find communities of women in technology?”.  This was for the purpose of including such groups in our recruiting outreach.

The results thus far have been, to put it mildly, underwhelming.  Mostly in the form non-responses.  And a suggestion to search on meetup.com, which is like starting mostly cold.

What’s going on?

I know there aren’t many women in engineering roles, especially in proportion to the percent of men.  But they do exist.  And they exist in even larger numbers in roles like user experience design, another role we’re looking for.

(We interrupt this post to note that as I’m writing this, in the lobby of a hotel, James Brown is singing “It’s a Man’s World” in the background music.  You can’t get more ironic than that!)

But despite the size of the community, why can’t I tap into this network of professional women the way I have done with so many other communities of interest?

Rather than put forward my hypotheses, I’m interested in yours.  And any connections you could make.  The journey of a thousand miles – gender equality in tech – begins with but a single step.

UPDATE

I’m pleased to report that the first two hires we’ve made are women.  Not because we went looking for them in women-specific networks; it just happened.  But it’s a big step toward preventing our early culture from being defined by a homogeneously (young) male team that mirrors the tech workforce as a whole.

UPDATE TWO

One of our woman hires resigned just weeks after joining.  Balancing a commute to work, parenting duties such as pickup from child care, and the demands of a startup was just too much for her.  It certainly makes for the argument that it’s tough to “have it all”, at least at certain stages in one’s career.

Big Data 2012: The “trough of disillusionment” and how to get past it

I spent some time at the Hadoop Summit this week, and spent lots of time in the prior weeks with entrepreneurs, practitioners and VC’s in this space.  My prediction: we are entering what Gartner would call the “trough of disillusionment” right about now.  The hype has left reality behind.

This is not a special insight of mine.  All emerging technologies go through stages of hype just as the Big Data movement is now.  Rather, I’d like to focus on why people will become disillusioned, and how to get past it.

Your data sucks

Big Data can deal with less-than-perfect data in many cases.  On collection, Hadoop doesn’t require parsing data into a schema, so you can leave it unparsed and de-normalized at first.

On analysis, lots of Big Data use cases are based on non-financial data.  So there’s tolerance for approximations (or, “confidence intervals” if you’re into stats).  For example, can I deal with a predictive model that says a user is 95% likely to churn?  You bet.

But the old adage “garbage in, garbage out” still applies.  For example, I see lots of cases where joining data sets remains a challenge because of a lack of serialized keys.  For example, is a visitor to your web site the same one who went to your community forums for help?  By the way, you’d better avoid using IP address as your key because it’s Personally Identifiable Information in many jurisdictions.  So it remains tough to develop a single view of the customer/user, especially when web-based touchpoints are everywhere including in Enterprise business models.

Organizations also get hung up with the approximation game.  Use of the qualifier “Likely” sends a chill through some people when faced with making important business decisions based on analytic insights.  Yet this is inherent when dealing with imperfect data.  So people wait for perfect data to arrive into their analytics systems.  And wait.  And wait.

Your analytics platform requires programmers to operate

This seems innocuous enough.  Aren’t programmers available to hire?

Let’s draw a comparison.  Legacy analytics platforms, namely those built on SQL databases and BI tools, don’t need much programming compared to when they emerged given the maturity of the platforms.  An Oracle DBA has lots of tools to configure and manage that database, and doesn’t need to do command-line programming thanks to the toolsets.

Compare that to Hadoop platforms, where programming is often required to even extract data from the data store.  Until the platform matures and these tasks are abstracted away by good tools, you’re faced with the prospect of hiring programmers.  The people that know how to do this are commanding huge salaries and multiple job offers.  Paying these market rates is not an easy conversation with your boss.

Your data requires statisticians to make sense of it

Statisticians, like the programmers above, are in scarce supply.  And are commanding  their own big salaries and lots of offers.

But to be fair, part of what makes Big Data exciting is the use of statistical analysis methods on business data as a mainstream discipline.  As hard as the work is today, lots of business insights are coming from new ways of looking at data.  So let’s not “throw the baby out with the bathwater”.

The bad news in sum: it’s going to take longer – and more money – to capture the promise of Big Data

What to do?

First, and most important, paint a vision for Big Data that is compelling, and creates unwavering executive support.  This is a marathon, not a sprint.  You will need executive support for a long time.

Second, make the support conditional on interim results.  Chunk up the journey into phases, where you can declare victory against interim  milestones.  No executive likes to take risks that will take years to prove.  So make sure the phased plan delivers good news along the way, and early detection of things when they go awry.

Speaking of continuous wins, don’t forget to visualize the results.  Pictures are vital in getting the results across and sustaining the excitement over long periods of time.  If you haven’t thought about hiring a visualization specialist for your team, do it.

Get the job done without experienced programmers and statisticians.  This gets to finding talent absent experience; the kind of talent that can learn these tools and methods in a self-directed way.  Someone with a good computer sceince background can learn Hadoop provided they have the curiosity and the will.  Just be a little patient while they get up to speed, and link your phased deployment to their ramp-up so nothing crashes and burns along the way.

The same could be said for stats skills.  I recently hired a masters graduate in marketing, who had a basic command of math.  But she learned the statistical tools on the fly to get the job done.  You can test for math aptitude by giving assignments during the interview stage, or even to existing employees.  This stuff can be learned.  Like the programmer role above, stage your initiative according to the learning journey of your analysts, so that the tasks don’t outstrip their developing capabilities.

All is not lost

Getting the maximum value out of Big Data is hard, and it’s a long journey.  Data quality is never a quick fix.  Nor is it quick or easy to hire the specialist skills presently required.

It will get easier.  Eventually, the vendor community will deliver point and click capability that abstracts away much of the coding required today by Hadoop admins and statisticians.

In parallel, sell the vision.  Deliver interim results.  Pay attention to visualization.  And look for latent talent.  Do these and you’ll be a Big Data hero.