Two big, new business influences

One of my personal delights is to come across new concepts and ideas that advance my professional thinking.

I’m usually suspicious of “form over substance” business books.  I’m attracted to the esoteric, favoring concepts that are meaty and somewhat hard to grasp at first. To me, the really valuable stuff is probably hard.

Two things I discovered in the past year have really got me thinking.

Amazon’s recipe for success is…..API’s?

Amazon’s success is hard to argue with.  Much has been written by the business media about their formula for success, deciphering Amazon’s culture and values such as the 2-pizza rule for organizational design (which I am a believer in, too).

This blog by Steve Yegge takes you much deeper into a seemingly arcane Amazon mandate that might have more to do with Amazon’s success than anything else.  Seriously.

I’ll tease you a bit:

So one day Jeff Bezos issued a mandate. He’s doing that all the time, of course, and people scramble like ants being pounded with a rubber mallet whenever it happens. But on one occasion — back around 2002 I think, plus or minus a year — he issued a mandate that was so out there, so huge and eye-bulgingly ponderous, that it made all of his other mandates look like unsolicited peer bonuses.

Before you click away and read it (as you should), it’s written by a technical leader who worked at Amazon for several years before working at Google for several years. He wrote this as an internal memo at Google, trying to explain why Amazon was succeeding where Google was not in the realm of public cloud services.

The gist of the memo was that Jeff Bezos himself was resolute in requiring that every system of theirs must interact with every other system using defined interfaces (API’s).

This is profound because:

  1. it meant that each team can operate autonomously of the next, maximizing each team’s agility
  2. this forced the creation of a technology services catalog well before Amazon Web Services was ever launched. Amazon was Customer 1.0 of AWS “for real” before anyone else. When AWS launched it was truly prepared to satisfy the needs of its customer base, and has sustained that ability over time

These days, business and IT are inextricably linked. If you can “grok” this article, you’ll be ahead of most in understanding how and why.

Wardley Maps: peeling a very large & useful onion

I’ve been interested in, and performed, strategic planning at times in my career.

However, strategy has gotten a bad reputation, and for good reason. Most of it doesn’t work in leading companies to having success or not.

Like so much innovation, the most interesting innovation I’ve seen about strategy came from an outsider.  Simon Wardley’s experience was not from the strategy industry of consultants, MBA’s etc.  Rather, it was from his functioning (and failing?) as a leader of a technology business.

His “Wardley Maps” resonate with me because they pinpointed why my past work on strategy was flawed in ways that I couldn’t quite articulate at the time.

Here are short and long versions introducing his work:

Wardley’s work is like peeling an onion. You can explain Wardley Maps succinctly as “value chain meets the dimension of time & evolution”.  But I’ve spent many, many hours with his writings and I still feel like I’m only peeling the outer layers. He writes about gameplay, team behavioral types, and much more.

Grand unification

What’s really interesting is how Steve Yegge’s memo and Wardley’s writings relate to each other.

Warley has written about AWS’ success using Wardley Maps to explain why. You start to understand why Amazon’s “API edict” in Yegge’s memo was so important in unleashing their business agility.  That agility has made it awfully hard for anyone to catch them in e-commerce or cloud computing.

If you’re in the tech industry, I hope you find the time to explore and enjoy these authors as much as I have.

Two security products you should use now

Here’s two products you should use to get a lot of extra security without a lot of hassle.


Lastpass is a browser plugin that stores your online passwords.

It eliminates the need to remember usernames and passwords for your online accounts because it auto-fills the forms whenever you login.  Super convenient.

LastPass generates very complex passwords, which helps avoid having your password guessed by hackers’ programs.  You don’t need to remember them, because LastPass does.

LastPass has a great report that inspects all of your stored passwords.  It quickly shows you duplicate passwords you’ve used in two or more accounts.  And it rates each password for its strength.  Don’t like what it found?  Use LastPass to generate new ones.

LastPass runs on Windows and Mac PC’s as well as all smartphones.

LastPass also has a sister product Xmarks. It’s a bookmark manager and syncs across all of your devices.  This is great when you change devices, such as buying a new smartphone or PC.  One click and all of your bookmarks show up on the new device.

One drawback to LastPass: you must use a master password to access the LastPass repository that contains all the other passwords.  Even though they encrypt everything, you’ll need to choose (and remember) a sufficiently complex password to your LastPass account.  There’s some extra features that can make access to LastPass even more secure, if you’re so inclined.


AnchorFree establishes a secure, encrypted connection from your laptop to a wifi network.  This is important if you use a public wifi network, and is useful even on a password-protected one.

Like LastPass, AnchorFree gives you a lot of security without a lot of hassle.  Once you install it, it auto-connects over your network connection.  You don’t need to remember to turn it on.

AnchorFree doesn’t noticeably slow down your connection, at least here in the U.S.  They have built out their global network over the years to the point that it should perform well from most countries.

Customers also purchase AnchorFree for two other reasons:

  • it provides some anonymity due to IP address obfuscation and encrypted communications;  people living in countries where their internet use is subject to surveillance find this useful
  • its IP address obfuscation also enables people living in one country to access online entertainment content in another. For example, streaming Netflix from a country other than the U.S.

One drawback: I had trouble using AnchorFree on my iPhone.  The Mac and PC versions work fine for me. created a revolution, but not the one you think

revolutionIT departments in large and medium corporations face extinction thanks to SaaS, IaaS and PaaS vendors.  But it’s got nothing to do with “on-premise versus the Cloud”.

Rather, IT’s role in managing business applications is ending.  Business users can do for themselves in minutes what used to require an IT programmer hours and days.

What’s radical about SaaS business applications like is their configurability.  The fact that they run in another data center called “the Cloud” is less significant, imho.

Think about it.  Using a browser, business users with admin permissions can do lots of stuff to tailor how the SaaS application behaves:

  • bulk import of data
  • add new fields to the database
  • create templates for workflows and business processes
  • provision new users
  • modify the role-based access model
  • design dashboards
  • …. and more

Contrast this to legacy on-premise applications like SAP.   Any change in application behavior required source code programming in “ABAP”.  Dozens of IT people would care and feed the beast, accumulating a long list of modification requests from the end-user community.  Upgrade cycles would require re-implementation of all of these changes against the new release.  Slow.  Expensive.  Brittle.

Thanks to the power of configuration, business department leaders are gradually and systematically dismantling on-premise ERP suites with a group of SaaS applications.  They are happy to be freed from the grip of a centralized IT organization.  And they’re voting with their feet (or, budgets) by consuming SaaS, IaaS and PaaS at an accelerating rate.

If IT organizations don’t re-invent themselves they’ll face extinction.  More in a future post on what re-invention might look like.

Are we underestimating the Cloud? One person’s story

cloud imageI spent some time in the last couple months getting my new company’s tooling and systems in place.  Why?

Because when the full engineering team is here soon, we’ll be in heads-down development mode along with our early customers.  No time for other stuff.

The results are pretty staggering:

  • Everything we implemented is software-as-a service; it lives in the cloud
  • Everything is “industrial strength” in terms of feature/functionality; we’re not going to outgrow these tools and apps anytime soon
  • Everything was implemented within minutes or hours.  Enter your credit card number and go.  Tweak the configurations now or later
  • Little or no installed software on laptops
  • No servers required
  • Everything is licensed as a monthly or yearly subscription (often I had a choice of either).  Easy on the cash-flow and easy to budget for growth

All of this was done without an IT employee or consultant.  All of this was done without owning a server.  All this was done without installing (and maintaining!) software.

We get immune to the hype surrounding the Cloud, but this experience reinforced the immense power of this trend.  Think about what this means to small businesses and their ability to “act big” on a budget.  Or, what this means to the IT department of a mid-size or larger corporation.

Massive change is underway and we might be underestimating it.

For the curious, here’s what we deployed so far:

  • Webex for conference calls and web meetings
  • Accompa for product requirements management
  • Rally for Agile product delivery
  • Jira for defect tracking
  • Github for source code control
  • Basecamp for general-purpose project management
  • Box for file repository
  • QuickbooksOnline for accounting
  • ExpenseCloud for expense report management
  • Google AdWords for keyword advertising
  • Google Analytics
  • Algentis for outsourced HR, benefits and payroll administration

As we get closer to market launch , we’ll take the same appraoch for everything else:

  • Website content management
  • Marekting campaign management
  • Various web analytics tools
  • e-commerce and/or customer billing
  • Various software development tools

Kudos to New Relic for writing about their toolset and inspiring me to write this post.

There’s an elephant in the Big Data room, and it ain’t Hadoop

elephantThe Strata conference is this week.  It’s the seminal conference on all things Big Data.

What’s notably missing?  Any talk on data quality and ways to deal with it.

I’m shocked, given my past and current experiences and the widely circulated anecdote that “80% of an analyst’s / data scientist’s time is spent preparing data to be analyzed”.  In other words, dealing with inbound data quality.

One explanation could be that the Big Data world is still focused on single-source click-stream data.  This is the cleanest data available.

But many of the best insights come from fusing many data sets together to paint a more comprehensive picture of a subject, such as a user or customer.  And this is when it gets messy.

How do you link multiple data sets together to know it’s the same user or customer across the various sources?  How do you deal with CRM and transactional data, which is rife with duplicate records, incorrect categorizations, missing values, etc.?

If we’re to take the next step in generating value from the Big Data ecosystem, the old problems still need solving.  Hopefully Strata 2014 will be a different story.

Where are the women in tech? Updated.

woman in boardroomHaving recently co-founded my own company, I get a big role in defining its culture.  And one of the things my partner and I agree on is the need for diversity in our team across cultures, genders and everything else.

Why?  Because it makes for a more inclusive culture.  And because bringing many different points of view to bear on important decisions yields….better decisions.

To that end, I started asking friends of mine the following question: “where can we find communities of women in technology?”.  This was for the purpose of including such groups in our recruiting outreach.

The results thus far have been, to put it mildly, underwhelming.  Mostly in the form non-responses.  And a suggestion to search on, which is like starting mostly cold.

What’s going on?

I know there aren’t many women in engineering roles, especially in proportion to the percent of men.  But they do exist.  And they exist in even larger numbers in roles like user experience design, another role we’re looking for.

(We interrupt this post to note that as I’m writing this, in the lobby of a hotel, James Brown is singing “It’s a Man’s World” in the background music.  You can’t get more ironic than that!)

But despite the size of the community, why can’t I tap into this network of professional women the way I have done with so many other communities of interest?

Rather than put forward my hypotheses, I’m interested in yours.  And any connections you could make.  The journey of a thousand miles – gender equality in tech – begins with but a single step.


I’m pleased to report that the first two hires we’ve made are women.  Not because we went looking for them in women-specific networks; it just happened.  But it’s a big step toward preventing our early culture from being defined by a homogeneously (young) male team that mirrors the tech workforce as a whole.


One of our woman hires resigned just weeks after joining.  Balancing a commute to work, parenting duties such as pickup from child care, and the demands of a startup was just too much for her.  It certainly makes for the argument that it’s tough to “have it all”, at least at certain stages in one’s career.

Big Data 2012: The “trough of disillusionment” and how to get past it

I spent some time at the Hadoop Summit this week, and spent lots of time in the prior weeks with entrepreneurs, practitioners and VC’s in this space.  My prediction: we are entering what Gartner would call the “trough of disillusionment” right about now.  The hype has left reality behind.

This is not a special insight of mine.  All emerging technologies go through stages of hype just as the Big Data movement is now.  Rather, I’d like to focus on why people will become disillusioned, and how to get past it.

Your data sucks

Big Data can deal with less-than-perfect data in many cases.  On collection, Hadoop doesn’t require parsing data into a schema, so you can leave it unparsed and de-normalized at first.

On analysis, lots of Big Data use cases are based on non-financial data.  So there’s tolerance for approximations (or, “confidence intervals” if you’re into stats).  For example, can I deal with a predictive model that says a user is 95% likely to churn?  You bet.

But the old adage “garbage in, garbage out” still applies.  For example, I see lots of cases where joining data sets remains a challenge because of a lack of serialized keys.  For example, is a visitor to your web site the same one who went to your community forums for help?  By the way, you’d better avoid using IP address as your key because it’s Personally Identifiable Information in many jurisdictions.  So it remains tough to develop a single view of the customer/user, especially when web-based touchpoints are everywhere including in Enterprise business models.

Organizations also get hung up with the approximation game.  Use of the qualifier “Likely” sends a chill through some people when faced with making important business decisions based on analytic insights.  Yet this is inherent when dealing with imperfect data.  So people wait for perfect data to arrive into their analytics systems.  And wait.  And wait.

Your analytics platform requires programmers to operate

This seems innocuous enough.  Aren’t programmers available to hire?

Let’s draw a comparison.  Legacy analytics platforms, namely those built on SQL databases and BI tools, don’t need much programming compared to when they emerged given the maturity of the platforms.  An Oracle DBA has lots of tools to configure and manage that database, and doesn’t need to do command-line programming thanks to the toolsets.

Compare that to Hadoop platforms, where programming is often required to even extract data from the data store.  Until the platform matures and these tasks are abstracted away by good tools, you’re faced with the prospect of hiring programmers.  The people that know how to do this are commanding huge salaries and multiple job offers.  Paying these market rates is not an easy conversation with your boss.

Your data requires statisticians to make sense of it

Statisticians, like the programmers above, are in scarce supply.  And are commanding  their own big salaries and lots of offers.

But to be fair, part of what makes Big Data exciting is the use of statistical analysis methods on business data as a mainstream discipline.  As hard as the work is today, lots of business insights are coming from new ways of looking at data.  So let’s not “throw the baby out with the bathwater”.

The bad news in sum: it’s going to take longer – and more money – to capture the promise of Big Data

What to do?

First, and most important, paint a vision for Big Data that is compelling, and creates unwavering executive support.  This is a marathon, not a sprint.  You will need executive support for a long time.

Second, make the support conditional on interim results.  Chunk up the journey into phases, where you can declare victory against interim  milestones.  No executive likes to take risks that will take years to prove.  So make sure the phased plan delivers good news along the way, and early detection of things when they go awry.

Speaking of continuous wins, don’t forget to visualize the results.  Pictures are vital in getting the results across and sustaining the excitement over long periods of time.  If you haven’t thought about hiring a visualization specialist for your team, do it.

Get the job done without experienced programmers and statisticians.  This gets to finding talent absent experience; the kind of talent that can learn these tools and methods in a self-directed way.  Someone with a good computer sceince background can learn Hadoop provided they have the curiosity and the will.  Just be a little patient while they get up to speed, and link your phased deployment to their ramp-up so nothing crashes and burns along the way.

The same could be said for stats skills.  I recently hired a masters graduate in marketing, who had a basic command of math.  But she learned the statistical tools on the fly to get the job done.  You can test for math aptitude by giving assignments during the interview stage, or even to existing employees.  This stuff can be learned.  Like the programmer role above, stage your initiative according to the learning journey of your analysts, so that the tasks don’t outstrip their developing capabilities.

All is not lost

Getting the maximum value out of Big Data is hard, and it’s a long journey.  Data quality is never a quick fix.  Nor is it quick or easy to hire the specialist skills presently required.

It will get easier.  Eventually, the vendor community will deliver point and click capability that abstracts away much of the coding required today by Hadoop admins and statisticians.

In parallel, sell the vision.  Deliver interim results.  Pay attention to visualization.  And look for latent talent.  Do these and you’ll be a Big Data hero.

My hero, the Software Architect

In my many years doing product management or managing the function, the number one blocker to getting the features I want (and the user needs) is……software architecture.

Reading Mike Driscoll’s recent blog on software craftspeople reminded me that this architecture topic has been stewing in my brain for a while now.  Time to write about it.

“Too hard”, “too complex”, “too long” are the persistent reasons behind engineers’ resistance to feature requests or major product pivots.  What I realized is that in every case, it was the software architecture holding us back.  More specifically, the lack of componentization and modularity.

And the pattern spans every experience I’ve had; across lots of different products, across lots of different market sectors, across lots of different architectures (from client-side tools to client/server apps to SaaS/cloud apps), and across lots of different company sizes (pre-revenue to behemoths like SAP and EMC).

Need a new UI presentation tier?  Sorry, that code is co-mingled with the underlying business logic.  Need a new data management tier?  Sorry, the file system is bound to the rest of the code.  Need new business objects to show up in the schema?  Sorry, we can’t split our giant table and it’s already too big to extend.

One can understand how this predicament arose.  When new products are built, what’s required is focus on solving the user problem at hand.  You don’t have time nor money to design for unknown needs and future flexibility.  So why pay for abstraction and modularity without any present-day reason?

The bigger problem is when products mature and the user needs outgrow or diverge from the capabilities of the original architecture.  What to do next?  Re-factor and modularize?  Re-build from scratch?  Limp along by stuffing new features into the code but with huge effort each time?

Nobody knows the magic formula for how to make these decisions.  Re-factoring scares the crap out of engineers lest they “break something”.  After all, by the time this discussion arises, you’ve got spaghetti for code.  And the folks who wrote it might not be around anymore.

Re-write scares the crap out of the business leaders, since it appears to be paying twice for the same product.  And there’s the inherent risk of missing deadlines.  Oh yeah, and you just put your legacy product version on life support so you can afford to staff the engineers on the re-write project.  And you’re losing ground to competitors along the way, since you’ve stopped new feature development to pay for a better architecture.

No wonder products whose architecture devolved to something bad, or started that way, never get fixed.

This vicious cycle is what creates the opportunity for “innovation” in the form of a start-up who has the benefit of a clean sheet of paper: fresh, elegant code using state-of-the art languages, components and tools.  That seems like a wasteful way to solve the problem.

Enter the architect.  If you have a great architect, every problem is reduced in magnitude.

With a great architect, new products have some modularity and flexibility designed in.  A little bit of future-proofing goes a long way. Existing products can be selectively modularized and modernized so the new functional capabilities are delivered without breakage.  And if the time comes for a re-write, you have confidence that all of the lessons learned from the legacy code base are applied to the new design.  Thus, a greater chance of success, especially in meeting a deadline.

So, what makes a great architect?  In many respects, a lot of the same characteristics that make a great product manager: curiosity, an ability to translate what users and salespeople need into technical terms, abstract thinking that enables one to imagine new possibilities, etc.  Of course, the architect also needs the deep technical experience too.

Back to the premise of Mike Driscoll’s article: the best software is being built by people with, dare I say it, experience.  Experience to avoid pitfalls because she messed something up before.  Experience to choose the right tools for the job, much like a fine craftsman that builds furniture, or houses, or bespoke clothing. Experience to know what degree of flexibility to design in, without paying for needless flexibility that feels more like insurance against every conceivable future requirement.

I’ve known some good architects and probably only one or two great ones.  With the great ones, we have had some huge debates thanks to the force of personality that seems to come with great ones.  But in the end, despite the strong personalities, great architects are worth having.  And the great product companies know this, which is why they spend a lot of money on great architects.

I say it’s money well spent.

“Creative Destruction and Netflix”: Part Two

I wrote a while ago in admiring terms about how Reed Hastings was trying to disrupt his own business before others did it to him.  As in, splitting the mail-order DVD business from the online streaming business at Netflix.

The backlash to this announcement was pretty huge.  To the point that Netflix had to “undo” the announcement.  Talk about a black eye.

The timing of the decision is certainly up for debate given how customers reacted.  So let’s say it was premature.  But how premature?

I still contend this is the right decision.  Eventually.  Just like the Pony Express was rendered obsolete by the telegraph, so too will mail-order media delivery be made obsolete by streaming delivery.  Who would believe otherwise?

It’s darned hard to time such changes, however.  Most companies never make the leap at all, hence books like The Innovator’s Dilemma.  For those who have the bravery to do so like Netflix, the timing is a perilous choice.  Too soon?  Investors punish you for cannibalizing current revenue and alienating happy, paying customers.  Too late?  You’ll probably never catch up.

If I knew the answer, I’d be a rich man.  My sense is that it’s impossible to make a formula.  Instead, it’s about getting the whole set of stakeholders on the same page.  So when it’s time to jump off the cliff, everyone is holding hands.  CEO, executive team, Board of Directors, large investors.  Not a small task.  At least you’re trying, Reed.

The “Measurement Wars”

This post is a bit long, but it ends with why your personal life will be spied upon by your company’s competitors.  Curious?  Read on.

Information – data turned into meaning – is going to (further) disrupt just about every industry there is.  Many of the strong will become weak.  Tiny upstarts, like Davids, will topple Goliaths.  And this could all happen at the expense of your personal privacy.

I know, I know, the “information revolution” has been predicted for decades.  Except that like many predictions, exactly when & how the predictions come true will differ from what’s first assumed.  “Video calling” has been predicted as inevitable since the 1950’s.  But nobody saw Skype as the inevitable means for it to come true.

To see the future, let’s look at what has happened on Wall Street and how the same is about to happen in other industries.

What’s happened on Wall Street

Fortunes on Wall Street are made through arbitrage.  Years ago, the winners were those who had better research teams: who had the better information about a company’s stock?  Currency arbitrage followed: who can spot temporary asymmetries in currency prices and execute a trade the fastest?

Across every tradable instrument, the arbitrage game on Wall Street has moved from high-latency (you know something I don’t, for days or weeks at a time) to near-zero-latency.  New fiber optic networks have been built to shave milliseconds from the average trade.  And entire data centers have been relocated to close proximity to these new networks.  Whatever advantage you have must now be exploited in real time.  Or it’s not an advantage.

As the arms race of zero-latency arbitrage has unfolded, two other trends have been its critical enablers.  The first is data acquisition.  Every trade on every exchange can be captured and analyzed, alongside reams of other data about the companies, markets and countries those pertain to those financial instruments.  The scope and scale is increasing by orders of magnitude.

  • An example of scope: the latest algorithms even ingest Twitter feeds in real time to discern investor sentiment.  Including your latest Tweet about some stock you like
  • An example of scale: banks and securities firms store more information per company than any other industry (see this great report from McKinsey)

The second enabler is the ability to make sense of the collected data.  No longer do freshly-minted MBA’s toil into the night to build Excel models with quaint calculations like “EDITDA” or “Return on Invested Capital”.  Today, PhD statisticians build algorithms so complex that they themselves have trouble making sense of them in use.  Michael Lewis’ “The Big Short” is a great read on this topic.

From Wall Street to Main Street?

Wall Street started its journey by going on a data collection binge, from which it developed its algorithms, from which it automated its trading such that milliseconds mattered.

It’s happening now in online consumer businesses.  Wonder why Google, Facebook et al are under such scrutiny for collecting your data to the point of privacy invasion?  Because they understand that data is the raw material that feeds their statisticians, that feeds their algorithms, that feeds their money-making.

Except “money-making” in this case is the price they can charge an advertiser for an ad on one of their web pages.  The more relevant the ad to a consumer’s interests, the higher a premium the ad will command.  How to discern consumer interest?  Profile the heck out of them.

It’s happening elsewhere too.

The “Measurement Wars”

New companies, or ones that successfully reinvent themselves, will start their innovation and disruption journey by gathering reams of data, then finding the relationships between that data.

The forthcoming data acquisition binge is going to amplify online privacy issues.  Collecting competitive intelligence will border on spying on other companies’ employees.

It will also tempt companies to sell their internal data to others.  Could we see new “data merchants” emerge?  Would you like to know the kilowatt-per-hour energy consumption for every household in America?  Someone would, and would like an electric utility to sell them that data for some commercial advantage.

Orwellian, or progress?

Every boundary that separates individuals, companies, cultures and countries will be subject to elimination or reduction.  Our ability to learn, empathize and understand differences across these boundaries will be exponentially enhanced.  Which is good.

The ability to use that same understanding to exploit others will also increase exponentially.  Which is bad.

My take?  We have dealt with many past innovations that could be used to exploit someone or something.  Each time, a new equilibrium was established and humankind moved on.

But it was the period of rapid transition, and resulting destabilization, that was the most dangerous.  We are in such a period now, I reckon.

Governments must understand the coming hunger for massive data collection, and act to mitigate the risks, if we are to emerge on the other side unscathed.  Or even emerge better off.  But what are the risks, and what are the remedies?

Spy vs. spy

We’re seeing the privacy issue play out now in the consumer sector.  Facebook has been scolded by the Federal Trade Commission for its privacy abuses.  Politicians in the European Union search for ways to legislate much more stringent consumer protections.  I won’t cover this ground because the media has done so many times over.

But we have yet to come to grips with the implications of how companies will compete with each other in the Measurement Wars.  The old rules were about the theft of intellectual property.  But what about when a company profiles its competitors’ employees?  Such as their Tweets, Facebook activities or LinkedIn profiles?  Individual persons will be deeply profiled as part of compiling a dossier of competitive intelligence gathering.

This is where it gets creepy.  Heck, if you were imaginative, you might think that things I write about in my blog pertain to issues I experience at work.

And you wouldn’t be all wrong.


I have borrowed from the great work done by others.  In particular, Michael Lewis and McKinsey as cited previously.  Also, O’Reilly Media for their extensive coverage of the underling technologies of “Big Data” and their Big Data’s use in consumer online businesses.