Enterprise Software

New vs. Upgrade Features: How to manage the process

Posted by | Enterprise Software | No Comments

Being a product manager has it’s perks, such as being able to create products from scratch. There really is no better feeling than seeing something go live for the first time and see the excitement from your users. When you hear customers love your product you get that euphoric feeling of “Yes! We did it!”. However, that is not all of product management. In fact, it’s probably a much smaller subset than most people think. The less glamorous portion of product management is upgrading the old bits within your product.

If you’re a software company, there is a good chance that you’ve accrued tech debt and made decisions to forgo development on portions of your product. These are the “nice to have/RFP checkbox” features that you create to get into deals but neglect to keep up to date. Often times, this comes to bite you in the ass later but it’s a known tradeoff. A common challenge that many product managers face (myself included) is delivering new features while updating old ones. For example, if you’re creating a new product from scratch on a new front end architecture, you’re going to want to port that over to the rest of the product to reduce the tech debt. However, doing that is a challenge because a large portion of the old platform is built with legacy code. Without rewriting the entire platform at the same time, how do you go about making incremental updates while delivering new features to stay competitive?

There’s no silver bullet answer to this question. But, since this is a blog post, I do have a potential answer! The method I’ve used in the past is incrementally changing the product over time, then making the full switch once most of the platform is complete. Obvious answer, right? It’s much more challenging in practice to actually do this.

For example: let’s assume that we have 10 pages that each have a unique visualization on them. The visualizations are using a myriad of different visual libraries to render the data that we’re serving up. Additionally, the framework we’re on is a legacy one and we want to decouple the front end from the back end using APIs. The wrong way to go about making the upgrade is by trying to build this all at once. By committing all of your resources to the switch, you leave no room for new feature development that may be critical to closing deals.

There are 3 key items I look for when tackling an upgrade:

  • What area of the product has the highest customer pain point
  • What are of the product has the most customer usage
  • What area of the product has the highest reusability

We want to look at each of those questions individually because they each hold their own weight when prioritizing what gets rebuilt and when. The item that has the highest score for these 3 questions is the one that should be built first. By creating a list of these items, it becomes much easier to build an upgrade path that allows for new development on the new framework to be sprinkled into the roadmap. Upgrading a platform is a process and rarely should be a rough switch. Some may disagree here but the reasoning is that not all items in a platform hold equal weight in value to the customer – especially in enterprise. This means that we want to be extremely targeted about where we spend our time in order to create the biggest positive impact.

In my experience, the best way to prove out the upgrade path is to scope the development down to a thin, vertical slice of the workflow. Meaning, when you come out of your prototype or MVP, your customers should be able to use the product end to end for the specific item that you upgraded. The key here is to rapidly build out this thin, vertical solution as fast as possible in order to get it in front of customers faster. This shows customers that you are actively upgrading the areas of the product that need help but also validates your direction on the upgrades. From there, you can start to either move horizontal or vertical in upgrading. Horizontal means taking a component that you built and reusing it in different areas in the platform to keep consistency. Vertical means taking the same type of thin workflow solution and replicating it across different parts of your platform.

Since the work is broken down into short bursts and is compacted into bite sized developments, it allows you to continue delivering new features that adhere to the new design or engineering efforts you’re doing. This means that for every feature you introduce, you should be using the new upgraded code or design. By doing this, you’re doubling down the speed of delivering the new experience, getting your customers into a happier spot.

In my opinion, there’s never a “right” way to do platform wide updates, primarily because it will always be painful (whether for you or your customers). There will be hurdles, there will be confused customers, and there will be short-term larger code base support needs. However, this above method helps reduce the risk and pain significantly by compartmentalizing the breadth of impact into smaller chunks while simultaneously providing more rapid feedback on your upgrades.

The Commoditization of Data: Where real value is moving to

Posted by | Enterprise Software, Technology | No Comments

I’ve been doing a lot of consulting lately for large groups around the personalization and analytics space. There seems to be a common trend amongst many of the questions I’m getting asked: How do all of these providers differentiate? Where is the value for enterprises looking to deploy these new softwares coming onto the space?

In all fairness, it’s not an easy question to answer. There’s a lot of moving parts and competitive pressures forcing enterprises (and software vendors) to think different about how true value is delivered to the end user. On one hand, you have a highly competitive marketing landscape at the very top who are all competing for the same business. These are analytics providers that typically collect torrents of data about your users. Often times these software providers are channel focused. On the other hand, you have IaaS providers, primarily Amazon AWS, who are completely commoditizing this space. Amazon allows users in house to easily create and deploy custom analytics applications. To further it, they’re starting to offer Business Intelligence tools and Machine Learning capabilities that are simple to deploy. This is putting big pressure on the software vendors to differentiate themselves.

With the marketing software market to increase between 17% by 2019 to over $56B in total market cap, the need to provide real value will come much more rapidly than many vendors expect. Data collection, management, and querying is basically table stakes for enterprises evaluating these vendors. The real shift has moved towards actionable insight, predictive analytics, and utilizing machine learning to understand users at an implicit behavior level.

To get a better perspective on where value is shift, let’s break down the following image.

0RidI
At the very bottom, we have “Data Collection & Management“. What data encompasses is data collection, management, transformation, etc. I’m using it as a catch all term for anything around database systems. While there are many criteria for having a solid data practice, we can safely say that this space is very commoditized at this point. Many analytical vendors are all collecting the same or very similar data and enterprises are no longer wanting to wait until after the fact to make decisions on what to do next (especially with digital marketing strategies). With Amazon making this layer of the stack so easy, this is no longer a competitive advantage or a value proposition for these software providers. Up the stack we go.

Pure Analytics providers are getting eaten away by Amazon or startup disruptors. Vendors like Mixpanel or Swrve are pushing heavily against incumbents like Omniture to price more competitively and offer better differentiated value. Since Google Analytics is a robust offering for free, there’s pressure to provide a differentiated offering. Additionally, current software technology makes it easy to do graph visualizations (think d3.js), translate data, or pull data in from different sources thus making this part of the stack and the former commoditized. This isn’t to say that you can get by without the Analytics or Data part of the stack. You absolutely have to have both in order to move into the areas where real value is built. Up the chain we go.

Segmentation and User Targeting is getting closer to the point of commoditization but not there yet. We’re still seeing new forms of targeting driving value and decision making when enterprises are evaluating vendors. Often times, these new vendors are giving enterprises the true ability to track and target customers across channel versus a single channel. This has been a big push by many of the leading research firms, such as Forrest and Gartner, that they’ve termed “Unified Customer View”. In my opinion, this is most of the market today and many of the vendors in this space are, in some form or fashion, able to accomplish this. There are additional features that help add value proposition such as CMS integrations, triggering events, and automated segmentation. However, the challenge is that the brunt of the work is still put onto the marketer. The reality is that marketers are having to do more with less and time is not on their side. This portion of the tech stack is where the current focus of commoditization is at with the movement of machine learning into being able understand and react to user behavior.

This brings us up to current day and the future which is the meat of where I believe the value is moving to.


The future is always unclear but here is my prediction. Software that automatically surfaces up general insights, unique insights, predictive insights, and autonomous reactions is going to be king. To break it down, let’s look at each one of these items.

General Insights

In my eyes, general insights are insights into the audience that users of a software vendor would perform on a daily basis. This might be something like audience health (DAU:MAU or audience return frequency) which helps keep a pulse on audiences. In the future, I anticipate that vendors will automatically surface up a suite of insights, based on vertical, that users get as more of a “state of the union” dashboard with the obvious ability to add their own automated metrics via a nice query builder of sorts.

Unique Insights

Users don’t always know where to look for insights. Nor do they always know the right questions to be asking. Since marketers can’t constantly look at all of the crazy data points they’re collecting and correlations between these data points, future software vendors will have to move here in order to meet these demands. These types of insights would be the machine crunching very sophisticated machine learning models each night in order to surface up insights that the model believes the user should know about. You can think of this as “Show me what I don’t know”. An example of this might something like “Your broadcast marketing campaign to all users had 20% higher engagement with French Male users in the Active segment of your audience”. The software will surface up the things the marketers doesn’t even know to think about.

Predictive Insights

As the name suggests, these are insights that are based off predictive modeling. Since these vendors are required to collect troves of data on user activity and behavior, there’s a world where machine learning models can predict the performance of campaigns, when to send them, when a user may be churning, etc. These are insights that help the marketer be proactive about their approach when engaging with users or handling changes within their digital ecosystem. There are lots of vendors playing in this space right now as point solutions but very few (if any) have a buttoned up product that is making a significant difference yet. The reason for this is that it is hard to build a generalized machine learning model that can find covariance between the different data points collected in a way that works for different verticals. This isn’t to say that it’s impossible but it will take more time to get to a really good spot here. This is where the marketer gets exceptionally high value because it moves them from the reactive analytics world to the proactive “autonomous” world.

Autonomous Reactions

Building on the previous item, autonomous reactions are predominately around automating many of the mundane tasks the marketer has to do today. For example, setting up an automated trigger to email a user when they perform a specific event on a channel. In the future, high value world software vendors will build machine learning models that are similar to developments in artificial intelligence. The system will know when to reach out to users with what type of messaging based on variable inputs at each point of the user lifecycle. The user can pinpoint where positive or negative behaviors may be at and assign value weightings to these data points, however it is the machine learning model that is optimizing the user journey. This is along the lines of Factorial Experimentation with machine learning models that build off of notions such as Markov Chain Monte Carlo (MCMC) simulations. This helps the marketer focus heavily on things that are much harder to automate, such as their acquisition and churn reduction/retention strategies.


With the above points in play, the only way that this is possible is the ability for these sophisticated machine learning models to understand explicit and implicit user behavior. Today, we have a lot of explicit behavior since this is easily tracked (session count, time on page, etc.). The real value is in understanding explicit user behavior. Implicit behavior is extremely valuable because you can tie in personas which help dictate how you market this personas. For example, I’m an unknown user on a travel site looking for a vacation that has sandy beaches, is sunny, and warm. I start looking at different beach destinations that are in tropical regions, search for different locations, input different fields about what temperature range I want, etc. I interact with the site on a more intimate basis. In the background, a model is crunching an analysis on who I am which can inform the broader system, specifically the CMS, what types of content I may like.

The reason why this is so fundamentally important is that we can understand who our users are without known explicitly who they are on an authenticated basis. Additionally, we can serve up very specific content based on a granular understanding of what the user is interested in. This is exceptionally powerful because you’re able to achieve a deep personalized connection with your user. A great example of this is Spotify’s “Discover Weekly” engine. People have described this weekly curated list of songs based on who you are as “creepy awesome”. What is effectively happening is that Spotify is crunching 100’s of intimate data points on your listening habits and building a playlist each week customized for you, and you only. An example of some of these data points are:

  • Genres of songs frequently played
  • Sub-Genres of songs frequently played
  • Did user skip within first 30 seconds?
  • What type of track is it? (high energy, relaxing, etc.)

It’s a beautiful example of personalization that is damn near perfect. Much like the Discover Weekly engine, software vendors in the marketing space are going to need to get to this level where they can curate and deliver content at an extremely granular and personal level. As the stack moves upwards in value due to commoditization, the next battle will be fought in the insights, personalization, and proactive engagement world. It’s an exciting time to be watching and building products for this line of work. I welcome the day when brands know me well enough to know when to engage with me, how to engage, in what form, in what frequency, in what context, with what content and so much more.


Agree? Disagree? Let us hear your thoughts on the next generation of marketing software!