Unpacking the Stack

Christine Edmonds | April 23, 2020

Five key insights from ICONIQ’s recent study of the Modern Developer Technology Stack

Earlier this year, ICONIQ Growth performed an in-depth study of the developer technology stack to help us better understand emerging trends, most commonly adopted tools, and key questions assessed during decision making processes.

The following is based on an external survey we ran across 200+ IT decision makers at software companies, further qualified by primary interviews, input from ICONIQ’s Technical Advisory Board as well as secondary research, including data from Sourcescrub and G2 Crowd.

In our external survey, we built our questioning around our view of the DevOps lifecycle, which includes seven key tool categories. Within each of these categories, we attempted to better understand which tools are most commonly used, why these tools were selected, what the level of satisfaction and frequency of usage is today, and how this might impact future decision making.

The below represents a sub-set of insights that emerged from this work — please visit our LinkedIn Slideshare for a copy of the full report.

1. The number of tools used by developers has proliferated….

Given lowering barriers to adoption, it is perhaps no surprise that companies have started to include an increasing number of tools in their developer stacks. In particular, emerging technology companies seem to have evolved to encourage experimentation of new tools, potentially driven both by the need for agility in high-growth stages as well as an increased comfort around bottoms-up adoption in these environments.

2. As a result, security has overwhelmingly become the most important consideration in selecting a tool, even at the expense of both efficiency and scalability.

As the number of available tools related to the code development process continues to explode, the focal point in the design and assembly of technology stacks has evolved. With multiple potential points of failure across architectures, security has become a top priority, even at the expense of short-term efficiency.

3. Concurrently, integration capabilities have become increasingly important in order to effectively manage the interplay between tools, with project management & code development tools emerging as key ‘anchor-points’.

These two tool categories are generally central to stack architecture given their role in bridging various teams and environments. Project management tools — including both generic options like Airtable and Smartsheet, as well as developer-specific choices like Asana and Jira — are a critical conduit between product and business teams. Meanwhile, development tools or IDEs (Integrated Development Environments) are intuitively central to any technology architecture as they are typically where developers spend the majority of their time. Selection in this category is often driven by programming language and tools include Visual Studio, AQTime Pro, as well as free offerings like Notepad++.

While security across tool categories remains a top priority, the categories of code deployment and security tools tend to be secondary from a decision-making standpoint. These tools are ubiquitous, but often only selected once the bulk of the stack is in place.

4. Specific tool selection is often driven by factors idiosyncratic to the tool category; sometimes these are driven by scale, coding language, or codebase environment

While tool adoption is generally driven in a bottoms-up fashion, final selection criteria continue to be defined by top-down decision makers. The cost of these tools is sometimes important, but ROI time horizon is often more so, indicating an appreciation for the value that can be driven by even some of the more expensive tools.

In some cases, this appreciation for finding the most effective tool and optimizing for ROI comes with scale. For example, across the project management category, small companies tend to use more generic options like Airtable or Smartsheet — likely given the amorphous nature of their processes at earlier stages which can be easier to handle in simple, self-defined tools. For larger companies of scale, however, a developer-specific tool like Jira is a more common choice with deep engagement across the team.

Across categories deeply linked to the coding process itself (e.g., development tools), programming language is more likely to be the largest driver of decision making. But in cases where tools serve as anchor points for the broader stack architecture such as code management or CI/CD, tried-and-true names like GitHub and multi-purpose tools such as GitLab are especially popular with a strong ability to integrate across the ecosystem.

In the monitoring category, the ability to unify logs, metrics, and traces across a distributed infrastructure is key; this makes Datadog — a tool with 200+ integrations — the preferred choice.

Finally, there are some categories such as defense in which most companies will have at least two distinct tools (e.g., both Checkmarx, geared towards code scanning, and Exabeam, a SIEM solution) — both for redundancy and given the distinct capabilities across this ‘best-of-bread’ group.

5. Challenges and organizational decisions related to machine learning resource allocation are top-of-mind

In a world where demand for this skillset has generally outpaced growth in the necessary talent pool, it comes as no surprise that decision makers continue to grapple with how best to organize and allocate Machine Learning resources.

Most companies currently have in-house ML teams and capabilities, while a smaller subset (~5–10% respondents) outsource machine learning needs on an ad-hoc basis as their primary approach. As companies scale from ~$10M to ~$250M of revenue, they increasingly deploy small teams of ML engineers dedicated to each function.

We hope these five insights provide a quick glimpse into the types of findings and recommendations generated from this study. Our full report includes a deep-dive into each of the discrete tool categories, with detailed views across various dimensions including familiarity, usage, engagement, satisfaction, and churn propensity. Please visit our LinkedIn Slideshare for a copy of the full report.