The Future of Software

A VC View on Trends and Markets that Will be Big in 2016

If there is a single idea behind our predictions for 2016 it is the role reversal between humans and machines. Until now machines have augmented humans in their jobs. Now we step aside as the machines take over and ask for our help when they need it. We are still unquestionably smarter, but the competence of the machines is such that they do the job and we help when they get stuck. The machines are driving themselves, automating our data centers, mining for insight, securing our commerce, and finally, inviting us into their virtual world.

Below are six themes that we, at Scale, are thinking about every day. In each case we believe the change has already begun and will last for a decade or more. The majority of our investments in 2016 will be to entrepreneurs who understand and are capitalizing on the disruption from these trends.

Business Apps Will Get Smarter with Machine Learning via @staceycurry

When people think of AI, it conjures up images of science fiction novels where robots come alive and act like humans. However for enterprise investing, the opportunity in AI is machine learning, where machines have the ability to process more data than a human can process and glean insights from the data to make business decisions.  

2016 will see the prevalence of machine learning horizontally applied across enterprise applications. Machine learning isn’t a new concept. Finance, government and pharmaceuticals have long seen the value of using big data to make smarter business decisions. We now see it gaining traction in marketing, sales and HR and believe most new business applications will utilizing machine learning to make more intelligent decisions.

Why now?  Three main reasons

  1. Data: There is so much data being created today, a requirement to effectively train the machine learning models. Every website click, social media mention and Linkedin profile change can now be tracked and analyzed.
  2. Processing: We now have the processing power to manage such large amounts of data in real time.
  3. Access: Applications can now talk to one another through API’s. Customer activity data can now be easily shared across applications. 

Software Moves to Autopilot, the Automation of DevOps via @avitus

The increasing complexity of software will ultimately require expansive automation. Think of airplanes as an example, early planes were more simplistic in design and were solely flown by the pilot. As systems became more complex and flights longer, the introduction of the autopilot emerged allowing pilots to manage more tasks and avoid fatigue. Today, computers can “fly aircraft in virtually any situation”.

Software has also evolved from linear design to much more complex systems driven by: moving from one server to multi-server to multi-datacenter; expanding from single programming language to multiple programming languages; moving from a simple 3 tier stack to a distributed micro-services architecture; deploying servers to deploying servers, mobile and IOT.

These 4 examples coupled with the need to release code multiple times per week or even per day has created the intense need for automation. While we have seen some signs of automation to date, we are still at the early stages and companies that support this transition will be a big focus in 2016.

APIs – Not Just for Developers Anymore via @cackf

APIs [Application Programming Interfaces] have been around for years but mostly known amongst developers in the context of data-based communications both within an organization and externally. Amazon, and specifically Jeff Bezos, is notorious for declaring in the early 2000s that all internal data and functionality was to be exposed via API only and that all teams had to consider an eventual external audience in mind. Fast forward: Amazon Web Services was released to provide storage and compute to third parties.

APIs are at the core of most Web 2.0 and tech-forward businesses today, enabling transparent, always-on, real-time, multi-device communication. APIs are even moving offline with the rise of the Internet of Things. As APIs become a) more prominent, and b) more prolific, the landscape will evolve in 2016. Business-side attention will emerge as stake-holders realize that APIs can enable broader adoption, data-driven partnerships, additional revenue streams, or enrichments to product. And as API usage increases, both intra-company and with third parties, we will see new challenges of complexity and security: how do you efficiently monitor these APIs as you constantly release and update? How do you manage all of the interactions? How do you secure the API channels to prevent hackers from exploiting the new attack surface? 

Shift to Chip & Sign Will Accelerate Technological Adoption via @aniehenke

The new EMV standards quietly rolled out in October shifting fraud risk from banks to merchants on all transactions not processed with chip and sign. Large retailers have mostly made the shift, but adoption in the mid- and long-tail has been surprisingly slow in the US. Merchants will likey see their first fraud "hits" in 2016 and will scramble to upgrade their technology. This will benefit the whole commerce industry as merchants not only add chip and sign technology, but re-evaluate their technology stack and accelerate upgrade cycles. Simultaneously, card-present fraud will be increasingly more difficult, causing an increase in card not present (read: online) fraud which will drive increased technology and fraud detection solutions to vendors in those categories.

An uptick in post-breach technology investment via @atseitlin

The evolution of security tools hasn’t kept pace with the rate at which we’ve introduced major shifts to enterprise infrastructure--mainly driven by adoption of cloud and mobile. This dynamic has added great strain to our security posture and allowed cyber-criminals to stay one step ahead organizations. This is partially why we’re seeing more public data breaches.

In 2016, I expect to see more investments in tools and best practices that minimize exposure in the event of a data breach. Post-breach detection has already attracted significant investor funding and high valuations. We’re already seeing market consolidation here: Microsoft bought Addallom; Splunk’s acquired Caspida and Blue Coat Systems’ took up with Elastica. The market is still ripe for innovation though.

Specifically, tracking and remediating a breach has been under-invested in. We need better tools that provide guidance on what do instantly after a breach has taken place. Everything from the physical response to process to how and when to notify customers depending on regulatory compliance.

On the flip side of the coin, software that allows you to better quantify risk at executive or board level is interesting. Companies universally want to show they’re spending the right resources on cybersecurity initiatives and that spending is having positive impact over time. Companies who can explain how they’re reducing security risk are much better positioned to respond to the public and customers they’re accountable in the event of a breach. Measuring and tracking is very hard without the human component--consulting and understanding the business--so this area is particularly difficult to automate with software. 

Virtual reality via @avitus

Finally, in 2016, the various components enabling virtual reality will reach maturity. Positional tracking (the ability to move around in a virtual scene) and mixed media (combining video with CG content) will be big advances that we're watching closely as they unfold. We are particular interested in the business side of virtual reality: new computing interfaces and new ways of doing business. It is hard to envision the final disappearance of the desktop PC but that could well be the endgame.