There is something to be said for a benign dictatorship. For all its faults, the world of the Microsoft PC desktop was an orderly one, and for IT administrators, an easy one to manage. The occasional “blue screen of death” was a reasonable price to pay for a compute environment where services like identity, security and monitoring could be provided efficiently across all applications using Active Directory, anti-virus and myriad PC management tools. By contrast, SaaS is like a democracy, lots of potential but messy, and to a control freak (and all great IT managers are a little bit controlling!) a little bit worrying.
End users are not going back to the PC desktop, so now IT has to re-create the desktop experience and the desktop management experience, but in the far more heterogeneous and disparate world of the cloud. Building the desktop in the cloud is a megatrend that will impact hundreds of tech companies. Google, Microsoft, Apple and Firefox are duking it out for browser market share, because the browser is the new OS. At the back end, there are hundreds of cloud-based SaaS applications. In the middle is a great big mess. In the absence of a dominant vendor that comes between all clients and all apps (the role Microsoft used to fill in the LAN), there are a host of new security, management, monitoring and identity products, all selling to IT to help control the world of cloud without destroying productivity.
In the world of PCs, identity was important, but in the cloud it is vital. With PCs, LANs, and the always hated VPN, IT could have a high degree of confidence in what applications were running and what devices they were running on. Today that is gone. Applications from SaaS providers are now purchased on credit cards and run outside of IT. End devices can be a tablet, a BYOD phone that the employee owns, or a PC. The IT “bag of tricks” for locking down devices and blocking access to applications is now completely irrelevant. What IT can still manage is identity, which is simply the list of current employees (plus vendors, customers and consultants), and what applications and what information these employees can have access to. If that list can be kept up to date via close integration with HR systems, and if that list can then be promulgated across all third-party applications in real time, then IT can use identity as the leverage point to seamlessly re-achieve control. That is the business OneLogin is in.
In the new world, IT cannot make things worse for users just to make things better for IT. Users will go around them. In response, identity services and access control, which is what IT wants, has been cleverly packaged by vendors to appear to the user as Single Sign On (SSO), which is what users want. The idea is simple, if you are a typical knowledge worker today, you use an average of 13 different SaaS applications that you sign into regularly. Thirteen applications means thirteen passwords and thirteen opportunities to forget a password. With Single Sign On, you log in once, and you are automatically logged into all your applications.
For IT, this is a chance to roll out a user win that is also a huge long-term win for IT. Once users access all their applications through SSO, IT can then use SSO as a central clearinghouse to enforce stronger authentication (not just password but two-factor authentication), to do real-time provisioning of new employees (once an employee joins a company they get instant access to all relevant applications), and to do real-time de-provisioning (once an employee leaves or is asked to leave, they instantly lose access to all applications). This is not rocket science stuff, but if the average employee has 10 SaaS applications, and staff turnover is 15% per year, then a 1,000 person company has to provision and de-provision 1,500 different accounts annually.
This is a competitive market with a few direct competitors and many incumbents and adjacent players that are starting to tell a “we manage cloud identity” story to stay relevant. In this market we have been consistently impressed with the execution of the OneLogin team since we started tracking the company two years ago. The founders, Thomas (CEO) and brother Christian Pederson (CTO), have the classic immigrant entrepreneur story, literally building the first version of the product while living on ramen noodles in an apartment in Los Angeles. The company has moved to Silicon Valley, and Thomas has built a strong go-to-market team around him. What we particularly like is the focus on being partner friendly, including helping other SaaS vendors implement standards like SAML and now NAPPS (for mobile) by providing free toolkits to integrate these emerging standards.
The biggest challenge the company faces now is keeping up with the demands of an exploding marketplace. The company is hiring across sales, customer service and engineering. We are excited to be part of the team and look forward to working with Thomas, the team and the board to build the winner in the cloud-based identity market.
It’s no surprise that consumer and business buyers are more likely to buy something on the recommendation of a friend or peer – “earned advertising” as Nielsen calls it. I noted this rise of referral marketing in a post called The New Marketing Funnel, which examined how buyers’ behaviors have changed in the wake of the digital revolution. Some of today’s most successful startups have made referral marketing the backbone of their demand generation activities. The reach that companies can achieve with referrals is wide; 39% of marketers use referral regularly and 43% of these acquire more than 35% of new customers through referral. That is why I am proud to announce our investment in Extole.
Building a sophisticated referral and advocacy program in-house can be costly and complicated. The Extole platform makes it possible for marketers to go beyond paid search and SEO – to amplify the passion of their own members and loyal base to acquire new, passionate, paying customers. Existing customers are a company’s biggest champions and people rely on their peers for suggestions on products and services. With the Extole platform advocates receive a unique referral link to share with friends. When a friend purchases through the link, both the friend and advocate get rewarded.
The appeal of Extole also comes from the fact that the marketing department, not IT, manages the platform. Extole provides the software to help companies test incentive programs, manage multiple campaigns, and easily access analytics. The platform helps to mitigate user fraud. Extole customers can even save companies money on customer service by deflecting calls for referral rewards.
We’ve witnessed a growth in essential platforms for acquisition and demand gen over email (ExactTarget, Constant Contact), search (Kenshoo, Marin Software), and marketing automation (Marketo, HubSpot). Until now, however, there’s been no enterprise platform for referrals. Extole is delivering the key “third channel” for online marketers.
A quick review of Extole’s customer shows that many companies had a latent need for this platform. The company has already signed deals with companies from Boden, DocuSign, Fleetmatics, Intuit, and Kraft.
Extole is still a startup but its management team isn’t new to this game—or each other. Many of the senior executives worked together at a previous ScaleVP backed company – Omniture/Offermatica and I am thrilled to have them join the ScaleVP portfolio again. Welcome Extole!
Customer retention is critical to any company’s business model. Every company spends good money attracting and acquiring customers, and this simply goes to waste if you don’t retain them over time. At minimum, every company wants to recover its initial sales and marketing expense, however customers become more profitable through their lifetime … so the longer you retain a customer, the better the ROI and profitability of your business.
Many companies will assign customer retention responsibility to one functional leader. After all, product should create a product, business development should do deals, marketing should drive awareness, sales should sell, and G&A functions should keep a company humming. What is left over, i.e., managing the customer experience and lifetime value of those customers is often thrown to a support function, “Customer Success”, or marketing (when there is significant opportunity to cross-sell additional products – such as in e-commerce or financial services). But having any one function owning customer retention or lifetime value is a mistake.
A customer’s reaction to your product or service is valuable information that should guide every department. Without a customer-back mindset and incentives, functional leaders run the risk of optimizing near-term goals without doing what is in the best interest of the company’s long-term financial health…retaining customers.
CEOs I have spoken with are generally intrigued by and supportive of this “it takes a village” philosophy to maximizing customer lifetime value. After all, who doesn’t want a more successful and profitable business? Here are a few tactical approaches to driving this in your organization:
I would love to hear your thoughts, ideas and lessons-learned on this type of an approach to driving customer lifetime value.
Monica Adractas is VP, Customer Success at Box where she leads cross-company customer retention and churn initiatives. Previously, Monica was a Principal at McKinsey & Company where she served clients on growth, customer and digital strategy. You can find more from her on LinkedIn.
Amazon’s AWS re:Invent conference has become the must-attend conference of the year for anyone even tangentially involved with the cloud. This is now my third year attending re:Invent and each year it continues to improve and grow, with attendance growing roughly 50% every year. The sessions from Amazon and customers were incredibly insightful and I plan to spend many hours catching up on videos of sessions I couldn’t attend. For those not able to make it, I have shared my main takeaways below.
Offense over defense. In the past, Andy Jassy’s keynote was filled with references to other cloud providers and reactive moves like matching pricing cuts. With the exception of the EC2 Container Service coming on the heels of Google’s Container Engine, none of that happened this time around. Even though Google announced price cuts, nothing of the sort came from Amazon. In my conversations with AWS insiders, the mood is apparently shifting toward a focus on innovation rather than matching. I anticipate that price drops will continue, but it is clear those will be on Amazon’s schedule.
It’s all about the enterprise. Amazon started by appealing to developers. They are now focusing squarely on the enterprise. While past keynotes featured early adopters like Netflix and NASA, this year’s keynote moved into the early majority, highlighting companies like Philips, Johnson & Johnson, and Intuit. Nike was one interesting company to highlight. Aside from throwing an amazing after-party, they spoke about their journey to a cloud-native microservices architecture. They went from taking months to deploy a new feature in 2011 to a few hours in 2014. A shoe company not only moving into the cloud, but fully embracing it is shows us what’s ahead as the rest of the non-tech world realizes the full benefits of operating in the cloud.
AWS innovation engine shows no signs of slowing down. The number of new services that Amazon announced was truly impressive. I was impressed with both the quantity and the quality of the new services Amazon announced at with the conference.
Of all the announcements, Lambda was the one I found most interesting, and I think that many in the audience shared my sentiment. Lambda creates an entirely new programming model for event-driven asynchronous programs that leverages what AWS does best: abstracting away the undifferentiated heavy lifting of building infrastructure and freeing developers to focus on the more differentiated application logic. I also found it the most unexpected of the menu of new services.The other product launches were natural incremental features to existing services or use cases. Take for example the EC2 Container Service: this was expected in reaction to Google’s announcement a week prior. Lambda was creative, unique and unexpected and it is that innovation which will help Amazon maintain the wide lead it has established in the race to cloud dominance.
What I find most rewarding about attending conferences such as re:Invent is the opportunity to meet with other people interested in similar spaces and technologies. It was wonderful to reconnect with everyone from AWS with whom I had previously worked at Netflix and to catch up with the rest of the clouderati in attendance. The Expo, hallway, and drink conversations alone are worth the price of admission.
Already looking forward to seeing you next year at re:Invent.
Today, I am proud to announce our investment in Agari, a provider of real-time, data-driven security solutions that detect and prevent advanced email cyberthreats. Why Agari? Consider this, sensitive information of over 110 million people in the U.S. was stolen in the Target breach, and it all started from a phishing email. The lack of email authentication continues to be a source of massive vulnerability for users and companies many decades since email’s initial inception.
The foundation of the Internet that we all know and love today was formed in the 1960s and 1970s out of military research that built ARPANET, the first packet-switched connected network built on top of the TCP/IP protocol. One of the first “killer apps” built on this new infrastructure was a way for the then small group of researchers to exchange digital messages with one another. At the heart of this electronic mail subsystem was SMTP, a protocol as fundamental for email as TCP/IP was for the Internet. But because email was then intended for a small trusted group, authentication was never built into the protocol.
Fast-forward to modern day and we find that the number of online users numbers in billions and more half the email that is sent is spam. Phishing has become a major attack vector for the bad guys to gain an initial foothold to exploit well-intentioned security defenses. In our recent security survey of over 100 CISO’s revealed that one third of those surveyed list data breaches and malware outbreaks as a top concern for their business, both issues that involve email as the attack vector.
The most impactful breach in the past twelve month has been the highly publicized Target data breach where 40 million credit cards and 70 million identities, including name, address, email address and phone number, of Target shoppers were stolen. How did the bad guys get in? A successful phishing email attack on Target’s HVAC vendor installed malware on the vendor’s computers that, due to improper security controls at Target, made its way into Target’s network. A phishing attack was the first point of compromise.
Thankfully, there is a solution. New protocols like SPF (authentication) and DKIM (integrity) add the missing security to the email protocols. The last missing piece was just recently added in 2012. Agari, along with a group of leading internet companies including Google, Yahoo, PayPal, Twitter, Facebook created a new standard called DMARC that finally allows email senders to instruct email deliverers how to treat unauthenticated emails.
This fundamental leap forward in email architecture is what excited us about investing in Agari. However, making this transition won’t happen overnight. It’s complex and requires ongoing monitoring and management. Additionally, a new level of threat intelligence can be gathered from attempted attacks. That is exactly where Agari comes in, helping organizations that want to protect their brands from email abuse and securing their email channel. The Agari cloud-based SaaS solution aggregates data from 2.5 billion mailboxes to help global brands eliminate email threats, protect customers and their personal data, and proactively guard brand reputation. Today, Agari analyzes over 6 billion messages per day, identifies over 2 million malicious URLs per month, and blocks over 200 million malicious emails per month.
We’re thrilled to join the Agari team and help them scale the business.
Software companies are acquiring and accumulating ever-increasing sets of data driven by reduced costs and new database paradigms. But as my colleague Cack Wilhelm wrote earlier this week “‘Big data,’ however large, sitting idle in a data store is not adding value to an enterprise … Data must be consumerized easily for business stakeholders in order to uncover insights and drive predictions that are specific to solving each business problem.” Software companies are moving beyond just gathering retrospective analysis and instead are looking to leverage this data to better understand and predict what is to come – often by using machine learning. We call this trend of next generation set software companies “automation by algorithm” and I predict that it is going to eat up software.
The technology of using machine learning for predictive analytics is not new. However, quick glance at Google Trends shows an increasing interest in predictive analytics. It is both interesting and important to understand why this trend is growing.
Search Term: Predictive Analytics
The rise of big data (both the decreasing cost and non-relational database formats) is likely the largest catalyst. But there are other factors as well, such as the proliferation of the cloud, distributed computing, and better hardware that have made machine learning algorithms faster and easier to run. There is also an organizational shift at large with cutting-edge software companies being the first to adopt. Now, smaller organizations are following the trend as they acquire the talent from larger companies. There’s also been recent innovation in machine learning algorithms, with random forest (early 2000s) and deep learning (2000s) both being discovered in the past decade.
It is easiest to understand “automation by algorithm” by looking at examples such as our portfolio company Sailthru. It is common practice to optimize frequency of e-mail sends to open rates. This quickly becomes complex as you add dozens of possible variables (content of the email, time of day, subject, recipient demographic, pricing, etc). A data scientist may be able to optimize one instance, but the ongoing complexity means that only a machine-learning algorithm can continually optimize for the best result, incorporating feedback as performance changes. Sailthru has developed such technology (and more) allowing any customer to easily integrate this functionality without requiring in-house knowledge of machine learning. There are many alternatives when seeking an email vendor, but the ability to automate the algorithm and deliver it as an application / service is what distinguishes the company from its peers.
Rather than broad, horizontal solutions we believe the greatest near-term opportunity exists in specific use-case applications that leverages machine learning. This allows for the strongest selection and optimization of the machine-learning algorithm specific to the use case. Further, we have noticed that companies with the largest data sets have a clear advantage and therefore the most success. The data set can be accumulated, accessed through partners, or purchased– the source is less relevant than the volume (though source is a potential for long-term defensibility). Early adoption of automation by algorithm is evident in data rich areas such as marketing and finance. However, we’ve also seen great use cases in human resources, sales, and industry specific vertical software. Without intending to be exhaustive, a few areas we have found particularly interesting are:
Back in April, my colleague Stacey Bishop blogged about our growing interest in this area. We saw that several of our portfolio companies had either too many leads or too few. Those with too many are left hopeless trying to figure out which leads to focus on. Those with too little are often challenged with understanding where they should find their next lead. We’re really excited about companies using automation by algorithm to help companies focus on the right leads and identify new leads leveraging existing customer and pipeline data.
Managing hundreds or thousands customers is tough. Monitoring all the events that can lead to a churn event becomes impossible. Automation by algorithm can be used in retention software to predict the behavior of future customers and churn events. We are particularly excited about companies that not only leverage direct sources of data such as payment schedule, engagement, and feature requests, but also external data about that customer. For example, monitoring headcount might let you know that a sudden, significant reduction in forces could indicate financial hardship (and therefore a temporary easement on payment may prevent a churn event).
Reviewing thousands of resumes is inefficient and suffers from human bias. Meanwhile candidates are putting all their career data online, publishing their work online, and demonstrating qualifications in dozens of other ways across the Internet. The resulting data footprint holds a potential for strong insights into the recruiting process. We believe that there is a large opportunity for recruiting software to leverage automation by algorithm and change the way managers recruit.
Online fraud is exploding. Many first and second generation software solutions are struggling as location has become mobile and online personas have become more complicated to verify. At the same time, more behavioral, social, and other third party data is available than ever before. We are very excited about companies that are leveraging all of this data into their software and using automation by algorithm to help companies combat fraudsters.
Note: We were very fortunate to have Xiaonan Zhao join us for the summer. Her previous experience in machine learning while at Google and enthusiasm for the area were crucial in the development of thinking. She is currently a second-year at Harvard Business School and contemplating her post-graduation plans.
The term ‘big data’ has assumed different definitions as the industry has developed, with no universal consensus around whether it refers to the actual quantity and complexity of the data or the software and tools intended to make sense of the data – or a combination of both.
That said, most of us can agree that the infrastructure and tools focused on tackling big data challenges have progressed steadily. I worked at Cloudera in 2010 and 2011 and, at the time, in the eyes of our customers big data was Hadoop, and HDFS + MapReduce were the main attractions. Tech-savvy Cloudera customers were just beginning to experiment with 10 and 20 node clusters, nothing like the thousand node clusters you witness today.
Looking back over the past five years, Hadoop, NoSQL stores, and AWS have enabled very inexpensive data storage and compute, allowing companies to capture and retain more data. This data is as varied as video clips, call records, log files, machine-to-machine output, and social media streams, but taken together it is seen as an organizational asset and a competitive advantage.
As Hadoop and NoSQL took root, other data-related trends have emerged or persisted, increasing raw data volumes and reinforcing the need for cheap data storage, including:
Companies have ample internal and external data to store and they have a cost-effective way to do so: Hadoop or NoSQL paired with inexpensive commodity hardware made possible by the endurance of Moore’s law and resulting capacity improvements. To borrow an idea from The Second Machine Age, the database layer has become the “general purpose technology” and from here, companies are building complementary innovations to put the data to use, everything from Paxata for data preparation to Looker for data exploration and interaction. The next five years is going to be focused not on storing the data but on how to leverage or monetize all of this data.
‘Big data,’ however large, sitting idle in a data store is not adding value to an enterprise until tools exist to deliver analytics that access data in disparate locations, democratize the analysis currently performed by data scientists, and inform. Data must be consumerized easily for business stakeholders in order to uncover insights and drive predictions that are specific to solving each business problem; there is a dearth of data scientists and developers may not be informed of the underlying business questions.
Most agree that traditional business intelligence (BI) is being overshadowed by tools that are prescriptive and predictive in addition to being descriptive or diagnostic. Ad hoc analysis and advanced visualizations are increasingly utilized alongside (or instead of) reporting and monitoring. I predict that over time simple reporting and monitoring capabilities will come ‘free’ with many software applications as a standard function, just as APIs are now free for most tools.
With the general purpose database-level technology in place and the complementary data analysis tools maturing, the pivotal next step is closing the loop between data outputs and action. This is not action by humans but action triggered by a machine or computer. Machine learning, predictive analytics, and statistical analysis are a few emergent trends focused on making automation a reality. Just as ‘big data’ became a nebulous term, each of machine learning, statistical analysis, and predictive analytics suggests a different meaning to different constituents: for example, machine learning may relate more to algorithms or deep learning, whereas statistical analysis is associated with random forests and logistic regressions. Regardless of the nuances, each analytic approach aims to automate what was once a human-powered action made in reaction to data served up by computer analysis.
Today lots of this is black box, as in the mathematical machinations are obscured from the user. In another couple of years, white box, closed-loop analytics powered by machine learning algorithms may well be the new general purpose technology. Context Relevant and Domino Data Labs appear to be well on their way. This trend, in turn, will enable the next crop of complementary innovations to emerge.
In a business like venture capital, it’s all about people and how they work together as a team. That’s why I am thrilled to announce some new members of our team & two promotions!
Cack Wilhelm has joined us as Principal, with a focus on cloud infrastructure investments. Cack was with us last summer before finishing her MBA at the University of Chicago Booth School of Business. During her summer here at ScaleVP, Cack focused on the twin themes of business processes being automated by algorithms and big data being analyzed for insight. Cack draws from her recent experience in tech sales at Hadoop company, Cloudera and before that, at Oracle. Her technology, sales and startup background is a perfect addition to our infrastructure team and our portfolio.
Cack is also a dedicated runner, racing professionally for Nike for two years at the 5,000m distance and competing in track & field and cross-country while an undergraduate at Princeton. She already has visions of us competing in corporate challenges so expect to see more of us on the track!
Next up, Rose Yuan joins us as Associate. Most recently Rose was an analyst with J.P. Morgan Chase in technology banking where she worked as part of the advisory team on the sale of ScaleVP portfolio company, ExactTarget to Salesforce. Rose will work with us on investments across the portfolio in the SaaS, Cloud and Mobile sectors.
And while we are excited about the new members of our team, we continue to invest in our existing team and are proud when they succeed, since they represent the future of our business.
Ariel Tseitlin has been promoted to Partner. He joined us almost a year ago from Netflix where he was Director of Cloud Solutions. Ariel has been a great addition to the team getting involved in all aspects of the business from joining portfolio boards, to working with Andy Vitus on further evolving our IT sector focus, and even recruiting an EIR for Scale. He has sourced a number of interesting new companies so expect us to be announcing a new investment with Ariel as the ScaleVP lead sometime soon.
I am also proud to announce that we have promoted Susan Liu to Senior Associate. Susan has become a key member of our investment team, working closely with our SaaS team on the sourcing and evaluation of deals. She has worked on a number of ScaleVP investments including Bill.com, Bizible, DemandBase and particularly WalkMe, which she sourced directly at an industry conference.
Finally, congrats to EIR Bill Burns who recently accepted the new CISO position at Informatica. Ariel introduced us to Bill who had been the Director of Information Security at Netflix. Bill was our most recent EIR, spearheading a research project where he examined the priorities for 100 CISOs, what innovations they’re focused on, and how they are planning to help businesses take smart risks. He hosted weekly brown bag lunches to educate us on the inner workings of information security organizations and keep us all informed on what he was learning from the contacts he has made through his work with the ISSA CISO Forum and RSA. The result from his research was insightful and continues to shape our investment thesis around security. As is always the case, our EIRs remain part of our network. Last week, Bill joined Ariel & Cack in co-hosting a roundtable dinner to talk about security with some of our closest IT relationships. Thanks, Bill, we wish you well!
It’s the longest running joke in advertising, coined by the late John H. Wanamaker: “Half my advertising is wasted; I just don’t know which half.”
Enter Bizible. Today, ScaleVP is announcing our latest digital marketing investment in Bizible. Bizible provides CMOs with performance management software to answer the question of where to direct their marketing budgets. ScaleVP has a long history of investing in successful digital marketing companies. Namely, we’ve funded HubSpot, Datasift, Sailthru, and Demandbase. And our exited investments include such names as Omniture, ExactTarget, and Vitrue. We are thrilled to announce Bizible as our most recent investment.
Earlier this year, we announced that former ExactTarget CMO, Tim Kopp, was joining ScaleVP as an Executive-in-Residence. In that same blog post, we highlighted the key marketing pain points that CMOs, such as Tim, consistently share with us. Arguably, chief among these pain points is the ability to quantify the impact of marketing spend. Specifically, CMOs struggle to tie this spend – and the discrete programs it supports – to net new prospects and customers. They know that their marketing programs are driving results, but without the performance data, they’re unable to discern which programs deserve the credit. Bizible solves this problem.
Founded by Microsoft/Bing advertising veterans, Bizible sheds light on which online marketing programs are delivering the highest ROI.
Bizible connects all of a company’s digital marketing programs to their CRM system, enabling CMOs and demand generation teams to attribute revenue to paid search, social, email, display, and campaigns.
Bizible surfaces this data through an elegant and easy-to-use interface that intuitively shows where marketing programs are working and where they’re not. In fact, today the company announced a new product offering targeted to business-to-business CMOs that allows them to quickly:
In short, Bizible clarifies what’s working and what’s not. Companies that implement Bizible can identify the source of their leads and demonstrate the value that marketing delivers. Most importantly, there’s no more “wasted” advertising.
Advances in technology are being embraced by both security teams protecting sensitive corporate data and the sophisticated criminals trying to disrupt business. Addressing security, privacy and compliance concerns are often cited by businesses as top priorities before adopting Cloud or BYOD technologies. As a member of and advisor to Wisegate, I partnered with the IT advisory service to survey over a hundred security leads to learn what’s most important to CISOs, what innovations they’re focused on to address their most pressing problems, and how they’re planning to help businesses take smart risks.
New battlefields, same war. CISOs remain vigilant on the fundamentals – Malware Outbreaks and Data Breaches. Security teams confront growing risks on many fronts, from new technologies to external threat factors. Driving their security strategies are 6 technology trends and 5 top risks.
Security programs prioritize risks and business alignment, but lack tools to draw the big picture. Their risks are increasing, but only half can efficiently report risk status to their Boards and internal business partners.
As IT hands off infrastructure control, CISOs focus on the data. Shared risk models – a nod to the expanding universe of user devices and the dissolving enterprise perimeter.
Automate all the things. CISOs push automation, orchestration to manage point solution sprawl. Consolidation and automation are top areas of focus to improve security program maturity. Three-quarters of CISOs are building or integrating solutions to address their top risks. APIs are frequently requested features in modern security solutions.
Six Top Technology Trends
Businesses are embracing and realizing the productivity benefits of BYOD, Everything as a Service, and ubiquitous connectivity from their mobile devices. One of the consequences of these shared-risk models is losing visibility and manageability on endpoints, applications, and networks. These advances have impaired traditional security controls based on traffic inspection, blacklist signature-matching, and device management. Businesses have accepted these risks in exchange for their benefits, driving innovation for alternative solutions to secure corporate data. Advances in predictive and behavioral analytics, Cloud Application Security Brokers and SecDevOps methodologies, for example, help security teams remain effective at addressing their top risks — harmful applications and the loss of sensitive corporate data.
Five Top-Of-Mind Risks
Five risks capture 51% of CISOs’ top concerns, and are increasing industry-wide. Two risks in particular - Malware Outbreak and Sensitive Data Breach – account for nearly 1/3 of all CISOs’ attention. They were more important to participants than the next 6 identified risks combined. Beyond the top half, risk priorities quickly become diffuse and follow a power-law curve – indicating that security priorities differ across industries, companies, and program maturities. See our earlier post for a “word cloud” version of top risks.
Although Malicious Insider Threats receive a lot of press and are harder to detect, external threat actors pose a greater overall concern. Verizon’s 2014 Data Breach Incident Report indicates that only 8% of reported data breaches involved malicious insiders. Our data suggests that although the “insider threat” is a concern, it’s not in participants’ top-3. Coincidentally, behavior-based controls show promise at detecting anomalies in both endpoint application execution and activity logs — we look forward to more innovations in both of these top-risk areas.
Risk-Based, Business-Aligned Programs
With so many possible ways for harm to affect a company and its data, how do information security programs prioritize what to focus on, what threats should teams address first, and when should they change their focus? Teams overwhelmingly follow “Risk-Based” approaches, and look forward about 2 years when reviewing their strategic roadmaps. As security products themselves implement more cloud-based controls (either directly or embedded within vendor products), security teams will stay agile amidst ever-evolving threats.
Q: How do you prioritize your security program?
Q: How far out do you plan your strategic roadmap?
But Metrics and Reporting Impact Are Lacking
Despite being able to identify their top risks, one-half of our participants admitted they didn’t have good ways to measure the status of these risks or how effective their programs were at address them. This is surprising and concerning – imagine trying to fly an airplane at night with your three most important cockpit indicators missing.
Security and risk management systems are becoming Board-level discussions, government and industry regulations are also requiring better risk monitoring and controls. While many security products do provide dashboards, those tend to be specific to that product’s threats and activities. What’s needed are efficient ways to map all of this event data into holistic, business-level perspectives.
CISOs are looking to put security controls as close as possible to enterprise data, versus focusing on specific device types or threats. Information Protection and Control products (“IPC”, including DLP/DRM/masking/encryption technologies) were the #1 desired control to apply on computers, at the infrastructure layer, within applications, and on Mobile endpoints.
Data-centric security will become increasingly important as emerging “Cloud Always” companies implement modern enterprise stacks, established enterprises refresh their technologies with “Cloud First” initiatives, and even “Cloud Cautious” companies realize the benefits of SaaS and IaaS. By focusing on capabilities and adherence to data-centric security controls instead of specific device types, security teams can support a wider range of BYOD endpoints and applications. Storing and processing corporate data within SaaS and PaaS providers becomes less risky if the enterprise manages the encryption keys. There are many operational considerations to address, such as enterprise search and key distribution, but this is a promising area to address a wide range of risks.
Security teams are consolidating and automating their controls
And finally, several participants remarked that they’re concerned about managing an ever-expanding set of security point solutions. Even if security teams could easily find qualified staff to run new controls, they get better efficiencies driving security initiatives via automation and APIs. This is consistent with our investment in Chef and the notion that good security posture is based on solid operational controls and consistent configuration management.
Q: For which risks did you need to build something in-house because there were no acceptable commercially available alternatives?
We met with nearly two-dozen CISOs across more than 15 industries, asking about trends, externalities, and what they’re focusing on to protect their enterprise risks. We then partnered with Wisegate who surveyed their members to get a broader, in-the-trenches perspective on security practices and strategies. This increased our total, usable sample size to just over 100.
There was strong consistency between both data sets and we saw some “spreading” amongst priorities, implying that both program maturity and product choice is alive and well within the Information Security market. We also collected attributes about InfoSec programs and heard glimpses of what makes security programs successful. I will be presenting more details on these findings at the Gartner Catalyst Conference in San Diego August 11, 2014.
We hope these findings are helpful to both enterprise security teams and security startups contemplating new approaches. We’d like to thank our CISO survey participants for their time and insights, and Wisegate for their expertise. Future InfoSec opportunities are large, defenders are eager to gain new capabilities, and the market is ready for new innovations to disrupt the status quo.