Amazon’s AWS re:Invent conference has become the must-attend conference of the year for anyone even tangentially involved with the cloud. This is now my third year attending re:Invent and each year it continues to improve and grow, with attendance growing roughly 50% every year. The sessions from Amazon and customers were incredibly insightful and I plan to spend many hours catching up on videos of sessions I couldn’t attend. For those not able to make it, I have shared my main takeaways below.
Offense over defense. In the past, Andy Jassy’s keynote was filled with references to other cloud providers and reactive moves like matching pricing cuts. With the exception of the EC2 Container Service coming on the heels of Google’s Container Engine, none of that happened this time around. Even though Google announced price cuts, nothing of the sort came from Amazon. In my conversations with AWS insiders, the mood is apparently shifting toward a focus on innovation rather than matching. I anticipate that price drops will continue, but it is clear those will be on Amazon’s schedule.
It’s all about the enterprise. Amazon started by appealing to developers. They are now focusing squarely on the enterprise. While past keynotes featured early adopters like Netflix and NASA, this year’s keynote moved into the early majority, highlighting companies like Philips, Johnson & Johnson, and Intuit. Nike was one interesting company to highlight. Aside from throwing an amazing after-party, they spoke about their journey to a cloud-native microservices architecture. They went from taking months to deploy a new feature in 2011 to a few hours in 2014. A shoe company not only moving into the cloud, but fully embracing it is shows us what’s ahead as the rest of the non-tech world realizes the full benefits of operating in the cloud.
AWS innovation engine shows no signs of slowing down. The number of new services that Amazon announced was truly impressive. I was impressed with both the quantity and the quality of the new services Amazon announced at with the conference.
Of all the announcements, Lambda was the one I found most interesting, and I think that many in the audience shared my sentiment. Lambda creates an entirely new programming model for event-driven asynchronous programs that leverages what AWS does best: abstracting away the undifferentiated heavy lifting of building infrastructure and freeing developers to focus on the more differentiated application logic. I also found it the most unexpected of the menu of new services.The other product launches were natural incremental features to existing services or use cases. Take for example the EC2 Container Service: this was expected in reaction to Google’s announcement a week prior. Lambda was creative, unique and unexpected and it is that innovation which will help Amazon maintain the wide lead it has established in the race to cloud dominance.
What I find most rewarding about attending conferences such as re:Invent is the opportunity to meet with other people interested in similar spaces and technologies. It was wonderful to reconnect with everyone from AWS with whom I had previously worked at Netflix and to catch up with the rest of the clouderati in attendance. The Expo, hallway, and drink conversations alone are worth the price of admission.
Already looking forward to seeing you next year at re:Invent.
Today, I am proud to announce our investment in Agari, a provider of real-time, data-driven security solutions that detect and prevent advanced email cyberthreats. Why Agari? Consider this, sensitive information of over 110 million people in the U.S. was stolen in the Target breach, and it all started from a phishing email. The lack of email authentication continues to be a source of massive vulnerability for users and companies many decades since email’s initial inception.
The foundation of the Internet that we all know and love today was formed in the 1960s and 1970s out of military research that built ARPANET, the first packet-switched connected network built on top of the TCP/IP protocol. One of the first “killer apps” built on this new infrastructure was a way for the then small group of researchers to exchange digital messages with one another. At the heart of this electronic mail subsystem was SMTP, a protocol as fundamental for email as TCP/IP was for the Internet. But because email was then intended for a small trusted group, authentication was never built into the protocol.
Fast-forward to modern day and we find that the number of online users numbers in billions and more half the email that is sent is spam. Phishing has become a major attack vector for the bad guys to gain an initial foothold to exploit well-intentioned security defenses. In our recent security survey of over 100 CISO’s revealed that one third of those surveyed list data breaches and malware outbreaks as a top concern for their business, both issues that involve email as the attack vector.
The most impactful breach in the past twelve month has been the highly publicized Target data breach where 40 million credit cards and 70 million identities, including name, address, email address and phone number, of Target shoppers were stolen. How did the bad guys get in? A successful phishing email attack on Target’s HVAC vendor installed malware on the vendor’s computers that, due to improper security controls at Target, made its way into Target’s network. A phishing attack was the first point of compromise.
Thankfully, there is a solution. New protocols like SPF (authentication) and DKIM (integrity) add the missing security to the email protocols. The last missing piece was just recently added in 2012. Agari, along with a group of leading internet companies including Google, Yahoo, PayPal, Twitter, Facebook created a new standard called DMARC that finally allows email senders to instruct email deliverers how to treat unauthenticated emails.
This fundamental leap forward in email architecture is what excited us about investing in Agari. However, making this transition won’t happen overnight. It’s complex and requires ongoing monitoring and management. Additionally, a new level of threat intelligence can be gathered from attempted attacks. That is exactly where Agari comes in, helping organizations that want to protect their brands from email abuse and securing their email channel. The Agari cloud-based SaaS solution aggregates data from 2.5 billion mailboxes to help global brands eliminate email threats, protect customers and their personal data, and proactively guard brand reputation. Today, Agari analyzes over 6 billion messages per day, identifies over 2 million malicious URLs per month, and blocks over 200 million malicious emails per month.
We’re thrilled to join the Agari team and help them scale the business.
Software companies are acquiring and accumulating ever-increasing sets of data driven by reduced costs and new database paradigms. But as my colleague Cack Wilhelm wrote earlier this week “‘Big data,’ however large, sitting idle in a data store is not adding value to an enterprise … Data must be consumerized easily for business stakeholders in order to uncover insights and drive predictions that are specific to solving each business problem.” Software companies are moving beyond just gathering retrospective analysis and instead are looking to leverage this data to better understand and predict what is to come – often by using machine learning. We call this trend of next generation set software companies “automation by algorithm” and I predict that it is going to eat up software.
The technology of using machine learning for predictive analytics is not new. However, quick glance at Google Trends shows an increasing interest in predictive analytics. It is both interesting and important to understand why this trend is growing.
Search Term: Predictive Analytics
The rise of big data (both the decreasing cost and non-relational database formats) is likely the largest catalyst. But there are other factors as well, such as the proliferation of the cloud, distributed computing, and better hardware that have made machine learning algorithms faster and easier to run. There is also an organizational shift at large with cutting-edge software companies being the first to adopt. Now, smaller organizations are following the trend as they acquire the talent from larger companies. There’s also been recent innovation in machine learning algorithms, with random forest (early 2000s) and deep learning (2000s) both being discovered in the past decade.
It is easiest to understand “automation by algorithm” by looking at examples such as our portfolio company Sailthru. It is common practice to optimize frequency of e-mail sends to open rates. This quickly becomes complex as you add dozens of possible variables (content of the email, time of day, subject, recipient demographic, pricing, etc). A data scientist may be able to optimize one instance, but the ongoing complexity means that only a machine-learning algorithm can continually optimize for the best result, incorporating feedback as performance changes. Sailthru has developed such technology (and more) allowing any customer to easily integrate this functionality without requiring in-house knowledge of machine learning. There are many alternatives when seeking an email vendor, but the ability to automate the algorithm and deliver it as an application / service is what distinguishes the company from its peers.
Rather than broad, horizontal solutions we believe the greatest near-term opportunity exists in specific use-case applications that leverages machine learning. This allows for the strongest selection and optimization of the machine-learning algorithm specific to the use case. Further, we have noticed that companies with the largest data sets have a clear advantage and therefore the most success. The data set can be accumulated, accessed through partners, or purchased– the source is less relevant than the volume (though source is a potential for long-term defensibility). Early adoption of automation by algorithm is evident in data rich areas such as marketing and finance. However, we’ve also seen great use cases in human resources, sales, and industry specific vertical software. Without intending to be exhaustive, a few areas we have found particularly interesting are:
Back in April, my colleague Stacey Bishop blogged about our growing interest in this area. We saw that several of our portfolio companies had either too many leads or too few. Those with too many are left hopeless trying to figure out which leads to focus on. Those with too little are often challenged with understanding where they should find their next lead. We’re really excited about companies using automation by algorithm to help companies focus on the right leads and identify new leads leveraging existing customer and pipeline data.
Managing hundreds or thousands customers is tough. Monitoring all the events that can lead to a churn event becomes impossible. Automation by algorithm can be used in retention software to predict the behavior of future customers and churn events. We are particularly excited about companies that not only leverage direct sources of data such as payment schedule, engagement, and feature requests, but also external data about that customer. For example, monitoring headcount might let you know that a sudden, significant reduction in forces could indicate financial hardship (and therefore a temporary easement on payment may prevent a churn event).
Reviewing thousands of resumes is inefficient and suffers from human bias. Meanwhile candidates are putting all their career data online, publishing their work online, and demonstrating qualifications in dozens of other ways across the Internet. The resulting data footprint holds a potential for strong insights into the recruiting process. We believe that there is a large opportunity for recruiting software to leverage automation by algorithm and change the way managers recruit.
Online fraud is exploding. Many first and second generation software solutions are struggling as location has become mobile and online personas have become more complicated to verify. At the same time, more behavioral, social, and other third party data is available than ever before. We are very excited about companies that are leveraging all of this data into their software and using automation by algorithm to help companies combat fraudsters.
Note: We were very fortunate to have Xiaonan Zhao join us for the summer. Her previous experience in machine learning while at Google and enthusiasm for the area were crucial in the development of thinking. She is currently a second-year at Harvard Business School and contemplating her post-graduation plans.
The term ‘big data’ has assumed different definitions as the industry has developed, with no universal consensus around whether it refers to the actual quantity and complexity of the data or the software and tools intended to make sense of the data – or a combination of both.
That said, most of us can agree that the infrastructure and tools focused on tackling big data challenges have progressed steadily. I worked at Cloudera in 2010 and 2011 and, at the time, in the eyes of our customers big data was Hadoop, and HDFS + MapReduce were the main attractions. Tech-savvy Cloudera customers were just beginning to experiment with 10 and 20 node clusters, nothing like the thousand node clusters you witness today.
Looking back over the past five years, Hadoop, NoSQL stores, and AWS have enabled very inexpensive data storage and compute, allowing companies to capture and retain more data. This data is as varied as video clips, call records, log files, machine-to-machine output, and social media streams, but taken together it is seen as an organizational asset and a competitive advantage.
As Hadoop and NoSQL took root, other data-related trends have emerged or persisted, increasing raw data volumes and reinforcing the need for cheap data storage, including:
Companies have ample internal and external data to store and they have a cost-effective way to do so: Hadoop or NoSQL paired with inexpensive commodity hardware made possible by the endurance of Moore’s law and resulting capacity improvements. To borrow an idea from The Second Machine Age, the database layer has become the “general purpose technology” and from here, companies are building complementary innovations to put the data to use, everything from Paxata for data preparation to Looker for data exploration and interaction. The next five years is going to be focused not on storing the data but on how to leverage or monetize all of this data.
‘Big data,’ however large, sitting idle in a data store is not adding value to an enterprise until tools exist to deliver analytics that access data in disparate locations, democratize the analysis currently performed by data scientists, and inform. Data must be consumerized easily for business stakeholders in order to uncover insights and drive predictions that are specific to solving each business problem; there is a dearth of data scientists and developers may not be informed of the underlying business questions.
Most agree that traditional business intelligence (BI) is being overshadowed by tools that are prescriptive and predictive in addition to being descriptive or diagnostic. Ad hoc analysis and advanced visualizations are increasingly utilized alongside (or instead of) reporting and monitoring. I predict that over time simple reporting and monitoring capabilities will come ‘free’ with many software applications as a standard function, just as APIs are now free for most tools.
With the general purpose database-level technology in place and the complementary data analysis tools maturing, the pivotal next step is closing the loop between data outputs and action. This is not action by humans but action triggered by a machine or computer. Machine learning, predictive analytics, and statistical analysis are a few emergent trends focused on making automation a reality. Just as ‘big data’ became a nebulous term, each of machine learning, statistical analysis, and predictive analytics suggests a different meaning to different constituents: for example, machine learning may relate more to algorithms or deep learning, whereas statistical analysis is associated with random forests and logistic regressions. Regardless of the nuances, each analytic approach aims to automate what was once a human-powered action made in reaction to data served up by computer analysis.
Today lots of this is black box, as in the mathematical machinations are obscured from the user. In another couple of years, white box, closed-loop analytics powered by machine learning algorithms may well be the new general purpose technology. Context Relevant and Domino Data Labs appear to be well on their way. This trend, in turn, will enable the next crop of complementary innovations to emerge.
In a business like venture capital, it’s all about people and how they work together as a team. That’s why I am thrilled to announce some new members of our team & two promotions!
Cack Wilhelm has joined us as Principal, with a focus on cloud infrastructure investments. Cack was with us last summer before finishing her MBA at the University of Chicago Booth School of Business. During her summer here at ScaleVP, Cack focused on the twin themes of business processes being automated by algorithms and big data being analyzed for insight. Cack draws from her recent experience in tech sales at Hadoop company, Cloudera and before that, at Oracle. Her technology, sales and startup background is a perfect addition to our infrastructure team and our portfolio.
Cack is also a dedicated runner, racing professionally for Nike for two years at the 5,000m distance and competing in track & field and cross-country while an undergraduate at Princeton. She already has visions of us competing in corporate challenges so expect to see more of us on the track!
Next up, Rose Yuan joins us as Associate. Most recently Rose was an analyst with J.P. Morgan Chase in technology banking where she worked as part of the advisory team on the sale of ScaleVP portfolio company, ExactTarget to Salesforce. Rose will work with us on investments across the portfolio in the SaaS, Cloud and Mobile sectors.
And while we are excited about the new members of our team, we continue to invest in our existing team and are proud when they succeed, since they represent the future of our business.
Ariel Tseitlin has been promoted to Partner. He joined us almost a year ago from Netflix where he was Director of Cloud Solutions. Ariel has been a great addition to the team getting involved in all aspects of the business from joining portfolio boards, to working with Andy Vitus on further evolving our IT sector focus, and even recruiting an EIR for Scale. He has sourced a number of interesting new companies so expect us to be announcing a new investment with Ariel as the ScaleVP lead sometime soon.
I am also proud to announce that we have promoted Susan Liu to Senior Associate. Susan has become a key member of our investment team, working closely with our SaaS team on the sourcing and evaluation of deals. She has worked on a number of ScaleVP investments including Bill.com, Bizible, DemandBase and particularly WalkMe, which she sourced directly at an industry conference.
Finally, congrats to EIR Bill Burns who recently accepted the new CISO position at Informatica. Ariel introduced us to Bill who had been the Director of Information Security at Netflix. Bill was our most recent EIR, spearheading a research project where he examined the priorities for 100 CISOs, what innovations they’re focused on, and how they are planning to help businesses take smart risks. He hosted weekly brown bag lunches to educate us on the inner workings of information security organizations and keep us all informed on what he was learning from the contacts he has made through his work with the ISSA CISO Forum and RSA. The result from his research was insightful and continues to shape our investment thesis around security. As is always the case, our EIRs remain part of our network. Last week, Bill joined Ariel & Cack in co-hosting a roundtable dinner to talk about security with some of our closest IT relationships. Thanks, Bill, we wish you well!
It’s the longest running joke in advertising, coined by the late John H. Wanamaker: “Half my advertising is wasted; I just don’t know which half.”
Enter Bizible. Today, ScaleVP is announcing our latest digital marketing investment in Bizible. Bizible provides CMOs with performance management software to answer the question of where to direct their marketing budgets. ScaleVP has a long history of investing in successful digital marketing companies. Namely, we’ve funded HubSpot, Datasift, Sailthru, and Demandbase. And our exited investments include such names as Omniture, ExactTarget, and Vitrue. We are thrilled to announce Bizible as our most recent investment.
Earlier this year, we announced that former ExactTarget CMO, Tim Kopp, was joining ScaleVP as an Executive-in-Residence. In that same blog post, we highlighted the key marketing pain points that CMOs, such as Tim, consistently share with us. Arguably, chief among these pain points is the ability to quantify the impact of marketing spend. Specifically, CMOs struggle to tie this spend – and the discrete programs it supports – to net new prospects and customers. They know that their marketing programs are driving results, but without the performance data, they’re unable to discern which programs deserve the credit. Bizible solves this problem.
Founded by Microsoft/Bing advertising veterans, Bizible sheds light on which online marketing programs are delivering the highest ROI.
Bizible connects all of a company’s digital marketing programs to their CRM system, enabling CMOs and demand generation teams to attribute revenue to paid search, social, email, display, and campaigns.
Bizible surfaces this data through an elegant and easy-to-use interface that intuitively shows where marketing programs are working and where they’re not. In fact, today the company announced a new product offering targeted to business-to-business CMOs that allows them to quickly:
In short, Bizible clarifies what’s working and what’s not. Companies that implement Bizible can identify the source of their leads and demonstrate the value that marketing delivers. Most importantly, there’s no more “wasted” advertising.
Advances in technology are being embraced by both security teams protecting sensitive corporate data and the sophisticated criminals trying to disrupt business. Addressing security, privacy and compliance concerns are often cited by businesses as top priorities before adopting Cloud or BYOD technologies. As a member of and advisor to Wisegate, I partnered with the IT advisory service to survey over a hundred security leads to learn what’s most important to CISOs, what innovations they’re focused on to address their most pressing problems, and how they’re planning to help businesses take smart risks.
New battlefields, same war. CISOs remain vigilant on the fundamentals – Malware Outbreaks and Data Breaches. Security teams confront growing risks on many fronts, from new technologies to external threat factors. Driving their security strategies are 6 technology trends and 5 top risks.
Security programs prioritize risks and business alignment, but lack tools to draw the big picture. Their risks are increasing, but only half can efficiently report risk status to their Boards and internal business partners.
As IT hands off infrastructure control, CISOs focus on the data. Shared risk models – a nod to the expanding universe of user devices and the dissolving enterprise perimeter.
Automate all the things. CISOs push automation, orchestration to manage point solution sprawl. Consolidation and automation are top areas of focus to improve security program maturity. Three-quarters of CISOs are building or integrating solutions to address their top risks. APIs are frequently requested features in modern security solutions.
Six Top Technology Trends
Businesses are embracing and realizing the productivity benefits of BYOD, Everything as a Service, and ubiquitous connectivity from their mobile devices. One of the consequences of these shared-risk models is losing visibility and manageability on endpoints, applications, and networks. These advances have impaired traditional security controls based on traffic inspection, blacklist signature-matching, and device management. Businesses have accepted these risks in exchange for their benefits, driving innovation for alternative solutions to secure corporate data. Advances in predictive and behavioral analytics, Cloud Application Security Brokers and SecDevOps methodologies, for example, help security teams remain effective at addressing their top risks — harmful applications and the loss of sensitive corporate data.
Five Top-Of-Mind Risks
Five risks capture 51% of CISOs’ top concerns, and are increasing industry-wide. Two risks in particular - Malware Outbreak and Sensitive Data Breach – account for nearly 1/3 of all CISOs’ attention. They were more important to participants than the next 6 identified risks combined. Beyond the top half, risk priorities quickly become diffuse and follow a power-law curve – indicating that security priorities differ across industries, companies, and program maturities. See our earlier post for a “word cloud” version of top risks.
Although Malicious Insider Threats receive a lot of press and are harder to detect, external threat actors pose a greater overall concern. Verizon’s 2014 Data Breach Incident Report indicates that only 8% of reported data breaches involved malicious insiders. Our data suggests that although the “insider threat” is a concern, it’s not in participants’ top-3. Coincidentally, behavior-based controls show promise at detecting anomalies in both endpoint application execution and activity logs — we look forward to more innovations in both of these top-risk areas.
Risk-Based, Business-Aligned Programs
With so many possible ways for harm to affect a company and its data, how do information security programs prioritize what to focus on, what threats should teams address first, and when should they change their focus? Teams overwhelmingly follow “Risk-Based” approaches, and look forward about 2 years when reviewing their strategic roadmaps. As security products themselves implement more cloud-based controls (either directly or embedded within vendor products), security teams will stay agile amidst ever-evolving threats.
Q: How do you prioritize your security program?
Q: How far out do you plan your strategic roadmap?
But Metrics and Reporting Impact Are Lacking
Despite being able to identify their top risks, one-half of our participants admitted they didn’t have good ways to measure the status of these risks or how effective their programs were at address them. This is surprising and concerning – imagine trying to fly an airplane at night with your three most important cockpit indicators missing.
Security and risk management systems are becoming Board-level discussions, government and industry regulations are also requiring better risk monitoring and controls. While many security products do provide dashboards, those tend to be specific to that product’s threats and activities. What’s needed are efficient ways to map all of this event data into holistic, business-level perspectives.
CISOs are looking to put security controls as close as possible to enterprise data, versus focusing on specific device types or threats. Information Protection and Control products (“IPC”, including DLP/DRM/masking/encryption technologies) were the #1 desired control to apply on computers, at the infrastructure layer, within applications, and on Mobile endpoints.
Data-centric security will become increasingly important as emerging “Cloud Always” companies implement modern enterprise stacks, established enterprises refresh their technologies with “Cloud First” initiatives, and even “Cloud Cautious” companies realize the benefits of SaaS and IaaS. By focusing on capabilities and adherence to data-centric security controls instead of specific device types, security teams can support a wider range of BYOD endpoints and applications. Storing and processing corporate data within SaaS and PaaS providers becomes less risky if the enterprise manages the encryption keys. There are many operational considerations to address, such as enterprise search and key distribution, but this is a promising area to address a wide range of risks.
Security teams are consolidating and automating their controls
And finally, several participants remarked that they’re concerned about managing an ever-expanding set of security point solutions. Even if security teams could easily find qualified staff to run new controls, they get better efficiencies driving security initiatives via automation and APIs. This is consistent with our investment in Chef and the notion that good security posture is based on solid operational controls and consistent configuration management.
Q: For which risks did you need to build something in-house because there were no acceptable commercially available alternatives?
We met with nearly two-dozen CISOs across more than 15 industries, asking about trends, externalities, and what they’re focusing on to protect their enterprise risks. We then partnered with Wisegate who surveyed their members to get a broader, in-the-trenches perspective on security practices and strategies. This increased our total, usable sample size to just over 100.
There was strong consistency between both data sets and we saw some “spreading” amongst priorities, implying that both program maturity and product choice is alive and well within the Information Security market. We also collected attributes about InfoSec programs and heard glimpses of what makes security programs successful. I will be presenting more details on these findings at the Gartner Catalyst Conference in San Diego August 11, 2014.
We hope these findings are helpful to both enterprise security teams and security startups contemplating new approaches. We’d like to thank our CISO survey participants for their time and insights, and Wisegate for their expertise. Future InfoSec opportunities are large, defenders are eager to gain new capabilities, and the market is ready for new innovations to disrupt the status quo.
In our last entry we discussed the benefit for early-product, SaaS companies to initially target the SMB market with the goal to move upmarket and capture larger accounts over time. While this presents a smoother entry into market, it creates a new set of challenges including: when to move upmarket? And when the timing is right, how does one move upmarket? As investors and board members we work with our companies closely through this growth. It is a broad subject that could be covered in many chapters of a book. Instead, here, we have asked some of our portfolio executives to share lessons from their own experiences.
One executive summarized the upmarket timing simply with “you will know.” In board rooms we often get asked whether it is time to start pushing upmarket. It is a much more organic path where you get pulled by staying close to your customers, having an open line of communication internally and externally, and strategically investing in your product and organization as the upmarket demand increases. The methodical approach also allows you to pace the disruption of an incumbent, making it harder for them to swiftly react.
During this past quarter our current EIR and security expert Bill Burns has been researching and building an investment thesis around the enterprise Information Security space.
We set out to determine which social and technological trends most affect information security programs, and how security organizations protect their corporate and customer information as companies evolve and adapt to these trends. We also wanted to see what changes in the threat landscape are impacting InfoSec teams and what perceived market gaps exist – e.g. problems without good solutions. It’s not uncommon for security to lag behind technology innovations, and different organizations have different risk appetites, budgetary constraints, and regulatory mandates. All of these factors affect how teams manage security and risk within a particular organization, and create opportunities for innovative startups.
I’m excited to share some of Bill’s early results and give back to the security community that has helped us so graciously with their time and insights.
To understand what market forces are at play within the security space, we reached out to nearly two-dozen CISOs across 15 industries. We wanted to know how their security programs are maturing and being disrupted, how effective CISOs are at dealing with substantial changes to their attack surface and sets of controls, and we wanted to know what “keeps them up and night” and how they’re meeting that challenge. What we heard in these interviews was refreshingly candid and insightful. They shared what controls are working, where they see concern and opportunities ahead, and what they’re strategically focused on going forward.
What Keeps CISOs Up At Night?
Regarding these top-of-mind risks, the consensus was that teams want to approach security more as a “science” as less as an “art”. Security organizations were looking to:
We also asked CISOs to identify external forces most likely to affect their security program strategy in the near future. It’s not surprising that Mobile Computing, and Cloud technologies (both SaaS and IaaS) were top security forcing functions, since they’re also top-of-mind for CIOs and organizations to improve employee productivity. Increases in Regulatory/Compliance pressure also ranked highly.
Securing Agile / DevOps methodologies also was top-of-mind for CISOs, which is encouraging. Our investment in Chef has shown that enterprises are adopting automation rapidly, and it’s great to see security teams embracing this trend to deploy security controls more consistently and continuously. We anticipated going into the research, and the participating CISOs confirmed, that automation and API-level access will become table stakes for security products going forward, to support a new level of interoperability and customization.
The use of BYOD and “Shadow IT” also scored as top concerns for many participants. These two technologies are managed by end-users, which causes concern for those typically assigned to protect the data being processed and managed. Data breach reports such as Verizon’s seminal Data Breach Investigation Report and post-mortems from high-profile incidents frequently implicate poor security controls on endpoints and attackers socially engineering persons to infiltrate companies and exfiltrate their data.
We also measured trends that have the least effect on security priorities, and those were more industry-specific. What matters least to a highly-regulated government contractor (that can afford to lock down or prohibit BYOD) is different from a fast food restaurant chain that doesn’t write custom applications. Choice is alive and well within the information security market!
We’ve just scratched the surface on the insights we’ve gleaned from our initial survey. We’ve partnered with Wisegate, a next generation IT advisory company, to create a streamlined version of this security questionnaire and open it up to a broader audience. For more information on Wisegate and a to see a list of their community-based public reports, please visit here.
Bill shared some of early insights last month at the Rocky Mountain Information Security Conference. We’ll be sharing more of these research results this quarter as this research project wraps up, including the security market segments that we find most interesting. Stay tuned!
A version of this article previously ran in Entrepreneur.
How do you know when you’re about to get to startup heaven, scaling up fast and reliably? A company’s growth curve can head in a variety of directions, not just the lucky, up-to-the-right hockey stick of explosive growth.
At ScaleVP, it is our business to figure out exactly when a company can use our money to grow to the heavens, or when it’s best to keep the burn low. The checklist below is based on insights we’ve drawn from our due diligence over more than a decade of helping to scale a number of successful startup companies.
Do people want to buy what you have to sell? In Silicon Valley speak, that translates into having achieved “product-market fit.” At this point, you’ve tweaked your product (service, solution) and have a standard version, and maybe a few options that are appropriate for the bulk of your market. Signs that you haven’t yet achieved this state include products that are customized for each customer, need a lot of customer support, have negative margins, get poor customer reviews or do not get used after purchase/installation.
Do not confuse having a “minimally-viable product” (MVP) with being ready to scale. Eric Ries of LeanStartup fame coined that phrase. An MVP is used to test-market your product before it is built so you know what to build without wasting precious resources on building the wrong features. You need real products to scale, not a mock-up.
These three P’s are related to your product, and describe the bundle of attributes that you are taking to market. You better have these nailed down before scaling, too.
You have to figure out a cost-effective way to get your product to customers. Much of the money spent by startups goes towards sales and marketing (see the second chart here). Your people are relatively expensive, the programs cost real money, and oh, by the way, both people and programs interface with your prospects, so if they’re not right, you risk doing more harm than good. Scared now? Good! Do not spend a lot here before you have figured out what works.
Your “sales channel” is the path to a successful, cost-effective sale. It includes the methodology (telesales, direct sales, retail/distribution, value-added resellers, e-commerce, etc.) and the associated processes (telesales scripts, salespeople’s backgrounds, reseller training materials, etc.) that lead your customer to say “yes.” You don’t need every component to be polished and optimized to begin scaling, but you should have a clear idea what works, what doesn’t, and most importantly, what you would do more of if you had additional money.
ScaleVP often sees a pattern with companies on the verge of scaling: They move from the entrepreneur him/herself making initial sales to adding a couple of sales people, then a manager with another small handful of sales people. That point where you have built a small, competent team that executes flawlessly is exactly when you know you have the recipe for success and are ready to scale.
Sales isn’t the only department that needs to prepare for scaling. As the entrepreneur and leader, you need to be ready to scale. This usually means you have hired some top talent to support you in key functions, and the team as a whole has the capacity and clear direction to take on the challenges of extra business. Your company needs to be able to service existing customers consistently and simultaneously attract new customers, and keep all of them happy.
Engineering needs enough leadership to keep driving innovation, while the business team is closing more deals. And if you’re making a real-world product, factories, shipping departments, customer service lines and finance departments all have to keep growing efficiently as the business expands. With modern software-as-a-service solutions, many of these functions can be automated inexpensively.
If part of your organization is fragile or broken, you can be in startup hell — frazzled, inefficient and stalled. You either won’t attract needed startup capital, or if you get it, you won’t spend it effectively. So ask yourself, “Are we all ready to scale?” using the checklist above. If you are, we could be seeing you soon.