• Our Investment in CloudHealth Technologies
    Posted by Ariel Tseitlin on January 21, 2015

    cloud

    Netflix, a company that accounts for over a third of all downstream internet traffic in the US at peak, is widely regarded as a pioneer in the cloud. I had the privilege to manage the Cloud Solutions team at Netflix through 2013, looking after streaming operations and cloud tooling. I, along with others at Netflix, often spoke publicly about Netflix’s migration to the cloud, as we were one of the first to move major infrastructure to the cloud.

    Migrating from the data center into the cloud was no easy task, but the hard work emerged after we were functioning in the cloud. Only then did we fully appreciate the complexity of running a globally-distributed, always-on end-user service on top of an elastic software-defined infrastructure. From the start, the cloud-based service was superior to the same service running in Netflix data centers. But we quickly realized the added complexity and management of operating in the cloud.

    This led us to build tools like Asgard, ICE, Chaos Monkey, and the rest of the Simian Army, all of which are now a part of NetflixOSS.  Back then, Netflix was a trailblazer and early adopter of the cloud. Now, more and more “traditional” enterprises are going all-in the cloud.

    CloudHealth-LogoSince leaving Netflix and joining Scale Venture Partners, I have been on the lookout for a company that encapsulates the best practices and tools we developed at Netflix for highly-available and efficient cloud operations because many companies prefer to buy instead of build. I’m thrilled to have found it in CloudHealth Technologies. Today, we announced a 12M investment in the company to help support customer acquisition and expansion of the platform.

    Cloud Adoption Lifecycle

    The typical cloud adoption lifecycle goes like this:

    • Teams move to the cloud to increase agility and velocity of feature delivery
    • At some point, typically about a year after initial cloud adoption, the organization realizes that the increased agility and velocity come at the cost of increased management complexity, with the side effect of reduced visibility into costs and cost drivers
    • As a result, a project is launched to reduce cloud costs and increase visibility into cost drivers. (For example, one use-case is the desire to attribute infrastructure serving costs to each customer to determine per-customer margins.)  Because of the dynamic nature of the cloud, added cost visibility and attribution is an ongoing need rather than a one-time analysis.  Once costs are under control, broader cloud complexity needs be tackled with increased visibility and automation.  For visibility, the value lies in providing a single place that shows that health of the cloud infrastructure while maximizing information flow to enable decentralized decision making.  For automation, management complexity is reduced by minimizing human (manual) workflows and allowing software to manage the ongoing operations in the cloud.
    • Invariably, organizations hit a stage where cloud adoption matures beyond initial experimentation. At this stage, automation and visibility are needed to extract the true benefits of the cloud and make cloud operations manageable.

    When we look across early cloud adopters, we find that many built their own internal sets of management services for making cloud operations more automated, less costly, and more performant, available, and secure. Looking ahead, this need to automate and simplify cloud operations is a universal requirement for cost-effective cloud adoption and one that is very much not limited to early adopters. CloudHealth solves that need and can help any company, small or large, that is serious about maximizing the return of a cloud investment.

    What I liked about CloudHealth was that they have a holistic vision for what IT Service Management in the cloud should be. They deliver an easy-to-use, API-driven, cloud analytics platform that addresses all aspects of the traditional IT service management platforms without the heavy investment. The CloudHealth platform collects, integrates, correlates and analyzes the massive amount of data available from all of the cloud-based platforms and services that companies use today thus giving customers the context to develop business models, analyze trends, and report historically. They are setting the pillars for companies to recognize success in the cloud.  I look forward to working with the team on their next phase of growth.

     

  • Treasure Data allows massive torrents of data to be easily streamed into a cloud service for storage and analysis.
    Posted by Andy Vitus on January 16, 2015

    Treasure-Data At some point during the past decade everyone forgot the hard-knock lessons from the pre-SaaS era and decided that running a highly-available, on premise big data infrastructure was a good idea. We heartily disagree. We have been looking for a cloud-based big data-as-a-service offering for a few years. The need was clear: it should allow easy data ingest from any device, in real-time, and with a simplicity that requires no more than  1/10th of an engineer to manage. A cursory glance at any ‘Big Data Market Map‘ reveals that this is a very hard problem to solve. When we first met Treasure Data it was clear that the company occupies a unique position in the landscape. The Treasure Data offering has four characteristics that combine to form a very valuable service:

    1. Simplicity

    I added the Treasure Data software to one of my web applications and was streaming data into their cloud service within 5 minutes. In about the same amount of time I was able to connect from Tableau and generate visualizations of the data. Any company that empowers individuals to that extent has a strong wedge into a market. The service dramatically reduces both upfront capital expenditure and ongoing operational costs over time.

    2. Easy Ingest to the Cloud

    Treasure Data sponsors FluentD, the open-source data collector that has become the standard in the industry. Streaming data from myriad devices with intermittent web connectivity is a hard problem and Treasure Data has solved it elegantly.

    3. High Volume Data Streams

    Anytime engineers stream data on intervals approaching 1 second, the torrent of JSON blobs becomes staggeringly large. Handling a million incoming data records every second is a huge technical achievement. It is one that will provide a lot of value as corporations continues to connect sensors to the web, as mobile devices capture location and application usage data, and as web applications begin to capture not only system logs but also user activity logs.

    4. Real-time, SQL Access

    The team at Treasure Data has coupled a flexible schema with SQL access from all the usual business intelligence tools. Allowing easy query access into the data store is clever and will massively reduce the friction associated with adopting a big data platform.

    It is far easier to describe these characteristics than it is to build them. At ScaleVP we like to invest in companies that are both well-aligned with technology trends and have a deep technical moat. It is wizardry to be able to handle large amounts of data and completely abstract away the complexity of the underlying architecture. The capital efficiency with which the Treasure Data team has accomplished this feat is rare and is a big part of our enthusiasm in partnering with them as an investor.

  • In the world of PCs, identity was important, but in the cloud it is vital
    Posted by Rory O'Driscoll on December 16, 2014

    The Lost World of the Microsoft Controlled Desktop

    There is something to be said for a benign dictatorship. For all its faults, the world of the Microsoft PC desktop was an orderly one, and for IT administrators, an easy one to manage. The occasional “blue screen of death” was a reasonable price to pay for a compute environment where services like identity, security and monitoring could be provided efficiently across all applications using Active Directory, anti-virus and myriad PC management tools. By contrast, SaaS is like a democracy, lots of potential but messy, and to a control freak (and all great IT managers are a little bit controlling!) a little bit worrying.

    Building the Desktop in the Cloud

    End users are not going back to the PC desktop, so now IT has to re-create the desktop experience and the desktop management experience, but in the far more heterogeneous and disparate world of the cloud. Building the desktop in the cloud is a megatrend that will impact hundreds of tech companies. Google, Microsoft, Apple and Firefox are duking it out for browser market share, because the browser is the new OS. At the back end, there are hundreds of cloud-based SaaS applications. In the middle is a great big mess. In the absence of a dominant vendor that comes between all clients and all apps (the role Microsoft used to fill in the LAN), there are a host of new security, management, monitoring and identity products, all selling to IT to help control the world of cloud without destroying productivity.

    Identity Matters

    In the world of PCs, identity was important, but in the cloud it is vital. With PCs, LANs, and the always hated VPN, IT could have a high degree of confidence in what applications were running and what devices they were running on. Today that is gone. Applications from SaaS providers are now purchased on credit cards and run outside of IT. End devices can be a tablet, a BYOD phone that the employee owns, or a PC. The IT “bag of tricks” for locking down devices and blocking access to applications is now completely irrelevant. What IT can still manage is identity, which is simply the list of current employees (plus vendors, customers and consultants), and what applications and what information these employees can have access to. If that list can be kept up to date via close integration with HR systems, and if that list can then be promulgated across all third-party applications in real time, then IT can use identity as the leverage point to seamlessly re-achieve control. That is the business OneLogin is in.

    Single Sign On for Market Entry

    computer-app@2x (1)In the new world, IT cannot make things worse for users just to make things better for IT. Users will go around them. In response, identity services and access control, which is what IT wants, has been cleverly packaged by vendors to appear to the user as Single Sign On (SSO), which is what users want. The idea is simple, if you are a typical knowledge worker today, you use an average of 13 different SaaS applications that you sign into regularly. Thirteen applications means thirteen passwords and thirteen opportunities to forget a password. With Single Sign On, you log in once, and you are automatically logged into all your applications.

    For IT, this is a chance to roll out a user win that is also a huge long-term win for IT. Once users access all their applications through SSO, IT can then use SSO as a central clearinghouse to enforce stronger authentication (not just password but two-factor authentication), to do real-time provisioning of new employees (once an employee joins a company they get instant access to all relevant applications), and to do real-time de-provisioning (once an employee leaves or is asked to leave, they instantly lose access to all applications). This is not rocket science stuff, but if the average employee has 10 SaaS applications, and staff turnover is 15% per year, then a 1,000 person company has to provision and de-provision 1,500 different accounts annually.

    Why OneLogin

    This is a competitive market with a few direct competitors and many incumbents and adjacent players that are starting to tell a “we manage cloud identity” story to stay relevant. In this market we have been consistently impressed with the execution of the OneLogin team since we started tracking the company two years ago. The founders, Thomas (CEO) and brother Christian Pederson (CTO), have the classic immigrant entrepreneur story, literally building the first version of the product while living on ramen noodles in an apartment in Los Angeles. The company has moved to Silicon Valley, and Thomas has built a strong go-to-market team around him. What we particularly like is the focus on being partner friendly, including helping other SaaS vendors implement standards like SAML and now NAPPS (for mobile) by providing free toolkits to integrate these emerging standards.

    The biggest challenge the company faces now is keeping up with the demands of an exploding marketplace. The company is hiring across sales, customer service and engineering. We are excited to be part of the team and look forward to working with Thomas, the team and the board to build the winner in the cloud-based identity market.

    OLmobile

  • The power of referral is the next phase in digital marketing
    Posted by Stacey Bishop on December 8, 2014

    It’s no surprise that consumer and business buyers are more likely to buy something on the recommendation of a friend or peer – “earned advertising” as Nielsen calls it. I noted this rise of referral marketing in a post called The New Marketing Funnel, which examined how buyers’ behaviors have changed in the wake of the digital revolution. Some of today’s most successful startups have made referral marketing the backbone of their demand generation activities. The reach that companies can achieve with referrals is wide; 39% of marketers use referral regularly and 43% of these acquire more than 35% of new customers through referral. That is why I am proud to announce our investment in Extole.

    Optimizing the Referral Process

    Building a sophisticated referral and advocacy program in-house can be costly and complicated. The Extole platform makes it possible for marketers to go beyond paid search and SEO – to amplify the passion of their own members and loyal base to acquire new, passionate, paying customers. Existing customers are a company’s biggest champions and people rely on their peers for suggestions on products and services. With the Extole platform advocates receive a unique referral link to share with friends. When a friend purchases through the link, both the friend and advocate get rewarded.

    The appeal of Extole also comes from the fact that the marketing department, not IT, manages the platform. Extole provides the software to help companies test incentive programs, manage multiple campaigns, and easily access analytics. The platform helps to mitigate user fraud. Extole customers can even save companies money on customer service by deflecting calls for referral rewards.

    Fills a Void in Demand Gen and Customer Acquisition

    We’ve witnessed a growth in essential platforms for acquisition and demand gen over email (ExactTarget, Constant Contact), search (Kenshoo, Marin Software), and marketing automation (Marketo, HubSpot). Until now, however, there’s been no enterprise platform for referrals. Extole is delivering the key “third channel” for online marketers.

    A quick review of Extole’s customer shows that many companies had a latent need for this platform. The company has already signed deals with companies from Boden, DocuSign, Fleetmatics, Intuit, and Kraft.

    Extole is still a startup but its management team isn’t new to this game—or each other.  Many of the senior executives worked together at a previous ScaleVP backed company – Omniture/Offermatica and I am thrilled to have them join the ScaleVP portfolio again. Welcome Extole!

  • Guest columnist Monica Adractas VP, Customer Success at Box, a ScaleVP portfolio company
    Posted by Guest Columnist on December 2, 2014

    Customer retention is critical to any company’s business model. Every company spends good money attracting and acquiring customers, and this simply goes to waste if you don’t retain them over time. At minimum, every company wants to recover its initial sales and marketing expense, however customers become more profitable through their lifetime … so the longer you retain a customer, the better the ROI and profitability of your business.

    Many companies will assign customer retention responsibility to one functional leader. After all, product should create a product, business development should do deals, marketing should drive awareness, sales should sell, and G&A functions should keep a company humming.  What is left over, i.e., managing the customer experience and lifetime value of those customers is often thrown to a support function, “Customer Success”, or marketing (when there is significant opportunity to cross-sell additional products – such as in e-commerce or financial services). But having any one function owning customer retention or lifetime value is a mistake. 

    A customer’s reaction to your product or service is valuable information that should guide every department. Without a customer-back mindset and incentives, functional leaders run the risk of optimizing near-term goals without doing what is in the best interest of the company’s long-term financial health…retaining customers.

    Examples of how customer retention touches each business function   

    • Marketing – For Marketing groups who primarily focus on acquisition the goal is to maximize awareness, consideration, and perhaps purchase in consumer businesses.  In doing so, are you setting the right expectations for customers? You may believe your product and brand are amazing – but to a customer it is all about the best solution to the job they are trying to get done.
    • Sales – It can be particularly challenging to balance short and long-term objectives when there are aggressive sales targets to be met. However, if you bring in customers who aren’t a right fit for your product or service they will surely leave you, sooner or later, – and the financial investment made in sales and commissions is sunk. Do you understand what drives long-term customer retention?  Are these embedded in your pre-sales process?
    • Business Development – Deals are hard to get done. Strategic and channel partners often have their own set of objectives and sometimes this can create a misalignment of what is best for the customer. Entering into a deal, do you know how this will impact your existing customers? If the deal is to acquire new customers, will the sales channel allow for the right type of sale? Who manages the customer experience after initial acquisition? Does your partner have aligned metrics and financial incentives to drive long-term customer value?
    • Product – In my experience, product is often one of the most-aligned functions in terms of driving long-term customer value.  Generally high usage and satisfaction with a product will naturally translate into increased customer stickiness and loyalty. One potential misalignment in a world of limited resources is innovation vs. marginal improvements. Often small marginal improvements can seem trivial compared to launching (and announcing!) the next big feature or product. Take for example small improvements to the user interface of a software product – easy and “trivial” to implement from a product perspective, but potentially a big driver of user engagement.
    • Finance – Efficiency and precision matter in Finance, however at times this can be at odds with what is best for your customer. For example, does your collections process try to “win back” a customer – or focus strictly on collecting payment? Are your accounting policies flexible enough to enable true experimentation with concessions or gestures of gratitude? Do your planning goals take into account short-term investments that may be required for long-term customer ROI?

    Putting cross-functional customer retention into practice:

    CEOs I have spoken with are generally intrigued by and supportive of this “it takes a village” philosophy to maximizing customer lifetime value. After all, who doesn’t want a more successful and profitable business? Here are a few tactical approaches to driving this in your organization:

    • Make customer retention or lifetime value a company level goal. Most CEOs have a few goals they rally the company around – make customer value as important as sales growth, product engagement, profitability or employee engagement. Set targets and review progress regularly with your entire organization and shareholders. A previous post “Churn out Churn:  Five Steps to Serious” has some pointers for goal-setting.
    • Align incentives. Put your money where your mouth is and incentivize your entire top team on customer lifetime value. If a significant amount of compensation is in the form of company equity, demonstrate the impact customer lifetime value has on valuation over time.
    • Appoint a Chief Customer Officer, or other cross-functional retention leader. Let someone analyze, prioritize, and coordinate cross-functional efforts to drive customer value. I have seen many CCOs have limited effectiveness as providers of interesting customer insight, but not necessarily able to incite change or bring customer-back initiatives to the top of a functional leader’s agenda. Success will require real accountability and mandate to drive change.
    • Give assignments. Have every functional leader identify at least one initiative their function owns that drives customer retention.  Make it core part of their strategic plan and drive accountability.

    I would love to hear your thoughts, ideas and lessons-learned on this type of an approach to driving customer lifetime value.

     

    Monica Adractas HeadshotMonica Adractas is VP, Customer Success at Box where she leads cross-company customer retention and churn initiatives. Previously, Monica was a Principal at McKinsey & Company where she served clients on growth, customer and digital strategy. You can find more from her on LinkedIn.

     

  • An insider view on the news, events and happenings of re:Invent 2014
    Posted by Ariel Tseitlin on November 19, 2014

    Amazon’s AWS re:Invent conference has become the must-attend conference of the year for anyone even tangentially involved with the cloud. This is now my third year attending re:Invent and each year it continues to improve and grow, with attendance growing roughly 50% every year. The sessions from Amazon and customers were incredibly insightful and I plan to spend many hours catching up on videos of sessions I couldn’t attend. For those not able to make it, I have shared my main takeaways below.

    Offense over defense. In the past, Andy Jassy’s keynote was filled with references to other cloud providers and reactive moves like matching pricing cuts. With the exception of the EC2 Container Service coming on the heels of Google’s Container Engine, none of that happened this time around. Even though Google announced price cuts, nothing of the sort came from Amazon. In my conversations with AWS insiders, the mood is apparently shifting toward a focus on innovation rather than matching. I anticipate that price drops will continue, but it is clear those will be on Amazon’s schedule.

    It’s all about the enterprise.  Amazon started by appealing to developers. They are now focusing squarely on the enterprise. While past keynotes featured early adopters like Netflix and NASA, this year’s keynote moved into the early majority, highlighting companies like Philips, Johnson & Johnson, and Intuit. Nike was one interesting company to highlight. Aside from throwing an amazing after-party, they spoke about their journey to a cloud-native microservices architecture. They went from taking months to deploy a new feature in 2011 to a few hours in 2014. A shoe company not only moving into the cloud, but fully embracing it is shows us what’s ahead as the rest of the non-tech world realizes the full benefits of operating in the cloud.

    AWS innovation engine shows no signs of slowing down. The number of new services that Amazon announced was truly impressive.  I was impressed with both the quantity and the quality of the new services Amazon announced at with the conference.

    Of all the announcements, Lambda was the one I found most interesting, and I think that many in the audience shared my sentiment. Lambda creates an entirely new programming model for event-driven asynchronous programs that leverages what AWS does best: abstracting away the undifferentiated heavy lifting of building infrastructure and freeing developers to focus on the more differentiated application logic. I also found it the most unexpected of the menu of new services.The other product launches were natural incremental features to existing services or use cases. Take for example the EC2 Container Service: this was expected in reaction to Google’s announcement a week prior. Lambda was creative, unique and unexpected and it is that innovation which will help Amazon maintain the wide lead it has established in the race to cloud dominance.

    What I find most rewarding about attending conferences such as re:Invent is the opportunity to meet with other people interested in similar spaces and technologies. It was wonderful to reconnect with everyone from AWS with whom I had previously worked at Netflix and to catch up with the rest of the clouderati in attendance. The Expo, hallway, and drink conversations alone are worth the price of admission.

    Already looking forward to seeing you next year at re:Invent.

     

  • Two-thirds of all worldwide email is spam, including phishing
    Posted by Ariel Tseitlin on September 23, 2014

    Agari-LogoToday, I am proud to announce our investment in Agari, a provider of real-time, data-driven security solutions that detect and prevent advanced email cyberthreats. Why Agari? Consider this, sensitive information of over 110 million people in the U.S. was stolen in the Target breach, and it all started from a phishing email. The lack of email authentication continues to be a source of massive vulnerability for users and companies many decades since email’s initial inception.

    The foundation of the Internet that we all know and love today was formed in the 1960s and 1970s out of military research that built ARPANET, the first packet-switched connected network built on top of the TCP/IP protocol. One of the first “killer apps” built on this new infrastructure was a way for the then small group of researchers to exchange digital messages with one another. At the heart of this electronic mail subsystem was SMTP, a protocol as fundamental for email as TCP/IP was for the Internet. But because email was then intended for a small trusted group, authentication was never built into the protocol.

    Fast-forward to modern day and we find that the number of online users numbers in billions and more half the email that is sent is spam. Phishing has become a major attack vector for the bad guys to gain an initial foothold to exploit well-intentioned security defenses. In our recent security survey of over 100 CISO’s revealed that one third of those surveyed list data breaches and malware outbreaks as a top concern for their business, both issues that involve email as the attack vector.

    The most impactful breach in the past twelve month has been the highly publicized Target data breach where 40 million credit cards and 70 million identities, including name, address, email address and phone number, of Target shoppers were stolen. How did the bad guys get in? A successful phishing email attack on Target’s HVAC vendor installed malware on the vendor’s computers that, due to improper security controls at Target, made its way into Target’s network. A phishing attack was the first point of compromise.

    Thankfully, there is a solution. New protocols like SPF (authentication) and DKIM (integrity) add the missing security to the email protocols. The last missing piece was just recently added in 2012. Agari, along with a group of leading internet companies including Google, Yahoo, PayPal, Twitter, Facebook created a new standard called DMARC that finally allows email senders to instruct email deliverers how to treat unauthenticated emails.

    This fundamental leap forward in email architecture is what excited us about investing in Agari. However, making this transition won’t happen overnight. It’s complex and requires ongoing monitoring and management. Additionally, a new level of threat intelligence can be gathered from attempted attacks. That is exactly where Agari comes in, helping organizations that want to protect their brands from email abuse and securing their email channel. The Agari cloud-based SaaS solution aggregates data from 2.5 billion mailboxes to help global brands eliminate email threats, protect customers and their personal data, and proactively guard brand reputation. Today, Agari analyzes over 6 billion messages per day, identifies over 2 million malicious URLs per month, and blocks over 200 million malicious emails per month.

    We’re thrilled to join the Agari team and help them scale the business.

    Screen Shot 2014-09-22 at 5.02.24 PM

  • The new era of machine learning is eating up software
    Posted by Alex Niehenke on September 11, 2014

    Software companies are acquiring and accumulating ever-increasing sets of data driven by reduced costs and new database paradigms. But as my colleague Cack Wilhelm wrote earlier this week “‘Big data,’ however large, sitting idle in a data store is not adding value to an enterprise … Data must be consumerized easily for business stakeholders in order to uncover insights and drive predictions that are specific to solving each business problem.” Software companies are moving beyond just gathering retrospective analysis and instead are looking to leverage this data to better understand and predict what is to come – often by using machine learning. We call this trend of next generation set software companies “automation by algorithm” and I predict that it is going to eat up software.

    The technology of using machine learning for predictive analytics is not new. However, quick glance at Google Trends shows an increasing interest in predictive analytics. It is both interesting and important to understand why this trend is growing.

    Search Term: Predictive Analytics

    automated

    The rise of big data (both the decreasing cost and non-relational database formats) is likely the largest catalyst. But there are other factors as well, such as the proliferation of the cloud, distributed computing, and better hardware that have made machine learning algorithms faster and easier to run. There is also an organizational shift at large with cutting-edge software companies being the first to adopt. Now, smaller organizations are following the trend as they acquire the talent from larger companies. There’s also been recent innovation in machine learning algorithms, with random forest (early 2000s) and deep learning (2000s) both being discovered in the past decade.

    It is easiest to understand “automation by algorithm” by looking at examples such as our portfolio company Sailthru. It is common practice to optimize frequency of e-mail sends to open rates. This quickly becomes complex as you add dozens of possible variables (content of the email, time of day, subject, recipient demographic, pricing, etc). A data scientist may be able to optimize one instance, but the ongoing complexity means that only a machine-learning algorithm can continually optimize for the best result, incorporating feedback as performance changes. Sailthru has developed such technology (and more) allowing any customer to easily integrate this functionality without requiring in-house knowledge of machine learning. There are many alternatives when seeking an email vendor, but the ability to automate the algorithm and deliver it as an application / service is what distinguishes the company from its peers.

    Rather than broad, horizontal solutions we believe the greatest near-term opportunity exists in specific use-case applications that leverages machine learning. This allows for the strongest selection and optimization of the machine-learning algorithm specific to the use case. Further, we have noticed that companies with the largest data sets have a clear advantage and therefore the most success. The data set can be accumulated, accessed through partners, or purchased– the source is less relevant than the volume (though source is a potential for long-term defensibility). Early adoption of automation by algorithm is evident in data rich areas such as marketing and finance. However, we’ve also seen great use cases in human resources, sales, and industry specific vertical software. Without intending to be exhaustive, a few areas we have found particularly interesting are:

    Predictive Pipeline Analytics

    Back in April, my colleague Stacey Bishop blogged about our growing interest in this area. We saw that several of our portfolio companies had either too many leads or too few. Those with too many are left hopeless trying to figure out which leads to focus on. Those with too little are often challenged with understanding where they should find their next lead. We’re really excited about companies using automation by algorithm to help companies focus on the right leads and identify new leads leveraging existing customer and pipeline data.

    Customer Retention

    Managing hundreds or thousands customers is tough. Monitoring all the events that can lead to a churn event becomes impossible. Automation by algorithm can be used in retention software to predict the behavior of future customers and churn events. We are particularly excited about companies that not only leverage direct sources of data such as payment schedule, engagement, and feature requests, but also external data about that customer. For example, monitoring headcount might let you know that a sudden, significant reduction in forces could indicate financial hardship (and therefore a temporary easement on payment may prevent a churn event).

    Recruiting

    Reviewing thousands of resumes is inefficient and suffers from human bias. Meanwhile candidates are putting all their career data online, publishing their work online, and demonstrating qualifications in dozens of other ways across the Internet. The resulting data footprint holds a potential for strong insights into the recruiting process. We believe that there is a large opportunity for recruiting software to leverage automation by algorithm and change the way managers recruit.

    Credit Risk Monitoring

    Online fraud is exploding. Many first and second generation software solutions are struggling as location has become mobile and online personas have become more complicated to verify. At the same time, more behavioral, social, and other third party data is available than ever before. We are very excited about companies that are leveraging all of this data into their software and using automation by algorithm to help companies combat fraudsters.

    Note: We were very fortunate to have Xiaonan Zhao join us for the summer. Her previous experience in machine learning while at Google and enthusiasm for the area were crucial in the development of thinking. She is currently a second-year at Harvard Business School and contemplating her post-graduation plans. 

  • .
    Posted by Cack Wilhelm on September 9, 2014

    The term ‘big data’ has assumed different definitions as the industry has developed, with no universal consensus around whether it refers to the actual quantity and complexity of the data or the software and tools intended to make sense of the data – or a combination of both.

    That said, most of us can agree that the infrastructure and tools focused on tackling big data challenges have progressed steadily. I worked at Cloudera in 2010 and 2011 and, at the time, in the eyes of our customers big data was Hadoop, and HDFS + MapReduce were the main attractions. Tech-savvy Cloudera customers were just beginning to experiment with 10 and 20 node clusters, nothing like the thousand node clusters you witness today.

    Looking back over the past five years, Hadoop, NoSQL stores, and AWS have enabled very inexpensive data storage and compute, allowing companies to capture and retain more data. This data is as varied as video clips, call records, log files, machine-to-machine output, and social media streams, but taken together it is seen as an organizational asset and a competitive advantage.

    As Hadoop and NoSQL took root, other data-related trends have emerged or persisted, increasing raw data volumes and reinforcing the need for cheap data storage, including:

    • Widely accessible APIs (Programmable web)
    • Extensive public data (open government)
    • Data federation allowing data to be queried as part of a heterogeneous data ecosystem (without regard to data type, location, or functionality)
    • Data-as-a-Service (including companies such as DataSift or Clearstory Data)
    • Organizations demanding granular data, non-aggregated and non-summarized
    • Machine-generated data (anything from jet engines to Nest thermostats)

    Big Data as a General Purpose Technology

    Companies have ample internal and external data to store and they have a cost-effective way to do so: Hadoop or NoSQL paired with inexpensive commodity hardware made possible by the endurance of Moore’s law and resulting capacity improvements. To borrow an idea from The Second Machine Age, the database layer has become the “general purpose technology” and from here, companies are building complementary innovations to put the data to use, everything from Paxata for data preparation to Looker for data exploration and interaction. The next five years is going to be focused not on storing the data but on how to leverage or monetize all of this data.

    ‘Big data,’ however large, sitting idle in a data store is not adding value to an enterprise until tools exist to deliver analytics that access data in disparate locations, democratize the analysis currently performed by data scientists, and inform. Data must be consumerized easily for business stakeholders in order to uncover insights and drive predictions that are specific to solving each business problem; there is a dearth of data scientists and developers may not be informed of the underlying business questions.

    Most agree that traditional business intelligence (BI) is being overshadowed by tools that are prescriptive and predictive in addition to being descriptive or diagnostic. Ad hoc analysis and advanced visualizations are increasingly utilized alongside (or instead of) reporting and monitoring. I predict that over time simple reporting and monitoring capabilities will come ‘free’ with many software applications as a standard function, just as APIs are now free for most tools.

    Looking ahead to Automation

    With the general purpose database-level technology in place and the complementary data analysis tools maturing, the pivotal next step is closing the loop between data outputs and action. This is not action by humans but action triggered by a machine or computer. Machine learning, predictive analytics, and statistical analysis are a few emergent trends focused on making automation a reality. Just as ‘big data’ became a nebulous term, each of machine learning, statistical analysis, and predictive analytics suggests a different meaning to different constituents: for example, machine learning may relate more to algorithms or deep learning, whereas statistical analysis is associated with random forests and logistic regressions. Regardless of the nuances, each analytic approach aims to automate what was once a human-powered action made in reaction to data served up by computer analysis.

    Today lots of this is black box, as in the mathematical machinations are obscured from the user. In another couple of years, white box, closed-loop analytics powered by machine learning algorithms may well be the new general purpose technology. Context Relevant and Domino Data Labs appear to be well on their way. This trend, in turn, will enable the next crop of complementary innovations to emerge.

     


     

  • Welcome Cack Wilhelm and Rose Yuan; Congrats Ariel Tseitlin & Susan Liu
    Posted by Kate Mitchell on September 3, 2014

    In a business like venture capital, it’s all about people and how they work together as a team. That’s why I am thrilled to announce some new members of our team & two promotions!

    Cack Wilhelm has joined us as Principal, with a focus on cloud infrastructure investments. Cack was with us last summer before finishing her MBA at the University of Chicago Booth School of Business. During her summer here at ScaleVP, Cack focused on the twin themes of business processes being automated by algorithms and big data being analyzed for insight. Cack draws from her recent experience in tech sales at Hadoop company, Cloudera and before that, at Oracle. Her technology, sales and startup background is a perfect addition to our infrastructure team and our portfolio.

    Cack is also a dedicated runner, racing professionally for Nike for two years at the 5,000m distance and competing in track & field and cross-country while an undergraduate at Princeton. She already has visions of us competing in corporate challenges so expect to see more of us on the track!

    Next up, Rose Yuan joins us as Associate. Most recently Rose was an analyst with J.P. Morgan Chase in technology banking where she worked as part of the advisory team on the sale of ScaleVP portfolio company, ExactTarget to Salesforce. Rose will work with us on investments across the portfolio in the SaaS, Cloud and Mobile sectors.

    And while we are excited about the new members of our team, we continue to invest in our existing team and are proud when they succeed, since they represent the future of our business.

    Ariel Tseitlin has been promoted to Partner. He joined us almost a year ago from Netflix where he was Director of Cloud Solutions. Ariel has been a great addition to the team getting involved in all aspects of the business from joining portfolio boards, to working with Andy Vitus on further evolving our IT sector focus, and even recruiting an EIR for Scale. He has sourced a number of interesting new companies so expect us to be announcing a new investment with Ariel as the ScaleVP lead sometime soon.

    I am also proud to announce that we have promoted Susan Liu to Senior Associate. Susan has become a key member of our investment team, working closely with our SaaS team on the sourcing and evaluation of deals. She has worked on a number of ScaleVP investments including Bill.com, Bizible, DemandBase and particularly WalkMe, which she sourced directly at an industry conference.

    Finally, congrats to EIR Bill Burns who recently accepted the new CISO position at Informatica. Ariel introduced us to Bill who had been the Director of Information Security at Netflix. Bill was our most recent EIR, spearheading a research project where he examined the priorities for 100 CISOs, what innovations they’re focused on, and how they are planning to help businesses take smart risks. He hosted weekly brown bag lunches to educate us on the inner workings of information security organizations and keep us all informed on what he was learning from the contacts he has made through his work with the ISSA CISO Forum and RSA. The result from his research was insightful and continues to shape our investment thesis around security. As is always the case, our EIRs remain part of our network. Last week, Bill joined Ariel & Cack in co-hosting a roundtable dinner to talk about security with some of our closest IT relationships. Thanks, Bill, we wish you well!