Wednesday August 21, 2019
 

Retail technology platform Relex raises $200M from TCV

Amazon’s formidable presence in the world of retail stems partly from the fact that it’s just not a commerce giant, it’s also a tech company — building solutions and platforms in-house that make its processes, from figuring out what to sell, to how much to have on hand and how best to distribute it, more efficient and smarter than those of its competition. Now, one of the startups that is building retail technology to help those that are not Amazon compete better with it, has raised a significant round of funding to meet that challenge.

Relex — a company out of Finland that focuses on retail planning solutions by helping both brick-and-mortar as well as e-commerce companies make better forecasts of how products will sell using AI and machine learning, and in turn giving those retailers guidance on how and what should be stocked for purchasing — is today announcing that it has raised $200 million from TCV. The VC giant — which has backed iconic companies like Facebook, Airbnb, Netflix, Spotify and Splunk — last week announced a new $3 billion fund, and this is the first investment out of it that is being made public.

Relex is not disclosing its valuation, but from what I understand it’s a minority stake, which would put it at between $400 million and $500 million. The company has been around for a few years but has largely been very capital-efficient, raising only between $20 million and $30 million before this from Summit Partners, with much of that sum still in the bank.

That lack of song and dance around VC funding also helped keep the company relatively under the radar, even while it has quietly grown to work with customers like supermarkets Albertsons in the U.S., Morrisons in the U.K. and a host of others. Business today is mostly in North America and Europe, with the U.S. growing the fastest, CEO Mikko Kärkkäinen — who co-founded the company with Johanna Småros and Michael Falck — said in an interview.

While the company has already been growing at a steady clip — Kärkkäinen said sales have been expanding by 50 percent each year for a while now — the plan now will be to accelerate that.

Relex competes with management systems from SAP, JDA and Oracle, but Kärkkäinen said that these are largely “legacy” solutions, in that they do not take advantage of advances in areas like machine learning and cloud computing — both of which form the core of what Relex uses — to crunch more data more intelligently.

“Most retailers are not tech companies, and Relex is a clear leader among a lot of legacy players,” said TCV general partner John Doran, who led the deal.

Significantly, that’s an approach that the elephant in the room pioneered and has used to great effect, becoming one of the biggest companies in the world.

“Amazon has driven quite a lot of change in the industry,” Kärkkäinen said (he’s very typically Finnish and understated). “But we like to see ourselves as an antidote to Amazon.”

Brick-and-mortar stores are an obvious target for a company like Relex, given that shelf space and real estate are costs that these kinds of retailers have to grapple with more than online sellers. But in fact Kärkkäinen said that e-commerce companies (given that’s also where Amazon primarily operates too) have been an equal target and customer base. “For these, we might be the only solution they have purchased that has not been developed in-house.”

The funding will be used in two ways. First, to give the company’s sales a boost, especially in the U.S., where business is growing the fastest at the moment. And second, to develop more services on its current platform.

For example, the focus up to now has been on-demand forecasting, Kärkkäinen said, and how that effects prices and supply, but it would like to expand its coverage also to labor optimisation alongside that; in other words, how best to staff a business according to forecasts and demands.

Of course, while Amazon is the big competition for all retailers, they potentially also exist as a partner. The company regularly productizes its own in-house services, and it will be interesting to see how and if that translates to Amazon emerging as a competitor to Relex down the line.

Post to Twitter Tweet This Post

vArmour, a security startup focused on multi-cloud deployments, raises $44M

As more organizations move to cloud-based IT architectures, a startup that’s helping them secure that data in an efficient way has raised some capital. vArmour, which provides a platform to help manage security policies across disparate public and private cloud environments in one place, is announcing today that it has raised a growth round of $44 million.

The funding is being led by two VCs that specialise in investments into security startups, AllegisCyber and NightDragon.

CEO Tim Eades said that also participating are “two large software companies” as strategic investors that vArmour works with on a regular basis but asked not to be named. (You might consider that candidates might include some of the big security vendors in the market, as well as the big cloud services providers.) This Series E brings the total raised by vArmour to $127 million.

When asked, Eades said the company would not be disclosing its valuation. That lack of transparency is not uncommon among startups, but perhaps especially should be expected at a business that operated in stealth for the first several years of its life.

According to PitchBook, vArmour was valued at $420 million when it last raised money, a $41 million round in 2016. That would put the startup’s valuation at $464 million with this round, if everything is growing at a steady pace, or possibly more if investors are keen to tap into what appears to be a growing need.

That growing need might be summarised like this: We’re seeing a huge migration of IT to cloud-based services, with public cloud services set to grow 17.3 percent in 2019. A large part of those deployments — for companies typically larger than 1,000 people — are spread across multiple private and public clouds.

This, in turn, has opened a new front in the battle to secure data amid the rising threat of cybercrime. “We believe that hybrid cloud security is a market valued somewhere between $6 billion and $8 billion at the moment,” said Eades. Cybercrime has been estimated by McAfee to cost businesses $600 billion annually worldwide. Accenture is even more bullish on the impact; it puts the impact on companies at $5.2 trillion over the next five years.

The challenge for many organizations is that they store information and apps across multiple locations — between seven and eight data centers on average for, say, a typical bank, Eades said. And while that may help them hedge bets, save money and reach some efficiencies, that lack of cohesion also opens the door to security loopholes.

“Organizations are deploying multiple clouds for business agility and reduced cost, but the rapid adoption is making it a nightmare for security and IT pros to provide consistent security controls across cloud platforms,” said Bob Ackerman, founder and managing director at AllegisCyber, in a statement. “vArmour is already servicing this need with hundreds of customers, and we’re excited to help vArmour grow to the next stage of development.”

vArmour hasn’t developed a security service per se, but it is among the companies — Cisco and others are also competing with it — that are providing a platform to help manage security policies across these disparate locations. That could either mean working on knitting together different security services as delivered in distinct clouds, or taking a single security service and making sure it works the same policies across disparate locations, or a combination of both of those.

In other words, vArmour takes something that is somewhat messy — disparate security policies covering disparate containers and apps — and helps to hand it in a more cohesive and neat way by providing a single way to manage and provision compliance and policies across all of them.

This not only helps to manage the data but potentially can help halt a breach by letting an organization put a stop in place across multiple environments.

“From my experience, this is an important solution for the cloud security space,” said Dave DeWalt, founder of NightDragon, in a statement. “With security teams now having to manage a multitude of cloud estates and inundated with regulatory mandates, they need a simple solution that’s capable of continuous compliance. We haven’t seen anyone else do this as well as vArmour.”

Eades said that one big change for his company in the last couple of years has been that, as cloud services have grown in popularity, vArmour has been putting in place a self-service version of the main product, the vArmour Application Controller, to better target smaller organizations. It’s also been leaning heavily on channel partners (Telstra, which led its previous round, is one strategic of this kind) to help with the heavy lifting of sales.

vArmour isn’t disclosing revenues or how many customers it has at the moment, but Eades said that it’s been growing at 100 percent each year for the last two and has “way more than 100 customers,” ranging from hospitals and churches through to “8-10 of the largest service providers and over 25 financial institutions.”

At this rate, he said the plan will be to take the company public in the next couple of years.

Post to Twitter Tweet This Post

Gong.io nabs $40M investment to enhance CRM with voice recognition

With traditional CRM tools, sales people add basic details about the companies to the database, then a few notes about their interactions. AI has helped automate some of that, but Gong.io wants to take it even further using voice recognition to capture every word of every interaction. Today, it got a $40 million Series B investment.

The round was led by Battery Ventures, with existing investors Norwest Venture Partners, Shlomo Kramer, Wing Venture Capital, NextWorld Capital and Cisco Investments also participating. Battery general partner Dharmesh Thakker will join the startup’s board under the terms of the deal. Today’s investment brings the total raised so far to $68 million, according to the company.

Indeed, $40 million is a hefty Series B, but investors see a tool that has the potential to have a material impact on sales, or at least give management a deeper understanding of why a deal succeeded or failed using artificial intelligence, specifically natural language processing.

Company co-founder and CEO Amit Bendov says the solution starts by monitoring all customer-facing conversation and giving feedback in a fully automated fashion. “Our solution uses AI to extract important bits out of the conversation to provide insights to customer-facing people about how they can get better at what they do, while providing insights to management about how staff is performing,” he explained. It takes it one step further by offering strategic input like how your competitors are trending or how are customers responding to your products.

Screenshot: Gong.io

Bendov says he started the company because he has had this experience at previous startups where he wants to know more about why he lost a sale, but there was no insight from looking at the data in the CRM database. “CRM could tell you what customers you have, how many sales you’re making, who is achieving quota or not, but never give me the information to rationalize and improve operations,” he said.

The company currently has 350 customers, a number that has more than tripled since the end of 2017 when it had 100. He says it’s not only that it’s adding new customers, existing ones are expanding, and he says that there is almost zero churn.

Today, Gong has 120 employees, with headquarters in San Francisco and a 55-person R&D team in Israel. Bendov expects the number of employees to double over the next year with the new influx of money to keep up with the customer growth.

Post to Twitter Tweet This Post

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Post to Twitter Tweet This Post

Carbonite to acquire endpoint security company Webroot for $618.5M

Carbonite, the online backup and recovery company based in Boston, announced late yesterday that it will be acquiring Webroot, an endpoint security vendor, for $618.5 million in cash.

The company believes that by combining its cloud backup service with Webroot’s endpoint security tools, it will give customers a more complete solution. Webroot’s history actually predates the cloud, having launched in 1997. The private company reported $250 million in revenue for fiscal 2018, according to data provided by Carbonite . That will combine with Carbonite’s $296.4 million in revenue for the same time period.

Carbonite CEO and president Mohamad Ali saw the deal as a way to expand the Carbonite offering. “With threats like ransomware evolving daily, our customers and partners are increasingly seeking a more comprehensive solution that is both powerful and easy to use. Backup and recovery, combined with endpoint security and threat intelligence, is a differentiated solution that provides one, comprehensive data protection platform,” Ali explained in a statement.

The deal not only enhances Carbonite’s backup offering, it gives the company access to a new set of customers. While Carbonite sells mainly through Value Added Resellers (VARs), Webroot’s customers are mainly 14,000 Managed Service Providers (MSPs). That lack of overlap could increase its market reach through to the MSP channel. Webroot has 300,000 customers, according to Carbonite.

This is not the first Carbonite acquisition. It has acquired several other companies over the last several years, including buying Mozy from Dell a year ago for $145 million. The acquisition strategy is about using its checkbook to expand the capabilities of the platform to offer a more comprehensive set of tools beyond core backup and recovery.

Graphic: Carbonite

The company announced it is using cash on hand and a $550 million loan from Barclays, Citizens Bank and RBC Capital Markets to finance the deal. Per usual, the acquisition will be subject to regulatory approval, but is expected to close this quarter.

Post to Twitter Tweet This Post

Google open sources ClusterFuzz

Google today announced that it is open sourcing ClusterFuzz, a scalable fuzzing tool that can run on clusters with more than 25,000 machines.

The company has long used the tool internally, and if you’ve paid particular attention to Google’s fuzzing efforts (and you have, right?), then this may all seem a bit familiar. That’s because Google launched the OSS-Fuzz service a couple of years ago and that service actually used ClusterFuzz. OSS-Fuzz was only available to open-source projects, though, while ClusterFuzz is now available for anyone to use.

The overall concept behind fuzzing is pretty straightforward: you basically throw lots of data (including random inputs) at your application and see how it reacts. Often, it’ll crash, but sometimes you’ll be able to find memory leaks and security flaws. Once you start anything at scale, though, it becomes more complicated and you’ll need tools like ClusterFuzz to manage that complexity.

ClusterFuzz automates the fuzzing process all the way from bug detection to reporting — and then retesting the fix. The tool itself also uses open-source libraries like the libFuzzer fuzzing engine and the AFL fuzzer to power some of the core fuzzing features that generate the test cases for the tool.

Google says it has used the tool to find more than 16,000 bugs in Chrome and 11,000 bugs in more than  160 open-source projects that used OSS-Fuzz. Since so much of the software testing and deployment toolchain is now generally automated, it’s no surprise that fuzzing is also becoming a hot topic these days (I’ve seen references to “continuous fuzzing” pop up quite a bit recently).

Post to Twitter Tweet This Post

Someone could scoop up Slack before it IPOs

Earlier this week, Slack announced that it has filed the paperwork to go public at some point later this year. The big question is, will the company exit into the public markets as expected, or will one of the technology giants swoop in at the last minute with buckets of cash and take them off the market?

Slack, which raised more than $1 billion on an other-worldly $7 billion valuation, is an interesting property. It has managed to grow and be successful while competing with some of the world’s largest tech companies — Microsoft, Cisco, Facebook, Google and Salesforce. Not coincidentally, these deep-pocketed companies could be the ones that come knock, knock, knocking at Slack’s door.

Slack has managed to hold its own against these giants by doing something in this space that hadn’t been done effectively before. It made it easy to plug in other services, effectively making Slack a work hub where you could spend your day because your work could get pushed to you there from other enterprise apps.

The great enterprise chat race

As I’ve discussed before, this centralized hub has been a dream of communications tools for most of the 21st century. It began with enterprise IM tools in the early 2000s, and progressed to Enterprise 2.0 tools in the 2007 time frame. That period culminated in 2012 when Microsoft bought Yammer for $1.2 billion, the only billion-dollar exit for that generation of tools.

I remember hearing complaints about Enterprise 2.0 tools. While they had utility, in many ways they were just one more thing employees had to check for information beyond email. The talk was these tools would replace email, but a decade later email’s still standing and that generation of tools has been absorbed.

In 2013, Slack came along, perhaps sensing that Enterprise 2.0 never really got mobile and the cloud, and it recreated the notion in a more modern guise. By taking all of that a step further and making the tool a kind of workplace hub, it has been tremendously successful, growing to 8 million daily users in roughly 4 years, around 3 million of which were the paying variety, at last count.

Slack’s growth numbers as of May 2018

All of this leads us back to the exit question. While the company has obviously filed for IPO paperwork, it might not be the way it ultimately exits. Just the other day CNBC’s Jay Yarrow posited this questions on Twitter:

Not sure where he pulled that number from, but if you figure 3x valuation, that could be the value for a company of this ilk. There would be symmetry in Microsoft buying Slack six years after it plucked Yammer off the market, and it would remove a major competitive piece from the board, while allowing Microsoft access to Slack’s growing customer base.

Nobody can see into the future, and maybe Slack does IPO and takes its turn as a public company, but it surely wouldn’t be a surprise if someone came along with an offer it couldn’t refuse, whatever that figure might be.

Can Slack transform enterprise communication once and for all?

Post to Twitter Tweet This Post

Big companies are not becoming data-driven fast enough

I remember watching MIT professor Andrew McAfee years ago telling stories about the importance of data over gut feeling, whether it was predicting successful wines or making sound business decisions. We have been hearing about big data and data-driven decision making for so long, you would think it has become hardened into our largest organizations by now. As it turns out, new research by NewVantage Partners finds that most large companies are having problems implementing an organization-wide, data-driven strategy.

McAfee was fond of saying that before the data deluge we have today, the way most large organizations made decisions was via the HiPPO — the highest paid person’s opinion. Then he would chide the audience that this was not the proper way to run your business. Data, not gut feelings, even those based on experience, should drive important organizational decisions.

While companies haven’t failed to recognize McAfee’s advice, the NVP report suggests they are having problems implementing data-driven decision making across organizations. There are plenty of technological solutions out there today to help them, from startups all the way to the largest enterprise vendors, but the data (see, you always need to go back to the data) suggests that it’s not a technology problem, it’s a people problem.

Executives can have farsighted vision that their organizations need to be data-driven. They can acquire all of the latest solutions to bring data to the forefront, but unless they combine that with a broad cultural shift and a deep understanding of how to use that data inside business processes, they will continue to struggle.

The study’s authors, Randy Bean and Thomas H. Davenport, wrote about the people problem in their study’s executive summary. “We hear little about initiatives devoted to changing human attitudes and behaviors around data. Unless the focus shifts to these types of activities, we are likely to see the same problem areas in the future that we’ve observed year after year in this survey.”

The survey found that 72 percent of respondents have failed in this regard, reporting they haven’t been able to create a data-driven culture, whatever that means to individual respondents. Meanwhile, 69 percent reported they had failed to create a data-driven organization, although it would seem that these two metrics would be closely aligned.

Perhaps most discouraging of all is that the data is trending the wrong way. Over the last several years, the report’s authors say that those organizations calling themselves data-driven has actually dropped each year from 37.1 percent in 2017 to 32.4 percent in 2018 to 31.0 percent in the latest survey.

This matters on so many levels, but consider that as companies shift to artificial intelligence and machine learning, these technologies rely on abundant amounts of data to work effectively. What’s more, every organization, regardless of its size, is generating vast amounts of data, simply as part of being a digital business in the 21st century. They need to find a way to control this data to make better decisions and understand their customers better. It’s essential.

There is so much talk about innovation and disruption, and understanding and affecting company culture, but so much of all this is linked. You need to be more agile. You need to be more digital. You need to be transformational. You need to be all of these things — and data is at the center of all of it.

Data has been called the new oil often enough to be cliché, but these results reveal that the lesson is failing to get through. Companies need to be data-driven now, this instant. This isn’t something to be working toward at this point. This is something you need to be doing, unless your ultimate goal is to become irrelevant.

Digital Transformation Requires Total Organizational Commitment

Post to Twitter Tweet This Post

Google doubles down on its Asylo confidential computing framework

Last May, Google introduced Asylo, an open source framework for confidential computing, a technique favored by many of the big cloud vendors because it allows you to set up trusted execution environments that are shielded from the rest of the (potentially untrusted) system. Workloads and their data basically sit in a trusted enclave that adds another layer of protection against network and operating system vulnerabilities.

That’s not a new concept, but as Google argues, it has been hard to adopt. “Despite this promise, the adoption of this emerging technology has been hampered by dependence on specific hardware, complexity and the lack of an application development tool to run in confidential computing environments,” Google Cloud Engineering Director Jason Garms and Senior Product Manager Nelly Porter write in a blog post today. The promise of the Asylo framework, as you can probably guess, is to make confidential computing easy.

Asylo makes it easier to build applications that can run in these enclaves and can use various software- and hardware-based security back ends like Intel’s SGX and others. Once an app has been ported to support Asylo, you should also be able to take that code with you and run in on any other Asylo-supported enclave.

Right now, though, many of these technologies and practices around confidential computing remain in flux. Google notes that there are no set design patterns for building applications that then use the Asylo API and run in these enclaves, for example.The different hardware manufacturers also don’t necessarily work together to ensure their technologies are interoperable.

“Together with the industry, we can work toward more transparent and interoperable services to support confidential computing apps, for example, making it easy to understand and verify attestation claims, inter-enclave communication protocols, and federated identity systems across enclaves,” write Garms and Porter.

And to do that, Google is launching its Confidential Computing Challenge (C3) today. The idea here is to have developers create novel use cases for confidential computing — or to advance the current state of the technologies. If you do that and win, you’ll get $15,000 in cash, $5,000 in Google Cloud Platform credits and an undisclosed hardware gift (a Pixelbook or Pixel phone, if I had to guess).

In additionl, Google now also offers developers three hands-on labs that teach how to build apps using Asylo’s tools. Those are free for the first month if you use the code in Google’s blog post.

Post to Twitter Tweet This Post

Google’s still not sharing cloud revenue

Google has shared its cloud revenue exactly once over the last several years. Silence tends to lead to speculation to fill the information vacuum. Luckily there are some analyst firms who try to fill the void, and it looks like Google’s cloud business is actually trending in the right direction, even if they aren’t willing to tell us an exact number.

When Google last reported its cloud revenue, last year about this time, they indicated they had earned $1 billion in revenue for the quarter, which included Google Cloud Platform and G Suite combined. Diane Greene, who was head of Google Cloud at the time, called it an “elite business.” but in reality it was pretty small potatoes compared to Microsoft’s and Amazon’s cloud numbers, which were pulling in $4-$5 billion a quarter between them at the time. Google was looking at a $4 billion run rate for the entire year.

Google apparently didn’t like the reaction it got from that disclosure so it stopped talking about cloud revenue. Yesterday when Google’s parent company, Alphabet, issued its quarterly earnings report, to nobody’s surprise, it failed to report cloud revenue yet again, at least not directly.

Google’s Diane Greene says billion-dollar cloud revenue already puts them in elite company

Google CEO Sundar Pichai gave some hints, but never revealed an exact number. Instead he talked in vague terms calling Google Cloud “a fast-growing multibillion-dollar business.” The only time he came close to talking about actual revenue was when he said, “Last year, we more than doubled both the number of Google Cloud Platform deals over $1 million as well as the number of multiyear contracts signed. We also ended the year with another milestone, passing 5 million paying customers for our cloud collaboration and productivity solution, G Suite.”

OK, it’s not an actual dollar figure, but it’s a sense that the company is actually moving the needle in the cloud business. A bit later in the call, CFO Ruth Porat threw in this cloud revenue nugget. “We are also seeing a really nice uptick in the number of deals that are greater than $100 million and really pleased with the success and penetration there. At this point, not updating further.” She is not updating further. Got it.

Former Oracle exec Thomas Kurian to replace Diane Greene as head of Google Cloud

That brings us to a company that guessed for us, Canalys. While the firm didn’t share its methodology, it did come up with a figure of $2.2 billion for the quarter. Given that the company is closing larger deals and was at a billion last year, this figure feels like it’s probably in the right ballpark, but of course it’s not from the horse’s mouth, so we can’t know for certain. It’s worth noting that Canalys told TechCrunch that this is for GCP revenue only, and does not include G Suite, so that would suggest that it could be gaining some momentum.

Frankly, I’m a little baffled why Alphabet’s shareholders actually let the company get away with this complete lack of transparency. It seems like people would want to know exactly what they are making on that crucial part of the business, wouldn’t you? As a cloud market watcher, I know I would, and if the company is truly beginning to pick up steam, as Canalys data suggests, the lack of openness is even more surprising. Maybe next quarter.

Google in the cloud

Post to Twitter Tweet This Post