Skip to main content

Honeycomb.io raises $11.4M to help developers observe and debug their apps

As companies continue to expand the number of cloud-based tools and apps that are used to run their businesses, DevOps continues to grow as a field of IT to help developers meet those demands. In one of the more recent moves, Honeycomb.io, which developers use to observe code on live apps, microservices and other processes in order to identify where something is not working, is today announcing that it has raised a Series A of $11.4 million to expand its sales and support efforts for existing customers.

The funding is being led by Scale Venture Partners, with Storm Ventures, eVentures, NextWorld Capital, and Merian Ventures also participating. Honeycomb has now raised $26.9 million.

Paul Graham, the co-founder of Y Combinator, once famously described how a startup (Stripe) grew in part by building a tool (in payments) that was useful and needed by other startups. Honeycomb itself is embodiment of that model, too: the story is one of engineers building tools that engineers need. Charity Majors and Christine Yen came to Facebook by way of Parse, where they were both engineers, and in the bigger environment, they found that the coexistence of apps and other services both built in-house and those interacting with Facebook’s platform created a minefield when it came to things working harmoniously.

“Things were just going down, or [even worse] looked like they were going down, all the time,” Majors said, noting that one of the big issues was that “you couldn’t look at things at a finer level” to figure out what was going wrong, and to identify issues behind why things were not working.

“Testing platforms can only cover the things you predict in advance, things you know might go wrong,” Yen noted. “Observability is about capturing what is going wrong,” a critical piece of data that will subsequently help an engineer figure out how to best fix it, rather than spending time trying to identify where the actual problem is.

Without a performance monitoring product on the market that was able to provide insight into real-time activity and interactivity between apps — and with a large part of the process requiring yet more code to be deployed to search for and fix problems — Majors (who is now the CTO of Honeycomb) mapped out a way to do this by observing the overall environment. When she decided to leave Facebook and work further on the idea, she teamed up with Yen (now the CEO) to build Honeycomb. (The internal tool that Majors built as an infrastructure engineer, she said, is also still being used, and you can see more on the structure behind how Honeycomb works here.)

Honeycomb has resonated with developers at both smaller and very giant tech companies (that prefer not to be named), with the high correlation between those who trial and those who end up buying the product speaking both to the demand for Honeycomb’s solution and its impact on developers’ work.

The company says that it has doubled ARR in the last six months, doubling the number of six-figure contracts, and is on track to triple ARR by the end of 2019.

“Honeycomb is enabling a long-overdue shift in the way developers interact with and operate the software they build,” said Ariel Tseitlin, Partner at Scale Venture Partners, who is also joining the board with this round. “As production systems become more complex and distributed, the company is taking advantage of the massive market opportunity and establishing itself as a leader in real-time observability. It’s no wonder developers say they can’t live without it after they try it.”



from TechCrunch https://ift.tt/2lYI8uk
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT