Skip to main content

Writer deploys home-cooked large language models to power up enterprise copy

There’s a lot of noise right now about how generative AIs like ChatGPT and Bard are going to revolutionize various aspects of the web, but companies targeting narrower verticals are already experiencing success. Writer is such a one, and it just announced a new trio of large language models to power its enterprise copy assistant.

The company lets customers fine-tune these models on their own content and style guides, from which point forward the AI can write, help write, or edit copy so that it meets internal standards. More than just catching typos and recommending the preferred word, Writer’s new models can evaluate style and write content themselves, even doing a bit of fact-checking when they’re done.

But the real draw is that the whole thing can be done internally, from fine-tuning to hosting, at least when it comes to the smaller two of the Palmyra series of models.

“No enterprise leader wants their data to be fodder for someone else’s foundation model, including ours,” said CEO May Habib in a press release. “We give customers all the benefits of the AI application layer without any of the risks of other AI applications and commercial models. Enterprise leaders want to invest in solutions that will essentially give them their own LLM.”

Palymra comes in three sizes: 128 million, 5 billion, or 20 billion parameters respectively for Small, Base, and Large. They’re trained on business and marketing writing, not Reddit posts and Project Gutenberg, so there are less surprises to begin with. Then you load up its maw with the last ten years of annual reports, financials, blog posts, and so on to make it yours. (This and any derived data do not filter back to Writer, to be clear.)

Having written my share of enterprise and marketing copy, I can say this isn’t the most exciting of applications. But what it lacks in thrills it makes up for in practicality: companies need to do lots of this kind of writing and editing, and tend to actually pay for it. Writer already hooks into lots of development and productivity suites, so there’s not much friction added.

Mockup of Writer generating a product description.

The business model is similar to other generative AI companies: you get all set up and fine-tuned for free, then pay a penny per thousand tokens, which gets you about 750 words. (This article is just over 500, as a quick reference.)

Alternatively, you can self-hose the Small or Base models free of charge if you have the compute.

A few dozen companies have been using the models since late last year, and we haven’t heard about any egregious problems like we did on day one of Microsoft and Google’s attempts at popularizing generative AI… so that’s a good sign. This is the success of which I spoke earlier. While ChatGPT is certainly impressive, as a generalist or dilettante AI it’s hard to say what it’s actually capable of being used for. The next year or two will see more targeted plays like Writers while Microsoft and Google kick the tires on their latest toy.

Writer deploys home-cooked large language models to power up enterprise copy by Devin Coldewey originally published on TechCrunch



source https://techcrunch.com/2023/02/13/writer-deploys-home-cooked-large-language-models-to-power-up-enterprise-copy/

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT