Skip to main content

Adobe beefs up developer tools to make it easer to build apps on Experience Cloud

Adobe has had a developer program for years called Adobe.io, but today at the Adobe Developers Live virtual conference, the company announced some new tools with a fresh emphasis on helping developers build custom apps on the Adobe Experience Cloud.

Jason Woosley, VP of developer experience and commerce at Adobe says that the pandemic has forced companies to build enhanced digital experiences much more quickly than they might have, and the new tools being announced today are at least partly related to helping speed up the development of better online experiences.

“Our focus is very specifically on making the experience generation business something that’s very attractive to developers and very accessible to developers so we’re announcing a number of tools,” Woosley told TechCrunch.

The idea is to build a more complete framework over time to make it easier to build applications and connect to data sources that take advantage of the Experience Cloud tooling. For starters, Project Firefly is designed to help developers build applications more quickly by providing a higher level of automation than was previously available.

“Project Firefly creates an extensibility framework that reduces the boilerplate that a developer would need to get started working with the Experience Cloud, and extends that into the customizations that we know every implementation eventually needs to differentiate the storefront experience, the website experience or whatever customer touch point as these things become increasingly digital,” he said.

In order to make those new experiences open to all, the company is also announcing React Spectrum, an open source set of libraries and tools designed to help members of the Adobe developer community build more accessible applications and websites.

“It comes with all of the accessibility features that often get forgotten when you’re in a race to market, so it’s nice to make sure that you will be very inclusive with your design, making sure that you’re bringing on all aspects of your audiences,” Woosley said.

Finally, a big part of interacting with Experience Cloud is taking advantage of all of the data that’s available to help build those more customized interactions with customers that having that data enables. To that end, the company is announcing some new web and mobile software development kits (SDKs) designed to help make it simpler to link to Experience Cloud data sources as you build your applications.

Project Firefly is available in developer preview starting today. Several React Spectrum components and some data connection SDKs are also available today. The company intends to keep adding to these various pieces in the coming months.



from TechCrunch https://ift.tt/30gfJB1
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT