Skip to main content

Alphabet’s DeepMind achieves historic new milestone in AI-based protein structure prediction

DeepMind, the AI technology company that’s part of Google parent Alphabet, has achieved a significant breakthrough in AI-based protein structure prediction. The company announced today that its AlphaFold system has officially solved a protein folding grand challenge that has flummoxed the scientific community for 50 years. The advance inn DeepMind’s AlphaFold capabilities could lead to a significant leap forward in areas like our understanding of disease, as well as future drug discovery and development.

The test that AlphaFold passed essentially shows that the AI can correctly figure out, to a very high degree of accuracy (accurate to within the width of an atom, in fact), the structure of proteins in just days – a very complex task that is crucial to figuring out how diseases can be best treated, as well as solving other big problems like working out how best to break down ecologically dangerous material like toxic waste. You may have heard of ‘Folding@Home,’ the program that allows people to contribute their own home computing (and formerly, game console) processing power to protein folding experiments. That massive global crowdsourcing effort was necessary because using traditional methods, portion folding prediction takes years and is extremely expensive in terms of straight cost, and computing resources.

DeepMind’s approach involves using an “Attentionb-basd neural network system” (basically a neural network that can focus on specific inputs in order to increase efficiency). It’s able to continually refine its own predictive graph of possible protein folding outcomes based on their folding history, and provide highly accurate predictions as a result.

How proteins fold – or go from being a random string of amino acids when originally created, to a complex 3D structure in their final stable form – is key to understanding how diseases are transmitted, as well as how common conditions like allergies work. If you understand the folding process, you can potentially alter it, halting an infection’s progress mid-stride, or conversely, correct mistakes in folding that can lead to neurodegenerative and cognitive disorders.

DeepMind’s technological leap could make accurately predicting these folds a much less time- and resource-consuming process, which could dramatically change the pace at which our understanding of diseases and therapeutics progresses. This could come in handy to address major global threats including future potential pandemics like the COVID-19 crisis we’re currently enduring, by predicting viral protein structures to a high degree of accuracy early in the appearance fo any new future threats like SARS-CoV-2, thus speeding up the development of potential effective treatments and vaccines.



from TechCrunch https://ift.tt/3ltQgfm
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT