Research at Microsoft 2023: A year of groundbreaking AI advances and discoveries

Published

It isn’t often that researchers at the cutting edge of technology see something that blows their minds. But that’s exactly what happened in 2023, when AI experts began interacting with GPT-4, a large language model (LLM) created by researchers at OpenAI that was trained at unprecedented scale. 

“I saw some mind-blowing capabilities that I thought I wouldn’t see for many years,” said Ece Kamar, partner research manager at Microsoft, during a podcast recorded in April.

Throughout the year, rapid advances in AI came to dominate the public conversation (opens in new tab), as technology leaders and eventually the general public voiced a mix of wonder and skepticism after experimenting with GPT-4 and related applications. Could we be seeing sparks of artificial general intelligence (opens in new tab)—informally defined as AI systems that “demonstrate broad capabilities of intelligence, including reasoning, planning, and the ability to learn from experience (opens in new tab)”? 

While the answer to that question isn’t yet clear, we have certainly entered the era of AI, and it’s bringing profound changes to the way we work and live. In 2023, AI emerged from the lab and delivered everyday innovations that anyone can use. Millions of people now engage with AI-based services like ChatGPT. Copilots (opens in new tab)—AI that helps with complex tasks ranging from search to security—are being woven into business software and services.

Underpinning all of this innovation is years of research, including the work of hundreds of world-class researchers at Microsoft, aided by scientists, engineers, and experts across many related fields. In 2023, AI’s transition from research to reality began to accelerate, creating more tangible results than ever before. This post looks back at the progress of the past year, highlighting a sampling of the research and strategies that will support even greater progress in 2024.

Strengthening the foundations of AI

AI with positive societal impact is the sum of several integral moving parts, including the AI models, the application of these models, and the infrastructure and standards supporting their development and the development of the larger systems they underpin. Microsoft is redefining the state of the art across these areas with improvements to model efficiency, performance, and capability; the introduction of new frameworks and prompting strategies that increase the usability of models; and best practices that contribute to sustainable and responsible AI. 

Advancing models 

  • Researchers introduced Retentive Networks (RetNet), an alternative to the dominant transformer architecture in language modeling. RetNet supports training parallelism and strong performance while making significant gains in inference efficiency. 
  • To contribute to more computationally efficient and sustainable language models, researchers presented a 1-bit transformer architecture called BitNet.
  • Microsoft expanded its Phi family of small language models with the 2.7 billion-parameter Phi-2, which raises the bar in reasoning and language understanding among base models with up to 13 billion parameters. Phi-2 also met or exceeded the performance of models 25 times its size on complex benchmarks.
  • The release of the language models Orca (13 billion parameters) and, several months later, Orca 2 (7 billion and 13 billion parameters) demonstrates how improved training methods, such as synthetic data creation, can elevate small model reasoning to a level on par with larger models.
  • For AI experiences that more closely reflect how people create across mediums, Composable Diffusion (CoDi) takes as input a mix of modalities, such as text, audio, and image, and produces multimodal output, such as video with synchronized audio.
  • To better model human reasoning and speed up response time, the new approach Skeleton-of-Thought has LLMs break tasks down into two parts—creating an outline of a response and providing details on each point in parallel.

Advancing methods for model usage

  • AutoGen is an open-source framework for simplifying the orchestration, optimization, and automation of LLM workflows to enable and streamline the creation of LLM-based applications.
  • Medprompt, a composition of prompting strategies, demonstrates that with thoughtful and advanced prompting alone, general foundation models can outperform specialized models, offering a more efficient and accessible alternative to fine-tuning on expert-curated data.
  • The resource collection promptbase offers prompting techniques and tools designed to help optimize foundation model performance, including Medprompt, which has been extended for application outside of medicine.
  • Aimed at addressing issues associated with lengthy inputs, such as increased response latency, LLMLingua is a prompt-compression method that leverages small language models to remove unnecessary tokens.

Developing and sharing best practices 

Accelerating scientific exploration and discovery

Microsoft uses AI and other advanced technologies to accelerate and transform scientific discovery, empowering researchers worldwide with leading-edge tools. Across global Microsoft research labs, experts in machine learning, quantum physics, molecular biology, and many other disciplines are tackling pressing challenges in the natural and life sciences.

  • Because of the complexities arising from multiple variables and the inherently chaotic nature of weather, Microsoft is using machine learning to enhance the accuracy of subseasonal forecasts.
  • Distributional Graphormer (DIG) is a deep learning framework for predicting protein structures with greater accuracy, a fundamental problem in molecular science. This advance could help deliver breakthroughs in critical research areas like materials science and drug discovery.
  • Leveraging evolutionary-scale protein data, the general-purpose diffusion framework EvoDiff helps design novel proteins more efficiently, which can aid in the development of industrial enzymes, including for therapeutics.
  • MOFDiff, a coarse-grained diffusion model, helps scientists refine the design of new metal-organic frameworks (MOFs) for the low-cost removal of carbon dioxide from air and other dilute gas streams. This innovation could play a vital role in slowing climate change.
  • This episode of the Microsoft Research Podcast series Collaborators explores research into renewable energy storage systems, specifically flow batteries, and discusses how machine learning can help to identify compounds ideal for storing waterpower and advancing carbon capture.
  • MatterGen is a diffusion model specifically designed to address the central challenge in materials science by efficiently generating novel, stable materials with desired properties, such as high conductivity for lithium-ion batteries.
  • Deep learning is poised to revolutionize the natural sciences, enhancing modeling and prediction of natural occurrences, ushering in a new era of scientific exploration, and leading to significant advances in sectors ranging from drug development to renewable energy. DeepSpeed4Science, a new Microsoft initiative, aims to build unique capabilities through AI system technology innovations to help domain experts unlock today’s biggest science mysteries. 
  • Christopher Bishop, Microsoft technical fellow and director of the AI4Science team, recently published Deep Learning: Foundations and Concepts, a book that “offers a comprehensive introduction to the ideas that underpin deep learning.” Bishop discussed the motivation and process behind the book, as well as deep learning’s impact on the natural sciences, in the AI Frontiers podcast series.

Maximizing the individual and societal benefits of AI

As AI models grow in capability so, too, do opportunities to empower people to achieve more, as demonstrated by Microsoft work in such domains as health and education this year. The company’s commitment to positive human impact requires that AI technology be equitable and accessible.

Beyond AI: Leading technology innovation

While AI rightly garners much attention in the current research landscape, researchers at Microsoft are still making plenty of progress across a spectrum of technical focus areas.

  • Project Silica, a cloud-based storage system underpinned by quartz glass, is designed to provide sustainable and durable archival storage that’s theoretically capable of lasting thousands of years.
  • Project Analog Iterative Machine (AIM) aims to solve difficult optimization problems—crucial across industries such as finance, logistics, transportation, energy, healthcare, and manufacturing—in a timely, energy-efficient, and cost-effective manner. Its designers believe Project AIM could outperform even the most powerful digital computers.
  • Microsoft researchers proved that 3D telemedicine (3DTM), using HoloportationTM communication technology, could help improve healthcare delivery, even across continents, in a unique collaboration with doctors and governments in Scotland and Ghana.
  • In another collaboration that aims to help improve precision medicine, Microsoft worked with industry and academic colleagues to release Terra, a secure, centralized, cloud-based platform for biomedical research on Microsoft Azure.
  • On the hardware front, Microsoft researchers are exploring sensor-enhanced headphones, outfitting them with controls that use head orientation and hand gestures to enable context-aware privacy, gestural audio-visual control, and animated avatars derived from natural body language.

Collaborating across academia, industries, and disciplines

Cross-company and cross-disciplinary collaboration has always played an important role in research and even more so as AI continues to rapidly advance. Large models driving the progress are components of larger systems that will deliver the value of AI to people. Developing these systems and the frameworks for determining their roles in people’s lives and society requires the knowledge and experience of those who understand the context in which they’ll operate—domain experts, academics, the individuals using these systems, and others.

Engaging and supporting the larger research community

Throughout the year, Microsoft continued to engage with the broader research community on AI and beyond. The company’s sponsorship of and participation in key conferences not only showcased its dedication to the application of AI in diverse technological domains but also underscored its unwavering support for cutting-edge advancements and collaborative community involvement.

Functional programming

  • Microsoft was a proud sponsor of ICFP 2023, with research contributions covering a range of functional programming topics, including memory optimization, language design, and software-development techniques.

Human-computer interaction

  • At CHI 2023, Microsoft researchers and their collaborators demonstrated the myriad and diverse ways people use computing today and will in the future. 

Large language models and ML

  • Microsoft was a sponsor of ACL 2023, showcasing papers ranging from fairness in language models to natural language generation and beyond.
  • Microsoft also sponsored NeurIPS 2023, publishing over 100 papers and conducting workshops on language models, deep learning techniques, and additional concepts, methods, and applications addressing pressing issues in the field.
  • With its sponsorship of and contribution to ICML 2023, Microsoft showcased its investment in advancing the field of machine learning.
  • Microsoft sponsored ML4H (opens in new tab) and participated in AfriCHI (opens in new tab) and EMNLP (opens in new tab), a leading conference in natural language processing and AI, highlighting its commitment to exploring how LLMs can be applied to healthcare and other vital domains.

Systems and advanced networking

Listeners’ choice: Notable podcasts for 2023

Thank you for reading

Microsoft achieved extraordinary milestones in 2023 and will continue pushing the boundaries of innovation to help shape a future where technology serves humanity in remarkable ways. To stay abreast of the latest updates, subscribe to the Microsoft Research Newsletter (opens in new tab) and the Microsoft Research Podcast (opens in new tab). You can also follow us on Facebook (opens in new tab), Instagram (opens in new tab), LinkedIn (opens in new tab), X (opens in new tab), and YouTube (opens in new tab). 

Writers, Editors, and Producers
Kristina Dodge
Kate Forster
Jessica Gartner
Alyssa Hughes
Gretchen Huizinga
Brenda Potts
Chris Stetkiewicz
Larry West

Managing Editor
Amber Tingle

Project Manager
Amanda Melfi

Microsoft Research Global Design Lead
Neeltje Berger

Graphic Designers
Adam Blythe
Harley Weber

Microsoft Research Creative Studio Lead
Matt Corwine

Related publications

Continue reading

See all blog posts