EdgeComputing_MCR_Header_09_2018_1920x720_2

Edge Computing

Established: October 29, 2008

Applications

From the very beginning, we have maintained that the most compelling applications for edge computing are ones that require low latency responses or ones where the network to the cloud is expensive or inadequate. In this context, we asserted that the “killer app” for edge computing is live video analytics. Along the way, other Microsoft researchers discovered precision agriculture to be a beautiful edge computing application as well. We are exploring both:

Live Video Analytics

Live Video Analytics Scenrios

Large-scale video processing is a grand challenge representing an important frontier for analytics, what with videos from factory floors, traffic intersections, police vehicles, and retail shops. Read more.

Data Driven Agriculture

Photography depicts Microsoft's FarmBeats technology uses AI and IoT to help increase farm productivity.

We believe that data, coupled with the farmer’s knowledge and intuition, can help increase farm productivity and help reduce costs. However, getting data from the farm is difficult since there is often no power in the field… Read more.

Highly Adaptive and Resilient Edges

Continuity of service is a must-have attribute in mission- and safety- critical edge computing applications. For example, in telecommunication networks interruption of communication services is unexceptionable; in oil rigs (link) where continuous monitoring of the safety of on-site workers and the health of multi-million-dollar equipment is a must have; in manufacturing, constantly looking-out for production errors that may lead to defective items is also a must-have. Downtime of edge computing servers, beyond an acceptable threshold, can result in accidents and significant financial loss. The challenge then is to keep everything operational, even when local technicians are absent. This requires edge servers to be up and running (24x7x365), which is particularly hard when edge-cloud connectivity is unreliable and expensive. We are developing adaptive and resilient software solutions that enable edge clusters to run continuously.

Paya

to describe Project Paya

Paya is a state migration edge-tailored solution that ensures that high-availability is met for all edge-cloud applications based on the application’s specific need. Read more.

LOKI

to describe Project LOKI

LOKI is a suite of services and programming abstractions that simplify the development of adaptive edge-cloud & multi-cloud applications, Read more.

Our goal is to develop a software-based system and architecture that can help keep operations alive by healing the system automatically in the face of machine failures, network disconnections, dynamic application loads, or changes in capacity. Towards this north star, we have several ongoing projects that take on these problems.

HybridKube

to describe HybridKube

HybridKube is a Kubernetes extension for optimal placement of applications across a edge-cloud environment. Read more.

Offloading Computations

We have been exploring the fundamental trade-off between computation and communications to enable a new class of cpu-, gpu- and data-intensive applications that seamlessly augment the cognitive abilities of users by exploiting speech recognition, NLP, vision, machine learning, and augmented reality (Project Maui, Mobisys 2010). We have made significant progress in overcoming the energy and computation limitations of sensors, handhelds, and wearables. In subsequent research we demonstrated how important special-purpose workloads can also leverage cloud offload: for GPU-intensive rendering applications (Project Kahawai, MobiSys 2015) and deep neural network video stream processing (MCDNN, MobiSys 2016).

Project MAUI

Image of chess being played on phone to describe Project Maui

Mobile Assistance Using Infrastructure (MAUI) was the first system to demonstrate fine-grained code offload to nearby edge server(s) with minimal programmer effort. Watch the video.

Project Kahawai

Image of person on computer

Kahawai enables high-quality gaming on mobile devices, such as tablets and smartphones, by offloading a portion of the GPU computation to server-side infrastructure. Watch the video.

Geo-distributed Edge Analytics

Edge servers located in thousands of locations and managed by the same administrative entity offer powerful computing resources for cloud providers. Our research on low-latency edge analytics explores how best to use these resources. For example, the old approach of aggregating all the data from sensors to a single data center negatively impacts the timeliness of the analytics. But, running queries over geo-distributed inputs using the current intra-DC analytics frameworks also results in high query response times because these frameworks cannot cope with the relatively low and variable capacity of the WAN links. Our Iridium system (SIGCOMM 2015) provides low latency geo-distributed analytics by optimizing placement of both data and tasks of the queries. Follow-on work (CLARINET, OSDI 2016) considers WAN links with heterogeneous and modest bandwidths, unlike intra-datacenter networks, when deriving query execution plans across the cloud and edge servers.

ML for Edge

Our colleagues in Microsoft Research India are developing a library of efficient machine learning (ML) algorithms that can run on resource-constrained edge and IoT devices ranging from the Arduino to the Raspberry Pi. The thesis is that IoT devices and sensors don’t have to be “dumb” i.e. they can do more than just sense their environment and transmit their readings to the cloud, which is where the traditional decision making intelligence resides. Instead, an alternative paradigm is where even tiny, resource-constrained IoT devices run ML algorithms locally. This enables important scenarios overcoming concerns around connectivity to the cloud, latency, energy, privacy, and security. Read more

Networking

Cloud providers, such as Microsoft, have two types of edges: On-net edges or Off-net edges. On-net edges are generally easier to operate, manage and maintain as they are on the cloud provider’s network. In contrast, Off-net edges are connected to the cloud via the Internet, which may include several ISPs. Managing and operating such edges can be challenging due to the vagaries of the Internet. We are investigating problems to improve the network connectivity to our Off-net edges and the networking between the edges and sensors. Furthermore, edges provide us an opportunity to (re) investigate old ideas around low-latency, secure, overlay networking.

Security

Cloud companies spend large amounts of money to physically secure their millions of servers located in their many data centers. In contrast, edge computing servers may or may not be physically secured. This opens the possibility of malicious attacks on the edge and cloud infrastructure. While a lot has been done to physically secure assets in the cloud, we are investigating techniques to do the same for our edge assets. Security and trust require authenticity and integrity, so we are investigating the use of sensors and specialized hardware in combination with new programming abstractions and system support for building secure and trusted edges. This research builds on our prior work on trusted sensors (MobiSys 2012, ASPLOS ’14) and recent product offering (Azure Sphere).

Cloud Services

Before the dawn of edge computing, which has brought about a major cloud computing paradigm shift in the industry, we developed, deployed and operated a cloud service-store under the banner of Project Hawaii. With it we empowered developers to build sophisticated, cloud-enhanced applications for their resource constraint devices. Our cloud service store included a variety of services including: optical character recognition, speech-to-text, path prediction, social computing, language translation, relay, rendezvous, etc. for Windows, Android, & IOS devices. Over 60 universities used our services as a teaching aid for senior and graduate-level mobile + cloud computing courses. 2015 onward similar cloud services were commercialized by all major cloud providers under the banner of cognitive services. Check out Azure cognitive services. Historically speaking, Project Hawaii was the first to show how cloud/edge can be used in conjunction with a resource-constraint mobile device to augment human abilities.

Project Hawaii

Project Hawaii group photo

The Project Hawaii team – BACK ROW (left to right): Gleb Krivosheev, Philip Fawcett, Ronnie Chaiken; FRONT ROW (left to right): Arjmand Samuel, Jitu Padhye, Alec Wolman, Victor Bahl. Read more.

Gallery

Gallery image of Project Hawaii

A utility tool developed by a student for on-the-go translations. Project Hawaii’s OCR & S2T services, and Bing Translator were used. Check out our Gallery for dozens of student created featured projects. Read more.