The confluence of Cloud computing, 5G, and IoT in the Fog

Author: William Tärneberg; Bredbandskommunikation; []

Keywords: ;

Abstract: In the wake of the arrival of cloud computing, future applications are poised to be- come more resilient and adaptive by embracing elasticity in an osmotic manner. Although cloud computing is a strong attractor for application developers, thereare still unconquered performance frontiers. Latency-sensitive and mission-critical ap- plications make up a significant portion of all software systems, and their owners are eager to reap the benefits of cloud computing. However, they are hindered by signific- ant delay, jitter in the delay, and relatively low resilience when operating on traditional, distant, cloud data centres.Fog computing is emerging as a remedy. Fog computing is a heterogeneous hyper- distributed cloud infrastructure paradigm, ranging from small compute nodes close to the end-users to traditional distant data centres. With greater proximity to the end- users, delay and jitter in the delay can be reduced, and intermediate network reliability improved. Additionally, with increased heterogeneity of resources, applications have a richer tapestry of resources to take advantage of for their objectives. However, man- aging and taking advantage of this heterogeneity in resources and objectives is a chal- lenge for both the infrastructure providers and application owners alike. Only where to place and scale application components and how to manage system resources to meet the objectives of both parties, is non-trivial. Application placement implies elaborate optimisation objectives, hard-to-find solutions, and operational conflicts.The objective of this thesis is to investigate the performance-related properties of fog computing, how such an infrastructure can be managed while applications can osmotic- ally take advantage of the infrastructure, and what Fog computing’s potential practical performance gains are. These are fundamental topics that need to be answered for pro- viders and application owners alike to be able to invest in fog computing. In general terms, the work in this thesis seeks the trade-offs between infrastructure, applications, and software platform in contrast to the traditional cloud offering.The thesis provides modelling and simulation tools for evaluating the performance and feasibility of Fog computing. Based on which, the thesis goes on to propose holistic infrastructure management algorithms. The requirements of latency-sensitive and mission-critical applications and use cases are discussed for a fog computing paradigm. These requirements are then translated to Fifth Generation Wireless Spe- cifications (5G) Massive Multiple Input Multiple Output (MIMO) specifications. An original 5G-based fog computing test-bed for time-sensitive and mission-critical ap- plications is implemented. The test-bed is used to evaluate the potential application performance gains of fog computing and to what extent the applications can practic- ally take advantage of a fog infrastructure. The thesis also investigates the architecture of the applications that are proposed to benefit from fog computing and how they per- form in traditional cloud offerings.The included works show that fog computing indeed has a performance advantage over the traditional distant cloud, not only in latency but also in robustness. The be- nefits of 5G on a time-sensitive application deployed in a fog computing infrastructure are shown to be significant. It is also shown that a fog computing infrastructure with a high degree of heterogeneity and with multiple objectives can be successfully managed scalably. Additionally, the thesis sheds some light on the challenges of implementing latency-sensitive and mission-critical applications with traditional cloud service offer- ings.

  CLICK HERE TO DOWNLOAD THE WHOLE DISSERTATION. (in PDF format)