All of the tutorials will be available for access on-demand through the conference virtual platform for those attendees with a Tutorial registration.
TUT-01 : Emergency Wireless Communications for Next Era: Challenges, Applications, and Future Trends
Wenchi Cheng (Xidian University, China)
Wei Zhang (The University of New South Wales, Australia)
With the frequently occurrence of global emergencies, such as natural disasters like the 2011 earthquake of the Pacific coast of Tōhoku and pandemics like covid-19, emergency wireless communications (EWC) has attracted more and more attention of researchers from different countries. Various wireless emergency services and applications will be enabled by beyond fifth-generation (5G) and emerging sixth-generation (6G) wireless communication systems, and offer many advantages for disaster prediction, assessment, response, and disease prevention. This proposal will be discussing the main research contributions, from academia, industry, and development trends, revolving around the topic of 5G/6G technologies for emergency communications. This proposal would start with the background introduction of EWC including emergency classification and challenges over emergencies. Then, several advanced EWC-enabled technologies will be elaborated in details, including EWC-based complex channel model, unmanned aerial vehicles (UAVs) enhanced communications, reconfigurable intelligent surface (RIS) empowered communications, device-to-device (D2D) /end-to-end (E2E) performance optimization, mobile edge infrastructure, through-the-earth communications, diverse quality-of-services (QoS) guarantee technology, cross-technology communications, and network function virtualization (NFV)/software defined networks (SDNs) for EWC networks. Moreover, we would also like to share the future trends regarding intelligent EWC networks.
TUT-02 : Enhancing Wi-Fi Sensing: Definitions, Features, and Applications of IEEE 802.11bf
Claudio da Silva (Facebook, USA)
Chunyu Hu (Facebook, USA)
Wi-Fi sensing is the use of Wi-Fi to enable everyday electronic devices to learn and become aware of their surroundings. Interest and research in the area has grown steadily over the past two decades, and now Wi-Fi sensing is used in a wide range of applications, including new forms of user interface (gesture recognition), space occupancy characterization and analytics, and home security systems. Due to the significant and growing market interest in Wi-Fi sensing, IEEE 802.11 Task Group bf (IEEE 802.11bf) was formed in September 2020 to develop an amendment to the 802.11 standard that will enhance its ability to support sensing applications by defining modifications to its physical (PHY) and medium access control (MAC) layers. The objective of this tutorial is to introduce ICC attendees to main PHY and MAC definitions and features found in the IEEE 802.11bf draft (that is still under development), including the complete sensing protocol, sensing-specific 802.11 PHY definitions, and 60 GHz sensing. For each topic, we explain and discuss their design and implementation, present definitions and mechanisms specified in the draft standard, and analyze their performance as appropriate. A brief overview of basic sensing principles, 802.11 concepts, and millimeter-wave communications will also be offered.
TUT-03 : Wireless for Machine Learning
Carlo Fischione (KTH, Sweden)
Viktoria Fodor (KTH NSE, Sweden)
José Mairton Barros da Silva Jr. (KTH Royal Institute of Technology, Sweden)
Henrik Hellström (KTH Royal Institute of Technology, Sweden)
A large part of machine learning (ML) services in the future will take place over wireless networks, and conversely, a large part of wirelessly transmitted information will be related to ML. As data generation increasingly takes place on devices without a wired connection, ML over wireless networks becomes critical. Many studies have shown that traditional wireless protocols are highly inefficient or unsustainable to support distributed ML services. This is creating the need for new wireless communication methods, specifically on the medium access control and physical layers that will be arguably included in 6G. In this tutorial, we plan to give a comprehensive review of the state-of-the-art wireless methods that are specifically designed to support ML services. Namely, over-the-air computation and physical and medium access control layer optimized for supporting ML. In the over-the-air approach, multiple devices communicate simultaneously over the same time slot and frequency band to exploit the superposition property of wireless channels for gradient averaging over-the-air. In physical and medium access control layer optimized for ML, active learning metrics guide the allocation of spectrum and energy resources for faster or more accurate ML. This tutorial introduces these methods, reviews the most important works, and highlights crucial open problems.
TUT-04 : Deep Learning Empowered Large-Scale Antenna Systems
Feifei Gao (Tsinghua University, China)
Shun Zhang (Xidian University, China)
Zhen Gao (Beijing Institute of Technology, China)
Wireless communication systems turn to exploit large antenna arrays to achieve the degree of freedom in space domain, such as millimeter wave massive multi-input multi-output (MIMO), and reconfigurable intelligent surface (RIS) assisted communications. Meanwhile, it has been recently admitted that implementing deep learning (DL) into large-scale antenna communications will extensively benefit the system capacity and enhance the robustness to complicated transmission environments. Different from traditional model-driven approaches, DL can help deal with the existing communications and signal processing problems in a data driven perspective by digging the inherent characteristic from the real data. This tutorial aims to provide the audience a general picture of the recent developments in this exciting area. Specifically, in this interactive presentation we will introduce the merging of DL and large-scale antenna systems, over various topics, including channel acquisition, signal detection, and beamforming design, etc. We will also discuss the challenges of DL empowered large-scale antenna systems and present some interesting future directions.
TUT-05 : Wireless Information and Energy Transfer in the Era of 6G Communications
Ioannis Krikidis (University of Cyprus)
Constantinos Psomas (University of Cyprus)
Conventional energy-constrained wireless systems such as sensor networks are powered by batteries and have limited lifetime. Wireless power transfer (WPT) is a promising technology for energy sustainable networks, where terminals can harvest energy from dedicated electromagnetic radiation through appropriate electronic circuits. The integration of WPT technology into communication networks introduces a fundamental co-existence of information and energy flows; radio-frequency signals are used in order to convey information and/or energy. The efficient management of these two flows through sophisticated networking protocols, signal processing/communication techniques and network architectures, gives rise to a new communication paradigm called wireless powered communications (WPC). In this tutorial, we discuss the principles of WPC and we highlight its main network architectures as well as the fundamental trade-off between information and energy transfer. Several examples, which deal with the integration of WPC in modern communication systems, are presented. Specifically, we study some fundamental network structures such as the MIMO broadcast channel, the interference channel, the relay channel, the multiple-access channel, and ad-hoc networks. The integration of WPC in 6G and beyond is analyzed and discussed through the use of tools from stochastic geometry. Future research directions and challenges are also pointed out.
TUT-06 : Federated Learning for 6G: Models, Algorithms and Systems
Khaled B. Letaief (The Hong Kong University of Science and Technology, Hong Kong)
Yuanming Shi (ShanghaiTech University, China)
Yong Zhou (ShanghaiTech University, China)
The thriving of artificial intelligence (AI) applications is driving the further evolution of 6G to revolutionize the evolution of wireless from "connected things" to "connected intelligence". Federated learning (FL) is a collaborative machine learning framework to enable scalable and trustworthy "connected intelligence" for 6G. This is achieved by training a global statistical model without accessing edge devices' private raw data, wherein a dedicated edge server is responsible for aggregating local learning model updates and disseminating global learning model updates. However, FL will cause task-oriented data traffic flows over networks under the statistical and system heterogeneity. This encourages multidisciplinary collaborations among spanning wireless communications, machine learning, and operation research. The aim of this tutorial is to present recent advances in decentralized optimization and wireless networking technologies to design FL systems for 6G. Specifically, personalized, fair, robust, private and trustworthy FL models will be presented to achieve statistical efficiency. Decentralized zeroth-order optimization, first-order optimization, and second-order optimization algorithms will be further introduced to achieve communication efficiency. Task-oriented communication principles, disruptive wireless network architectures, as well as service-driven resource allocation optimization will be described to achieve system efficiency. Various application scenarios, software and hardware platforms for FL will also be discussed.
TUT-07 : Semantic Communications: Transmission beyond Shannon Paradigm
Geoffrey Ye Li (Imperial College London, United Kingdom (Great Britain))
Zhijin Qin (Queen Mary University of London, United Kingdom (Great Britain))
Shannon and Weaver categorized communications into three levels: • Level A. How accurately can the symbols of communication be transmitted? • Level B. How precisely do the transmitted symbols convey the desired meaning? • Level C. How effectively does the received meaning affect conduct in the desired way? In the past decades, researchers primarily focus level A communications. With the development of cellular communication systems, the achieved transmission rate has been improved tens of thousands of times and the system capacity is gradually approaching to the Shannon limit. Semantic communications have been regarded as a promising direction to improve the system efficiency and reduce the data traffic so that to realize the level B or even level C communications. Semantic communications aim to realize the successful semantic information transmission that is relevant to the transmission task at the receiver. In this tutorial, we first introduce the concept of the semantic communications and a general model of it. We then detail the principles and performance metrics of semantic communications. Afterwards, we present the initial work on deep learning enabled semantic communications for different sources, multi-user semantic communication systems, and green semantic communications. Finally, we identify the research challenges in semantic communications.
TUT-08 : The Multifunctional Network of 6G and Beyond: Fundamentals of Integrating Communications and Sensing
Fan Liu (Southern University of Science and Technology, China)
Christos Masouros (University College London, United Kingdom (Great Britain))
J. Andrew Zhang (University of Technology Sydney, Australia)
As the standardization of 5G is gradually being solidified, researchers are speculating what 6G will be. A common theme in many perspectives is that 6G Radio Access Network (RAN) should serve as edge infrastructure to provide site-specific services for surrounding users, rather than communication-only functionality. Jointly suggested by recent advances from the communications and signal processing communities, radio sensing functionality can be integrated into 6G RAN in a low-cost and fast manner. Therefore, the future cellular network could image and measure the surrounding environment to enable advanced location-aware services, ranging from the physical to application layers. This type of research is typically referred to as Integrated Sensing and Communications (ISAC), which has found applications in numerous emerging areas, including vehicular networks, environmental monitoring, as well as indoor services such as human activity recognition. In this tutorial, we will firstly overview the background and application scenarios of ISAC. As a step further, we will introduce the state-of-the-art research progress on this topic, which consists of 4 technical parts: 1) Fundamental Tradeoff, 2) Waveform Design, 3) ISAC for Vehicular Networks, and 4) Perceptive Mobile Networks. Finally, we will conclude the tutorial by summarizing the future directions and open problems in ISAC.
TUT-09 : Evolution of NOMA Toward Next Generation Multiple Access
Yuanwei Liu (Queen Mary University of London, United Kingdom (Great Britain))
Zhiguo Ding (University of Manchester, United Kingdom (Great Britain))
User data traffic, especially a large amount of video traffic and small-size internet-of-things (IoT) packets, has dramatically increased in recent years with the emergence of smart devices, smart sensors and various new applications. It is hence crucial to increase network capacity and user access to accommodate these bandwidth consuming applications and enhance the massive connectivity. As a prominent member of the next generation multiple access (NGMA) family, non-orthogonal multiple access (NOMA) has been recognized as a promising multiple access candidate for the sixth-generation (6G) networks. The main content of this tutorial is to discuss the so-called "One Basic Principle plus Four New" concept. Starting with the basic NOMA principle to explore the possible multiple access techniques in non-orthogonal manner, the advantages and drawbacks of both the channel state information based successive interference cancelations (SIC) and quality-of-service based SIC are discussed. Then, the application of NOMA to meet the new 6G performance requirements, especially for massive connectivity, is explored. Furthermore, the integration of NOMA with new physical layer techniques is considered, followed by introducing new application scenarios for NOMA towards 6G. Finally, the application of machine learning in NOMA networks is investigated, ushering in the machine learning empowered NGMA era.
TUT-10 : Deep Learning Techniques for Hybrid Beamforming in mmWave and THz-Band Communications and Radar
Kumar Vijay Mishra (United States Army Research Laboratory, USA)
Ahmet M Elbir (Duzce University & University of Luxembourg, Turkey)
The millimeter-wave (mm-Wave) massive MIMO communications employ hybrid analog-digital beamforming architectures to reduce the cost-power-size-hardware overheads. Lately, there is also a gradual push to move from the millimeter-wave (mmWave) to Terahertz (THz) frequencies for short-range communications and radar applications to exploit very wide THz bandwidths. At THz, ultramassive MIMO is an enabling technology to exploit even wider bandwidth while employing thousands of antennas. The design of the hybrid beamforming techniques requires the solution to difficult nonconvex optimization problems that involve a common performance metric as a cost function and several constraints related to the employed communication regime and the adopted architecture of the hybrid system(s). There is no standard methodology for solving such problems and usually, the derivation of an efficient solution is a very challenging task. Since optimization-based approaches suffer from high computational complexity and their performance strongly relies on the perfect channel condition, we introduce deep learning (DL) techniques that provide robust performance while designing a hybrid beamformer. In this tutorial, the audience will learn about applying DL to various aspects of hybrid beamforming including channel estimation, antenna selection, wideband beamforming, and spatial modulation. In addition, we will examine these concepts in the context of joint radar-communications architectures.
TUT-11 : Softwarization and Virtualization in 5G and Beyond: from Theory to Practice
Fabrizio Granelli (University of Trento, Italy)
Frank Fitzek (Technische Universität Dresden & ComNets - Communication Networks Group, Germany)
The aim of the tutorial is to illustrate how the emerging paradigms of Software Defined Networking, Network Function Virtualization, and Information Centric Networking will impact on the development of future systems and networks, both from the theoretical/formal as well as from the practical perspective. Main focus will be on mobile networks, i.e. 5G and beyond. The tutorial will provide a comprehensive overview of the individual building blocks (software defined networking; network function virtualization; information centric networks) enabling the concept of computing in future networks, starting from use cases and concepts over technological enablers (Mininet; Docker) and future innovations (machine learning; network coding; compressed sensing) to implementing all of them on personal computers. Practical hands-on activities will be proposed, with realistic use cases to bridge theory and implementation by several examples, through the usage of a pre-built ad-hoc Virtual Machine (ComNetsEmu) that can be easily be extended for new experiments. The instructions to download the Virtual Machine will be provided in advance of the event. The main objective of the tutorial will be to expose attendees to the most recent technologies in the field of networking and teach them how to use them in a real setup in the "hands-on" session.
TUT-12 : Integrated Access and Backhaul for 5G and Beyond
Behrooz Makki (Chalmers University of Technology, Sweden)
Mohamed-Slim Alouini (King Abdullah University of Science and Technology (KAUST), Saudi Arabia)
Erik Dahlman (Ericsson Research, Sweden)
Filip Barać (Ericsson AB, Sweden)
To cope with the exponential growth of wireless communications, 5G and beyond will densify the network with many base stations (BSs) of different types. The BSs, however, require backhauling. On a global scale, fiber and wireless microwave technology are dominating backhauling media. Traditionally, wireless backhaul has been mainly based on proprietary technology operating in millimeter wave (mmw) spectrum and constrained to line-of-sight propagation conditions. However, with 5G, cellular technology extends into mmw spectrum; the spectrum historically used for backhauling. Also, with small-cells deployed on street level, the backhaul links need to operate also under nonline-of-sight conditions. These are the main motivations for the integrated access and backhaul (IAB) concept. The aim of IAB is to provide flexible wireless backhaul using 3GPP NR technology, providing not only backhaul but also the existing cellular services in the same node. The objective of the tutorial is to bring new insights to the analysis, design and standardization of IAB networks. The tutorial will 1) go through the recently finished Release 16 and ongoing Release 17 IAB work-items of 3GPP, 2) present proof-of-concept results for the usefulness of IAB, and 3) provide a vision for IAB-related research towards 6G.
TUT-13 : Integrated Access and Backhaul for 5G and Beyond
Andrea Ortiz (TU Darmstadt, Germany)
Sabrina Klos (TU Darmstadt, Germany)
Bringing intelligence into wireless networks is a key enabler to meet the requirements of beyond 5G networks. One approach to equip networks with self-management capabilities is Reinforcement Learning (RL). With RL, networks can adaptively and autonomously exploit available resources in an online manner. RL and its combination with deep learning, known as deep RL, have yielded impressive results in network optimization. Nevertheless, while advanced deep RL methods could be readily applied to communication problems in a black-box-like manner, this may lead to poorly understood or over-engineered solutions. Hence, before delving into deep RL, researchers need to understand the basics of RL. The goal of this tutorial is therefore to give a clear and condensed view of RL. First, we introduce RL and motivate its use in wireless networks. Then, we give a taxonomy of the multi-armed bandit problem and its algorithms. Thereafter, we focus on the RL problem, give a taxonomy on RL approaches and the main algorithms. Additionally, we provide practical guidelines by thoroughly studying an application of RL in wireless networks and give an overview of extensions and future research directions. Offering easy-to-grasp material, this tutorial is a gentle introduction to the field of RL in wireless networks.
TUT-14 : Breaking the energy consumption growth in future mobile networks: 5G enhancements and machine learning
Nicola Piovesan (Huawei Technologies, France)
Antonio De Domenico (Huawei Technologies Co. Ltd., France Research Center, France)
David López-Pérez (Huawei Technologies France, France)
The fifth generation (5G) of radio technology is revolutionizing our everyday lives, by enabling a high degree of automation, through its larger capacity, massive connectivity, and ultra-reliable low-latency communications. Despite its capabilities, however, 5G networks must improve in certain key technology areas, such as that of energy efficiency. While current 3GPP NR deployments provide an improved energy efficiency of around 4x w.r.t. 3GPP LTE ones, they still consume up to 3x more energy. Even if the 3GPP NR specification provides several tools to meet IMT-2020 energy efficiency requirements, one of the main energy consumption challenges of 5G networks is the complexity of their optimization in wide-area deployments: A large-scale, stochastic, non-convex and non-linear optimization problem. In light of the increasing interest in this field, this one-of-a-kind tutorial shares the author's industrial view on this 5G energy efficiency problem. In details, the tutorial provides a fresh look at energy efficiency enabling technologies in 3GPP NR. Moreover, by leveraging on the concepts of big data and machine learning, the tutorial presents practical scenarios in which data collected from thousands of base stations can be used to derive accurate machine learning models for the main building blocks of the energy efficiency optimization problem.
TUT-15 : Post-Shannon Communications for 6G
Rafael F. Schaefer (University of Siegen, Germany)
Holger Boche (Technical University Munich, Germany)
Christian Deppe (Technical University of Munich, Germany)
Frank H.P. Fitzek (Technische Universität Dresden & ComNets - Communication Networks Group, Germany)
Since the breakthrough of Shannon's seminal paper, researchers have worked on codes and techniques that approach the fundamental limits of message transmission. Here, the maximum number of possible messages that can be transmitted scales exponentially with the blocklength of the codewords. We advocate a paradigm change towards Post-Shannon communication that allows the encoding of messages whose maximum number scales double-exponentially with the blocklength! In addition, secrecy comes "for free" in the sense that it can be incorporated without penalizing the transmission rate! This paradigm shift is the study of semantic communication instead of message only transmission. It involves a shift from the traditional design of message transmission to a new Post-Shannon design that takes the semantics of the communication into account going beyond the transmission of pure message bits. Entire careers were built designing methods and codes on top of previous works, bringing only marginal gains in approaching the fundamental limit of Shannon's message transmission. This paradigm change can bring not only marginal but also exponential gains in the efficiency of communication. Within the Post-Shannon framework, this tutorial explores identification codes, embedded security, resilience by design, the exploitation of resources that have been considered useless in the traditional Shannon framework.
TUT-16 : Localization-of-Things: from Foundation to B5G Ecosystem
Moe Z. Win (Massachusetts Institute of Technology, USA)
Andrea Conti (DE and CNIT, University of Ferrara, Italy)
The availability of real-time high-accuracy location awareness is essential for current and future wireless applications, particularly those involving Internet-of-Things and beyond 5G ecosystem. Reliable localization and navigation of people, objects, and vehicles - Localization-of-Things - is a critical component for a diverse set of applications including connected communities, smart environments, vehicle autonomy, asset tracking, medical services, military systems, and crowd sensing. The coming years will see the emergence of network localization and navigation in challenging environments with sub-meter accuracy and minimal infrastructure requirements. We will discuss the limitations of traditional positioning, and move on to the key enablers for high-accuracy location awareness: wideband transmission and cooperative processing. Topics covered will include: fundamental bounds, cooperative algorithms for 5G and B5G standardized scenarios, and network experimentation. Fundamental bounds serve as performance benchmarks, and as a tool for network design. Cooperative algorithms are a way to achieve dramatic performance improvements compared to traditional non-cooperative positioning. To harness these benefits, system designers must consider realistic operational settings; thus, we present the performance of cooperative localization based on measurement campaigns.
TUT-17 : Age of Information in Wireless Networks: Fundamentals and Applications
Howard Yang (ZJU-UIUC Institute & University of Illinois at Urbana Champaign (UIUC), China)
Nikolaos Pappas (Linköping University, Sweden)
Tony Q.S. Quek (Singapore University of Technology and Design, Singapore)
Xijun Wang (Sun Yat-sen University, China)
Chao Xu (Northwest A&F University, China)
This tutorial aims to present the current research efforts on the analysis, optimization, and applications of the age of information (AoI) metric. AoI has been introduced recently for quantifying and evaluating the information freshness in wireless networks. In this tutorial, we will provide a comprehensive coverage including the definition and applications of AoI, queueing theory-based AoI analysis, age-oriented multiuser scheduling policies, spatiotemporal models for assessing AoI statistics in large-scale wireless networks, and reinforcement learning-based approaches that optimize AoI in wireless systems. Representative works in these areas will be discussed during the tutorial.
TUT-18 : Holographic Networking: A New Frontier in Communication
Ian F. Akyildiz (Georgia Institute of Technology, USA)
Rui Dai (University of Cincinnati, USA)
Chen Chen (University of Central Florida, USA)
Pu WangChao Xu (University of North Carolina at Charlotte, USA)
Holographic telepresence is an emerging technology that projects full-motion, real-time, free-viewpoint, and high-resolution 3D volumetric human beings and objects into remote locations. By making objects/people at a different location appear right in front of you, holographic telepresence breaks the physical boundaries and revolutionizes how people communicate with each other and interact with the physical world. Despite its great potential, the realization of holographic telepresence faces the great challenge of delivering extremely high volumes of 3D point cloud data in a real-time manner under inherent bandwidth constraints. To address such a challenge, we envision holographic networking will become a new frontier in communication, which demands joint exploration of emerging 3D point-cloud compression, 3D human reconstructions, and real-time 3D streaming video communications. This tutorial has three objectives: i) give a systematic overview, applications, and research challenges of holographic networking; ii) explain the emerging technologies that enable efficient holographic networking, including new point cloud compression solutions, 3D video reconstruction via deep learning, and AI approaches for networking; iii) demonstrate the first holographic networking prototype that enables the joint design of compression, AI-based processing, and networking of holographic content. The audience of this tutorial will be researchers in communications, computer vision, multimedia, and AR/VR/MR.
TUT-19 : Wireless Time Sensitive Networking: next generation wireless for time-critical performance
Dave A Cavalcanti (Intel Corporation, USA)
Vivek Jain (Robert Bosch LLC, USA)
Emerging applications are demanding more than high throughput and low latency from wireless networks. There is significant interest on time-sensitive applications, such as mobile robots, Autonomous Vehicles, aerospace, Industry 4.0, Industrial IoT, mobile gaming, and extended Reality (XR). These applications need precise time, determinism, worst-case low latency guarantees and extremely high reliability. Despite the advances in Ultra-Reliable Low Latency Communications (URLLC) defined in 5G, the efficiency and latency enhancements Wi-Fi 6/6E, wireless technologies remain far away from what is achievable with Time-Sensitive Networking (TSN) over wired media (Ethernet) in terms of timeliness, ultra-low latency guarantees (microsecond level) and reliability. This tutorial will address time-critical application requirements and related innovations in the next generation wireless systems, including cellular (Beyond 5G and 6G) and Wi-Fi (802.11be/Wi-Fi 7 and future Wi-Fi 8) standards. The tutorial will highlight the practical implementation challenges for the industry to meet time-critical requirements beyond standards development, such as interference management, network configuration, cross-device coordination, latency-reliability optimized scheduling, and orchestration of time-critical computing and networking resources. The presentation will include an outlook on evaluation methodologies, tools, testbeds, and industry ecosystem activities required for a scalable market adoption of wireless in time-critical systems.
TUT-20 : Near-Field Wideband XL-MIMO for 6G: Challenges, Solutions, and Opportunities
Linglong Dai (Tsinghua University, China)
Yonina C. Eldar (Weizmann Institute of Science, Israel)
Compared with massive MIMO for current 5G systems, extremely large-scale MIMO (XL-MIMO), where the antenna number is further increased, is a promising technology to achieve the very high spectrum efficiency of Kbps/Hz for future 6G communications. However, the change from massive MIMO for 5G to XL-MIMO for 6G not only means the increase in antenna number, but also leads to the fundamental change of the electromagnetic field property. In this tutorial, the near-field wideband effect, a new electromagnetic phenomenon distinguishing XL-MIMO from massive MIMO, will be discussed. Specifically, this tutorial will first introduce the background of XL-MIMO for 6G. Then, the near-field wideband effect for XL-MIMO will be explained, based on which we will show that this effect dramatically decreases the actual transmission rates of XL-MIMO systems. To address this new challenge, this tutorial will discuss how to develop advanced transmission techniques, including near-field wideband beamforming, near-field wideband channel estimation, and near-field wideband beam training, to make practical XL-MIMO work. Finally, future research opportunities for near-field wideband XL-MIMO will be discussed, such as the channel modeling and system performance limits, intelligent signal processing algorithms, waveform design, hybrid-field communications, continuous-aperture MIMO (CAP-MIMO), novel multiple access schemes, and hardware prototyping.
TUT-21 : MetaEverything: Ubiquitous Sensing and Communications Aided by Intelligent Meta-Surfaces
Boya Di (Peking University, China)
Hongliang Zhang (Princeton University, USA)
Lingyang Song (Peking University, China)
Zhu Han (University of Houston, USA)
The future wireless networks are exhibiting a trend towards the intelligent communication and sensing system to support a variety of applications requiring high data rates, low hardware cost, and fine-resolution sensing. With the recent development of meta-surfaces, it provides an efficient approach to reshape and control the electromagnetic characteristics of the environment, which can be utilized to enhance the performance of communication and sensing. In this tutorial, we will first provide a general introduction of the intelligent meta-surface along with state-of-the-art research in different areas. We then introduce two different meta-surfaces, i.e., intelligent reflective and omni-directional meta-surfaces, respectively, as well as the system design, optimization, and prototypes. Three types of meta-surface based applications are introduced and implemented, i.e., cellular communications, RF sensing (including localization), and meta-surface IoT sensors for simultaneous sensing and communications. Related design, analysis, optimization, and signal processing techniques will be presented. The implementation issues along with our developed prototypes and experiments will also be discussed. Formalized analysis of several up-to-date challenges and technical details on system design will be provided for different applications.
TUT-22 : Wireless Communications for Federated Learning
Kaibin Huang (The University of Hong Kong, Hong Kong)
Osvaldo Simeone (King's College London, United Kingdom (Great Britain))
Mingzhe Chen (Princeton University, USA)
Zhaohui Yang (University College London, United Kingdom (Great Britain))
Traditional machine learning is centralized in the cloud (data centers). Recently, the security concern and the availability of abundant data and computation resources in wireless networks are pushing the deployment of learning algorithms towards the network edge. This has led to the emergence of a fast growing area, called edge learning, which integrates two originally decoupled areas: wireless communication and machine learning. It is widely expected that the advancements in edge learning would provide a platform for implementing edge artificial intelligence (AI) in 5G-and-Beyond systems and solving large-scale problems in our society ranging from autonomous driving to personalized healthcare. One of the most promising edge learning algorithms is the emerging federated learning framework, which features distributed learning over many wireless devices as coordinated by edge servers to cooperatively train a large-scale AI model using local data and CPUs/GPUs. The iterative learning process involves repeated downloading and uploading of high-dimensional (millions to billions) model parameters or their updates by tens to hundreds of devices. This will generate enormous data traffic, placing a heavy burden on the already congested radio access networks. The training problem cannot be efficiently solved using traditional wireless techniques targeting rate maximization and decoupled from learning.
TUT-23 : Openness in Radio Access Network Design in 6G - the ORAN Concept
Adrian Kliks (Poznan University of Technology, Poland)
Marin Dryjanski (RIMEDO Labs, Poland)
Łukasz Kułacz (Poznan University of Technology, Poland)
Currently, one of the hot topics in the telecom's world is Open RAN. In the legacy telco way of providing Radio Access Network (RAN), there is a single black box and the internal interfaces within that box are closed and are in hands of one vendor. The Open RAN, known as "O-RAN" is defined by O-RAN Alliance, an entity, which mission is "to re-shape the RAN industry towards more intelligent, open, virtualized and fully interoperable mobile networks". The control of the radio networks is defined as an external entity known as RAN intelligent controller (RIC), which allows performing optimization of the network or radio resources. This is where the intelligence sits, by the means of artificial intelligence (AI) models for radio network automation. The algorithms for radio network control that are incorporated within the RIC framework are known as "xApps", and as such are independent of RIC and may be provided by a third party. The proposed tutorial addresses various aspects of O-RAN design, both theoretical and practical (including virtual lab), starting from the introduction to the topic, through the description of the prospective architecture and interfaces, involved associations, finishing at the practical implementations of selected xApps.
TUT-24 : Contactless Health Monitoring Using Wireless Signals and Cameras
Shiwen Mao (Auburn University, USA)
Xuyu Wang (California State University, Sacramento, USA)
Wenjin Wang (Eindhoven University of Technology, The Netherlands)
Contactless health monitoring based on wireless sensors and camera is an emerging research topic in internet of things (IoT) field, especially in the background of COVID-19. Various contactless sensors, such as optical camera, radio frequency, acoustic, capacitive and magnetic sensors, can be exploited to measure physiological signals and activity signals from a human face and body to assess the health condition. Wireless communications, signal processing, and AI techniques are essential steps driving these measurements. The new combination of wireless signals and camera leads to the multi-modal sensing, which further improves the monitoring performance and understanding of human health informatics (e.g. public health informatics, big data analysis, smart clinical alarms). In this tutorial, we will provide an overview of the latest developments and applications in contactless health monitoring, with an in-depth introduction on the core techniques, which includes three technical parts: 1) Camera based vital signs and activity monitoring, 2) Wireless signal based vital sign and activity monitoring, and 3) Multimodal learning of vision and RFID for health monitoring and pose estimation. Finally, we will conclude the tutorial by summarizing the current techniques and future directions in the fusion of wireless and camera for health monitoring systems.
TUT-25 : Machine Learning for AI-Native Wireless Networks: Challenges and Opportunities
Walid Saad (Virginia Tech, USA)
Medhi Bennis (Centre of Wireless Communications, University of Oulu, Finland)
This tutorial will provide a holistic tutorial on machine learning for wireless network design. In particular, we first provide a comprehensive treatment of the fundamentals of machine learning and artificial neural networks, which are one of the most important pillars of machine learning. After providing a substantial introduction to the basics of machine learning, we introduce a classification of the various types of neural networks that include feed-forward neural networks, recurrent neural networks, spiking neural networks, and deep neural networks. For each type, we provide an introduction on their basic components, their training processes, and their use cases with specific example neural networks. Then, we overview a broad range of wireless applications that leverage neural network designs. This range of applications includes spectrum management, multiple radio access technology cellular networks, wireless virtual reality, mobile edge computing and caching, drone-based communications, and the Internet of Things. For each application, we first outline the rationale for applying machine learning. Then, we overview the challenges and opportunities brought forward by the use of neural networks in the specific wireless application. We complement this overview with a detailed example drawn from the state-of-the-art. We conclude by an overview on future works in this area.
TUT-26 : Model-based deep learning for communications
Nir Shlezinger (Ben-Gurion University of the Negev, Israel)
Yonina C. Eldar (Weizmann Institute of Science, Israel)
Recent years have witnessed a dramatically growing interest in machine learning (ML). These data-driven trainable structures have demonstrated an unprecedented success in various applications, including computer vision and speech processing. The benefits of ML-driven techniques over traditional model-based approaches are twofold: First, ML methods are independent of the underlying stochastic model, and thus can operate efficiently in scenarios where this model is unknown, or its parameters cannot be accurately estimated; Second, when the underlying model is extremely complex, ML has the ability to extract the meaningful information from the observed data. Nonetheless, not every problem should be solved using deep neural networks (DNNs). In scenarios for which model-based algorithms exist and are feasible, these analytical methods are typically preferable over ML schemes due to their performance guarantees and possible proven optimality. A notable area where model-based schemes are typically preferable, and whose characteristics are fundamentally different from conventional deep learning applications, is communications. In this tutorial, we present methods for combining DNNs with model-based algorithms. We will show hybrid model-based/data-driven implementations which arise from classical methods in communications, and demonstrate how fundamental classic techniques can be implemented without knowledge of the underlying statistical model, while achieving improved robustness to uncertainty.
TUT-27 : Recent Advances and Future Challenges on 6G Wireless Channel Measurements and Models
Cheng-Xiang Wang (Southeast University & Purple Mountain Laboratories, China)
Haiming Wang (Southeast University & Purple Mountain Laboratories, China)
Jie Huang (Southeast University, China)
Harald Haas (The University of Strathclyde, United Kingdom (Great Britain))
For the design, performance evaluation, and optimization of wireless communication systems, channel measurements and realistic channel models with good accuracy-complexity-generality trade-off are indispensable. The proposed tutorial is intended to offer a comprehensive and in-depth course to communication professionals/academics, aiming to address recent advances and future challenges on channel measurements and models for sixth generation (6G) wireless systems. Network architecture and key technologies for 6G that will enable global coverage, all spectra, and full applications will be first discussed. Channel measurements and non-predictive channel models are then reviewed for challenging 6G scenarios and frequency bands, focusing on shortwave, millimeter wave, terahertz, and optical wireless communication channels under all spectra, satellite, unmanned aerial vehicle, maritime, and underwater acoustic communication channels under global coverage scenarios, and high-speed train, vehicle-to-vehicle, ultra-massive multiple-input multiple-output (MIMO), industry Internet of things (IoT), reconfigurable intelligent surface (RIS), and orbital angular momentum (OAM) communication channels under full application scenarios. New machine learning based predictive channel models will also be investigated. A general non-predictive 6G pervasive channel model will then be proposed, which is expected to serve as a baseline for future standardized 6G channel models. Future research challenges and trends for 6G channel measurements and models will be discussed.
TUT-28 : Reconfigurable Intelligent Surfaces: Electromagnetic models, design, and future directions
Alessio Zappone (University of Cassino and Southern Lazio, Italy)
Marco Di Renzo (CentraleSupelec-University, France)
As 5G networks take their final form, connectivity demands continue to increase exponentially and new services pose more constraints on the performance that end-users expect. A recent technological breakthrough that holds the potential to meet these demands is that of reconfigurable intelligent surfaces. We believe that a tutorial on the principles and latest approaches of reconfigurable intelligent surfaces for beyond 5G wireless communications will be of great value for both academics and industry practitioners.
TUT-29 : Public Blockchain: Theoretical Foundation and Applications
Wenbing Zhao (Cleveland State University, USA)
This tutorial is structured into three parts accordingly. In the first part, decentralized consensus is discussed. Decentralized consensus is the most important innovation brought by Bitcoin, the first public blockchain. Decentralized consensus forms the theoretical foundation for public blockchain. Decentralized consensus is drastically different from traditional distributed consensus, and it truly defines the public blockchain technology. More specifically, this part will introduce a general model for decentralized consensus. The Proof of Work and various Proof of Stake consensus approaches will be examined with respect to this model. A critique on other Proof-of-X approaches will be provided. The second part of this tutorial elaborate the characteristics of the blockchain technology, i.e., exactly what benefits the blockchain technology could bring to a system and what limitations the technology current has. For example, when we say the blockchain is immutable, what exactly does it mean? The third part of this tutorial covers various blockchain-enabled applications. These applications will be examined specifically on what blockchain benefits have been used.
TUT-30 : On the Road to Quantum Communications
Lajos Hanzo (University of Southampton, United Kingdom (Great Britain))
Kwang-Cheng Chen (University of South Florida, USA)
Moore's laws has indeed prevailed since he outlined his empirical rule-of-thumb in 1965, but based on this trend the scale of integration is set to depart from classical physics, entering nano-scale integration, where the postulates of quantum physics have to be obeyed. The quest for quantum-domain communication solutions was inspired by Feynman's revolutionary idea in 1985: particles such as photons or electrons might be relied upon for encoding, processing and delivering information. Hence in the light of these trends it is extremely timely to build an interdisciplinary momentum in the area of quantum communications, where there is an abundance of open problems for a broad community to solve collaboratively. In this workshop-style interactive presentation we will address the following issues: 1) We commence by highlighting the nature of the quantum channel, followed by techniques of mitigating the effects of quantum decoherence using quantum codes. 2) Then we bridge the subject areas of large-scale search problems in wireless communications and exploit the benefits of quantum search algorithms in multi-user detection, in joint-channel estimation and data detection, localization and in routing problems of networking, for example. 3) We survey advances in quantum key distribution networks.
TUT-31 : A Primer on Integrating Terrestrial and Non-terrestrial Networks
Giovanni Geraci (Universitat Pompeu Fabra, Spain)
Adrian Garcia-Rodriguez (Huawei Technologies, France)
Mustafa A Kishk (King Abdullah University of Science and Technology, Saudi Arabia)
Mohamed-Slim Alouini (King Abdullah University of Science and Technology (KAUST), Saudi Arabia)
Next-generation wireless networks are envisioned to break the boundaries of the current ground-focused paradigm and fully embrace aerial and spaceborne communications. In a quest for anything, anytime, anywhere connectivity, this vertical revolution entails defining a new intelligently integrated network architecture. The technological and societal implications would be of the greatest long-term significance, including connecting the unconnected and hyper-connecting the already-connected. The former will stimulate the economy in poor communities, providing digital inclusion, access to remote learning and healthcare, and improved farming productivity. The latter will enable new services in the cities of the next decade, including support for 3D aerial highways and advanced urban air mobility. However, important hurdles must be overcome to optimally orchestrate the integrated ground-air-space mobile network of tomorrow. This tutorial takes a holistic approach to integrating terrestrial and non-terrestrial mobile communications. We will discuss: - Fresh 3GPP updates on non-terrestrial networks, including deployments, architecture, and channel modeling. - Tangible performance results and novel research questions on next-generation aerial and spaceborne communications. - Design challenges in jointly orchestrating future 6G ground-air-space mobile networks for direct access, relayed fixed broadband, and IoT support. - Drone-based solutions to ensure stable coverage in rural areas, including tethered and laser-powered UAVs.
TUT-32 : Security and Privacy in 6G: The Vision Towards Reality
Madhusanka Liyanage (University College Dublin, Ireland & University of Oulu, Finland)
5G is a promising technology which has promised to support different verticals and novel applications such as Industrial Internet of Things IoT (IIoT), smart cities, autonomous vehicles, remote surgeries, virtual and augmented reality and so on. However, these verticals have a diverse set of network connectivity requirements and sometimes it is challenging to deliver customized services for each vertical by using a typical wide area 5G network. Thus, the operation of Private 5G operator (P5GO)/ Local 5G operator (L5GO) networks or private 5G networks are considered as a viable option to tackle this challenge. A private 5G network is a localized small cell network which can offer tailored service delivery for a localized environment. The adaptation of network softwarization in 5G allows vertical owners to deploy and operate such private 5G networks. However, the deployment of private 5G networks raises various issues and challenges related to the management of subscribers, roaming users, spectrum, security and also the infrastructure. In this tutorial, we will discuss these issues and challenges and propose possible solutions by using blockchain-based platforms.