Logo des Repositoriums

it - Information Technology 62(5-6) - August 2020

Autor*innen mit den meisten Dokumenten  

Auflistung nach:

Neueste Veröffentlichungen

1 - 10 von 10
  • Zeitschriftenartikel
    Streamlining IoT system development with open standards
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Korkan, Ege; Kaebisch, Sebastian; Steinhorst, Sebastian
    The Internet of Things (IoT) is bringing Internet connectivity to a wide range of devices which results in an increasing number of products for smart home, industry 4.0 and/or smart cities. Even though IoT has the ambition to reach an increasing amount of devices and be scalable across different domains, lack of interoperability inhibits this scope to be attained. Recent standardization efforts by the World Wide Web Consortium (W3C) are addressing the interoperability problem by the means of Thing Description (TD) that allows humans and machines to understand the capabilities and communication interfaces of IoT devices. In this paper, we show a more systematic and streamlined development of IoT devices and systems that relies on the TD standard. We introduce three different complementary methods that can be applied independently in the different stages of the development, or as a framework to streamline the development of IoT devices and systems. As a result of using the TD standard, interoperability between IoT devices of various stakeholders is ensured from early stages and the time to market is reduced.
  • Zeitschriftenartikel
    Exploring hardware accelerator offload for the Internet of Things
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Cooke, Ryan A.; Fahmy, Suhaib A.
    The Internet of Things is manifested through a large number of low-capability connected devices. This means that for many applications, computation must be offloaded to more capable platforms. While this has typically been cloud datacenters accessed over the Internet, this is not feasible for latency sensitive applications. In this paper we investigate the interplay between three factors that contribute to overall application latency when offloading computations in IoT applications. First, different platforms can reduce computation latency by differing amounts. Second, these platforms can be traditional server-based or emerging network-attached, which exhibit differing data ingestion latencies. Finally, where these platforms are deployed in the network has a significant impact on the network traversal latency. All these factors contributed to overall application latency, and hence the efficacy of computational offload. We show that network-attached acceleration scales better to further network locations and smaller base computation times that traditional server based approaches.
  • Zeitschriftenartikel
    The power of locality: Exploring the limits of randomness in distributed computing
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Maus, Yannic
    Many modern systems are built on top of large-scale networks like the Internet. This article provides an overview of a dissertation [29] that addresses the complexity of classic graph problems like the vertex coloring problem in such networks. It has been known for a long time that randomization helps significantly in solving many of these problems, whereas the best known deterministic algorithms have been exponentially slower. In the first part of the dissertation we use a complexity theoretic approach to show that several problems are complete in the following sense: An efficient deterministic algorithm for any complete problem would imply an efficient algorithm for all problems that can be solved efficiently with a randomized algorithm. Among the complete problems is a rudimentary looking graph coloring problem that can be solved by a randomized algorithm without any communication. In further parts of the dissertation we develop efficient distributed algorithms for several problems where the most important problems are distributed versions of integer linear programs, the vertex coloring problem and the edge coloring problem. We also prove a lower bound on the runtime of any deterministic algorithm that solves the vertex coloring problem in a weak variant of the standard model of the area.
  • Zeitschriftenartikel
    Temporal-based intrusion detection for IoV
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Hamad, Mohammad; Hammadeh, Zain A. H.; Saidi, Selma; Prevelakis, Vassilis
    The Internet of Vehicle (IoV) is an extension of Vehicle-to-Vehicle (V2V) communication that can improve vehicles’ fully autonomous driving capabilities. However, these communications are vulnerable to many attacks. Therefore, it is critical to provide run-time mechanisms to detect malware and stop the attackers before they manage to gain a foothold in the system. Anomaly-based detection techniques are convenient and capable of detecting off-nominal behavior by the component caused by zero-day attacks. One significant critical aspect when using anomaly-based techniques is ensuring the correct definition of the observed component’s normal behavior. In this paper, we propose using the task’s temporal specification as a baseline to define its normal behavior and identify temporal thresholds that give the system the ability to predict malicious tasks. By applying our solution on one use-case, we got temporal thresholds 20–40 % less than the one usually used to alarm the system about security violations. Using our boundaries ensures the early detection of off-nominal temporal behavior and provides the system with a sufficient amount of time to initiate recovery actions.
  • Zeitschriftenartikel
    A survey on time-sensitive resource allocation in the cloud continuum
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Ramanathan, Saravanan; Shivaraman, Nitin; Suryasekaran, Seima; Easwaran, Arvind; Borde, Etienne; Steinhorst, Sebastian
    Artificial Intelligence (AI) and Internet of Things (IoT) applications are rapidly growing in today’s world where they are continuously connected to the internet and process, store and exchange information among the devices and the environment. The cloud and edge platform is very crucial to these applications due to their inherent compute-intensive and resource-constrained nature. One of the foremost challenges in cloud and edge resource allocation is the efficient management of computation and communication resources to meet the performance and latency guarantees of the applications. Numerous research studies have been carried out to address this intricate problem. In this paper, the current state-of-the-art resource allocation techniques for the cloud continuum, in particular those that consider time-sensitive applications, are reviewed. Furthermore, we present the key challenges in the resource allocation problem for the cloud continuum, a taxonomy to classify the existing literature and the potential research gaps.
  • Zeitschriftenartikel
    Internet of Things
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Steinhorst, Sebastian
    Article Internet of Things was published on December 1, 2020 in the journal it - Information Technology (volume 62, issue 5-6).
  • Zeitschriftenartikel
    Efficient machine learning for attack detection
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Wressnegger, Christian
    Detecting and fending off attacks on computer systems is an enduring problem in computer security. In light of a plethora of different threats and the growing automation used by attackers, we are in urgent need of more advanced methods for attack detection. Manually crafting detection rules is by no means feasible at scale, and automatically generated signatures often lack context, such that they fall short in detecting slight variations of known threats. In the thesis “Efficient Machine Learning for Attack Detection” [35], we address the necessity of advanced attack detection. For the effective application of machine learning in this domain, a periodic retraining over time is crucial. We show that with the right data representation, efficient algorithms for mining substring statistics, and implementations based on probabilistic data structures, training the underlying model for establishing an higher degree of automation for defenses can be achieved in linear time.
  • Zeitschriftenartikel
    From transistor level to cyber physical/hybrid systems: Formal verification using automatic compositional abstraction
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Tarraf, Ahmad; Hedrich, Lars
    In this paper we present a methodology to automatically generate an accurate behavioral model from an analog circuit description. The current machine learning method is limited to circuits with up to 80 transistors, limiting our approach to small and mid size circuit blocks due to a state explosion problem. However, if complex building blocks such as IOT systems should be modeled, the current approach needs to recoup with feasible simulation and modeling time. To come up with a solution for this problem, we extend the current method by a compositional approach. The approach is illustrated upon an example from the area of autonomous driving. Our method decomposes this large example into smaller building blocks and models each of them automatically. All models are combined into a compositional hybrid automaton of the whole complex system. Compared to the original state space, the building blocks operate on smaller and reduced state spaces and hence drastically reduce the complexity. Using a back-transformation on the compositional automaton, all values from the original state space can be reconstructed. Moreover, we perform a formal verification on the generated compositional automaton. Results from a meaningful example are presented and discussed.
  • Zeitschriftenartikel
    Modeling advanced security aspects of key exchange and secure channel protocols
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Günther, Felix
    Secure connections are at the heart of today’s Internet infrastructure, protecting the confidentiality, authenticity, and integrity of communication. Achieving these security goals is the responsibility of cryptographic schemes, more specifically two main building blocks of secure connections. First, a key exchange protocol is run to establish a shared secret key between two parties over a, potentially, insecure connection. Then, a secure channel protocol uses that shared key to securely transport the actual data to be exchanged. While security notions for classical designs of these components are well-established, recently developed and standardized major Internet security protocols like Google’s QUIC protocol and the Transport Layer Security (TLS) protocol version 1.3 introduce novel features for which supporting security theory is lacking. In my dissertation [20], which this article summarizes, I studied these novel and advanced design aspects, introducing enhanced security models and analyzing the security of deployed protocols. For key exchange protocols, my thesis introduces a new model for multi-stage key exchange to capture that recent designs for secure connections establish several cryptographic keys for various purposes and with differing levels of security. It further introduces a formalism for key confirmation, reflecting a long-established practical design criteria which however was lacking a comprehensive formal treatment so far. For secure channels, my thesis captures the cryptographic subtleties of streaming data transmission through a revised security model and approaches novel concepts to frequently update key material for enhanced security through a multi-key channel notion. These models are then applied to study (and confirm) the security of the QUIC and TLS 1.3 protocol designs.
  • Zeitschriftenartikel
    (it - Information Technology: Vol. 62, No. 5-6, 2020) Frontmatter
    Article Frontmatter was published on December 1, 2020 in the journal it - Information Technology (volume 62, issue 5-6).