Auflistung nach Schlagwort "Precision"
1 - 6 von 6
Treffer pro Seite
Sortieroptionen
- ZeitschriftenartikelApproximating stochastic numbers to reduce latency(it - Information Technology: Vol. 64, No. 3, 2022) Kawaminami, Syoki; Watanabe, Yukino; Yamashita, ShigeruApproximate Computing (AC) and Stochastic Computing (SC) have been studied as new computing paradigms to achieve energy-efficient designs for error-tolerant applications. The hardware cost of SC generally can be small compared to that of AC, but SC has not been applied to a wide range of applications as AC because SC needs very long cycles to use long random bit strings called Stochastic Numbers (SNs) when we need to maintain the desired precision. To mitigate this disadvantage of SC, we propose a new idea to approximate numbers represented by SNs; our idea is to use multiple SNs to represent one number. Indeed our method can shorten the length of SNs drastically while keeping the precision level compared to conventional SNs. We study two specific cases where we use two and three shorter bit-strings to represent a single conventional SN, which we call a dual-rail and a triple-rail SNs, respectively. We also discuss a general case when we use many SNs corresponding to a single conventional SNs. We also compare triple-rail, dual-rail and conventional SNs in terms of hardware overhead and calculation errors in this paper. From the comparison, we can conclude that our idea can be used to shorten the necessary cycles for SC.
- ZeitschriftenartikelAutomatically Detecting and Mitigating Issues in Program Analyzers(Softwaretechnik-Trends Band 44, Heft 2, 2024) Mansur, Muhammad NumairThis dissertation tackles two major challenges that impede the incorporation of static analysis tools into software development workflows, despite their potential to detect bugs and vulnerabilities in software before deployment. The first challenge addressed is unintentional unsoundness in program analyzers, such as SMT solvers and Datalog engines, which are susceptible to undetected soundness issues that can lead to severe consequences, particularly in safety-critical software. The dissertation presents novel, publicly available techniques that detected over 55 critical soundness bugs in these tools. The second challenge is balancing soundness, precision, and performance in static analyzers, which struggle with integration into diverse development scenarios due to their inability to scale and adapt to different program sizes and resource constraints. To combat this, the dissertation introduces an approach to automatically tailor abstract interpreters to specific code and resource conditions and presents a method for horizontally scaling analysis tools in cloud-based platforms.
- ConferencePaperCooperative Android App Analysis with CoDiDroid(Software Engineering 2021, 2021) Pauck, Felix; Wehrheim, HeikeNovel Android app analysis tools as well as improved versions of available tools are frequently proposed. These proposed tools often tackle a specific single issue that cannot be handled with existing tools. Consequently, the best analysis possible should use the advantages of each and every tool. With CoDiDroid we present an analysis framework that allows to combine analysis tools such that the best out of each tool is used for a more comprehensive and more precise cooperative analysis. Our experimental results show indeed that CoDiDroid allows to setup cooperative analyses which are beneficial with respect to effectiveness, accuracy and scalability.
- WorkshopbeitragNoise over Fear of Missing Out(Mensch und Computer 2021 - Workshopband, 2021) Schleith, Johannes; Hristozova, Nina; Chechmanek, Brian; Bussey, Carolyn; Michalak, LeszekNatural language processing (NLP) techniques for information extraction commonly face the challenge to extract either ‘too much’ or ‘too little’ information from text. Extracting ‘too much’ means that a lot of the relevant information is captured, but also a lot of irrelevant information or ‘Noise’ is extracted. This usually results in high ‘Recall’, but lower ‘Precision’. Extracting ‘too little’ means that all of the information that is extracted is relevant, but not everything that is relevant is extracted – it is ‘missing’ information. This usually results in high ‘Precision’ and lower ‘Recall’. In this paper we present an approach combining quantitative and qualitative measures in order to evaluate the end-users’ experience with information extraction systems in addition to standard statistical metrics and interpret a preference for the above challenge. The method is applied in a case study of legal document review. Results from the case study suggest that legal professionals prefer seeing ‘too much’ over ‘too little’ when working on an AI-assisted legal document review tasks. Discussion of these results position the involvement of User Experience (UX) as a fundamental ingredient to NLP system design and evaluation.
- ZeitschriftenartikelPrecision Digital Health(Business & Information Systems Engineering: Vol. 66, No. 3, 2024) Baird, Aaron; Xia, YusenAccounting for individual and situational heterogeneity (i.e., precision) is now an important area of research and treatment in the field of medicine. This essay argues that precision should also be embraced within digital health artifacts, such as by designing digital health apps to tailor recommendations to individual user characteristics, needs, and situations, rather than only providing generic advice. The challenge, however, is that not much guidance is available for embracing precision when designing or researching digital health artifacts. The paper suggests that a shift toward precision in digital health will require embracing heterogeneous treatment effects (HTEs), which are variations in the effectiveness of treatment, such as variations in effects for individuals of different ages. Embracing precision via HTEs is not trivial, however, and will require new approaches to the research and design of digital health artifacts. Thus, this essay seeks to not only define precision digital health, but also to offer suggestions as to where and how machine learning, deep learning, and artificial intelligence can be used to enhance the precision of interventions provisioned via digital health artifacts (e.g., personalized advice from mental health wellbeing apps). The study emphasizes the value of applying emerging causal ML methods and generative AI features within digital health artifacts toward the goal of increasing the effectiveness of digitially provisioned interventions.
- ZeitschriftenartikelTowards Confirmatory Process Discovery: Making Assertions About the Underlying System(Business & Information Systems Engineering: Vol. 61, No. 6, 2019) Janssenswillen, Gert; Depaire, BenoîtThe focus in the field of process mining, and process discovery in particular, has thus far been on exploring and describing event data by the means of models. Since the obtained models are often directly based on a sample of event data, the question whether they also apply to the real process typically remains unanswered. As the underlying process is unknown in real life, there is a need for unbiased estimators to assess the system-quality of a discovered model, and subsequently make assertions about the process. In this paper, an experiment is described and discussed to analyze whether existing fitness, precision and generalization metrics can be used as unbiased estimators of system fitness and system precision. The results show that important biases exist, which makes it currently nearly impossible to objectively measure the ability of a model to represent the system.