Match!
Ibm Journal of Research and Development
IF
0.79
Papers
3592
Papers 3672
1 page of 368 pages (3,672 results)
Newest
The United States is one of the most natural disaster-prone countries in the world. Since 1980, there have been 246 weather and climate disasters exceeding $1.6 trillion in remediation. Within the last decade, the frequency of disaster events and their costs are on the rise. Complicating the impact of natural disasters is the population shift to cities and coastal areas, which concentrate their effects. The need for governments and communities to prepare for, respond to, and recover from disaste...
Source
Around 68.5 million people are currently forcibly displaced. The implementation and monitoring of international agreements, which are linked to the 2030 agenda (e.g., the Sendai Framework), require a standard set of metrics for internal displacement. Since nationally owned, validated, and credible data are difficult to obtain, new approaches are needed. This article aims to support the monitoring of displacement via satellite-derived observations of nighttime lights (NTL) from NASA's Black Marbl...
Source
#1Saugata GhoseH-Index: 23
#2Amirali BoroumandH-Index: 10
Last. Onur MutluH-Index: 74
view all 5 authors...
Many modern and emerging applications must process increasingly large volumes of data. Unfortunately, prevalent computing paradigms are not designed to efficiently handle such large-scale data: The energy and performance costs to move this data between the memory subsystem and the CPU now dominate the total costs of computation. This forces system architects and designers to fundamentally rethink how to design computers. Processing-in-memory (PIM) is a computing paradigm that avoids most data mo...
Source
#2Pritish Narayanan (IBM)H-Index: 20
Last. Geoffrey W. Burr (IBM)H-Index: 38
view all 10 authors...
This paper presents innovative micro-architectural designs for multi-layer Deep Neural Networks (DNNs) implemented in crossbar arrays of analog memories. Data are transferred in a fully-parallel manner between arrays without explicit analog-to-digital converters. Design ideas including source follower-based readout, array segmentation, and transmit-by-duration are adopted to improve circuit efficiency. The execution energy and throughput, for both DNN training and inference, are analyzed quantit...
Source
#1Alon AmidH-Index: 2
#2Kiseok KwonH-Index: 2
Last. Kurt KeutzerH-Index: 60
view all 6 authors...
Deep Learning is arguably the most rapidly evolving research area in recent years. As a result, it is not surprising that the design of state-of-the-art deep neural net models often proceeds without much consideration of the latest hardware targets, and the design of neural net accelerators proceeds without much consideration of the characteristics of the latest deep neural net models. Nevertheless, in this article, we show that there are significant improvements available if deep neural net mod...
Source
Efficiency bottlenecks inherent to conventional computing in executing neural algorithms have spurred the development of novel devices capable of “in-memory” computing. Commonly known as “memristors,” a variety of device concepts including conducting bridge, vacancy filament, phase change, and other types have been proposed as promising elements in artificial neural networks for executing inference and learning algorithms. In this article, we review the recent advances in memristor technology fo...
Source
Deep neural networks (DNNs) achieve best-known accuracies in many machine learning tasks involved in image, voice, and natural language processing and are being used in an ever-increasing range of applications. However, their algorithmic benefits are accompanied by extremely high computation and storage costs, sparking intense efforts in optimizing the design of computing platforms for DNNs. Today, graphics processing units (GPUs) and specialized digital CMOS accelerators represent the state-of-...
2 CitationsSource
Performing computations on conventional von Neumann computing systems results in a significant amount of data being moved back and forth between the physically separated memory and processing units. This costs time and energy, and constitutes an inherent performance bottleneck. In-memory computing is a novel non-von Neumann approach, where certain computational tasks are performed in the memory itself. This is enabled by the physical attributes and state dynamics of memory devices, in particular...
1 CitationsSource
Fairness is an increasingly important concern as machine learning models are used to support decision making in high-stakes applications such as mortgage lending, hiring, and prison sentencing. This article introduces a new open-source Python toolkit for algorithmic fairness, AI Fairness 360 (AIF360), released under an Apache v2.0 license ( https://github.com/ibm/aif360 ). The main objectives of this toolkit are to help facilitate the transition of fairness research algorithms for use in an indu...
5 CitationsSource
Major trends in healthcare and life sciences (HCLS) include huge amounts of and longitudinal patient data, policies on a patient's rights to access and control their data, a move from fee-for-service to outcome-based contracts, and regulatory and privacy requirements. Blockchain, as a distributed transactional system of record, can provide underpinnings to enable these trends and enable transformative opportunities in HCLS by providing immutable data on a shared ledger, secure and authenticated ...
Source
12345678910
Top fields of study
Operating system
Electrical engineering
Electronic engineering
Mathematics
Computer science
IBM
Real-time computing