Match!
Chinese Journal of Computers
Papers
5195
Papers 5203
1 page of 521 pages (5,203 results)
Newest
随着半导体工艺的高速发展,计算机系统中处理器与主存之间性能差距的不断增大,传统存储器件的集成度已接近极限,能耗问题也日益突出,当前传统的主存技术面临挑战.相变随机存储器(PCRAM)具有集成度高、功耗低、非易失、字节级编址等优良特性,是最有发展潜力的、最有可能完全取代DRAM主存的非易失性存储器之一.首先介绍了PCRAM的发展与应用现状,指出T型结构是当前学术界和产业界广泛采用的器件结构,目前已经有PCRAM产品逐步开始量产并投入商业应用.然后,介绍了PCRAM当前面临的挑战,指出PCRAM面临的写耐久性局限是限制期发展与应用的主要障碍之一,分布不均匀的写操作会使PCRAM快速失效.接着,从硬件辅助和软件辅助两个角度分别介绍了当前研究人员所提出的一些具有代表性的PCRAM损耗均衡技术,在分析和归纳当前研究现状的基础上,指出了现有方案的优点和亟待完善之处.最后,展望了未来PCRAM损耗均衡技术的研究方向,为该领域今后的发展提供参考.
Source
#1Kang Yin (CAS: Chinese Academy of Sciences)H-Index: 1
1 Citations
#1Yang Dong (HIT: Harbin Institute of Technology)H-Index: 1
Data quality issues will result in lethal effects of big data applications,so it is needed to clean the big data with the problem of data quality.MapReduce programming framework can take advantage of parallel technology to achieve high scalability for large data cleaning.However,due to the lack of effective design,redundant computation exists in the cleaning process based on MapReduce,resulting in decreased performance.Therefore,the purpose of this paper is to optimize the parallel data cleaning...
2 Citations
#1Wang Chun (RUC: Renmin University of China)
The applications that require online processing continuous data stream are increasing.Data stream management systems which are used to deal with massive and variable data in real time have been produced.With the development of open processing platforms in the ear of big data,a number of distributed data stream processing systems have emerged for dealing with large scale and diverse data stream,such as S4,Storm,Spark Streaming,etc.However,we should construct relational query systems which have ab...
#1Wu Chun (National University of Defense Technology)H-Index: 1
With the development of information technology,massive data resources with heterogeneous structure appear in the cyberspace,which is known as the network big data and has attracted extensive attentions.For mining the useful information from the network big data,it is required to efficiently organize the data resources in the cyberspace and realize the semantic-based similarity search.For an efficient data organization and search,we firstly need to extract the features/attributes of the big data ...
3 Citations
In recent years,IP spoofing is frequently used in network attacks,which immensely threatens the Internet security.Inter-domain source address validation methods defend against these attacks by enforcing the domain-level source address verification on the IP packets.The academia has proposed the evaluation criteria for these methods,and designed many methods according to the criteria.However,although these methods meet the criteria,none of them is widely deployed by Internet service providers(ISP...
#1Chen Shan (Chongqing University)H-Index: 1
Compressed sensing(CS)is new theory for sampling and recovering signal based sparse transformation.This theory could help us to acquire complete signal at low cost.Therefore,it also satisfies the requirement of low cost sampling since bandwidth and capability of sampling is not sufficient.However wireless sensor network is an open scene,signal is easily affected by noise in the open environment.Specially,CS theory indicates a method of sub-Nyquist sampling which is effective to reduce cost in th...
4 Citations
#1Ding Ya (National University of Defense Technology)H-Index: 1
Cloud service is a kind of emerging network service mode built on the platform of cloud computing.Its outsourcing feature and the security risks with the platform both introduce the trust problem,which becomes the largest misgiving when the users make decision to move their business onto the cloud platform.So the study on achieving trusted cloud service has become one of the key focuses in the research field.In this paper,the definition of trusted cloud service is proposed on the basis of analys...
13 Citations
12345678910
Top fields of study
Algorithm
Distributed computing
Computer network
Mathematics
Computer science
Theoretical computer science