NANO SCIENTIFIC RESEARCH CENTRE
PVT.LTD., AMEERPET, HYD
DOT NET
PROJECTS LIST--2013
DOT NET
2013 IEEE PAPERS
Cashing in on the Cache in the Cloud
Abstract:
Over
the past decades, caching has become the key technology used for bridging the
performance gap across memory hierarchies via temporal or spatial localities;
in particular, the effect is prominent in disk storage systems. Applications
that involve heavy I/O activities, which are common in the cloud, probably
benefit the most from caching. The use of local volatile memory as cache might
be a natural alternative, but many well-known restrictions, such as capacity
and the utilization of host machines, hinder its effective use. In addition to
technical challenges, providing cache services in clouds encounters a major
practical issue (quality of service or service level agreement issue) of
pricing. Currently, (public) cloud users are limited to a small set of uniform
and coarse grained service offerings, such as High-Memory and High-CPU in
Amazon EC2. In this paper, we present the cache as a service (CaaS) model as an
optional service to typical infrastructure service offerings. Specifically, the
cloud provider sets aside a large pool of memory that can be dynamically
partitioned and allocated to standard infrastructure services as disk cache.
We
first investigate the feasibility of providing CaaS with the proof-of-concept
elastic cache system (using dedicated remote memory servers) built and
validated on the actual system, and practical benefits of CaaS for both users
and providers (i.e., performance and profit, respectively) are thoroughly
studied with a novel pricing scheme. Our CaaS model helps to leverage the cloud
economy greatly in that 1) the extra user cost for I/O performance gain is
minimal if ever exists, and 2) the provider’s profit increases due to
improvements in server consolidation resulting from that performance gain.
Through extensive experiments with eight resource allocation strategies, we
demonstrate that our CaaS model can be a promising cost-efficient solution for
both users and providers.
Existing System:
Due
to essentially the shared nature of some resources like disks (not performance
isolatable), the virtualization overhead with these resources is not negligible
and it further worsens the disk I/O performance. Thus, low disk I/O performance
is one of the major challenges encountered by most infrastructure services as
in Amazon’s relational database service, which provisions virtual servers with database
servers. At present, the performance issue of I/O intensive applications is
mainly dealt with by using high-performance (HP) servers with large amounts of
memory, leaving it as the user’s responsibility.
Proposed System:
In
this paper, we address the issue of disk I/O performance in the context of
caching in the cloud and present a cache as a service (CaaS) model as an
additional service to IaaS. For example, a user is able to simply specify more
cache memory as an additional requirement to an IaaS instance with the minimum
computational capacity (e.g., micro/small instance in Amazon EC2) instead of an
instance with large amount of memory (high-memory instance in Amazon EC2). The
key contribution in this work is that our cache service model much augments
cost efficiency and elasticity of the cloud from the perspective of both users
and providers. CaaS as an additional service (provided mostly in separate cache
servers) gives the provider an opportunity to reduce both capital and operating
costs using a fewer number of active physical machines for IaaS; and this can justify
the cost of cache servers in our model. The user also benefits from CaaS in
terms of application performance with minimal extra cost; besides, caching is
enabled in a user transparent manner and cache capacity is not limited to local
memory.
Software and Hardware
Requirements
Hardware Required:
System : Pentium IV
Hard Disk : 80
GB
RAM : 512 MB
Software Required:
Operating
System : Windows
XP
Language : Asp.Net,
C#
Data Base : SQL
Server 2005
Modules:
·
Login
·
Registration
·
Uploading
·
Downloading
·
Caching Cache
No comments:
Post a Comment