SlideShare uma empresa Scribd logo
1 de 32
ABSTRACT
Cloud is basically a clusters of multiple dedicated servers attached within a network. Cloud
Computing is a network based environment that focuses on sharing computations or resources. In cloud
customers only pay for what they use and have not to pay for local resources which they need such as
storage or infrastructure. so this is the main advantage of cloud computing and main reason for gaining
popularity in todays world Also..But in cloud the main problem that occurs is security. And now a day’s
security and privacy both are main concern that needed to be considered. To overcome the problem of
security we are introducing the new technique which is called as Fog Computing .Fog Computing is not
a replacement of cloud it is just extends the cloud computing by providing security in the cloud
environment. With Fog services we are able to enhance the cloud experience by isolating user’s data that
need to live on the edge. The main aim of fog computing is to place the data close to the end user.
1. INTRODUCTION
In today's worlds the small as well as big -big organizations are using cloud computing technology to protect
their data and to use the cloud resources as and when they need. Cloud is a subscription based service .Cloud
computing is a shared pool of resources. The way of use computers and store our personal and business
information can arises new data security challenges. Encryption mechanisms not protect the data in the cloud
from unauthorized access. As we know that the traditional database system are usually deployed in closed
environment where user can access the system only through a restricted network or internet. With the fast
growth of W.W.W user can access virtually any database for which they have proper access right from
anywhere in the world . By registering into cloud the users are ready to get the resources from cloud
providers and the organization can access their data from anywhere and at any time when they need. But this
comfortness comes with certain type of risk like security and privacy. To overcome by this problem we are
using a new technique called as fog computing. Fog computing provides security in cloud environment in a
greater extend to get the benefit of this technique a user need to get registered with the fog. once the user is
ready by filling up the sign up form he will get the message or email that he is ready to take the services
from fog computing.
1.1Existing System
Existing data protection mechanisms such as encryption was failed in securing the data from the
attackers. It does not verify whether the user was authorized or not. Cloud computing security does not focus
on ways of secure the data from unauthorized access. Encryption does not provide much security to our data.
In 2009.We have our own confidential documents in the cloud. This files does not have much security. So,
hacker gains access the documents. Twitter incident is one example of a data theft attack in the Cloud.
Difficult to find the attacker. In 2010 and 2011 Cloud computing security was developed against attackers.
Finding of hackers in the cloud. Additionally, it shows that recent research results that might be useful to
protect data in the cloud.
1.2Proposed System
We proposed a completely new technique to secure user’s data in cloud using user behavior and
decoy information technology called as Fog Computing. We use this technique to provide data security in
the cloud. A different approach for securing data in the cloud using offensive decoy technology. We monitor
data access in the cloud and detect abnormal data access patterns. In this technique when the unauthorized
person try to access the data of the real user the system generates the fake documents in such a way that the
unauthorized person was also not able to identify that the data is fake or real .It is identified thought a
question which is entered by the real user at the time of filling the sign up form. If the answer of the question
is wrong it means the user is not the real user and the system provide the fake document else original
documents will be provided by the system to the real user.
2.SYSTEM OVERVIEW
2.1 Cloud Architecture
In Cloud architecture, the systems architecture(A system architecture or systems architecture is the
conceptual model that defines the structure, behavior, and more views of a system. An architecture
description is a formal description and representation of a system) of the software systems(The term
software system is often used as a synonym of computer program or software.) involved in the delivery of
cloud computing, typically involves multiple cloud components communicating with each other over
application programming interfaces, usually web services. This resembles the Unix philosophy of having
multiple programs each doing one thing well and working together over universal interfaces. Complexity is
controlled and the resulting systems are more manageable than their monolithic counterparts.
Fig 2.1 :Cloud Computing Sample Architecture
2.2 Cloud computing Services:
Cloud computing is a model for enabling convenient, on demand network access to a shared pool of
configurable computing resources (for example, networks, servers, storage, applications, and services) that
can be rapidly provisioned and released with minimal management effort or service-provider interaction. It
is divide into three types.
1. Application as a service.
2. Infrastructure as a service.
3. Platform as a service.
Fig 2.2: Cloud computing Services
Cloud computing exhibits the following key characteristics:
1. Agility:
improves with users' ability to re-provision technological infrastructure resources.
2. Cost:
Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted
to operational expenditure. This is purported to lower barriers to entry, as infrastructure is typically
provided by a third-party and does not need to be purchased for one-time or infrequent intensive
computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and
fewer IT skills are required for implementation. The e-FISCAL project's state of the art
repository contains several articles looking into cost aspects in more detail, most of them concluding
that costs savings depend on the type of activities supported and the type of infrastructure available in-
house.
3. Virtualization:
Technology allows servers and storage devices to be shared and utilization be increased. Applications
can be easily migrated from one physical server to another.
4. Multi tenancy:
Enables sharing of resources and costs across a large pool of users thus allowing for.
5. Centralization:
Centralization of infrastructure in locations with lower costs. (such as real estate, electricity, etc.)
6. Utilization and efficiency:
Improvements for systems that are often only 10–20% utilized.
7. Reliability:
Reliability is improved if multiple redundant sites are used, which makes well-designed cloud
computing suitable for business continuity and disaster recovery.
8. Performance:
Performance is monitored and consistent and loosely coupled architectures are constructed using web
services as the system interface.
9. Security:
Could improve due to centralization of data, increased security-focused resources, etc., but concerns
can persist about loss of control over certain sensitive data, and the lack of security for stored kernels.
Security is often as good as or better than other traditional systems, in part because providers are able
to devote resources to solving security issues that many customers cannot afford. However, the
complexity of security is greatly increased when data is distributed over a wider area or greater number
of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access
to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by
users' desire to retain control over the infrastructure and avoid losing control of information security.
10. Maintenance:
Maintenance of cloud computing applications is easier, because they do not need to be installed on
each user's computer and can be accessed from different places.
Fig 2. 3: Represents The Benefit
2.3 Security Issues in Service Model
Cloud computing having three delivery models through which services are delivered to end users.
These models are SaaS, IaaS and PaaS which provide software, Infrastructure and platform assets to the
users. They have different level of security requirements.
Fig 2.4 : Security Issues in Service Model
Security issues in SaaS:
Software as service is a model, where the software applications are hosted slightly by the service provider
and available to users on request, over the internet. In SaaS, client data is available on the internet and may
be visible to other users, it is the responsibility of provider to set proper security checks for data protection.
This is the major security risk, which create a problem in secure data migration and storage. The following
security measures should be counted in SaaS application improvement process such that Data Security, Data
locality, Data integrity, Data separation, Data access, Data confidentiality, Data breaches, Network Security,
Authentication and authorization, Web application security, Identity management process. The following are
the basics issues through which malicious user get access and violate the data Aruna et al., International
Journal of SQL Injection flaw, Cross-site request forgery, Insecure storage, Insecure configuration.
Security issues in PaaS:
PaaS is the layer above the IaaS. It deals with operating system, middleware, etc. It provides set of service
through which a developer can complete a development process from testing to maintenance. It is complete
platform where user can complete development task without any hesitation. In PaaS, the service provider
give some command to customer over some application on platform. But still there can be the problem of
security like intrusion etc, which must be assured that data may not be accessible between applications.
Security issues in IaaS:
IaaS introduce the traditional concept of development, spending a huge amount on data centers or managing
hosting forum and hiring a staff for operation. Now the IaaS give an idea to use the infrastructure of any one
provider, get services and pay only for resources they use. IaaS and other related services have enable set up
and focus on business improvement without worrying about the organization infrastructure. The IaaS
provides basic security firewall, load balancing, etc. In IaaS there is better control over the security, and
there is no security gap in virtualization manager. The main security problem in IaaS is the trustworthiness
of data that is stored within the provider’s hardware.
2.4 Cloud Computing Security Threats and solution
Top seven security threats to cloud computing discovered by “Cloud Security Alliance” (CSA) are:
i. Abuse and Nefarious Use of Cloud Computing:
Abuse and nefarious use of cloud computing is the top threat identified by the CSA. A simple example of
this is the use of botnets to spread spam and malware. Attackers can infiltrate a public cloud, for example,
and find a way to upload malware to thousands of computers and use the power of the cloud infrastructure to
attack other machines. Suggested remedies by the CSA to lessen this threat:
 Stricter initial registration and validation processes.
 Enhanced credit card fraud monitoring and coordination.
 Comprehensive introspection of customer network traffic.
 Monitoring public blacklists for one’s own network blocks.
ii. Insecure Application Programming Interfaces:
As software interfaces or APIs are what customers use to interact with cloud services, those must have
extremely secure authentication, access control, encryption and activity monitoring mechanisms - especially
when third parties start to build on them. Suggested remedies by CSA to lessen this threat:
 Analyze the security model of cloud provider interfaces.
 Ensure strong authentication and access controls are implemented in concert with encrypted
transmission.
 Understand the dependency chain associated with the API.
iii. Malicious Insiders:
The malicious insider threat is one that gains in importance as many providers still don't reveal how they
hire people, how they grant them access to assets or how they monitor them. Transparency is, in this case,
vital to a secure cloud offering, along with compliance reporting and breach notification. Suggested
remedies by CSA to lessen this threat:
 Enforce strict supply chain management and conduct a comprehensive supplier assessment.
 Specify human resource requirements as part of legal contracts.
 Require transparency into overall information security and management practices, as well as
compliance reporting.
 Determine security breach notification processes.
iv. Shared Technology Vulnerabilities:
Sharing infrastructure is a way of life for IaaS providers. Unfortunately, the components on which this
infrastructure is based were not designed for that. To ensure that customers don't thread on each other's
"territory", monitoring and strong compartmentalization is required. Suggested remedies by CSA to lessen
this threat:
 Implement security best practices for installation/configuration.
 Monitor environment for unauthorized changes/activity.
 Promote strong authentication and access control for administrative access and operations.
 Enforce service level agreements for patching and vulnerability remediation.
 Conduct vulnerability scanning and configuration audits.
v. Data Loss/Leakage:
Be it by deletion without a backup, by loss of the encoding key or by unauthorized access, data is always in
danger of being lost or stolen. This is one of the top concerns for businesses, because they not only stand to
lose their reputation, but are also obligated by law to keep it safe. Aruna et al., International Journal of
Advanced Research in Computer Science and Software Engineering 3(9), September - 2013, pp. 292-299 ©
2013, IJARCSSE All Rights Reserved Page | 294 Suggested remedies by CSA to lessen this threat:
 Implement strong API access control.
 Encrypt and protect integrity of data in transit.
 Analyze data protection at both design and run time.
 Implement strong key generation, storage and management, and destruction practices.
 Contractually demand providers to wipe persistent media before it is released into the pool.
 Contractually specify provider backup and retention strategies.
vi. Account, Service & Traffic Hijacking:
Account service and traffic hijacking is another issue that cloud users need to be aware of. These threats
range from man-in the-middle attacks, to phishing and spam campaigns, to denial-of service attacks.
Suggested remedies by CSA to lessen this threat:
 Prohibit the sharing of account credentials between users and services.
 Leverage strong two-factor authentication techniques where possible.
 Employ proactive monitoring to detect unauthorized activity.
 Understand cloud provider security policies and SLAs.
vii. Unknown Risk Profile:
Security should be always in the upper portion of the priority list. Code updates, security practices,
vulnerability profiles, intrusion attempts – all things that should always be kept in mind ,Suggested remedies
by CSA to lessen this threat:
 Disclosure of applicable logs and data.
 Partial/full disclosure of infrastructure details (e.g., patch levels, firewalls, etc).3
 Monitoring and alerting on necessary information.
3.SECURING CLOUDS USING FOG
3.1Fog Computing:
Below is the reference architecture of a Fog computing environment in an enterprise. You can see that the
Fog network is close to the smart devices, data processing is happening closer to the devices and the
processed information is passed to the cloud computing environment.
Fig 3.1: Reference Architecture
Just got comfortable with the concept of cloud computing Well, that is now in past. Cloud computing has
now been overtaken by a new concept called fog computing which is certainly much better and bigger than
the cloud.
Fog computing is quite similar to cloud and just like cloud computing it also provides its users with
data, storage, compute and application services. The thing that distinguishes fog from cloud is its support
for mobility, its proximity to its end-users and its dense geographical distribution. Its services are hosted at
the network edge or even on devices such as set-top-boxes or access points. By doing this, fog computing
helps in reducing service latency and even improves QoS, which further result in a
superior user experience.
Fog computing even supports emerging Internet of Things (IoT) applications that require real time or
predictable latency. A thing in Internet of Things is referred to as any natural or manmade object that can
be assigned an Internet Protocol (IP) address and provided with an ability to transfer data over a network.
Some of these can end up creating a lot of data. Cisco here provides us with an example of a jet engine,
which is capable of creating 10 terabytes of data about its condition and performance that too in half-hour.
Transmitting all this data to the cloud and then transmitting response data back ends up creating a huge
demand on bandwidth. This process further requires a considerable amount of time to take place and can
suffer from latency.
In fog computing, much of the processing takes place in a router. This type of computing creates a
virtual platform that provides networking, compute and storage services between traditional cloud
computing data centers and end devices. These services are central to both fog and cloud computing. They
are also important for supporting the emerging Internet deployments. Fog computing also has the
capability of enabling a new breed of aggregated services and applications, such as the smart energy
distribution. In smart energy distribution, all the energy load balancing apps will run on network edge
devices that will automatically switch to alternative energies like wind and solar etc., based on availability,
demand and lowest price.
The usage of fog computing can accelerate the innovation process in ways that has never been seen
before. This includes self-healing, self-organising and self-learning apps for industrial networks.
products.
Fig 3.2: Without Fog Computing and With Fog Computing in Grid
3.2 Real-Time Large Scale Distributed Fog Computing
"Fog Computing" is a highly distributed broadly decentralized "cloud" that operates close to the
operational level, where data is created and most often used. Fog computing at the ground-level is an
excellent choice for applications that need computing near use that is fit for purpose, where there is high
volume real-time and/or time-critical local data, where data has the greatest meaning within its context,
where fast localized turn around of results is important, where sending an over abundance of raw data to an
enterprise "cloud" is unnecessary, undesireable or bandwidth is expensive or limited.
Example applications of fog computing within an industrial context are analytics, optimization and
advanced control at a manufacturing work center, unit-operation, across and between unit-operations
where sensors, controllers, historians, analytical engines all share data interactively in real-time. At the
upper edges of the "fog" is local site-wide computing, such manufacturing plant systems that span work
centers and unit operations, higher yet would be regional clouds and finally the cloud at the enterprise
level. Fog computing is not independent of enterprise cloud computing, but connected to it sending
cleansed summarized information and in return receiving enterprise information needed locally.
Fog computing places data management, compute power, performance, reliability and recovery in
the hands of the people who understand the needs; the operators, engineers and IT staff for a unit
operation, an oil and gas platform, or other localized operation, so that it can be tailored for "fit-for-
purpose" in a high speed real-time environment.
Fog computing reduces bandwidth needs, as 80% of all data is needed within the local context, such
as; pressures, temperatures, materials charges, flow rates. To send such real-time information into the
enterprise cloud would be burdensome in bandwidth and centralized storage. Enterprise data base bloat
would occur for information rarely used at that level. In this way a limited amount of summarized
information can be transmitted up to the cloud and also down from the cloud to the local operation, such as
customer product performance feedback to the source of those products.
Fig 3.2: Real-Time Large Scale Distributed Fog Computing
We place computing where it is needed, and performant, suited for the purpose, sitting where it needs to be,
at a work center, inside a control panel, at a desk, in a lab, in a rack in a data center, anywhere and
everywhere, all sharing related data to understand and improve your performance. While located throughout
your organization, a fog computing system operates as a single unified resource, a distributed low level
cloud that integrates with centralized clouds to obtain market and customer feedback, desires and behavior’s
that reflect product performance in the eyes of the customer.
The characteristics of a fog computing system are:
 A Highly Distributed Concurrent Computing (HDCC) System.
 A peer-to-peer mesh of computational nodes in a virtual hierarchical structure that matches your
organization
 Communicates with smart sensors, controllers, historians, quality and materials control systems and
others as peers
 Runs on affordable, off the shelf computing technologies
 Supports multiple operating platforms; Unix, Windows, Mac
 Employs simple, fast and standardized IoT internet protocols (TCP/IP, Sockets, etc.)
 Browser user experience, after all, it is the key aspect of an "Industrial Internet of Things"
 Built on field-proven high performance distributed computing technologies.
Capturing,historizing,validating,cleaning and filtering, integrating, analyzing, predicting, adapting and
optimizing performance at lower levels across the enterprise in real-time requires High Performance
Computing (HPC) power. This does not necessarily mean high expense, as commercial off the shelf standard
PCs with the power of a typical laptop computer will suffice and the software running the system need not
be expensive.
To architect such a system, we draw upon the experiences, architectures, tools and successes of such
computing giants as Google, Amazon, YouTube, Facebook , Twitter and others. They have created robust
high performance computing architectures that span global data centers. They have provided development
tools and languages such as Google's GO (golang) that are well suited for high speed concurrent distributed
processing and robust networking and web services. Having a similar need, but more finely distributed, we
can adopt similar high performance computing architectures to deliver and share results where they are
needed in real-time.
There are various ways to use cloud services to save or store files, documents and media in remote
services that can be accessed whenever user connect to the Internet. The main problem in cloud is to
maintain security for user’s data in way that guarantees only authenticated users and no one else gain access
to that data. The issue of providing security to confidential information is core security problem, that it does
not provide level of assurance most people desire. There are various methods to secure remote data in cloud
using standard access control and encryption methods. It is good to say that all the standard approaches used
for providing security have been demonstrated to fail from time to time for a variety of reasons, including
faulty implementations, buggy code, insider attacks, misconfigured services, and the creative construction of
effective and sophisticated attacks not envisioned by the implementers of security procedures. Building a
secure and trustworthy cloud computing environment is not enough, because attacks on data continue to
happen, and when they do, and information gets lost, there is no way to get it back. There is needs to get
solutions to such accidents. The basic idea is that we can limit the damage of stolen data if we decrease the
value of that stolen data to the attacker. We can achieve this through a preventive decoy (disinformation)
attack. We can secure Cloud services by implementing given additional security features.
The basic idea is that we can limit the damage of stolen data if we decrease the value of that stolen
information to the attacker. We can achieve this through a ‘preventive’ disinformation attack. We posit that
secure Cloud services can be implemented given two additional security features:
3.3 User Behavior Profiling
It is expected that access to a user’s information in the Cloud will exhibit a normal means of access.
User profiling is a well known technique that can be applied here to model how, when, and how much a user
accesses their information in the Cloud. Such ‘normal user’ behavior can be continuously checked to
determine whether abnormal access to a user’s information is occurring. This method of behavior-based
security is commonly used in fraud detection applications. Such profiles would naturally include volumetric
information, how many documents are typically read and how often. These simple userspecific features can
serve to detect abnormal Cloud access based partially upon the scale and scope of data transferred.
3.4 : Decoy System
Decoy data, such as decoy documents, honey pots and other bogus information can be generated on
demand and used for detecting unauthorized access to information and to „poison‟ the thief’s ex-
filtrated information. Serving decoys will confuse an attacker into believing they have ex-filtrated useful
information, when they have not. This technology may be integrated with user behavior profiling technology
to secure a user’s data in the Cloud. . Whenever abnormal and unauthorized access to a cloud service is
noticed, decoy information may be returned by the Cloud and delivered in such a way that it appear
completely normal and legitimate. The legitimate user, who is the owner of the information, would readily
identify when decoy information is being returned by the Cloud, and hence could alter the Cloud’s responses
through a variety of means, such as challenge questions, to inform the Cloud security system that it has
incorrectly detected an unauthorized access. In the case where the access is correctly identified as an
unauthorized access, the Cloud security system would deliver unbounded amounts of bogus information to
the attacker, thus securing the user’s true data from can be implemented by given two additional security
features: (1) validating whether data access is authorized when abnormal information access is detected, and
(2) confusing the attacker with bogus information that is by providing decoy documents. We have applied
above concepts to detect unauthorized data access to data stored on a local file system by masqueraders, i.e.
attackers who view of legitimate users after stealing their credentials. Our experimental results in a local file
system setting show that combining both techniques can yield better detection results .This results suggest
that this approach may work in a Cloud environment, to make cloud system more transparent to the user as a
local file system.
Fig 3.3: Decoy system
Anomaly Detection :
The current logged in user access behavior is compared with the past behavior of the user.If the user
behavior is exceeding the threshold value or a limit, then the remote user is suspected to be anomaly. If the
current user behavior is as the past behavior, the user is allowed to operate on the original data.
Challenge Request :
If the current user‘s behavior seems anomalous, then the user is asked for randomly selected secret
questions. If the user fails to provide correct answers for a certain limits or
threshold, the user is provided with decoy files. If the user provided correct answers for a limit, the user is
treated as normal user. Sub subsection .
Algorithm Details :
AES ( Advanced Encryption Standards)
The Advanced Encryption Standard (AES) is a symmetric-key encryption standard approved by NSA for top
secret information and is adopted by the U.S. government. AES is based on a design principle known as a
substitution permutation network. The standard comprises three block ciphers: AES-128, AES-192 and
AES-256. Each of these ciphers has a 128-bit block size, with key sizes of 128, 192 and 256 bits,
respectively. The AES
ciphers have been analyzed extensively and are now used worldwide; AES was selected due to the level of
security it offers and its well documented implementation and optimization techniques. Furthermore, AES is
very efficient in terms of both time and memory requirements. The block ciphers have high computation
intensity and independent workloads (apply the same steps to different blocks of plain text).
Explanations:
AES is based on a design principle known as a Substitution permutation network. It is fast in both software
and hardware. Unlike its predecessor, DES, AES does not use a Feistelnetwork.AES has a fixed block size
of 128 bits and a key size of 128, 192, or 256 bits, whereas Rijndael can be specified with block and key
sizes in any multiple of 32 bits, with a minimum of 128 bits. The block size has a maximum of 256 bits, but
the key size has no theoretical maximum.AES operates on a 4×4 column-major order matrix of bytes, termed
the state (versions of Rijndael with a larger block size have additional columns in the state). Most AES
calculations are done in a special field. The AES cipher is specified as a number of repetitions of
transformation rounds that convert the input plaintext into the final output of cipher text. Each round consists
of several processing steps, including one that depends on the encryption key. A set of reverse rounds are
applied to transform cipher text back into the original plaintext using the same encryption key.
High-level description of the algorithm
1. Key Expansion: Round keys are derived from the cipher key using Rijndael's key schedule.
2. Initial Round
AddRoundKey: Each byte of the state is combined with the round key using bitwise xor.
3. Rounds
1. SubBytes—a non-linear substitution step were each byte is replaced with another according to
alookup table.
2. ShiftRows—a transposition step where each row of the state is shifted cyclically a certain number
of steps.
3. MixColumns—a mixing operation which operates on the columns of the state, combining the
four bytes in each column.
4. AddRoundKey Final Round (no MixColumns)
5. SubBytes
6. ShiftRows
7. AddRoundKey
4.APPLICATIONS OF FOG COMPUTING
We elaborate on the role of Fog computing in the following six motivating scenarios. The advantages of Fog
computing satisfy the requirements of applications in these scenarios.
Fog computing in Smart Grid:
Energy load balancing applications may run on network edge devices, such as smart meters and micro-grids
. Based on energy demand, availability and the lowest price, these devices automatically switch to
alternative energies like solar and wind.
Fog computing in smart traffic lights and connected vehicles:
Video camera that senses an ambulance flashing lights can automatically change street lights to open lanes
for the vehicle to pass through traffic. Smart street lights interact locally with sensors and detect presence of
pedestrian and bikers, and measure the distance and speed of approaching vehicles.
Wireless Sensor and Actuator Networks:
Traditional wireless sensor networks fall short in applications that go beyond sensing and tracking, but
require actuators to exert physical actions like opening, closing or even carrying sensors. In this scenario,
actuators serving as Fog devices can control the measurement process itself, the stability and the oscillatory
behaviours by creating a closed-loop system. For example, in the scenario of self-maintaining trains, sensor
monitoring on a train’s ball-bearing can detect heat levels, allowing applications to send an automatic alert to
the train operator to stop the train at next station for emergency maintenance and avoid potential derailment.
In lifesaving air vents scenario, sensors on vents monitor air conditions flowing in and out of mines and
automatically change air-flow if conditions become dangerous to miners.
Decentralized Smart Building Control:
The applications of this scenario are facilitated by wireless sensors deployed to measure temperature,
humidity, or levels of various gases in the building atmosphere. In this case, information can be exchanged
among all sensors in a floor, and their readings can be combined to form reliable measurements.The system
components may then work together to lower the temperature, inject fresh air or open windows. Air
conditioners can remove moisture from the air or increase the humidity. Sensors can also trace and react to
movements (e.g, by turning light on or off). Fog devices could be assigned at each floor and could
collaborate on higher level of actuation. With Fog computing applied in this scenario, smart buildings can
maintain their fabric, external and internal environments to conserve energy, water and other resources.
IoT and Cyber-physical systems (CPSs):
Fog computing based systems are becoming an important class of IoT and CPSs. Based on the traditional
information carriers including Internet and telecommunication network, IoT is a network that can
interconnect ordinary physical objects with identified addresses. CPSs feature a tight combination of the
system’s computational and physical elements. CPSs also coordinate the integration of computer and
information centric physical and engineered systems. IoT and CPSs promise to transform our world with
new relationships between computer-based control and communication systems, engineered systems and
physical reality. Fog computing in this scenario is built on the concepts of embedded systems in which
software programs and computers are embedded in devices for reasons other than computation alone.
Examples of the devices include toys, cars, medical devices and machinery. The goal is to integrate the
abstractions and precision of software and networking with the dynamics, uncertainty and noise in the
physical environment. Using the emerging knowledge, principles and methods of CPSs, we will be able to
develop new generations of intelligent medical devices and systems, ‘smart’ highways, buildings, factories,
agricultural and robotic systems.
Software Defined Networks (SDN):
SDN is an emergent computing and networking paradigm, and became one of the most popular topics in IT
industry. It separates control and data communication layers. Control is done at a central. ized server, and
nodes follow communication path decided by the server. The centralized server may need distributed
implementation. SDN concept was studied in WLAN, wireless sensor and mesh networks, but they do not
involve multihop wireless communication, multi-hop routing. Moreover, there is no communication between
peers in this scenario. SDN concept together with Fog computing will resolve the main issues in vehicular
networks, intermittent connectivity, collisions and high packet loss rate, by augmenting vehicleto-vehicle
with vehicle-to-infrastructure communications and centralized control. SDN concept for vehicular networks
is first proposed in.
5.SECURITY AND PRIVACY IN FOG COMPUTING
Security and privacy issues were not studied in the context of fog computing. They were studied in
the context of smart grids and machine-to-machine communications .There are security solutions for Cloud
computing. However, they may not suit for Fog computing because Fog devices work at the edge of
networks. The working surroundings of Fog devices will face with many threats which do not exist in well
managed Cloud. In this section, we discuss the security and privacy issues in Fog Computing.
Security Issues
The main security issues are authentication at different levels of gateways as well as (in case of smart grids)
at the smart meters installed in the consumer’s home. Each smart meter and smart appliance has an IP
address. A malicious user can either tamper with its own smart meter, report false readings, or spoof IP
addresses. There are some solutions for the authentication problem. The work elaborated public key
infrastructure (PKI) based solutions which involve multicast authentication. Some authentication techniques
using Diffie-Hellman key exchange have been discussed in . Smart meters encrypt the data and send to the
Fog device, such as a home-area network (HAN) gateway. HAN then decrypts the data, aggregates the
results and then passes them forward. Intrusion detection techniques can also be applied in Fog computing
[28]. Intrusion in smart grids can be detected using either a signature-based method in which the patterns of
behaviour are observed and checked against an already existing database of possible misbehaviours.
Intrusion can also be captured by using an anomaly-based method in which an observed behaviour is
compared with expected behaviour to check if there is a deviation. The work develops an algorithm that
monitors power flow results and detects anomalies in the input values that could have been modified by
attacks. The algorithm detects intrusion by using principal component analysis to separate power flow
variability into regular and irregular subspaces.
6. Combining User Behavior Profiling and Decoy Technology
We posit that the combination of these two security features will provide unprecedented levels of
security for the Cloud. No current Cloud security mechanism is available that provides this level of
security. We have applied these concepts to detect illegitimate data access to data stored on a local file
system by masqueraders, i.e. attackers who impersonate legitimate users after stealing their credentials.
One may consider illegitimate access to Cloud data by a rogue insider as the malicious act of a
masquerader. Our experimental results in a local file system setting show that combining both techniques
can yield better detection results, and our results suggest that this approach may work in a Cloud
environment, as the Cloud is intended to be as transparent to the user as a local file system. In the
following we review briefly some of the experimental results achieved by using this approach to detect
masquerade activity in a local file setting. A. Combining User Behavior Profiling and Decoy Technology
for Masquerade Detection.
6.1 User Behavior Profiling
Legitimate users of a computer system are familiar with the files on that system and where they
are located. Any search for specific files is likely to be targeted and limited. A masquerader, however,
who gets access to the victim’s system illegitimately, is unlikely to be familiar with the structure and
contents of the file system. Their search is likely to be widespread and untargeted. Based on this key
assumption, we profiled user search behavior and developed user models trained with a oneclass
modeling technique, namely one-class support vector machines. The importance of using one-class
modeling stems from the ability of building a classifier without having to share data from different users.
The privacy of the user and their data is therefore preserved. We monitor for abnormal search behaviors
that exhibit deviations from the user baseline. According to our assumption, such deviations signal a
potential masquerade attack. Our previous experiments validated our assumption and demonstrated that
we could reliably detect all simulated masquerade attacks using this approach with a very low false
positive rate of 1.12% .
6.2 Decoy Technology
We placed traps within the file system. The traps are decoy files downloaded from a Fog
computing site, an automated service that offers several types of decoy documents such as tax return
forms, medical records, credit card statements, e-bay receipts, etc. [10]. The decoy files are downloaded
by the legitimate user and placed in highly-conspicuous locations that are not likely to cause any
interference with the normal user activities on the system. A masquerader, who is not familiar with the
file system and its contents, is likely to access these decoy files, if he or she is in search for sensitive
information, such as the bait information 126embedded in these decoy files. Therefore, monitoring
access to the decoy files should signal masquerade activity on the system. The decoy documents carry a
keyed-Hash Message Authentication Code (HMAC), which is hidden in the header section of the
document. The HMAC is computed over the file’s contents using a key unique to each user. When a
decoy document is loaded into memory, we verify whether the document is a decoy document by
computing a HMAC based on all the contents of that document. We compare it with HMAC embedded
within the document. If the two HMACs match, the document is deemed a decoy and an alert is issued.
6.3 Combining the Two Techniques
The correlation of search behavior anomaly detection with trap-based decoy files should provide
stronger evidence of malfeasance, and therefore improve a detector’s accuracy. We hypothesize that
detecting abnormal search operations performed prior to an unsuspecting user opening a decoy file will
corroborate the suspicion that the user is indeed impersonating another victim user. This scenario covers
the threat model of illegitimate access to Cloud data. Furthermore, an accidental opening of a decoy file
by a legitimate user might be recognized as an accident if the search behavior is not deemed abnormal. In
other words, detecting abnormal search and decoy traps together may make a very effective masquerade
detection system. Combining the two techniques improves detection accuracy. We use decoys as an
oracle for validating the alerts issued by the sensor monitoring the user’s file search and access behavior.
In our experiments, we did not generate the decoys on demand at the time of detection when the alert
was issued. Instead, we made sure that the decoys were conspicuous enough for the attacker to access
them if they were indeed trying to steal information by placing them in highly conspicuous directories
and by giving them enticing names. With this approach, we were able to improve the accuracy of our
detector. Crafting the decoys on demand improves the accuracy of the detector even further. Combining
the two techniques, and having the decoy documents act as an oracle for our detector when abnormal
user behavior is detected may lower the overall false positive rate of detector. We trained eighteen
classifiers with computer usage data from 18 computer science students collected over a period of 4 days
on average. The classifiers were trained using the search behavior anomaly detection described in a prior
paper. We also trained another 18 classifiers using a detection approach that combines user behavior
profiling with monitoring access to decoy files placed in the local file system, as described above. We
tested these classifiers using simulated masquerader data. Figure 1 displays the AUC scores achieved by
both detection approaches by user model1. The results show that the models using the combined
detection approach achieve equal or better results than the search profiling approach alone.
7. FOG COMPUTING ARCHITECTURE
Fog Computing system is trying to work against the attacker specially malicious insider. Here
malicious insider means Insider attacks can be performed by malicious employees at the providers or users
site. Malicious insider can access the confidential data of cloud users. A malicious insider can easily obtain
passwords, cryptographic keys and files. The threat of malicious attacks has increased due to lack of
transparency in cloud providers processes and procedures .It means that a provider may not know how
employees are granted access and how this access is monitored or how reports as well as policy compliances
are analyzed.
Fig 7.1: Fog Computing Architecture
Above fig. states the actual working of the fog computing. In two ways login is done in system that are
admin login and user login .When admin login to the system there are again two steps to follow: step1:Enter
username step2:Enter the password . After successful login of admin he can perform all admin related tasks,
but while downloading any file from fog he have to answer the security Question if he answer it correctly
then only original file can be download. In other case, when admin or user answer incorrectly to the security
question then decoy document (fake document) is provided to the fake user.
Decoy technology work in the given manner if you have any word ,suppose “MADAM” in the
document then some alphabets are replaced as M->A then the given word become “AADAA” which have no
meaning. In some Case, if attacker getting to know that, M is replaced by A in the given document and by
applying reverse engineering he get result as “MMDMM”. In any case he can’t judge content of
document.When user login to the system he also have to follow the same procedure as admin. Operations
like upload files/documents, download files/documents, view alerts, send message, read message, broadcast
any message all these can be perform by the user. ALERT this stream provide the detail knowledge of attack
done on their personal file/document with details like date, time, no of times the attacker trying to hack that
file/document .Best thing of fog Computing is after each successful login the user get SMS on the mobile
that „login successful‟. from this the user get alert when other else trying to gain access to his/her personal
fog account and when attacker trying to download some files/documents then user also get SMS that contain
attacker ip-address, attacker’s server name, date, time details on his/her mobile so that become easy to catch
attacker by tracing all these things. In this way fog computing is more secure than the traditional cloud
computing.
8. ADVANTAGES AND DISADVANTAGES
ADVANTAGES
The advantages of placing decoys in a file system are threefold:
 The detection of masquerade activity.
 The confusion of the attacker and the additional costs incurred to distinguish real from bogus
information.
 The deterrence effect which, although hard to measure, plays a significant role in preventing
masquerade activity by risk-averse attackers.
DISADVANTAGES
 Nobody is identified when the attack is happen.
 It is complex to detect which user is attack.
 We cannot detect which file was hacking.
10. CONCLUSION
With the increase of data theft attacks the security of user data security is becoming a serious issue for
cloud service providers for which Fog Computing is a paradigm which helps in monitoring the behavior of
the user and providing security to the user’s data. The system was developed only with email provision but
we have also implemented the SMS technique. In Fog Computing we presenting a new approach for solving
the problem of insider data theft attacks in a cloud using dynamically generated decoy files and also saving
storage required for maintaining decoy files in the cloud. So by using decoy technique in Fog can minimize
insider attacks in cloud. Could provide unprecedented levels of security in the Cloud and in social networks.
11. SCOPE FUTURE ENHANCEMENTS
In our future work, this security system as we have explained is applicable only for single cloud
ownership system. If the cloud owner has a more than one clouds to operate then our security system will
not be applicable for providing security, therefore in the future enhancement we can enhance our existing
application to manage a cloud environment which has more than one cloud architecture. Cloud computing is
the future for organizations.The considerable benefits that provide will make eventually all the organizations
totally move their processes and data to the Cloud. A lot of effort will be put in return to provision the
appropriate security to make business on cloud environments. Although virtualization is already established,
virtualization in the Cloud is still an immature area. The focus of future works should aim to harden the
security of virtualization in multi-tenant environments. Possible lines of research are the development of
reliable and efficient virtual network securities to monitor the communications between virtual machines in
the same physical host. To achieve secure virtualized environments, isolation between the different tenants is
needed. Future researches should aim to provide new architectures and techniques to harden the different
resources shared between tenants. The hypervisor is the most critical component of virtualized
environments. If compromised, the host and guest OSs could potentially be compromised too. Hypervisor
architectures that aim to minimize the code and, at the same time, maintain the functionalities, provide an
interesting future research to secure virtualized environments and the Cloud, especially to prevent against
future hypervisor root kits.
12. REFERENCES
 Cloud Security Alliance, “Top Threat to Cloud Computing V1.0,” March 2010. [Online].Available:
https://cloudsecurityalliance.org/topthreats/csathreats.v1.0.pdf
 Prevention Of Malicious Insider In The Cloud Using Decoy Documents by S. Muqtyar Ahmed, P.
Namratha, C. Nagesh
 Cloud Security: Attacks and Current Defenses Gehana Booth, Andrew Soknacki, and Anil
Somayaji.
 Overview of Attacks on Cloud Computing by Ajey Singh, Dr. Maneesh Shrivastava.
 D.Jamil and H. Zaki, “Security Issues in Cloud Computing and Countermeasures,” International
Journal of Engineering Science and Technology, Vol. 3 No. 4, pp. 2672-2676, April 2011.
 K. Zunnurhain and S. Vrbsky, “Security Attacks and Solutions in Clouds,” 2nd IEEE
InternationalConference on Cloud Computing Technology and Science, Indianapolis, December
2010.
 W. A. Jansen, “Cloud Hooks: Security and Privacy Issues in Cloud Computing,” 44th Hawaii
International Conference on System Sciences, pp. 1–10, Koloa, Hawaii, January 2011.
 F. Bonomi, “Connected vehicles, the internet of things, and fog computing,”in The Eighth ACM
International Workshop on Vehicular Inter-Networking (VANET), Las Vegas, USA, 2011.
 http ://cnc.ucr.edu/security/glossary.
 http://technet.microsoft.com/enus/library/cc959354.aspx
 Cisco Cloud Computing -Data Center Strategy, Architecture,and Solutions
http://www.cisco.com/web/strategy/docs/gov/CiscoCloudComputing_WP.pdf.
 Fog Computing: Mitigating Insider Data Theft Attacks in The
Cloud.[Online].Available:http://ids.cs.columbia.edu/sites/default/files/Fog_Comuting_Position_Pape
r_WRIT_2012.pdf
 M. Van Dijk and A. Juels, “On the impossibility of cryptography alone for privacy-preserving cloud
computing,” in Proceedings of the 5th USENIX conference on Hot topics in security, ser. HotSec’10.
”Berkeley, CA, USA”: ”USENIX Association”, 2010, pp. 1–8.
 J. A. Iglesias, P. Angelov, A. Ledezma, and A. Sanchis, “Creating evolving user behavior profiles
automatically,” IEEE Trans. on Knowl. and Data Eng., vol. 24, no. 5, pp. 854–867, May 2012.
 F. Rocha and M. Correia, “Lucy in the sky without diamonds: Stealing confidential data in the
cloud,” in Proceedings of the 2011 IEEE/IFIP 41st International Conference on Dependable Systems
and Networks Workshops, ser. DSNW ’11. Washington, DC, USA: IEEE Computer Society, 2011,
pp. 129–134.

Mais conteúdo relacionado

Mais procurados

Fog computing
Fog computingFog computing
Fog computing
Mahantesh Hiremath
 
fog computing provide security to the data in cloud
fog computing provide security to the data in cloudfog computing provide security to the data in cloud
fog computing provide security to the data in cloud
priyanka reddy
 
Fog computing provide security to data in cloud ppt
Fog computing provide security to data in cloud pptFog computing provide security to data in cloud ppt
Fog computing provide security to data in cloud ppt
priyanka reddy
 

Mais procurados (20)

Fog computing
Fog computingFog computing
Fog computing
 
Fog computing technology
Fog computing technologyFog computing technology
Fog computing technology
 
fog computing provide security to the data in cloud
fog computing provide security to the data in cloudfog computing provide security to the data in cloud
fog computing provide security to the data in cloud
 
fog computing ppt
fog computing ppt fog computing ppt
fog computing ppt
 
Fog Computing
Fog ComputingFog Computing
Fog Computing
 
Cloud computing
Cloud computing Cloud computing
Cloud computing
 
Fog computing ( foggy cloud)
Fog computing  ( foggy cloud)Fog computing  ( foggy cloud)
Fog computing ( foggy cloud)
 
Fog computing : The new age Technology
Fog computing : The new age TechnologyFog computing : The new age Technology
Fog computing : The new age Technology
 
Fog computing
Fog computingFog computing
Fog computing
 
Fog computing provide security to data in cloud ppt
Fog computing provide security to data in cloud pptFog computing provide security to data in cloud ppt
Fog computing provide security to data in cloud ppt
 
Fog computing paper presentation
Fog computing paper presentationFog computing paper presentation
Fog computing paper presentation
 
Cloud computing
Cloud computingCloud computing
Cloud computing
 
Cloud computing presentation
Cloud computing presentationCloud computing presentation
Cloud computing presentation
 
Cloud Computing
Cloud ComputingCloud Computing
Cloud Computing
 
Cloud computing
Cloud computingCloud computing
Cloud computing
 
Fog Computing and Cloud Computing
Fog Computing and Cloud ComputingFog Computing and Cloud Computing
Fog Computing and Cloud Computing
 
Introduction to Fog Computing
Introduction to Fog ComputingIntroduction to Fog Computing
Introduction to Fog Computing
 
Cloud computing ppt
Cloud computing pptCloud computing ppt
Cloud computing ppt
 
Cloud computing
Cloud computingCloud computing
Cloud computing
 
Sustainability and fog computing applications, advantages and challenges
Sustainability and fog computing applications, advantages and challengesSustainability and fog computing applications, advantages and challenges
Sustainability and fog computing applications, advantages and challenges
 

Destaque

ccmigration_09186a008033a3b4
ccmigration_09186a008033a3b4ccmigration_09186a008033a3b4
ccmigration_09186a008033a3b4
guest66dc5f
 
Cloud computing security- critical infrastructures
Cloud computing security- critical infrastructuresCloud computing security- critical infrastructures
Cloud computing security- critical infrastructures
Mohammed Saqib
 

Destaque (9)

Fog Computing with VORTEX
Fog Computing with VORTEXFog Computing with VORTEX
Fog Computing with VORTEX
 
Architecture harmonization between cloud radio access network and fog network
Architecture harmonization between cloud radio access network and fog networkArchitecture harmonization between cloud radio access network and fog network
Architecture harmonization between cloud radio access network and fog network
 
ccmigration_09186a008033a3b4
ccmigration_09186a008033a3b4ccmigration_09186a008033a3b4
ccmigration_09186a008033a3b4
 
The data streaming paradigm and its use in Fog architectures
The data streaming paradigm and its use in Fog architecturesThe data streaming paradigm and its use in Fog architectures
The data streaming paradigm and its use in Fog architectures
 
Cloud computing security- critical infrastructures
Cloud computing security- critical infrastructuresCloud computing security- critical infrastructures
Cloud computing security- critical infrastructures
 
Improving Web Siste Performance Using Edge Services in Fog Computing Architec...
Improving Web Siste Performance Using Edge Services in Fog Computing Architec...Improving Web Siste Performance Using Edge Services in Fog Computing Architec...
Improving Web Siste Performance Using Edge Services in Fog Computing Architec...
 
IoT World Forum Press Conference - 10.14.2014
IoT World Forum Press Conference - 10.14.2014IoT World Forum Press Conference - 10.14.2014
IoT World Forum Press Conference - 10.14.2014
 
Pill camera documentation
Pill camera documentationPill camera documentation
Pill camera documentation
 
OpenStack NFV Edge computing for IOT microservices
OpenStack NFV Edge computing for IOT microservicesOpenStack NFV Edge computing for IOT microservices
OpenStack NFV Edge computing for IOT microservices
 

Semelhante a Fog computing document

Security Issues in Cloud Computing by rahul abhishek
Security Issues in Cloud Computing  by rahul abhishekSecurity Issues in Cloud Computing  by rahul abhishek
Security Issues in Cloud Computing by rahul abhishek
Er. rahul abhishek
 
Security Issues in Cloud Computing by rahul abhishek
Security Issues in Cloud Computing  by rahul abhishekSecurity Issues in Cloud Computing  by rahul abhishek
Security Issues in Cloud Computing by rahul abhishek
Er. rahul abhishek
 
Iaetsd cloud computing and security challenges
Iaetsd cloud computing and security challengesIaetsd cloud computing and security challenges
Iaetsd cloud computing and security challenges
Iaetsd Iaetsd
 
Ijarcet vol-2-issue-4-1405-1409
Ijarcet vol-2-issue-4-1405-1409Ijarcet vol-2-issue-4-1405-1409
Ijarcet vol-2-issue-4-1405-1409
Editor IJARCET
 
A Survey on Cloud Computing Security – Challenges and Trust Issues
A Survey on Cloud Computing Security – Challenges and Trust IssuesA Survey on Cloud Computing Security – Challenges and Trust Issues
A Survey on Cloud Computing Security – Challenges and Trust Issues
IJCSIS Research Publications
 
An efficient and secure data storage in cloud computing using modified RSA pu...
An efficient and secure data storage in cloud computing using modified RSA pu...An efficient and secure data storage in cloud computing using modified RSA pu...
An efficient and secure data storage in cloud computing using modified RSA pu...
IJECEIAES
 

Semelhante a Fog computing document (20)

G033030035
G033030035G033030035
G033030035
 
Cloud Computing Using Encryption and Intrusion Detection
Cloud Computing Using Encryption and Intrusion DetectionCloud Computing Using Encryption and Intrusion Detection
Cloud Computing Using Encryption and Intrusion Detection
 
A survey on cloud security issues and techniques
A survey on cloud security issues and techniquesA survey on cloud security issues and techniques
A survey on cloud security issues and techniques
 
Security Issues in Cloud Computing by rahul abhishek
Security Issues in Cloud Computing  by rahul abhishekSecurity Issues in Cloud Computing  by rahul abhishek
Security Issues in Cloud Computing by rahul abhishek
 
Security Issues in Cloud Computing by rahul abhishek
Security Issues in Cloud Computing  by rahul abhishekSecurity Issues in Cloud Computing  by rahul abhishek
Security Issues in Cloud Computing by rahul abhishek
 
Iaetsd cloud computing and security challenges
Iaetsd cloud computing and security challengesIaetsd cloud computing and security challenges
Iaetsd cloud computing and security challenges
 
A survey on data security in cloud computing issues and mitigation techniques
A survey on data security in cloud computing issues and mitigation techniquesA survey on data security in cloud computing issues and mitigation techniques
A survey on data security in cloud computing issues and mitigation techniques
 
SECURITY APPREHENSIONS IN DIFFERENT REGIONS OF CLOUD CAPTIOUS GROUNDS
SECURITY APPREHENSIONS IN DIFFERENT REGIONS OF CLOUD CAPTIOUS GROUNDSSECURITY APPREHENSIONS IN DIFFERENT REGIONS OF CLOUD CAPTIOUS GROUNDS
SECURITY APPREHENSIONS IN DIFFERENT REGIONS OF CLOUD CAPTIOUS GROUNDS
 
Enhanced security framework to ensure data security
Enhanced security framework to ensure data securityEnhanced security framework to ensure data security
Enhanced security framework to ensure data security
 
Enhanced security framework to ensure data security in cloud using security b...
Enhanced security framework to ensure data security in cloud using security b...Enhanced security framework to ensure data security in cloud using security b...
Enhanced security framework to ensure data security in cloud using security b...
 
Enhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through SteganographyEnhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through Steganography
 
Ijarcet vol-2-issue-4-1405-1409
Ijarcet vol-2-issue-4-1405-1409Ijarcet vol-2-issue-4-1405-1409
Ijarcet vol-2-issue-4-1405-1409
 
B1802041217
B1802041217B1802041217
B1802041217
 
A Survey on Cloud Computing Security – Challenges and Trust Issues
A Survey on Cloud Computing Security – Challenges and Trust IssuesA Survey on Cloud Computing Security – Challenges and Trust Issues
A Survey on Cloud Computing Security – Challenges and Trust Issues
 
Public Key Encryption algorithms Enabling Efficiency Using SaaS in Cloud Comp...
Public Key Encryption algorithms Enabling Efficiency Using SaaS in Cloud Comp...Public Key Encryption algorithms Enabling Efficiency Using SaaS in Cloud Comp...
Public Key Encryption algorithms Enabling Efficiency Using SaaS in Cloud Comp...
 
Solutions of cloud computing security issues
Solutions of cloud computing security issuesSolutions of cloud computing security issues
Solutions of cloud computing security issues
 
An efficient and secure data storage in cloud computing using modified RSA pu...
An efficient and secure data storage in cloud computing using modified RSA pu...An efficient and secure data storage in cloud computing using modified RSA pu...
An efficient and secure data storage in cloud computing using modified RSA pu...
 
Methodologies for Enhancing Data Integrity and Security in Distributed Cloud ...
Methodologies for Enhancing Data Integrity and Security in Distributed Cloud ...Methodologies for Enhancing Data Integrity and Security in Distributed Cloud ...
Methodologies for Enhancing Data Integrity and Security in Distributed Cloud ...
 
106248842 cc
106248842 cc106248842 cc
106248842 cc
 
A STUDY OF THE ISSUES AND SECURITY OF CLOUD COMPUTING
A STUDY OF THE ISSUES AND SECURITY OF CLOUD COMPUTINGA STUDY OF THE ISSUES AND SECURITY OF CLOUD COMPUTING
A STUDY OF THE ISSUES AND SECURITY OF CLOUD COMPUTING
 

Mais de sravya raju

Mais de sravya raju (6)

Secure shell ppt
Secure shell pptSecure shell ppt
Secure shell ppt
 
BIOMETRIC IDENTIFICATION IN ATM’S PPT
BIOMETRIC IDENTIFICATION IN ATM’S  PPTBIOMETRIC IDENTIFICATION IN ATM’S  PPT
BIOMETRIC IDENTIFICATION IN ATM’S PPT
 
Hawk Eye Technology ppt
Hawk Eye Technology pptHawk Eye Technology ppt
Hawk Eye Technology ppt
 
HADOOP TECHNOLOGY ppt
HADOOP  TECHNOLOGY pptHADOOP  TECHNOLOGY ppt
HADOOP TECHNOLOGY ppt
 
HADOOP TECHNOLOGY ppt
HADOOP  TECHNOLOGY pptHADOOP  TECHNOLOGY ppt
HADOOP TECHNOLOGY ppt
 
PERSON DE-IDENTIFICATION IN VIDEOS ppt
PERSON DE-IDENTIFICATION IN VIDEOS  pptPERSON DE-IDENTIFICATION IN VIDEOS  ppt
PERSON DE-IDENTIFICATION IN VIDEOS ppt
 

Último

Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Victor Rentea
 

Último (20)

ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Cyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfCyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdf
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 

Fog computing document

  • 1. ABSTRACT Cloud is basically a clusters of multiple dedicated servers attached within a network. Cloud Computing is a network based environment that focuses on sharing computations or resources. In cloud customers only pay for what they use and have not to pay for local resources which they need such as storage or infrastructure. so this is the main advantage of cloud computing and main reason for gaining popularity in todays world Also..But in cloud the main problem that occurs is security. And now a day’s security and privacy both are main concern that needed to be considered. To overcome the problem of security we are introducing the new technique which is called as Fog Computing .Fog Computing is not a replacement of cloud it is just extends the cloud computing by providing security in the cloud environment. With Fog services we are able to enhance the cloud experience by isolating user’s data that need to live on the edge. The main aim of fog computing is to place the data close to the end user.
  • 2. 1. INTRODUCTION In today's worlds the small as well as big -big organizations are using cloud computing technology to protect their data and to use the cloud resources as and when they need. Cloud is a subscription based service .Cloud computing is a shared pool of resources. The way of use computers and store our personal and business information can arises new data security challenges. Encryption mechanisms not protect the data in the cloud from unauthorized access. As we know that the traditional database system are usually deployed in closed environment where user can access the system only through a restricted network or internet. With the fast growth of W.W.W user can access virtually any database for which they have proper access right from anywhere in the world . By registering into cloud the users are ready to get the resources from cloud providers and the organization can access their data from anywhere and at any time when they need. But this comfortness comes with certain type of risk like security and privacy. To overcome by this problem we are using a new technique called as fog computing. Fog computing provides security in cloud environment in a greater extend to get the benefit of this technique a user need to get registered with the fog. once the user is ready by filling up the sign up form he will get the message or email that he is ready to take the services from fog computing. 1.1Existing System Existing data protection mechanisms such as encryption was failed in securing the data from the attackers. It does not verify whether the user was authorized or not. Cloud computing security does not focus on ways of secure the data from unauthorized access. Encryption does not provide much security to our data. In 2009.We have our own confidential documents in the cloud. This files does not have much security. So, hacker gains access the documents. Twitter incident is one example of a data theft attack in the Cloud. Difficult to find the attacker. In 2010 and 2011 Cloud computing security was developed against attackers. Finding of hackers in the cloud. Additionally, it shows that recent research results that might be useful to protect data in the cloud.
  • 3. 1.2Proposed System We proposed a completely new technique to secure user’s data in cloud using user behavior and decoy information technology called as Fog Computing. We use this technique to provide data security in the cloud. A different approach for securing data in the cloud using offensive decoy technology. We monitor data access in the cloud and detect abnormal data access patterns. In this technique when the unauthorized person try to access the data of the real user the system generates the fake documents in such a way that the unauthorized person was also not able to identify that the data is fake or real .It is identified thought a question which is entered by the real user at the time of filling the sign up form. If the answer of the question is wrong it means the user is not the real user and the system provide the fake document else original documents will be provided by the system to the real user.
  • 4. 2.SYSTEM OVERVIEW 2.1 Cloud Architecture In Cloud architecture, the systems architecture(A system architecture or systems architecture is the conceptual model that defines the structure, behavior, and more views of a system. An architecture description is a formal description and representation of a system) of the software systems(The term software system is often used as a synonym of computer program or software.) involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over application programming interfaces, usually web services. This resembles the Unix philosophy of having multiple programs each doing one thing well and working together over universal interfaces. Complexity is controlled and the resulting systems are more manageable than their monolithic counterparts. Fig 2.1 :Cloud Computing Sample Architecture
  • 5. 2.2 Cloud computing Services: Cloud computing is a model for enabling convenient, on demand network access to a shared pool of configurable computing resources (for example, networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service-provider interaction. It is divide into three types. 1. Application as a service. 2. Infrastructure as a service. 3. Platform as a service. Fig 2.2: Cloud computing Services Cloud computing exhibits the following key characteristics: 1. Agility: improves with users' ability to re-provision technological infrastructure resources.
  • 6. 2. Cost: Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure. This is purported to lower barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation. The e-FISCAL project's state of the art repository contains several articles looking into cost aspects in more detail, most of them concluding that costs savings depend on the type of activities supported and the type of infrastructure available in- house. 3. Virtualization: Technology allows servers and storage devices to be shared and utilization be increased. Applications can be easily migrated from one physical server to another. 4. Multi tenancy: Enables sharing of resources and costs across a large pool of users thus allowing for. 5. Centralization: Centralization of infrastructure in locations with lower costs. (such as real estate, electricity, etc.) 6. Utilization and efficiency: Improvements for systems that are often only 10–20% utilized. 7. Reliability: Reliability is improved if multiple redundant sites are used, which makes well-designed cloud computing suitable for business continuity and disaster recovery. 8. Performance: Performance is monitored and consistent and loosely coupled architectures are constructed using web services as the system interface.
  • 7. 9. Security: Could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than other traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security. 10. Maintenance: Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer and can be accessed from different places. Fig 2. 3: Represents The Benefit 2.3 Security Issues in Service Model Cloud computing having three delivery models through which services are delivered to end users. These models are SaaS, IaaS and PaaS which provide software, Infrastructure and platform assets to the users. They have different level of security requirements.
  • 8. Fig 2.4 : Security Issues in Service Model Security issues in SaaS: Software as service is a model, where the software applications are hosted slightly by the service provider and available to users on request, over the internet. In SaaS, client data is available on the internet and may be visible to other users, it is the responsibility of provider to set proper security checks for data protection. This is the major security risk, which create a problem in secure data migration and storage. The following security measures should be counted in SaaS application improvement process such that Data Security, Data locality, Data integrity, Data separation, Data access, Data confidentiality, Data breaches, Network Security, Authentication and authorization, Web application security, Identity management process. The following are the basics issues through which malicious user get access and violate the data Aruna et al., International Journal of SQL Injection flaw, Cross-site request forgery, Insecure storage, Insecure configuration. Security issues in PaaS: PaaS is the layer above the IaaS. It deals with operating system, middleware, etc. It provides set of service through which a developer can complete a development process from testing to maintenance. It is complete platform where user can complete development task without any hesitation. In PaaS, the service provider give some command to customer over some application on platform. But still there can be the problem of security like intrusion etc, which must be assured that data may not be accessible between applications. Security issues in IaaS: IaaS introduce the traditional concept of development, spending a huge amount on data centers or managing hosting forum and hiring a staff for operation. Now the IaaS give an idea to use the infrastructure of any one provider, get services and pay only for resources they use. IaaS and other related services have enable set up and focus on business improvement without worrying about the organization infrastructure. The IaaS
  • 9. provides basic security firewall, load balancing, etc. In IaaS there is better control over the security, and there is no security gap in virtualization manager. The main security problem in IaaS is the trustworthiness of data that is stored within the provider’s hardware. 2.4 Cloud Computing Security Threats and solution Top seven security threats to cloud computing discovered by “Cloud Security Alliance” (CSA) are: i. Abuse and Nefarious Use of Cloud Computing: Abuse and nefarious use of cloud computing is the top threat identified by the CSA. A simple example of this is the use of botnets to spread spam and malware. Attackers can infiltrate a public cloud, for example, and find a way to upload malware to thousands of computers and use the power of the cloud infrastructure to attack other machines. Suggested remedies by the CSA to lessen this threat:  Stricter initial registration and validation processes.  Enhanced credit card fraud monitoring and coordination.  Comprehensive introspection of customer network traffic.  Monitoring public blacklists for one’s own network blocks. ii. Insecure Application Programming Interfaces: As software interfaces or APIs are what customers use to interact with cloud services, those must have extremely secure authentication, access control, encryption and activity monitoring mechanisms - especially when third parties start to build on them. Suggested remedies by CSA to lessen this threat:  Analyze the security model of cloud provider interfaces.  Ensure strong authentication and access controls are implemented in concert with encrypted transmission.  Understand the dependency chain associated with the API. iii. Malicious Insiders: The malicious insider threat is one that gains in importance as many providers still don't reveal how they hire people, how they grant them access to assets or how they monitor them. Transparency is, in this case,
  • 10. vital to a secure cloud offering, along with compliance reporting and breach notification. Suggested remedies by CSA to lessen this threat:  Enforce strict supply chain management and conduct a comprehensive supplier assessment.  Specify human resource requirements as part of legal contracts.  Require transparency into overall information security and management practices, as well as compliance reporting.  Determine security breach notification processes. iv. Shared Technology Vulnerabilities: Sharing infrastructure is a way of life for IaaS providers. Unfortunately, the components on which this infrastructure is based were not designed for that. To ensure that customers don't thread on each other's "territory", monitoring and strong compartmentalization is required. Suggested remedies by CSA to lessen this threat:  Implement security best practices for installation/configuration.  Monitor environment for unauthorized changes/activity.  Promote strong authentication and access control for administrative access and operations.  Enforce service level agreements for patching and vulnerability remediation.  Conduct vulnerability scanning and configuration audits. v. Data Loss/Leakage: Be it by deletion without a backup, by loss of the encoding key or by unauthorized access, data is always in danger of being lost or stolen. This is one of the top concerns for businesses, because they not only stand to lose their reputation, but are also obligated by law to keep it safe. Aruna et al., International Journal of Advanced Research in Computer Science and Software Engineering 3(9), September - 2013, pp. 292-299 © 2013, IJARCSSE All Rights Reserved Page | 294 Suggested remedies by CSA to lessen this threat:  Implement strong API access control.  Encrypt and protect integrity of data in transit.  Analyze data protection at both design and run time.  Implement strong key generation, storage and management, and destruction practices.  Contractually demand providers to wipe persistent media before it is released into the pool.  Contractually specify provider backup and retention strategies.
  • 11. vi. Account, Service & Traffic Hijacking: Account service and traffic hijacking is another issue that cloud users need to be aware of. These threats range from man-in the-middle attacks, to phishing and spam campaigns, to denial-of service attacks. Suggested remedies by CSA to lessen this threat:  Prohibit the sharing of account credentials between users and services.  Leverage strong two-factor authentication techniques where possible.  Employ proactive monitoring to detect unauthorized activity.  Understand cloud provider security policies and SLAs. vii. Unknown Risk Profile: Security should be always in the upper portion of the priority list. Code updates, security practices, vulnerability profiles, intrusion attempts – all things that should always be kept in mind ,Suggested remedies by CSA to lessen this threat:  Disclosure of applicable logs and data.  Partial/full disclosure of infrastructure details (e.g., patch levels, firewalls, etc).3  Monitoring and alerting on necessary information.
  • 12. 3.SECURING CLOUDS USING FOG 3.1Fog Computing: Below is the reference architecture of a Fog computing environment in an enterprise. You can see that the Fog network is close to the smart devices, data processing is happening closer to the devices and the processed information is passed to the cloud computing environment. Fig 3.1: Reference Architecture Just got comfortable with the concept of cloud computing Well, that is now in past. Cloud computing has now been overtaken by a new concept called fog computing which is certainly much better and bigger than the cloud. Fog computing is quite similar to cloud and just like cloud computing it also provides its users with data, storage, compute and application services. The thing that distinguishes fog from cloud is its support for mobility, its proximity to its end-users and its dense geographical distribution. Its services are hosted at the network edge or even on devices such as set-top-boxes or access points. By doing this, fog computing helps in reducing service latency and even improves QoS, which further result in a superior user experience. Fog computing even supports emerging Internet of Things (IoT) applications that require real time or predictable latency. A thing in Internet of Things is referred to as any natural or manmade object that can
  • 13. be assigned an Internet Protocol (IP) address and provided with an ability to transfer data over a network. Some of these can end up creating a lot of data. Cisco here provides us with an example of a jet engine, which is capable of creating 10 terabytes of data about its condition and performance that too in half-hour. Transmitting all this data to the cloud and then transmitting response data back ends up creating a huge demand on bandwidth. This process further requires a considerable amount of time to take place and can suffer from latency. In fog computing, much of the processing takes place in a router. This type of computing creates a virtual platform that provides networking, compute and storage services between traditional cloud computing data centers and end devices. These services are central to both fog and cloud computing. They are also important for supporting the emerging Internet deployments. Fog computing also has the capability of enabling a new breed of aggregated services and applications, such as the smart energy distribution. In smart energy distribution, all the energy load balancing apps will run on network edge devices that will automatically switch to alternative energies like wind and solar etc., based on availability, demand and lowest price. The usage of fog computing can accelerate the innovation process in ways that has never been seen before. This includes self-healing, self-organising and self-learning apps for industrial networks. products.
  • 14. Fig 3.2: Without Fog Computing and With Fog Computing in Grid
  • 15. 3.2 Real-Time Large Scale Distributed Fog Computing "Fog Computing" is a highly distributed broadly decentralized "cloud" that operates close to the operational level, where data is created and most often used. Fog computing at the ground-level is an excellent choice for applications that need computing near use that is fit for purpose, where there is high volume real-time and/or time-critical local data, where data has the greatest meaning within its context, where fast localized turn around of results is important, where sending an over abundance of raw data to an enterprise "cloud" is unnecessary, undesireable or bandwidth is expensive or limited. Example applications of fog computing within an industrial context are analytics, optimization and advanced control at a manufacturing work center, unit-operation, across and between unit-operations where sensors, controllers, historians, analytical engines all share data interactively in real-time. At the upper edges of the "fog" is local site-wide computing, such manufacturing plant systems that span work centers and unit operations, higher yet would be regional clouds and finally the cloud at the enterprise level. Fog computing is not independent of enterprise cloud computing, but connected to it sending cleansed summarized information and in return receiving enterprise information needed locally. Fog computing places data management, compute power, performance, reliability and recovery in the hands of the people who understand the needs; the operators, engineers and IT staff for a unit operation, an oil and gas platform, or other localized operation, so that it can be tailored for "fit-for- purpose" in a high speed real-time environment. Fog computing reduces bandwidth needs, as 80% of all data is needed within the local context, such as; pressures, temperatures, materials charges, flow rates. To send such real-time information into the enterprise cloud would be burdensome in bandwidth and centralized storage. Enterprise data base bloat would occur for information rarely used at that level. In this way a limited amount of summarized information can be transmitted up to the cloud and also down from the cloud to the local operation, such as customer product performance feedback to the source of those products.
  • 16. Fig 3.2: Real-Time Large Scale Distributed Fog Computing We place computing where it is needed, and performant, suited for the purpose, sitting where it needs to be, at a work center, inside a control panel, at a desk, in a lab, in a rack in a data center, anywhere and everywhere, all sharing related data to understand and improve your performance. While located throughout your organization, a fog computing system operates as a single unified resource, a distributed low level cloud that integrates with centralized clouds to obtain market and customer feedback, desires and behavior’s that reflect product performance in the eyes of the customer. The characteristics of a fog computing system are:  A Highly Distributed Concurrent Computing (HDCC) System.  A peer-to-peer mesh of computational nodes in a virtual hierarchical structure that matches your organization  Communicates with smart sensors, controllers, historians, quality and materials control systems and others as peers  Runs on affordable, off the shelf computing technologies  Supports multiple operating platforms; Unix, Windows, Mac  Employs simple, fast and standardized IoT internet protocols (TCP/IP, Sockets, etc.)  Browser user experience, after all, it is the key aspect of an "Industrial Internet of Things"
  • 17.  Built on field-proven high performance distributed computing technologies. Capturing,historizing,validating,cleaning and filtering, integrating, analyzing, predicting, adapting and optimizing performance at lower levels across the enterprise in real-time requires High Performance Computing (HPC) power. This does not necessarily mean high expense, as commercial off the shelf standard PCs with the power of a typical laptop computer will suffice and the software running the system need not be expensive. To architect such a system, we draw upon the experiences, architectures, tools and successes of such computing giants as Google, Amazon, YouTube, Facebook , Twitter and others. They have created robust high performance computing architectures that span global data centers. They have provided development tools and languages such as Google's GO (golang) that are well suited for high speed concurrent distributed processing and robust networking and web services. Having a similar need, but more finely distributed, we can adopt similar high performance computing architectures to deliver and share results where they are needed in real-time. There are various ways to use cloud services to save or store files, documents and media in remote services that can be accessed whenever user connect to the Internet. The main problem in cloud is to maintain security for user’s data in way that guarantees only authenticated users and no one else gain access to that data. The issue of providing security to confidential information is core security problem, that it does not provide level of assurance most people desire. There are various methods to secure remote data in cloud using standard access control and encryption methods. It is good to say that all the standard approaches used for providing security have been demonstrated to fail from time to time for a variety of reasons, including faulty implementations, buggy code, insider attacks, misconfigured services, and the creative construction of effective and sophisticated attacks not envisioned by the implementers of security procedures. Building a secure and trustworthy cloud computing environment is not enough, because attacks on data continue to happen, and when they do, and information gets lost, there is no way to get it back. There is needs to get solutions to such accidents. The basic idea is that we can limit the damage of stolen data if we decrease the value of that stolen data to the attacker. We can achieve this through a preventive decoy (disinformation) attack. We can secure Cloud services by implementing given additional security features. The basic idea is that we can limit the damage of stolen data if we decrease the value of that stolen information to the attacker. We can achieve this through a ‘preventive’ disinformation attack. We posit that secure Cloud services can be implemented given two additional security features:
  • 18. 3.3 User Behavior Profiling It is expected that access to a user’s information in the Cloud will exhibit a normal means of access. User profiling is a well known technique that can be applied here to model how, when, and how much a user accesses their information in the Cloud. Such ‘normal user’ behavior can be continuously checked to determine whether abnormal access to a user’s information is occurring. This method of behavior-based security is commonly used in fraud detection applications. Such profiles would naturally include volumetric information, how many documents are typically read and how often. These simple userspecific features can serve to detect abnormal Cloud access based partially upon the scale and scope of data transferred. 3.4 : Decoy System Decoy data, such as decoy documents, honey pots and other bogus information can be generated on demand and used for detecting unauthorized access to information and to „poison‟ the thief’s ex- filtrated information. Serving decoys will confuse an attacker into believing they have ex-filtrated useful information, when they have not. This technology may be integrated with user behavior profiling technology to secure a user’s data in the Cloud. . Whenever abnormal and unauthorized access to a cloud service is noticed, decoy information may be returned by the Cloud and delivered in such a way that it appear completely normal and legitimate. The legitimate user, who is the owner of the information, would readily identify when decoy information is being returned by the Cloud, and hence could alter the Cloud’s responses through a variety of means, such as challenge questions, to inform the Cloud security system that it has incorrectly detected an unauthorized access. In the case where the access is correctly identified as an unauthorized access, the Cloud security system would deliver unbounded amounts of bogus information to the attacker, thus securing the user’s true data from can be implemented by given two additional security features: (1) validating whether data access is authorized when abnormal information access is detected, and (2) confusing the attacker with bogus information that is by providing decoy documents. We have applied above concepts to detect unauthorized data access to data stored on a local file system by masqueraders, i.e. attackers who view of legitimate users after stealing their credentials. Our experimental results in a local file system setting show that combining both techniques can yield better detection results .This results suggest that this approach may work in a Cloud environment, to make cloud system more transparent to the user as a local file system.
  • 19. Fig 3.3: Decoy system Anomaly Detection : The current logged in user access behavior is compared with the past behavior of the user.If the user behavior is exceeding the threshold value or a limit, then the remote user is suspected to be anomaly. If the current user behavior is as the past behavior, the user is allowed to operate on the original data. Challenge Request : If the current user‘s behavior seems anomalous, then the user is asked for randomly selected secret questions. If the user fails to provide correct answers for a certain limits or threshold, the user is provided with decoy files. If the user provided correct answers for a limit, the user is treated as normal user. Sub subsection . Algorithm Details : AES ( Advanced Encryption Standards) The Advanced Encryption Standard (AES) is a symmetric-key encryption standard approved by NSA for top secret information and is adopted by the U.S. government. AES is based on a design principle known as a substitution permutation network. The standard comprises three block ciphers: AES-128, AES-192 and
  • 20. AES-256. Each of these ciphers has a 128-bit block size, with key sizes of 128, 192 and 256 bits, respectively. The AES ciphers have been analyzed extensively and are now used worldwide; AES was selected due to the level of security it offers and its well documented implementation and optimization techniques. Furthermore, AES is very efficient in terms of both time and memory requirements. The block ciphers have high computation intensity and independent workloads (apply the same steps to different blocks of plain text). Explanations: AES is based on a design principle known as a Substitution permutation network. It is fast in both software and hardware. Unlike its predecessor, DES, AES does not use a Feistelnetwork.AES has a fixed block size of 128 bits and a key size of 128, 192, or 256 bits, whereas Rijndael can be specified with block and key sizes in any multiple of 32 bits, with a minimum of 128 bits. The block size has a maximum of 256 bits, but the key size has no theoretical maximum.AES operates on a 4×4 column-major order matrix of bytes, termed the state (versions of Rijndael with a larger block size have additional columns in the state). Most AES calculations are done in a special field. The AES cipher is specified as a number of repetitions of transformation rounds that convert the input plaintext into the final output of cipher text. Each round consists of several processing steps, including one that depends on the encryption key. A set of reverse rounds are applied to transform cipher text back into the original plaintext using the same encryption key.
  • 21. High-level description of the algorithm 1. Key Expansion: Round keys are derived from the cipher key using Rijndael's key schedule. 2. Initial Round AddRoundKey: Each byte of the state is combined with the round key using bitwise xor. 3. Rounds 1. SubBytes—a non-linear substitution step were each byte is replaced with another according to alookup table. 2. ShiftRows—a transposition step where each row of the state is shifted cyclically a certain number of steps. 3. MixColumns—a mixing operation which operates on the columns of the state, combining the four bytes in each column. 4. AddRoundKey Final Round (no MixColumns) 5. SubBytes 6. ShiftRows 7. AddRoundKey
  • 22. 4.APPLICATIONS OF FOG COMPUTING We elaborate on the role of Fog computing in the following six motivating scenarios. The advantages of Fog computing satisfy the requirements of applications in these scenarios. Fog computing in Smart Grid: Energy load balancing applications may run on network edge devices, such as smart meters and micro-grids . Based on energy demand, availability and the lowest price, these devices automatically switch to alternative energies like solar and wind. Fog computing in smart traffic lights and connected vehicles: Video camera that senses an ambulance flashing lights can automatically change street lights to open lanes for the vehicle to pass through traffic. Smart street lights interact locally with sensors and detect presence of pedestrian and bikers, and measure the distance and speed of approaching vehicles. Wireless Sensor and Actuator Networks: Traditional wireless sensor networks fall short in applications that go beyond sensing and tracking, but require actuators to exert physical actions like opening, closing or even carrying sensors. In this scenario, actuators serving as Fog devices can control the measurement process itself, the stability and the oscillatory behaviours by creating a closed-loop system. For example, in the scenario of self-maintaining trains, sensor monitoring on a train’s ball-bearing can detect heat levels, allowing applications to send an automatic alert to the train operator to stop the train at next station for emergency maintenance and avoid potential derailment. In lifesaving air vents scenario, sensors on vents monitor air conditions flowing in and out of mines and automatically change air-flow if conditions become dangerous to miners. Decentralized Smart Building Control: The applications of this scenario are facilitated by wireless sensors deployed to measure temperature, humidity, or levels of various gases in the building atmosphere. In this case, information can be exchanged among all sensors in a floor, and their readings can be combined to form reliable measurements.The system components may then work together to lower the temperature, inject fresh air or open windows. Air conditioners can remove moisture from the air or increase the humidity. Sensors can also trace and react to
  • 23. movements (e.g, by turning light on or off). Fog devices could be assigned at each floor and could collaborate on higher level of actuation. With Fog computing applied in this scenario, smart buildings can maintain their fabric, external and internal environments to conserve energy, water and other resources. IoT and Cyber-physical systems (CPSs): Fog computing based systems are becoming an important class of IoT and CPSs. Based on the traditional information carriers including Internet and telecommunication network, IoT is a network that can interconnect ordinary physical objects with identified addresses. CPSs feature a tight combination of the system’s computational and physical elements. CPSs also coordinate the integration of computer and information centric physical and engineered systems. IoT and CPSs promise to transform our world with new relationships between computer-based control and communication systems, engineered systems and physical reality. Fog computing in this scenario is built on the concepts of embedded systems in which software programs and computers are embedded in devices for reasons other than computation alone. Examples of the devices include toys, cars, medical devices and machinery. The goal is to integrate the abstractions and precision of software and networking with the dynamics, uncertainty and noise in the physical environment. Using the emerging knowledge, principles and methods of CPSs, we will be able to develop new generations of intelligent medical devices and systems, ‘smart’ highways, buildings, factories, agricultural and robotic systems. Software Defined Networks (SDN): SDN is an emergent computing and networking paradigm, and became one of the most popular topics in IT industry. It separates control and data communication layers. Control is done at a central. ized server, and nodes follow communication path decided by the server. The centralized server may need distributed implementation. SDN concept was studied in WLAN, wireless sensor and mesh networks, but they do not involve multihop wireless communication, multi-hop routing. Moreover, there is no communication between peers in this scenario. SDN concept together with Fog computing will resolve the main issues in vehicular networks, intermittent connectivity, collisions and high packet loss rate, by augmenting vehicleto-vehicle with vehicle-to-infrastructure communications and centralized control. SDN concept for vehicular networks is first proposed in.
  • 24. 5.SECURITY AND PRIVACY IN FOG COMPUTING Security and privacy issues were not studied in the context of fog computing. They were studied in the context of smart grids and machine-to-machine communications .There are security solutions for Cloud computing. However, they may not suit for Fog computing because Fog devices work at the edge of networks. The working surroundings of Fog devices will face with many threats which do not exist in well managed Cloud. In this section, we discuss the security and privacy issues in Fog Computing. Security Issues The main security issues are authentication at different levels of gateways as well as (in case of smart grids) at the smart meters installed in the consumer’s home. Each smart meter and smart appliance has an IP address. A malicious user can either tamper with its own smart meter, report false readings, or spoof IP addresses. There are some solutions for the authentication problem. The work elaborated public key infrastructure (PKI) based solutions which involve multicast authentication. Some authentication techniques using Diffie-Hellman key exchange have been discussed in . Smart meters encrypt the data and send to the Fog device, such as a home-area network (HAN) gateway. HAN then decrypts the data, aggregates the results and then passes them forward. Intrusion detection techniques can also be applied in Fog computing [28]. Intrusion in smart grids can be detected using either a signature-based method in which the patterns of behaviour are observed and checked against an already existing database of possible misbehaviours. Intrusion can also be captured by using an anomaly-based method in which an observed behaviour is compared with expected behaviour to check if there is a deviation. The work develops an algorithm that monitors power flow results and detects anomalies in the input values that could have been modified by attacks. The algorithm detects intrusion by using principal component analysis to separate power flow variability into regular and irregular subspaces.
  • 25. 6. Combining User Behavior Profiling and Decoy Technology We posit that the combination of these two security features will provide unprecedented levels of security for the Cloud. No current Cloud security mechanism is available that provides this level of security. We have applied these concepts to detect illegitimate data access to data stored on a local file system by masqueraders, i.e. attackers who impersonate legitimate users after stealing their credentials. One may consider illegitimate access to Cloud data by a rogue insider as the malicious act of a masquerader. Our experimental results in a local file system setting show that combining both techniques can yield better detection results, and our results suggest that this approach may work in a Cloud environment, as the Cloud is intended to be as transparent to the user as a local file system. In the following we review briefly some of the experimental results achieved by using this approach to detect masquerade activity in a local file setting. A. Combining User Behavior Profiling and Decoy Technology for Masquerade Detection. 6.1 User Behavior Profiling Legitimate users of a computer system are familiar with the files on that system and where they are located. Any search for specific files is likely to be targeted and limited. A masquerader, however, who gets access to the victim’s system illegitimately, is unlikely to be familiar with the structure and contents of the file system. Their search is likely to be widespread and untargeted. Based on this key assumption, we profiled user search behavior and developed user models trained with a oneclass modeling technique, namely one-class support vector machines. The importance of using one-class modeling stems from the ability of building a classifier without having to share data from different users. The privacy of the user and their data is therefore preserved. We monitor for abnormal search behaviors that exhibit deviations from the user baseline. According to our assumption, such deviations signal a potential masquerade attack. Our previous experiments validated our assumption and demonstrated that we could reliably detect all simulated masquerade attacks using this approach with a very low false positive rate of 1.12% . 6.2 Decoy Technology We placed traps within the file system. The traps are decoy files downloaded from a Fog computing site, an automated service that offers several types of decoy documents such as tax return forms, medical records, credit card statements, e-bay receipts, etc. [10]. The decoy files are downloaded
  • 26. by the legitimate user and placed in highly-conspicuous locations that are not likely to cause any interference with the normal user activities on the system. A masquerader, who is not familiar with the file system and its contents, is likely to access these decoy files, if he or she is in search for sensitive information, such as the bait information 126embedded in these decoy files. Therefore, monitoring access to the decoy files should signal masquerade activity on the system. The decoy documents carry a keyed-Hash Message Authentication Code (HMAC), which is hidden in the header section of the document. The HMAC is computed over the file’s contents using a key unique to each user. When a decoy document is loaded into memory, we verify whether the document is a decoy document by computing a HMAC based on all the contents of that document. We compare it with HMAC embedded within the document. If the two HMACs match, the document is deemed a decoy and an alert is issued. 6.3 Combining the Two Techniques The correlation of search behavior anomaly detection with trap-based decoy files should provide stronger evidence of malfeasance, and therefore improve a detector’s accuracy. We hypothesize that detecting abnormal search operations performed prior to an unsuspecting user opening a decoy file will corroborate the suspicion that the user is indeed impersonating another victim user. This scenario covers the threat model of illegitimate access to Cloud data. Furthermore, an accidental opening of a decoy file by a legitimate user might be recognized as an accident if the search behavior is not deemed abnormal. In other words, detecting abnormal search and decoy traps together may make a very effective masquerade detection system. Combining the two techniques improves detection accuracy. We use decoys as an oracle for validating the alerts issued by the sensor monitoring the user’s file search and access behavior. In our experiments, we did not generate the decoys on demand at the time of detection when the alert was issued. Instead, we made sure that the decoys were conspicuous enough for the attacker to access them if they were indeed trying to steal information by placing them in highly conspicuous directories and by giving them enticing names. With this approach, we were able to improve the accuracy of our detector. Crafting the decoys on demand improves the accuracy of the detector even further. Combining the two techniques, and having the decoy documents act as an oracle for our detector when abnormal user behavior is detected may lower the overall false positive rate of detector. We trained eighteen classifiers with computer usage data from 18 computer science students collected over a period of 4 days on average. The classifiers were trained using the search behavior anomaly detection described in a prior paper. We also trained another 18 classifiers using a detection approach that combines user behavior profiling with monitoring access to decoy files placed in the local file system, as described above. We tested these classifiers using simulated masquerader data. Figure 1 displays the AUC scores achieved by both detection approaches by user model1. The results show that the models using the combined detection approach achieve equal or better results than the search profiling approach alone.
  • 27. 7. FOG COMPUTING ARCHITECTURE Fog Computing system is trying to work against the attacker specially malicious insider. Here malicious insider means Insider attacks can be performed by malicious employees at the providers or users site. Malicious insider can access the confidential data of cloud users. A malicious insider can easily obtain passwords, cryptographic keys and files. The threat of malicious attacks has increased due to lack of transparency in cloud providers processes and procedures .It means that a provider may not know how employees are granted access and how this access is monitored or how reports as well as policy compliances are analyzed. Fig 7.1: Fog Computing Architecture Above fig. states the actual working of the fog computing. In two ways login is done in system that are admin login and user login .When admin login to the system there are again two steps to follow: step1:Enter username step2:Enter the password . After successful login of admin he can perform all admin related tasks, but while downloading any file from fog he have to answer the security Question if he answer it correctly
  • 28. then only original file can be download. In other case, when admin or user answer incorrectly to the security question then decoy document (fake document) is provided to the fake user. Decoy technology work in the given manner if you have any word ,suppose “MADAM” in the document then some alphabets are replaced as M->A then the given word become “AADAA” which have no meaning. In some Case, if attacker getting to know that, M is replaced by A in the given document and by applying reverse engineering he get result as “MMDMM”. In any case he can’t judge content of document.When user login to the system he also have to follow the same procedure as admin. Operations like upload files/documents, download files/documents, view alerts, send message, read message, broadcast any message all these can be perform by the user. ALERT this stream provide the detail knowledge of attack done on their personal file/document with details like date, time, no of times the attacker trying to hack that file/document .Best thing of fog Computing is after each successful login the user get SMS on the mobile that „login successful‟. from this the user get alert when other else trying to gain access to his/her personal fog account and when attacker trying to download some files/documents then user also get SMS that contain attacker ip-address, attacker’s server name, date, time details on his/her mobile so that become easy to catch attacker by tracing all these things. In this way fog computing is more secure than the traditional cloud computing.
  • 29. 8. ADVANTAGES AND DISADVANTAGES ADVANTAGES The advantages of placing decoys in a file system are threefold:  The detection of masquerade activity.  The confusion of the attacker and the additional costs incurred to distinguish real from bogus information.  The deterrence effect which, although hard to measure, plays a significant role in preventing masquerade activity by risk-averse attackers. DISADVANTAGES  Nobody is identified when the attack is happen.  It is complex to detect which user is attack.  We cannot detect which file was hacking.
  • 30. 10. CONCLUSION With the increase of data theft attacks the security of user data security is becoming a serious issue for cloud service providers for which Fog Computing is a paradigm which helps in monitoring the behavior of the user and providing security to the user’s data. The system was developed only with email provision but we have also implemented the SMS technique. In Fog Computing we presenting a new approach for solving the problem of insider data theft attacks in a cloud using dynamically generated decoy files and also saving storage required for maintaining decoy files in the cloud. So by using decoy technique in Fog can minimize insider attacks in cloud. Could provide unprecedented levels of security in the Cloud and in social networks.
  • 31. 11. SCOPE FUTURE ENHANCEMENTS In our future work, this security system as we have explained is applicable only for single cloud ownership system. If the cloud owner has a more than one clouds to operate then our security system will not be applicable for providing security, therefore in the future enhancement we can enhance our existing application to manage a cloud environment which has more than one cloud architecture. Cloud computing is the future for organizations.The considerable benefits that provide will make eventually all the organizations totally move their processes and data to the Cloud. A lot of effort will be put in return to provision the appropriate security to make business on cloud environments. Although virtualization is already established, virtualization in the Cloud is still an immature area. The focus of future works should aim to harden the security of virtualization in multi-tenant environments. Possible lines of research are the development of reliable and efficient virtual network securities to monitor the communications between virtual machines in the same physical host. To achieve secure virtualized environments, isolation between the different tenants is needed. Future researches should aim to provide new architectures and techniques to harden the different resources shared between tenants. The hypervisor is the most critical component of virtualized environments. If compromised, the host and guest OSs could potentially be compromised too. Hypervisor architectures that aim to minimize the code and, at the same time, maintain the functionalities, provide an interesting future research to secure virtualized environments and the Cloud, especially to prevent against future hypervisor root kits.
  • 32. 12. REFERENCES  Cloud Security Alliance, “Top Threat to Cloud Computing V1.0,” March 2010. [Online].Available: https://cloudsecurityalliance.org/topthreats/csathreats.v1.0.pdf  Prevention Of Malicious Insider In The Cloud Using Decoy Documents by S. Muqtyar Ahmed, P. Namratha, C. Nagesh  Cloud Security: Attacks and Current Defenses Gehana Booth, Andrew Soknacki, and Anil Somayaji.  Overview of Attacks on Cloud Computing by Ajey Singh, Dr. Maneesh Shrivastava.  D.Jamil and H. Zaki, “Security Issues in Cloud Computing and Countermeasures,” International Journal of Engineering Science and Technology, Vol. 3 No. 4, pp. 2672-2676, April 2011.  K. Zunnurhain and S. Vrbsky, “Security Attacks and Solutions in Clouds,” 2nd IEEE InternationalConference on Cloud Computing Technology and Science, Indianapolis, December 2010.  W. A. Jansen, “Cloud Hooks: Security and Privacy Issues in Cloud Computing,” 44th Hawaii International Conference on System Sciences, pp. 1–10, Koloa, Hawaii, January 2011.  F. Bonomi, “Connected vehicles, the internet of things, and fog computing,”in The Eighth ACM International Workshop on Vehicular Inter-Networking (VANET), Las Vegas, USA, 2011.  http ://cnc.ucr.edu/security/glossary.  http://technet.microsoft.com/enus/library/cc959354.aspx  Cisco Cloud Computing -Data Center Strategy, Architecture,and Solutions http://www.cisco.com/web/strategy/docs/gov/CiscoCloudComputing_WP.pdf.  Fog Computing: Mitigating Insider Data Theft Attacks in The Cloud.[Online].Available:http://ids.cs.columbia.edu/sites/default/files/Fog_Comuting_Position_Pape r_WRIT_2012.pdf  M. Van Dijk and A. Juels, “On the impossibility of cryptography alone for privacy-preserving cloud computing,” in Proceedings of the 5th USENIX conference on Hot topics in security, ser. HotSec’10. ”Berkeley, CA, USA”: ”USENIX Association”, 2010, pp. 1–8.  J. A. Iglesias, P. Angelov, A. Ledezma, and A. Sanchis, “Creating evolving user behavior profiles automatically,” IEEE Trans. on Knowl. and Data Eng., vol. 24, no. 5, pp. 854–867, May 2012.  F. Rocha and M. Correia, “Lucy in the sky without diamonds: Stealing confidential data in the cloud,” in Proceedings of the 2011 IEEE/IFIP 41st International Conference on Dependable Systems and Networks Workshops, ser. DSNW ’11. Washington, DC, USA: IEEE Computer Society, 2011, pp. 129–134.