Cloud computing is getting more and more attention from the information and communication technologies industry recently. Almost all the leading companies of the information area show their interesting and efforts on cloud computing and release services about cloud computing in succession. But if want to make it go further, we should pay more effort on security issues. Especially, the Internet environment now has become more and more unsecure. With the popularization of computers and intelligent devices, the number of crime on them has increased rapidly in last decades, and will be quicker on the cloud computing environment in future. No wall is wall in the world. We should enhance the cloud computing not only at the aspect of precaution, but also at the aspect of dealing with the security events to defend it from crime activities. In this paper, I propose a approach which using logs model to building a forensic-friendly system. Using this model we can quickly gather information from cloud computing for some kinds of forensic purpose. And this will decrease the complexity of those kinds of forensics. A Log-based Approach to Make Digital Forensics Easier on Cloud Computing
The identification of evidence in the cloud computing environment can be very complex. To different deployment model, which knows as public cloud, private cloud and hybrid, has deep affection on forensics procedural. If the evidence resides within a public cloud, it will be much more difficult to identify. There are different computer forensic challenges related to the different services models, PaaS, IaaS and SaaS. These models present subtly different challenges to the forensic investigator. While trying to process the forensics procedural in cloud, we will meet grate obstruction at the very beginning. We cannot seize the hardware containing or processing the target applications from the cloud, as they can be everywhere in the world or even no real hardware such as Virtual Machine. By the use of Existing System, the nature of dynamic scaling up and down makes the possibility of losing information higher.
Here we should keep another log locally and synchronously, so we can use it to check the activities on cloud while without the help of the CSPs. The content that would be recorded in the log files (the log files can be files or database) should be decided by the CSPs, but not the agent itself. That is to say the log files should be operated by a module created by the CSP. This is to make sure that the log files stored in local and in cloud are comparable. The local log module will use that information on the log record locally. Then we compare the local log with the log files that are maintained in the cloud, we can easily identify the fake users.