IJARCCE adheres to the suggestive parameters outlined by the University Grants Commission (UGC) for peer-reviewed journals, upholding high standards of research quality, ethical publishing, and academic excellence.
Ensuring Security of Cloud Backups After Data Modification
N. P. Ponnuviji, M. Vigilson Prem
DOI: 10.17148/IJARCCE.2018.71202
Abstract:
Cloud storage acts as a cloud computing model, where the data is stored on the remote servers that could be accessed over the internet or the cloud. The cloud storage is completely operated and maintained by several Cloud Storage Service Providers (CSSPs). The CSSPs use storage servers for storing huge data in cloud. These servers are constructed by applying several virtualization techniques. There have been more contradictions on the terms ‘Cloud Storage’ and ‘Cloud Backup’. A cloud storage server only stores the files in the cloud and maintains the data as long as it is retrieved again for future use. As the size of the data grows, the complexity within it also increases. The expectation arises on the ability to make the data available constantly without any interruption. In such cases, we need to adopt more advanced sophisticated backup strategies, by making use of the cloud services to deliver consistent back-up solutions to the users and the organizations and also provide quick disaster-recovery solutions. In this paper, we discuss about the secure storage of data and analyse possible solutions on secure storage of backup data in the cloud.
Keywords:
Cloud Storage Service Providers (CSSPs), Backup as a Service (BaaS), Third-Party Auditor, Cloud storage, Cloud backup, key update, verifiability, Remote server
IoT – An Exquisitely Adequate Technology to Reinforce the “City of Future - Smart City”
Monika Kohli, Rohit Tiwari
DOI: 10.17148/IJARCCE.2018.71203
Abstract:
IoT (Internet of Things) is considered to be the backbone for divergent electronic devices with which the Smart Cities are furnished with. And such smart cities are becoming smarter than ever by virtue of the flourishing growth in state-of-the-art automated and metering technologies. This research paper emphases upon catering all-inclusive scrutiny over the Smart City conceptualization with its impetus and utilization. Furthermore, this review characterizes about the technologies participating in the construction of IoT infrastructure for a Smart City, with the notion of predominant characteristics and elements of the same. And last but not least, there is an elucidation of real-life incidents all over the world, as well as the major challenges to be confronted by IoT in present and coming future.
Keywords:
IoT (Internet of Things), Smart City, Smart Home, Sensor Network, Security, Reliability
Internet of Things Based Low-Cost Real-Time Home Automation
Akanksha, Priti Mishra
DOI: 10.17148/IJARCCE.2018.71204
Abstract:
Internet of Things (IoT) conceptualizes the idea of remotely connecting and monitoring real world objects (things) through the Internet [1]. When it comes to our house, this concept can be aptly incorporated to make it smarter, safer and automated. This IoT project focuses on building a smart wireless home security system which sends alerts to the owner by using Bluetooth in case of any trespass and raises an alarm optionally. Besides, the same can also be utilized for home automation by making use of the same set of sensors .This project presents an Internet of Things based real-time home automation and security system using Arduino UNO and HC-05 Bluetooth module which makes the system cost-effective and portable.
Mobile Application for Detecting Vehicle Accident and Tracking System Using GPS/GSM
Mr. Shubham Ingle, Miss. Ankita Shendkar, Mr. Sanjay Chavan, Prof. Avinash Palave
DOI: 10.17148/IJARCCE.2018.71205
Abstract: Accident is the authority thread for the people's life which causes a very harmful or dead full. The automotive companies have made lots of progress in reducing this thread, but still the probability of harmful effect due to an accident is not reduced. Contravention of speed is one of the fundamental reasons of accident. Therewithal, external pressure and change of angle with road surface blameworthy for this mistake. As possible as the emergency service could disclose about an accident, the more the effect would be reduced. For this purpose, we developed an Android application that detects an accidental situation and sends emergency alert message to the nearest police station and health care centre. This application is integrated with an external pressure sensor to extract the noticeable force of the vehicle body. It will calculate speed and change of tilt angle with GPS and accelerometer sensors respectively on Android phone. And checking conditions, this application also capable of reducing the rate of false alarm.
Keywords:
Detecting Vehicle Accident, Sensors, GPS module, GSM module, Intelligent Transport System (ITS)
An Evaluation Approach for Image Copy Detection Based on Verification of Scale invariant Feature Transform (SIFT) Matches
Sushma.B, Nagaveni.B.Biradar
DOI: 10.17148/IJARCCE.2018.71206
Abstract:
The So-called image copy detection based on an image and pasting in another location of the same image is common way to manipulate the image content and difficult to detect illegal copies of copyrighted images,. In this paper the existing system is based on matching pairs by descriptors mainly by using visual words for given image matching which is hard to distinguish between images. This technique is called Bag-Of-Words (BOW) Quantization which is used cannot solve the problem well. To address this problem Scale-Invarient Feature Transform (SIFT) matches between the images where the technique used is mainly based on BOW Quantization where global context regions is done to filter false images. And thus comparatively gives rich performance of encoding between the two images.
Collaborative Filtering Using a Regression - Based Approach and Classification-Based Approach
Rutansh Trivedi
DOI: 10.17148/IJARCCE.2018.71207
Abstract:
The world is connected through the power of social media platform. People want to know what other people are doing and according to that, they adapt themselves to challenge the new world. People want to acquire new skills according to their job carrier path. Skill seeking has been a tricky, tedious and time consuming task, because people looking for a new opportunity had to collect information from many different sources, this type of system is required. In this report skill, recommendation systems according to carrier path has proposed in order to automate and simplify this task, also increasing its effectiveness. However, current approaches rely on scarce manually collected data that often do not completely reveal people skills. Our work aims to find out relationships between jobs and people skills.
Keywords:
Deep Neural Network, Collaborative Filtering, linear regression, Mean Normalization, Gradient Decent, Regularization
Voice Controlled Sensor Network Measuring Human Physiological Parameters
Amulya KN, Shridhar J, Ankitha S
DOI: 10.17148/IJARCCE.2018.71208
Abstract:
A human body is an aggregate of various systems such as circulatory system, digestive system, nervous system and so on. Each system contains several physiological parameters that can be measured using 21st century sensors. With time these sensors have become more sophisticated, robust and compatible with existing measuring technologies and controllers. In this paper we interconnect various physiological bio sensors forming a sensor network. The parameters can be wirelessly monitored on a smartphone. In this paper we have put in efforts to control these networks using VOICE commands processed as strings of data.
In 21st Century Peoples are busy in their rushing life so they do not have time for their safety and security that's why we are dealing with this issue regarding home security. In kitchen, while using LPG Cylinder there are many problems occurs such as the time required for LPG Cylinder booking it also phases the problem with gas leakage. Our project can help for monitoring and detecting gas leakage by using the sensor as soon as gas leakage is detected an alert message sent to remote location user and it also stuck the gas supply by automatically switching off the regulator and immediately turns on the exhaust fan. It assurer the safety from leakage of the gas accident dysponea and detonation .this system has an additional purpose that is monitoring the weight of LPG cylinder continuously by using weighing sensor by informing user about gas left in the cylinder this system will help to purchaser to know about whether they are cheated by gas distributor providing less amount of LPG. The main advantage is automatically booking an LPG cylinder by sending an SMS to the distributor company and also alert the user with the help of GSM technology. It also helps physically challenged people by making them independent and secure them from any kitchen speculative.
Detecting and Analyzing the Impact of Weather Condition for Urban Region Traffic Management
Rommel Emmanuel
DOI: 10.17148/IJARCCE.2018.71210
Abstract:
With the evolution of technology in today’s world, the traffic in the cities have become more complex and complicated than ever before. The conventional traffic regulation methods are not adequate considering the future expansions and other needs. It is of utmost important to address the urban challenges such as traffic regulation, avoiding accidents and other chaos. In this work our aim is to analyze and solve the problems associated with the present urban traffic management to make it much simpler, efficient and at the same time future ready. With the advent of big data, the prediction of the places which are heavily affected due to high traffic, high accidents, chaos and effect of environmental weather changes such as floods, Tsunami, hurricane etc. can be predicted in real time. Hence the proposed approach can provide an efficient and convenient result to the daily commuters thereby providing updated traffic information while reducing regular problems associated with urban traffic.
Abstract: In this paper we present the Bigdata infrastructure for handling the large amounts of data processed in the Invest-ment industry. Various bigdata tools can be used for this purpose. But in this case, we are going with Hadoop. Hadoop is frame-work which is developed using Java programming language. It is a framework which uses various concepts of parallel and distributed computing to make the computational speed faster. This causes the programs to execute at a much larger speed with the help of few normal speed computers. This increases the affordability rate and makes it much more efficient. It uses its own file system called as Hadoop distributed file system, that is HDFS. HDFS is known for its security and high risk control. Since it runs in cluster there becomes absolutely no use of a Super computer to process data faster. Hadoop is the widely used big data processing engine with a simple master slave setup. One of the most common place where bigdata is most commonly uses is the share market industry. There are various reasons why bigdata is used in this field. The most common one being to increase the profits by understanding the pervious data. The analytics and understanding of data can only be possible if the large amounts of data is handled in a proper way. Suggesting the shares to users is one of the main concept of this paper. But rather than focusing on the analytical part of the framework our main aim is to make it easier for the admin to use bigdata so that the large amounts of data sets can be easier to process. This application can have a lot of advantages in the algorithmic trading. Algorithmic trading is a type of trading where different algorithms are used for buying and selling the shares. It can also have various use cases in stock brokering firms for processing large amount of data quickly. The main contribution of this paper is to integrate the cloud computing and Hadoop framework. The cloud computing in this project is a web based application which is directly connected to the Hadoop system. The parallel pipeline is developed for the purpose of easy handling of data by the admin. We also have developed a communication protocol upon TCP/IP for the purpose of the pipeline. The share market industry involves are huge amount of data and exabytes of data is processed every minute for various purposes. The investors usually go through all the data that is involved in their research purposes and try to analyse it. There are various factors and also attributes that the investors try to take into account when going through the data. Analysing data involves a large amount of data and very well built platform for it to support the data. In this paper, we have Identify applicable funding agency here. If none, delete this also built a web based platform which helps user analyse the data in a much simpler and graphical notation. The attributes include financial transaction patterns by the investor, market conditions and sentiments, macroeconomics variables, scheme level features, and demographic factors. Predicting the redemption behavior requires sophisticated platform that can capture multiple factors that affect the redemption behavior. However, these big data infrastructure provide us with various use cases, with tools like Hadoop and spark it becomes even more easier to find use cases at macro levels. This platform can investigate these factors for near real-time data and can provide highly accurate predictions for the redeeming investors in the future at a investor-level. Our results show that by implementing cloud computing, bigdata analytics and sofisticated algorithmic trading the results which are data driven can be used to generate a resonable amounts of profits and also the data could be processed in a much simple and faster way.
Diabetes Mellitus Prediction in Big Data-Using Hadoop / Map Reduce Frame work (Survey)
K.S Praveenkumar, Dr. R Gunasundari
DOI: 10.17148/IJARCCE.2018.71212
Abstract:
Diabetes Mellitus disease prediction is a growing research in healthcare. More over number of data mining methods have been applied to evaluate the main causes of diabetes, but only small sets of clinical risk factors are considered. So the results generated by such methods may not represent exact diabetes. We have to analyse number of factors such as Hereditary and genetics factors, Stress, Body Mass Index, Increased cholesterol level, High carbohydrate diet, Nutritional deficiency, Nature of Exercises, Tension and worries, High blood pressure, Insulin deficiency, Insulin resistance. Then we evaluate and compare this system using suitable rules and Map Reduce algorithm. The performance of the system is assessed in terms of different parameter like rules used, classification accuracy, and classification error. By considering all these parameters, the system can predict diabetics in a great accuracy. Also this paper surveys about different techniques and tools available in Big Data to predict Diabetes mellitus. Big Data can significantly diabetes research and ultimately improves the quality of health care for diabetics patients.
Keywords:
Diabetes Mellitus ; Big Data, Hadoop/Map Reduce; C4.5 algorithm
Mr.Shubham Shinde, Mr.Sanket Kathwate, Mr.Rohit Chamle, Prof. Nikhita Nerkar
DOI: 10.17148/IJARCCE.2018.71213
Abstract: Managing Attendance and Generating Question papers digitally is one of the important research topics in computer industry. There are automated systems which record the attendance of students and generate question papers for exams. In this paper, we have proposed a Digital Attendance System which will record the attendance of students and generate question papers. The proposed system will help in avoiding proxy attendance, reduce human efforts of paperwork, maintaining the attendance registers and generate reports as per the needs of academics. The question papers generated by the system will help in giving a good learning experience to students and also to get good results.
Abstract: Cloud computing offers the IT services to users. Cloud computing provides us cheaper, faster, flexible, efficient environment. Cloud computing provides multitudinous benefits to both service provider and customer. Security, being the major issues which hinder the growth of cloud computing service model due to the provision of handling confidential data by the third party is risky such that the consumers need to be more attentive in understanding the risks of data breaches in this new environment. Due to various advancements many companies are migrating to cloud environment. However, the security of cloud computing has been a challenging one. For increased security, the recommended approach is to combine two or more methods processes, the DNA sequences are used with Morse code and zigzag pattern, for encoding scheme. Use of Morse code and Zigzag pattern makes the attacker much harder to steal original data. Furthermore, the proposed scheme is implemented and the accuracy of encryption and decryption of data is verified.
Keywords: Morse code, DNA sequences, Cloud Computing, Morse Pattern, Zigzag Pattern, Data Block Security, Encryption, Decryption, Key Rotation.
Classification of Cyber-bullying using Convolution Neural Network
Koushik Sai Venkataramanan
DOI: 10.17148/IJARCCE.2018.71215
Abstract:
More than 1.96 billion are bound to have an inevitable social life. However, the growing decade poses serious challenges and the online-behaviour of users have been put to question. Increasing cases of harassment and bullying along with cases of fatality have been a serious issue. Though, many old-school models are available to control the mishap, the need to effectively classify the bullying is still feeble. To effectively monitor the bullying in the virtual space and to stop the deadly aftermath with implementation of Machine Learning and Language processing. In this paper, we propose a methodology to provide a binary classification of cyberbullying. Our method uses an innovative concept of CNN for text analysis however the existing methods use a naive approach to provide the solution with less accuracy. An existing twitter dataset is used for experimentation and our framework is verified with other existing procedures and is found to provide better accuracy and classification.
Big Data refers to huge volume of data which present everywhere, in human body as human protein and also present in our environment. Previous generations amassed vast collections of rocks, papers, photographs, punch cards, microfilm etc. But now it's becoming very difficult for industries to store, retrieve and processes the big data at real time. But now day, the high performance and reliable systems are required to subpart real time work. Map Reduce is a programming model for writing applications that can process Big Data in parallel on multiple nodes. Map Reduce provides analytical capabilities for analyzing huge volumes of complex data.
“Customer Relationship Management” an Approach to Manage Companies Interaction with Customers
Mr. Shinde H.S, Mr. Kushte S.J, Mr. Ghodake S.S
DOI: 10.17148/IJARCCE.2018.71217
Abstract:
In CRM space, the broadest applications are identified with the administration of the substance of customers' messages. The point is to reroute the particular demand consequently to the suitable administration or supply prompt responses to the most every now and again made inquiries. Distinctive advancements are utilized in CRM to accomplish this objective. Along these lines, we will exhibit a review on CRM.
Keywords:
Data Mining, Text Mining, CRM, Competitive Intelligence
Big Data Analytics in the Marketing of Automotives
Vishnu T B, Dr. Hemalatha
DOI: 10.17148/IJARCCE.2018.71218
Abstract:
It is already true that Big Data has drawn huge attention from researchers in information sciences, policy and decision makers in governments and enterprises. However, there are so much potential and highly useful values hidden in the huge volume of data. A new scientific paradigm is born as Data Intensive Scientific Discovery (DISD), also known as Big Data problems. Whether industrial 4.0 nor Internet industry, for today's industrial manufacturing enterprises, it should be to make full use of information and communication technology to deal with the arrival of smart and effective large data, combining products, machinery and human resources into together, according to the unexpected speed about the mode of sales product, it can change the manufacturing enterprises to process innovation and reform. The objective is to take the automobile manufacturing industry as an example, based on sale car large data analysis, using big data technology. Recent technology innovations, many of which are based on the capture and analysis of big data, are transforming the automotive industry in a pace deemed inconceivable just a short time ago. At the heart of this transformation is the new role of the car itself, and the increasingly sophisticated abilities that “intelligent cars” possess to communicate with individuals, enterprises, and devices around them. Company leaders in the automotive industry clearly recognize that by embracing the concept of big data, they can access a mass of opportunities for differentiation, growth, and innovation that revolutionize the very core of existing business models. In order to unlock this potential, the key challenge is to develop and implement a big data strategy, which is tailored to the capture, analysis, and interpretation of the ever increasing quantities of structured and unstructured data which will be received from drivers, vehicles, and other devices. Only those companies which incorporate a big data strategy in their transformation agendas will be abled to reap the rewards offered by the zeta byte revolution. As the objective suggests, the purpose of this research is to enhance the automotive industry by applying new strategy which analyze the Big Data.
The tools such as Hadoop and Map Reduce algorithm can be applied for achieving the objective.
Fake news is counterfeit information which is mostly not true or layered. It has been around for a long time and with the coming of online life and cutting edge reporting at its pinnacle, the discovery of fake news has been a well-known point in the exploration network. There are various difficulties associated with the detection of deceptive news. This paper describes and compares various fake news detection systems, techniques and approaches and gives comparative analysis on them.
User Interface Based High Payload Double Image Steganography
Priya R Sankpal, Anuradha MR
DOI: 10.17148/IJARCCE.2018.71220
Abstract:
Digital data, since its invasion has become an integral part of everyday life aspects. Digital data can be transmitted in a fast and inexpensive way through data communication networks without losing quality. Digital data can be stored efficiently with a very high quality, and it can be manipulated very easily using computers. Watermarking techniques have been developed to protect the digital data from such manipulations. In watermarking, the most challenging task is the embedding capacity in an image. The strength of watermarking can be changed for the improvement of resolution of the images in order to make the image indistinguishable from the original images. The proposed method suggests an effective and easily implementable way to secure the confidential information with a high payload capacity. Also a User Interface (UI) was developed, which makes the watermarking process user friendly. The UI provides the sender with the freedom of opting cover image and watermark image in which actual data is stored. Confidential data can also be entered dynamically by the user. This is then made secure by symmetric key encryption technique and transmitted over the media. The transformation that is employed is the 3 level DWT process. The PSNR and MSE are also calculated to verify the robustness of the algorithm.
A Convolutional Neural Network with K-Neareast Neighbor for Image Classification
Fatima Ahmed Abubakar, Souley Boukari
DOI: 10.17148/IJARCCE.2018.71201
Abstract:
Image classification forms the basis for computer vision which is a trending sub-field in Machine Learning. The Convolutional Neural Network (ConvNet) has recently achieved great success in many computer vision tasks. The common architecture of ConvNets contains many layers to recurrently extract suitable image features and feed them to the softmax function for classification which often displays low prediction performance. In this paper, we propose the use of K-Nearest Neighbor as classifier for the ConvNets and also introduce the use of Principal Component Analysis (PCA) for dimensionality reduction. When successfully implemented, the proposed system should be able to accurately classify images.
Keywords:
Image classification, Convolutional Neural Networks, Principal Component Analysis, K-Nearest Neighbor
Abstract: The report focuses on computers' roles in digital forensics. As the world moves toward digitalization, all of these industries are incorporating digitalization into their operations. Working procedures are becoming more efficient and effective as a result of digitalization. We only need a computer or a system to use digitalization. There are some forensics software tools available that must be installed on computers in order to perform the task. It provides us with the advantage of quick operations. The use of the Internet and information communication technology is rapidly increasing in the modern world. Almost all valuable and confidential information is stored in computers or computer-based systems, and the majority of people share their personal information on social networks like Facebook. Because of the advancement of information and communication technology, there has been a significant increase in the number of computer-based or online criminals all over the world. Criminals who commit murder, kidnapping, sexual assault, extortion, drug dealing, economic espionage & cyber terrorism, weapon dealing, robbery, gambling, economic crimes, and criminal hacking, such as web defacements and computer file theft, keep files containing convicting evidence on their computer.
Keywords: Computer forensics, computer crime, digital evidence, Digital Forensic.