VOLUME 4, ISSUE 11, NOVEMBER 2015
Nigerian Vehicle License Plate Recognition System using Artificial Neural Network
Amusan D.G, Arulogun O.T and Falohun A.S
Design and Implementation of a Digital Anti-Aliasing Filter using FPGA for Communication Systems
Dr. Kamal Aboutabikh, Dr. Ibrahim Haidar, Dr. Nader Aboukerdah, Dr. Amer Garib
Study and Analysis for Development of an Efficient OCR for Printed and Handwritten ODIA Documents: A Survey
Anupama Sahu, Sarojananda Mishra
Hiding Virtualization from Attacks
Navaneetha M, Dr. M. Shiva Kumar
A Compact Monopole Antenna for Wireless Personal Area Network
D. Das, R. Sharma, S. Roy and K. Mandal
Data Transmission or Mailing on Security Basis Using AES Algorithm
Akshay Gaikwad, Prashant P Buktare, Chanda chouhan
Plagiarism Detection Tool for Programs
Yash Sanzgiri, Kevin Garda, Akshay Pujare
A Review on Hybrid Approach for Game Tree Search on GPU and CPU
Dipali V. Patil, Kishor N. Shedge
A Secure Payment Scheme with Low Communication and Processing Overhead For Multihop Wireless Networks
Muhammad Puzhakkalaveettil, Ms. B. Mathumathi
Entropy based Spectrum Sensing in Cognitive Radio Networks
G. Vaidehi, N. Swetha, Panyam Narahari Sastry
Towards Effective Troubleshooting With Data Truncation
Karishma Musale, Gorakshanath Gagare
Watermarking of Encrypted Image using RC5 for DRM System
Dhatri Verma, Yogesh Rathore
Reliable Biometric Data Encryption Using Chaotic Map
Jincey John, Ashji S.Raj
Filtering Unwanted Contents from User’s Wall in Online Social Networks
Miss. Ujwala S.Tambe, Prof. Archana. S. Vaidya
Feature Extraction Approach for Content Based Image Retrieval
Komal Ramteke, Ashwini Vinayak Bhad
Interactive Segmentation for Change Detection using Fuzzy Local Information C-Means Clustering and SWT in Multispectral Remote-Sensing Images
V. Hima Bindu, G. Sreenivasulu
Image Compression by using Morphological Operations and Edge-Based Segmentation Technique
Sanjana Mathew, Shinto Sebastian
Multi-scale Block Compressed Sensing image Reconstruction using Smoothed Projected Landweber
C. Manohar, S. Swarnalatha
A Review on the Security Issues in Cloud Computing Models
Er. Ubeeka Jain, Ritika Trivedi
Android Blood Bank
Prof. Snigdha, Varsha Anabhavane, Pratiksha lokhande, Siddhi Kasar, Pranita More
Emotion Detection from Punjabi Text using Hybrid Support Vector Machine and Maximum Entropy Algorithm
Er. Ubeeka Jain, Amandeep Sandu
Opinion Mining: An Overview
Ananta Arora, Chinmay Patil, Stevina Correia
Blur Parametric Estimation on Natural Images for Linear Motion, Out-Of-Focus and Gaussian Blur for Blind Restoration
Roshini Romeo, Lisha Varghese
Types of SQL Injection attacks
Vineet Nayak,Nupur Kalra, Ankit Gera
Optimizing Query Performance with the Help of Query Optimization Tool
Dr.A.N. Banubakode, Ajay Jaiswal, Swapnil Magar, Jason Vijoy, Harshal Kasle
Two Wheeled Balancing Autonomous Robot
Mr. Stafford Michahial,Mr. Basavanna M, Dr.M.Shivakumar
Survey on right-protected data publishing with provable Distance-based mining
Mrs.P.Menaka, Ms.P.Samundeeswari
Design & Development of Optical Frequency Division Multiplexing System for Digital Broadcasting Standard
Sapna Sadyan, Er. Paras Chawla
Implementation of SDR-based high frequency range OFDM transceiver for Dedicated Short Range Communication
Dr .L. Nirmala Devi
An Efficient Approach to Content Based Image Retrieval
Mukunda D. Waghmare, Kailash Patidar
E-Medical Diagnosis using Semantic Web
Anish Nair, Shantanu Kawlekar, Sharvil Kadam, Neepa Shah
Analysing Success Possibility of a Mobile Application Using Data Mining Technique
Sharvil Kadam, Shantanu Kawlekar, Anish Nair
A Survey on Data Mining and Digital Forensics Techniques for Intrusion Detection and Protection System
M. Jayamagarajothi, P. Murugeswari
Ciphertext-Policy Attribute based Data-Sharing with Enhanced Productivity and Security
Kavita Patil, Vidya Chitre
Triphasic CT Liver Characterization and Color Data Fusion
Silvana G. Dellepiane, Mahdieh Khalilinezhad, Roberta Ferretti
Prevention against Hacking using Trusted Graphs
Yash Sanzgiri, Kevin Garda, Arush Vichare
A Survey on fuzzy expert system for improving microarray data classification accuracy
Deepakkumar.S, Mohankumar.M
A Review on Functional Encryption Schemes and their usage in VANET
Sandhya Kohli, Kanwalvir Singh Dhindsa, Ravinder Khanna
Natural Language Database Interface
Asst. Prof.Rakhee Kundu, Asst. Prof. Poonam Gholap, Asst. Prof. Snehal Mane
Implementation and Analysis of QoS Aware Routing Protocol for the MANET
Surabhi Gupta, Manish Saxena
Compression of Encrypted Image using Wavelet Transform
Ravi Prakash Dewangan, Chandrashekhar Kamargaonkar
A Review of Feature Selection Algorithms to Identify Risk Factors for Liver Disease
Suri Yaddanapudi, Madhanan Balaram
Antenna Simulation Tools and Designing of Log Periodic Dipole Antenna with CST Studio
Vijay Kale, Dnyandev Patil
A Survey in Scheduling For Real-Time Tasks on Virtualized Clouds
Gowthami .R, Boopal .N, S. Gunasekaran
Weighted Archetypal Analysis used for Text Summarization
Miss. Vaishali Siddharam Shakhapure, Prof. A. R. Kulkarni
Survey on Anomaly Detection in Web Usage Mining
Navareena .A, Kathiresan .V, D.Gunasekaran
Development of a Content Based Recommender Using Dynamic Artificial Neural Network
Md Zahidul Islam, Feroza Naznin, Asaduz Zaman
A Case Study on Markov Model for Double Fault Tolerance, Comparing Cloud based Storage System
Swaroop Tewari, Mohana Kumar. S, Dr. S N Jagadeesh
Advanced Computerized Scheme for Detection of Lung Nodules by Incorporating VDE Image
Bhagyashree Nemade
Different Approaches for the Removal of Different Valued Salt and Pepper Noise in Images using Spartan 3
Madhuri Derle, Gorakshanath Gagare
Review on Biometric Authentication Methods
Mr. Mule Sandip S., Mr.H.B.Mali
Review of Soft Computing Techniques: Exploring Scope
Dr. Ranjana Rajnish, Dr. Parul Verma
A survey on Security based data dissemination for VANETs
C.Kiruthika, Ms.N.Gugha Priya
A Novel Approach for Detecting Image Forgery
Nilu Treesa Thomas, Anju Joseph, Shany Jophin
A Survey on Privacy and Security of Data Classification
Brindha.M, Prof. S.V.Hemalatha
Detection of Periodic Limb Movement with the Help of Short Time Frequency Analysis of PSD Applied on EEG Signals
Mohd Maroof Siddiqui, Geetika Srivastava, Syed Hasan Saeed
A New Technique for Face Matching after Plastic Surgery in Forensics
Anju Joseph, Nilu Tressa Thomas, Neethu C. Sekhar
An Efficient Semantic Dynamic Query processing based on user interest for web Database using FCM Algorithm
Prabha .P, Vijayakumar .P
Students’ Academic Failure Prediction Using Data Mining
Lumbini P. Khobragade, Prof. Pravin Mahadik
SLM Transmitter and Receiver for PAPR Reduction of OFDM System
Khyati Keyur Desai
Accurate Object Detection and Semantic Segmentation using Gaussian Mixture Model and CNN
Sakshi Jain, Satish Dehriya, Yogendra Kumar Jain
A New Secure Image Transmission Technique via Mosaic Images Using Genetic Algorithm
Surya .T.S, Deepthy Mathews
Approach of Jordan Elman Neural Network to Diagnose Breast Cancer on Three Different Data Sets
S.Swathi, P. Santhosh Kumar, P.V.G.K. Sarma
Formation of RADAR Images and Their Fusion Using Wavelet Transform
Yogesh V.Chandratre, Rajesh R.Karhe
Indian Share Market Forecasting with ARIMA Model
Swapnil Jadhav, Saurabh Kakade, Kaivalya Utpat, Harshal Deshpande
A Novel Knowledge Expert System for Water Type Analysis
D. Anusha, Ch.V. Sarma
Visual Cryptography with Color Error Diffusion and Digital Watermarking
Prof Akhil Anjikar, Prof. Rahul Bambodkar
Realization of Synchronized Computation and Communication Using Penta MTJ Elements
A. Lakshminarayanan, V. Krishnakumar, N. Jayapal, R. Shankar, K. Shajudeen
Controlling Attacks and Intrusions on Internet Banking using Intrusion Detection System in Banks
Pritika Mehra
K-means Clustering with MapReduce Technique
Yaminee S. Patil, M. B. Vaidya
Service Populating and QoS Aware Mechanism for Cloud Based Environment
Smita Patil, Arpit Solanki
A Novel Fingerprint Compression Method Based On Sparse Representation
Mahesh N. Karanjkar, Trishala K. Balsaraf
Smart Wheel Chair using Neuro – Sky Sensor
Mr. M. Selva Ganapathy, Mrs. N. Nishavithri
Implementation of Area-Efficient and Low Power OFDM Architecture
Rajidi Sahithi, Dr. T. Venkata Ramana
Unauthorized Access Point Detection in Wireless LAN
Mr. Amol S. Papade, Mr. Vikas E. Pansare, Mr. Rohit D. Patil, Prof. S. S. Gore
Problem Analysis of Multiple Object Tracking System: A Critical Review
Md Zahidul Islam, Md Shariful Islam, Md Sohel Rana
VM Selection using Index Approach for Deploying Cloud Computing Application and Approach to Obtain Equal Utilization of Virtual Machine
Garima Dubey, Yogendra Kumar Jain
A Polynomial Time Algorithm to Determine Singly Connectivity in Directed Graph
Ishwar Baidari, Rashmi Gangadhar
Design and Development of a Vehicle Monitoring System Using CAN Protocol
Mohammed Ismail. B, K. Sasidhar, Syed Aquib Ahmed
Detection of False Sub Aggregated Data in Wireless Sensor Networks
B. Gowtham, A.L. Sreenivasulu
Web Mining – Data Mining Concepts, Applications, and Research Directions
Mrs. S. R. Kalaiselvi, S. Maheshwari, V. Shobana
Deduplication Techniques in Storage System
Deepali Choudhari, R. W. Deshpande
A Survey of Context-Aware Framework for Pervasive Environment
K. H. Walse, Dr. R.V. Dharaskar, Dr. V. M. Thakare
A Review on Various Techniques of Secure Signature Verification: SIFT, SURF and G-SURF
Ritika Sachdeva, Ekta Gupta
Comparison of Contemporary Real Time Operating Systems
Mr. Sagar Jape, Mr. Mihir Kulkarni, Prof. Dipti Pawade
A Review On Clustering Of Streaming Data
Madhuri Vilas Gohad, Prashant Yawalkar
A Review On Credit Card Fraud Detection Using BLAST-SSAHA Method
Mr Yogesh M Narekar, Mr Sushil Kumar Chavan
Comparative Analysis Of Microstrip Rectangular Patch Antenna Using Different Height Substrates
Sudarshan Kumar Jain
A Review on Categorization of Text Data Using Side Information
Sandeep Jadhav, Dr. K. V. Metre
Virtual Clustering Based Routing For Power Heterogeneous Manets
M. Vevagananthan, Mr.P.Vijayakumar
Recommendation System for Answering Missing Tuples
Kanchan Pekhale, Dr. K. V. Metre
Review Paper on Security Intelligence with Big Data Analytics
Chiquita Prabhu, Omkar Neogi, Kriti Shrivastava, Neha Katre
A Review on Nearest Neighbour Techniques for Large Data
Deoyani Sonawane, Prof. P. M. Yawalkar
Co and Adjacent Channel Interference Evaluation in GSM and UMTS Cellular Networks
Selma Sbit, Mohamed Bechir Dadi, Belgacem Chibani
Shadow Removal from Images Using the Concept of Chromaticity
Harpreet Kaur, Navdeep Kaur
Fingerprint Compression based on Sparse Representation using Pixel Level Patch Decomposition
J. Saikrishna, Dr. T. Sreenivasulu Reddy
Analysis of EEG Rhythms in Epilepsy Patients Using MPSO Method
M.J.S Joshi, R.S.K Vaibhav, R.V.S Satyanarayana
Comparative Analysis of Techniques for Detecting Copy-Move Image Forgery
Pameli Mukherjee, Saurabh Mitra
Intelligent Techniques with GUI by Challenge Keypad for Secure Password
Krishna S. Gaikwad, Prof. Amruta Amune
Internet of Things (IOT) Standards, Protocols and Security Issues
Ahmed Mohammed Ibrahim Alkuhlani, Dr S.B. Thorat
Ant Colony Optimization Algorithm for Composition of Web Service using Mobile agents based Semantic, WSDL and QOS analysis
Mr. Sunil R Dhore, Prof. Dr M U Kharat
A Review Paper on 2D to 3D Face Recognition Technology
Vinita Bhandiwad
Survey of Efficient Algorithms in Data Mining For High Utility
V. Keerthy, Mrs. B. Buvaneswari
A Multimodal Biometric Identification System Using Finger Knuckle Print and Iris
Sukhdev Singh, Dr. Chander Kant
Novel Approach for Policy Network Extraction from Web Documents
Miss. Archana Loni, Mrs. Aarti Waghmare
Enhancing Text Mining Using Side Information
Prof. P. S. Toke, Rajat Mutha, Omkar Naidu, Juilee Kulkarni
Test Validation of Phase Switched Interferometer Module for Calibration of MST Radar Array
T. Rajendra Prasad, N. Vismayee, T. Venkateswarlu, P. Satyanarayana
Grammatical Error Detection Model for Assamese Sentences
Hirakjyoti Sarma, Debasish Das, Kishore Kashyap
Implementation of Surveillance Monitoring System
Ms. Moharil R. S, Dr. Mrs Patil S. B
Detect Pedestrian Orientation by Integrating Multiclass SVM Utilizing Binary Decision Tree
G. Santoshi, G Gowri Pushpa
Test Validation of MST Radar 3-channel Digital Receiver
T Rajendra Prasad, K Gayathri, T Venkateswarlu, P Satyanarayana
Design and Analysis of Ultra Wide Band Giuseppe Peano Fractal Antenna at Different Height Level of Substrate
Shikha Verma, Sumit Kaushik, Mandeep Singh Saini
“Efficient Method For Selecting Cluster Head In TRCA Clustering for MANET”
Jitendra Singh Yadav, A P Singh
Abstract
Nigerian Vehicle License Plate Recognition System using Artificial Neural Network
Amusan D.G, Arulogun O.T and Falohun A.S
DOI: 10.17148/IJARCCE.2015.41101
Abstract: In this paper, a development of Nigeria Vehicle License Plate Recognition (NVLPR) system using artificial neural network is done. Vehicle License Plate Recognition (VLPR) is a special form of optical character recognition (OCR) which enables computer systems to read automatically the registration number of vehicles from digital pictures for the purpose of traffic control, security, access control to restricted areas, tracking of cars, tracing of stolen cars, identification of dangerous and reckless drivers on the road. This system is divided into three major parts: vehicle license plate detection, vehicle license plate character segmentation and License Plate character recognition. In vehicle license plate detection there are many challenges such as, complex plate background, illumination in consistencies, vehicle motion, distance changes for which edge detection analysis was explored. In this work, 200 vehicle license plates were captured, some with clear characters, others with blur and dirty stains. The character feature extraction and plate recognition accuracies were determined. Results showed that plates without blur and stain were most accurately extracted and recognized while satisfactory results were also obtained for the others.
Keywords: Software, License plate detection and recognition, Optical Character Recognition (OCR), Nigerian vehicle license plate, artificial neural network.
Abstract
Design and Implementation of a Digital Anti-Aliasing Filter using FPGA for Communication Systems
Dr. Kamal Aboutabikh, Dr. Ibrahim Haidar, Dr. Nader Aboukerdah, Dr. Amer Garib
DOI: 10.17148/IJARCCE.2015.41102
Abstract: In this paper, we discuss a practical way to synthesize and filter the base band signal in the background of different interference signals resulting from white noise placed within the signal band and sinusoidal interference signal placed out of the signal band. This will be done by a one digital filter of type FIR LPF with Hamming window by using a digital programmable device (Cyclone II EP2C70F896C6 FPGA from ALTERA) placed on education and development board DE2-70. This filter is designed using VHDL language within Quartus II 9.1 design environment. The proposed method depends on designing a set of DDFS to synthesis the base band signal and the various interference signals so the specifications of the base band signal become very closed from the real one, and designing a FIR LPF, so the result is removing all the interference signals and overcoming on the anti-aliasing phenomena by this digital filter. The FIR LPF operation results are studied by using a digital oscilloscope for input and output signals due to different sinusoidal interference signals and white noise in time and frequency domains.
Keywords: Anti-AliasingFilter, FIR LPF, DDFS, FPGA.
Abstract
Study and Analysis for Development of an Efficient OCR for Printed and Handwritten ODIA Documents: A Survey
Anupama Sahu, Sarojananda Mishra
DOI: 10.17148/IJARCCE.2015.41103
Abstract: The OCR (optical character recognition) is the process of translating the hand written or printed text into a format that is understood by the machine for the purpose of editing, searching and indexing. Preprocessing, segmentation, features extraction, classification and post processing are the main phases of any OCR system and these specific fields are in use today. For all these tasks the segmentation plays a very crucial role in the overall performance of the OCR system. Segmentation can further divided into line, word and character. In this paper, we have discussed different segmentation methods used in various domains. Some of the methods are used for handwritten documents and some of the methods are printed documents. The major focus of this research is to identify the approach that can be segmented into compound and fused character symbols. After the analysis of the existing segmentation methods, we have concluded the favored methods for compound and fused character symbols which are better for the next research. Segmentation is always a frontier area of research in the field of image processing and pattern recognition. There is a large demand for OCR on odia handwritten documents. The objective of this paper is to present a survey of different exiting segmentation methods that have been developed during the last decade. The paper is concluded by suggesting the future aspect of research in this research area.
Keywords: OCR, Text line Segmentation, Word Segmentation, Character Segmentation, Odia Handwritten and Printed documents.
Abstract
Hiding Virtualization from Attacks
Navaneetha M, Dr. M. Shiva Kumar
DOI: 10.17148/IJARCCE.2015.41104
Abstract: This paper presents a new concept of security assessment methodology for the hiding of virtualization from the attacks and several areas of its application. Virtual machine environments (VMEs) let a user or administrator sprint one or more guest operating systems on top of a host operating system�for example, three or four instances of the Microsoft Windows operating system could run as caller systems on a Linux host operating system on a single PC or server. Such environments are widely used as patrons or servers in a diversity of commercial, administration, and martial organizations. Beyond normal business operations, security researchers and honey pot technologies often leverage VMEs to analyse malicious code discovered in the feral to determine its functionality, business model, origin, and author. Because VMEs offer useful monitoring and isolation capabilities, malware researchers are increasingly reliant on these products to conduct their trade.
Keywords: Include at least 4 keywords or phrases.
Abstract
A Compact Monopole Antenna for Wireless Personal Area Network
D. Das, R. Sharma, S. Roy and K. Mandal
DOI: 10.17148/IJARCCE.2015.41105
Abstract: A compact planar monopole antenna suitable for wireless personal area network (WPAN) application is proposed. By simply modifying the periphery of a rectangular patch, effective operational band 2.95 to 16.8 GHz is achieved. This band sufficiently covers the operating band, 3.1�10.6 GHz, of the new wireless personal area network (WPAN) using ultra wide band (UWB) technique. A prototype of the proposed antenna is fabricated and measured to verify the simulated performances of the proposed antenna. The antenna provides fractional bandwidth of 138%, peak antenna gain of 2.4 dBi. The proposed antenna is compact and of small size (20 X 30 X 1.6 mm3) with a 50? feed line and offers a very simple geometry, suitable for low cost fabrication using printed circuit board technology.
Keywords: Monopole antenna, wireless personal area network, ultra-wideband, rectangular patch.
Abstract
Data Transmission or Mailing on Security Basis Using AES Algorithm
Akshay Gaikwad, Prashant P Buktare, Chanda chouhan
DOI: 10.17148/IJARCCE.2015.41106
Abstract: The paper highlights the security needed door transmission of data on a wireless network, with the technology moving forward wireless communications has grown enormously. It is necessary to provide security for wireless data transfer as it is more vulnerable to the security attacks .70% of the online data transfer is in the images This implies it is important to provide security to text as well as images .Images have large data size and also has real time constraints problem hence the same cipher cannot be applied to encrypt images and text. But with some manipulation ARE can be used for this purpose. Here we modify AES by adding key stream generator (CEK) for enhanced security.
Keywords: AES, Cryptography, Image encryption, Wireless transmission.
Abstract
Plagiarism Detection Tool for Programs
Yash Sanzgiri, Kevin Garda, Akshay Pujare
DOI: 10.17148/IJARCCE.2015.41107
Abstract: Plagiarism is an act of Fraud which involves copying someone else�s work and stating it as our own without giving proper credit to that person. Plagiarism was seen earlier in various fields such as literature, science etc. Nowadays Plagiarism can also be seen in Programs especially in colleges where programming assignments are performed. Plagiarism is an easy to do task, but very difficult to detect without proper tool support. This report presents an overview of the tool which would be developed for detecting plagiarism. The tool would help detect plagiarism efficiently and help the faculty grade the students effectively
Keywords: Plagiarism, Program similarity, Karp Rabin, Tokenization.
Abstract
A Review on Hybrid Approach for Game Tree Search on GPU and CPU
Dipali V. Patil, Kishor N. Shedge
DOI: 10.17148/IJARCCE.2015.41108
Abstract: In the field of game theories and artificial intelligence Game tree search is a classical problem. The general use of GTS algorithm is in real time applications having much higher complexity like video games, chess, connect6, Go etc. Different algorithms for game tree are used to find the player's next best move on the game tree in minimum time. Main focus of the system is on increasing massive parallelism capabilities of GPUs to accelerate the speed of game tree algorithms and propose a general parallel game tree algorithm on GPUs. In game tree search, GPU surpasses a single CPU if high level of parallelism is achieved because of its searching is in BFS manner and CPU is in DFS manner so that CPU did not produce improvement. Here combination of DFS and BFS is main focus and selection will be the depth-first search on CPU and use breadth-first search on GPU. CPU can be responsible for generating number of choices of players' moves as a tree structure and parallel evaluation of these moves can be perform using GPU. It intends to look into a hybrid CPU-GPU solutions.
Keywords: SIMD, GPU, GTS, SUDOKU, Parallel Computing.
Abstract
A Secure Payment Scheme with Low Communication and Processing Overhead For Multihop Wireless Networks
Muhammad Puzhakkalaveettil, Ms. B. Mathumathi
DOI: 10.17148/IJARCCE.2015.41109
Abstract: We propose RACE, a report-based payment scheme for multihop wireless networks to stimulate node cooperation, regulate packet transmission, and enforce fairness. The nodes submit lightweight payment reports (instead of receipts) to the accounting center (AC) and temporarily store undeniable security tokens called Evidences. The reports contain the alleged charges and rewards without security proofs, e.g., signatures. The AC can verify the payment by investigating the consistency of the reports, and clear the payment of the fair reports with almost no processing overhead or cryptographic operations. For cheating reports, the Evidences are requested to identify and evict the cheating nodes that submit incorrect reports. Instead of requesting the Evidences from all the nodes participating in the cheating reports, RACE can identify the cheating nodes with requesting few Evidences. Moreover, Evidence aggregation technique is used to reduce the Evidences� storage area. Our analytical and simulation results demonstrate that RACE requires much less communication and processing overhead than the existing receipt-based schemes with acceptable payment clearance delay and storage area. This is essential for the effective implementation of a payment scheme because it uses micropayment and the overhead cost should be much less than the payment value. Moreover, RACE can secure the payment and precisely identify the cheating nodes without false accusations.
Keywords: Multihop wireless networks (MWN), Report-based pAyment sChemE (RACE), Mobile ad-hoc network (MANET), Wireless sensor network (WSN), Dynamic Source Routing(DSR), Rivest Shamir Adleman (RSA), Common Language Runtime(CLR), Common Type System(CTS).
Abstract
Entropy based Spectrum Sensing in Cognitive Radio Networks
G. Vaidehi, N. Swetha, Panyam Narahari Sastry
DOI: 10.17148/IJARCCE.2015.41110
Abstract: Spectrum sensing is one of the important tasks of Cognitive Radio Networks (CRN). Though many spectrum sensing techniques are available, sensitivity to noise uncertainty is the basic limitation for these techniques. In this paper, an improved entropy based detection technique in frequency domain is proposed. This work investigates the detection performance using Renyi and Tsallis entropy methods in both single node as well as multi node scenario. Simulations were carried out using QPSK and OFDM signals. The performance is evaluated by considering fading channels like Rician, Rayleigh and Nakagami-m fading. The proposed method could achieve 3 dB improvement compared to the Shannon entropy technique, with and without fading channels. The results have shown that Renyi entropy outperforms Tsallis entropy with significant improvement in SNR wall.
Keywords: Cognitive radio, entropy detection, Shannon entropy, Renyi entropy, fading channels.
Abstract
Towards Effective Troubleshooting With Data Truncation
Karishma Musale, Gorakshanath Gagare
DOI: 10.17148/IJARCCE.2015.41111
Abstract: The process of fixing bug is bug triage or bug assortment. The aim of this, to correctly assign a developer to a new bug. Triaging these incoming reports manually is error-prone and time consuming.Software companies pay most of their cost in dealing with these bugs. For software repositories traditional software analysis is not completely suitable for the large-scale and complex data.To reduce time and cost of bug triaging,present an automatic approach to predict a developer with relevant experience to solve the new coming report. In proposed approach explain data reduction on bug data set which will reduce the scale of the data as well as increase the quality of the data.And also give domain specific bugs with their solution by developers. For implementing this use instance selection and feature selection for reducing bug of data. And Top-K pruning algorithms for tackling domain specific task.
Keywords: Bug,Bug Triage,repositories,instance selection.
Abstract
Watermarking of Encrypted Image using RC5 for DRM System
Dhatri Verma, Yogesh Rathore
DOI: 10.17148/IJARCCE.2015.41112
Abstract: Digital image security is an important issue in internet and network application. Digital images are distributed in encrypted format and watermarking of these images for authentication and copyright protection. We need media authentication in encrypted domain to enhance security. So sometime necessary to watermarking in encrypted media for copyright management and authentication. In this paper we use block cipher called RC5 encryption algorithm to encrypt image which gives minimum correlation coefficient and maximum throughput among all block cipher. We use LSB method to watermark the image. In this method we embed watermark in encrypted image and extraction of watermark in decrypted domain.
Keywords: Encryption, watermarking, RC5 encryption algorithm, block cipher, LSB method.
Abstract
Reliable Biometric Data Encryption Using Chaotic Map
Jincey John, Ashji S.Raj
DOI: 10.17148/IJARCCE.2015.41113
Abstract: Recently, one of the significant methods of user identification is biometric identification. It has gained its popularity because of the non-repudiation of data, though it suffers problems such as biometric template storage, biometric spoof and the resulting security problems. To alleviate the problems, biometric encryption has become the focus of present studies. User's biometric data includes data such as face, fingerprint, iris, retinal scan, palmprint and hand vein etc. The traditional algorithms of encryption such as AES and DES are not suitable for biometric image encryption. The recent research on image encryption is in the chaos based encryption. In this paper, we propose a novel reliable algorithm for enhancing the security of biometric data using the chaotic map. A user intrinsic key from the biometric data is used to generate the chaotic sequence for encryption and for increasing the reliability.
Keywords: Biometric security; chaos; logistic map; diffusion.
Abstract
Filtering Unwanted Contents from User’s Wall in Online Social Networks
Miss. Ujwala S.Tambe, Prof. Archana. S. Vaidya
DOI: 10.17148/IJARCCE.2015.41114
Abstract: Now a day�s, people can communicate with each other by exchanging the multimedia information. It may include textual data, images, audio or video. Online social Network (OSN) provides different services to the user. In online social networking sites like facebook, twitter, etc., there is possibility of posting any kind of data on user wall. Such data may contain unwanted messages or images. Other user can view such data and can comment on such post. Such post may affect user social image. So security of such user�s personal wall is an important issue. So, to avoid this, we propose a system which provides facility to user to control the unwanted messages and images posted on their wall and a system that allow users to restrict particular user on the basis of its social reputation, also user can extract labels from posted images and filter it, also take a decision whether to allow such content or not. This system also suggests malicious users from their behaviour and block frequency. Hence, this system, avoids nuisance of unwanted messages and images on the walls.
Keywords: online social networks, short text classifier, message filtering, labelled image, machine learning.
Abstract
Feature Extraction Approach for Content Based Image Retrieval
Komal Ramteke, Ashwini Vinayak Bhad
DOI: 10.17148/IJARCCE.2015.41115
Abstract: Content Based Image Retrieval (CBIR) is a significant and increasingly popular approach that helps in the retrieval of image data from a huge collection. Image representation based on certain features helps in retrieval process. Three important visual features of an image include Color, Texture and Histogram. Here image retrieval techniques used are color dominant, texture and histogram features. Using that technique, as a first step an image can be uniformly divided into coarse partitions. GLCM (Gray Level Co-occurrence Matrix) is used here for texture representation for image retrieval based. Although a precise definition of texture is untraceable, the notion of texture generally refers to the presence of a spatial pattern that has some properties of homogeneity. Color histogram is the most important color representation factor used in image processing. Color histogram yields better retrieval accuracy. Histogram finds out the number of pixels in gray level. After that we are applying Euclidean distance, Neural Network, Target search methods algorithm and K-means clustering algorithm for retrieval of images from the database and making a comparison based approach between them to see which method helps in fast retrieval of images in terms of distance and time.
Keywords: Color feature extraction, Texture feature extraction, Histogram based extraction, image database, Euclidean distance, neural network, Neighbouring Divide-and-Conquer Method and Global Divide-and-Conquer Method, K-means clustering, Threshold=15000.
Abstract
Interactive Segmentation for Change Detection using Fuzzy Local Information C-Means Clustering and SWT in Multispectral Remote-Sensing Images
V. Hima Bindu, G. Sreenivasulu
DOI: 10.17148/IJARCCE.2015.41116
Abstract: Change Detection is a process that analyzes images of the same scene taken at different times in order to identify changes that may have occurred between the considered acquisition dates. In the last decades, it has attracted widespread interest due to a large number of applications in diverse disciplines such as remote sensing, medical diagnosis, and video surveillance. The proposed method is unique in the following two aspects: 1) producing difference images by fusing a mean-ratio image and a log-ratio image and 2) improving the fuzzy local-information c-means (FLICM) clustering algorithm, which is insensitive to noise, to identify the change areas in the difference image, without any distribution assumption. With the development of remote sensing technology, change detection in remote sensing images becomes more and more important. Among them, change detection in synthetic aperture radar (SAR) images exhibits some more difficulties than optical ones due to the fact that SAR images suffer from the presence of the speckle noise. However, SAR sensors are independent of atmospheric and sunlight conditions, which make the change detection in SAR images still attractive. SAR-image change detection is mainly relied on the quality of the difference image and the accuracy of the classification method.
Keywords: Change detection (CD), mean-ratio image, Fuzzy local-information c-means (FLICM) clustering algorithm.
Abstract
Image Compression by using Morphological Operations and Edge-Based Segmentation Technique
Sanjana Mathew, Shinto Sebastian
DOI: 10.17148/IJARCCE.2015.41117
Abstract: Image compression is the application of Data compression on digital images. In DCT-based compression standards such as conventional JPEG algorithm, an image is divided into eight-by-eight blocks and then the 2-D Discrete cosine transform (DCT) is applied to encode each block. Another DCT based algorithm called as Modified JPEG compression algorithm [1], overcomes the limitations of conventional JPEG by dividing an image into trapezoid and triangular blocks according to the shape of the objects and achieves a higher compression ratio. This makes the JPEG algorithm much more flexible. There are many compression standards available nowadays to compress images with a higher compression ratio. But here, a new compression technique is proposed that is much like the Modified JPEG compression with certain modifications. In this method, edge detection algorithms followed by morphological operations are used to segment the original image into interior part and the exterior part. The proposed method is compared and is found to be better than the existing compression standard such as JPEG. The simulation result shows that the proposed algorithm outperforms JPEG with a high Bpp vs PSNR graph.
Keywords: JPEG, SA-DCT, Morphological operations.
Abstract
Multi-scale Block Compressed Sensing image Reconstruction using Smoothed Projected Landweber
C. Manohar, S. Swarnalatha
DOI: 10.17148/IJARCCE.2015.41118
Abstract: Compressed sensing is new technique for an efficient data acquisition.in this paper, we proposed, a multi- scale variant of block compressed sensing of images coupled with Smoothed Projected Landweber Reconstruction. In essence, block-based compressed sampling is deployed independently with each subband of each decomposition level of a wavelet transform of an image. The corresponding multi-scale reconstruction interleaves Landweber steps on the individual blocks with a smoothing filter in the spatial domain of the image and thresholding within a sparsity transform. Experimental results shows that the proposed multi-scale reconstruction outperform over original block compressed sensing with Smoothed Projected Landweber.
Keywords: Compressed sensing, bivariate shrinkage, smoothing filter.
Abstract
A Review on the Security Issues in Cloud Computing Models
Er. Ubeeka Jain, Ritika Trivedi
DOI: 10.17148/IJARCCE.2015.41119
Abstract: As predicted, computing has become the fastest growing services in recent years and has acquired a new dimension in the form of cloud computing. It is greatly influenced by merging of internet with computing, sharing of resources and combining the advancements in technologies that paved to various cloud offerings. The fundamental guidelines of cloud computing model are computing, storage, Security & programming as an administrator. But the size of computation & demand for higher computation is growing very rapidly which is causing an uneven and heavy workload on cloud resources. Due to this rapidly increase of data, there is the need of security as per the increasing demand. But due to this increasing cloud demand, security of cloud data has become a problem for the future. So, here we are discussing about the security issues in the cloud data, service and deployment models.
Keywords: Cloud Computing, Cloud Security Issues, Cloud Service model, Cloud Deployment model.
Abstract
Android Blood Bank
Prof. Snigdha, Varsha Anabhavane, Pratiksha lokhande, Siddhi Kasar, Pranita More
DOI: 10.17148/IJARCCE.2015.41120
Abstract: Blood is a saver of all existing lives in case of emergency needs. The task of blood bank is to receive blood from various donors, to monitor the blood groups database and to send the required blood during the need to the hospital in case of emergencies. The problem is not insufficient number of donors, but finding a willing donor at the right time. We want to build a network of people who can help each other during an emergency. This application timely updates the information regarding the donors where the administrator accesses the whole information about blood bank management system. Donor will be prompted to enter an individual's details, like name, phone number, and blood group. In the urgent time of a blood requirement, you can quickly check for blood banks or hospitals matching a particular or related blood group and reach out to them through the App. Blood bank App provides list of blood banks in your area. A large number of blood donors are attracted using an Android application. Since almost everyone carries a mobile phone with him, it ensures instant location tracking and communication. Only a registered person, with willingness to donate blood, will be able to access the service. In this application we are using the GPS technology that will be used to trace the way to the blood bank. The user will get the route to reach the desired location and he won't have to ask manually, therefore time can be saved.
Keywords: Blood bank, Android, Blood transfusion, Database, Donors, Acceptors, Administrator, Geographic information System.
Abstract
Emotion Detection from Punjabi Text using Hybrid Support Vector Machine and Maximum Entropy Algorithm
Er. Ubeeka Jain, Amandeep Sandu
DOI: 10.17148/IJARCCE.2015.41121
Abstract: Emotion detection is the approach to detect the human emotion from different ways of images, text, videos and audios etc. We are focusing on the emotion detection from Punjabi text language. Vast amount of work has been done for the English language. In spite of lack of resources for Indian languages, work has been done for Telugu, Bengali and Hindi language. Punjab is one of Indian states and Punjabi is its official language. Punjabi is under resourced language. In this paper, we proposed a hybrid research approach for the emotion detection of the Punjabi text. The hybridization involves the concept of Support Vector machine and Maximum Entropy Algorithm. The dataset is considered from the Punjabi websites, news paper and various Punjabi blogs. The combined form of dataset is considered for the emotion detection. In this paper, we have considered six classes of emotions as Joy, Sadness, Fear, Surprise, Disgust and Anger. The goal of this research paper is to classify the considered dataset into the form of these six emotions.
Keywords: Emotion Detection, Punjabi dataset, Support Vector Machine, Maximum Entropy Algorithm, Emotions.
Abstract
Opinion Mining: An Overview
Ananta Arora, Chinmay Patil, Stevina Correia
DOI: 10.17148/IJARCCE.2015.41122
Abstract: Now-a-days due to the rapid growth of internet, people are expressing their views and opinions regarding, products, services and policies on the web in large numbers. This huge amount of feedback is very crucial for both organizations as well as individuals. The task of analysing these reviews is done by Opinion Mining (also known as Semantic Analysis). It aims for distinguishing the emotions expressed within the reviews, classifying them into positive or negative opinions and summarizing it into a form that is easily understood by users. Opinion Mining can be used by organizations to help improve their products and services. Also, it can be used by individuals in the process of decision making. This paper presents a review that covers the different techniques and approaches that are used in opinion mining systems. Also, this paper highlights various application areas and challenges related to the Opinion Mining.
Keywords: Opinion Mining, Sentiment Analysis, Supervised Learning, Unsupervised Learning, Semi-Supervised Learning, Challenges, Application Areas.
Abstract
Security in the Cloud
Akarsha B.M, Yogesh M J
DOI: 10.17148/IJARCCE.2015.41124
Abstract: Data security remains a top concern for the adoption of cloud-based delivery models, especially in the case of the Software as a Service (SaaS). This concern is primarily caused due to the lack of transparency on how customer data is managed. Clients depend on the security measures implemented by the service providers to keep their information protected. However, not many practical solutions exist to protect data from malicious insiders working for the cloud providers, a factor that represents a high potential for data breaches. This paper presents the High-Performance Anonymization Engine (HPAE), an approach to allow companies to protect their sensitive information from SaaS providers in a public cloud. This approach uses data anonymization to prevent the exposure of sensitive data in its original form, thus reducing the risk for misuses of customer information. This work involved the implementation of a prototype and an experimental validation phase, which assessed the performance of the HPAE in the context of a cloud-based log management service. The results showed that the architecture of the HPAE is a practical solution and can efficiently handle large volumes of data.
Keywords: Cloud Computing, SaaS, Data Confidentiality, Data Anonymization, Performance.
Abstract
Blur Parametric Estimation on Natural Images for Linear Motion, Out-Of-Focus and Gaussian Blur for Blind Restoration
Roshini Romeo, Lisha Varghese
DOI: 10.17148/IJARCCE.2015.41125
Abstract: Image restoration and recognition in blurred and poorly illuminated images is difficult. The recognition and the restoration factors are of vital importance in this endeavour. Image blur is difficult to avoid in many situations and can often ruin a photograph. Image deblurring is a process, which is used to make pictures sharp and useful by using mathematical model. The blind image restoration used to remove the blurs in the image. In this, the modified radon transform used to handle the three types of blurs in the given input image. The proposed method is used for estimating the parameters of linear motion, out-of-focus and gaussian blurs. The method is implemented by analyzing the blurred images spectrum. The modification in the radon transform is proposed in this. Blur parameters are identified using these modifications of radon transform i.e. radon-d, radon-c transform. By fitting an third order polynomial function that accounts separately for the image spectrum and the blur frequency blur parameters are estimated.
Keywords: Gaussian blur, linear motion blur, Image restoration, Out-of-focus, Radon transform, Radon-d, Radon-c, Spectrum of blurred images.
Abstract
Types of SQL Injection attacks
Vineet Nayak,Nupur Kalra, Ankit Gera
DOI: 10.17148/IJARCCE.2015.41126
Abstract: This SQL injection is a software vulnerability that occurs when data entered by users is sent to the sql interpreter as a part of SQL query. Attackers provide specially crafted input data to the SQL interpreter and trick the interpreter to execute unintended commands. Attackers utilize this vulnerability by providing specially crafted input data to the SQL interpreter in such a manner that the interpreter is not able to distinguish between the intended commands and the attacker�s specially crafted data. The interpreter is tricked into executing unintended commands. A SQL injection attack exploits security vulnerabilities at the database layer. By exploiting the SQL injection flaw, attackers can read, modify or delete sensitive data.
Keywords: Attacker, database, query, injection.
Abstract
Optimizing Query Performance with the Help of Query Optimization Tool
Dr.A.N. Banubakode, Ajay Jaiswal, Swapnil Magar, Jason Vijoy, Harshal Kasle
DOI: 10.17148/IJARCCE.2015.41127
Abstract: In this paper, we will enroll the procedure of query enhancement in view of Heuristic methodology. Information retrieving and storing are the basic tasks of the relational database. It is regularly found in the database business that a great deal of time is devoured in executing wasteful Queries. Query optimization is used for accessing database in an efficient manner, analyzing and choosing an optimized plan. The primary issue every Data System has is that despite the fact that the DBA realizing that the inquiries read are wasteful, the vast segments of individuals who fire these queries are not able to compose proficient inquiries. Subsequently, the execution of the whole framework corrupts on account of the exceptional fall in the framework throughput i.e. the quantity of exchanges performed per unit time is diminished. Query optimization primarily means selection, followed by sequencing in specific order, of the different clauses to formulate an efficient query from the multiple query plans by drawing a comparison of the query plans based on the cost of the resources involved and the response time. The objective of query optimization is to provide minimum response time and maximum throughput (i.e., the efficient use of resources). A Query Optimizer will acknowledge the inputted client query and naturally produce an identical yet very enhanced query. This will spare a considerable amount of time and effort. This thus enhances the framework throughput and its general execution.
Keywords: Query Optimization, Information Retrieval, Data Base Analyst(DBA), Heuristic Approach.
Abstract
Two Wheeled Balancing Autonomous Robot
Mr. Stafford Michahial,Mr. Basavanna M, Dr.M.Shivakumar
DOI: 10.17148/IJARCCE.2015.41128
Abstract: A Two wheel balancing robot are based on inverted pendulum configurations which rely on dynamic balancing systems for balancing and maneuvering. The controller board is equipped with PWM channels and motion sensors such as accelerometers. The processes developed are involved in balancing a two-wheeled autonomous robot based on the inverted pendulum model. The robot utilises a Proportional-Integral-Derivative (PID) controlled differential steering method for trajectory control. The balancing robot platform proved to be an excellent test bed for sensor fusion using the Kalman filter as the methodology. Kalman filter is a set of mathematical equations that provides an efficient computational solution of least squared method. An indirect Kalman filter configuration combining free scale board accelerometers is implemented to obtain an accurate estimate of the derivative and tilt angle. An accelerometer measures acceleration of components that is mounted on it. In sensor world, accelerometer is very important because they can sense a wide range of motion. Accelerometer also detects the angle with respect to gravity. To run the left and right motors, two separate H-bridge are used. An H-bridge is an electronic circuit that enables the voltage applied across in either direction. The aim of the Accelerometer and PID readings is to control the direction of rotation of the DC motors.
Keywords: Accelerometer, Kalman filter, Inverted pendulum, PID Controller, H-bridge, Freedom freescale.
Abstract
Survey on right-protected data publishing with provable Distance-based mining
Mrs.P.Menaka, Ms.P.Samundeeswari
DOI: 10.17148/IJARCCE.2015.41129
Abstract: Data exchange and data publishing are becoming an essential part of business and academic practices. Data owners also need to maintain the principal rights over the concern datasets that they share. This survey presents a right-protection mechanism that can provide detectable evidence for the legal ownership of a shared dataset, without compromising its usability under wide range of machine learning, mining, and search operations. It is accomplished by guaranteeing that order relations between object distances remain unaltered. This survey provides mechanisms for establishing the ownership of a dataset consisting of multiple objects. The algorithms also preserve important properties of the dataset, which are important for mining operations, and so guarantee both right protection and utility preservation. In this paper considers a right-protection scheme based on watermarking. Watermarking may distort the original distance graph. The proposed watermarking methodology preserves important distance relationships, such as: the Nearest Neighbors (NN) of each object of the original dataset. It proves fundamental lower and upper bounds on the distance between objects. In particular, it establishes a restricted isometric property, i.e., tight bounds on the expansion of the original distances. This analysis used to design fast algorithms for NN-preserving watermarking that drastically prunes the vast search space.
Keywords: Right-protection, Watermarking methodology, k-NN classification, k-NN preservation.
Abstract
Approaches of Resolving the Ambiguities of Word in Sentences
Roshan Karwa
DOI: 10.17148/IJARCCE.2015.41130
Abstract: There is multiple meaning of single word, for example, the word �Cold�. One meaning of Cold is Weather and other meaning is viral infectious disease. Identification of correct meaning of ambiguous word with respect to particular context is nothing but Word Sense Disambiguation (WSD) which is required in every field of Natural language processing like in Machine Translation for lexical choice for words that have dissimilar versions for different senses. In Information Retrieval, WSD is for Resolve ambiguity in questions and in Information Extraction for discriminate among precise occurrences of concepts. WSD is one of the demanding problems in Natural Language Processing (NLP). NLP is ability of computer program being able to processes human like language like Hindi, English, and French etc. This document presents an analysis on methods for WSD and proposed one method which is based on Existing method.
Keywords: Natural language processing, Word Sense Disambiguation, Dictionary, Corpus.
Abstract
Design & Development of Optical Frequency Division Multiplexing System for Digital Broadcasting Standard
Sapna Sadyan, Er. Paras Chawla
DOI: 10.17148/IJARCCE.2015.41131
Abstract: This work is concerned with how well OFDM performs in digital broadcasting when transmitted over an Additive White Gaussian Noise (AWGN) channel only. The main objective of this work is to design an OFDM system for digital broadcasting standard. In this, it uses 2k mode and 8k mode system. The main problem in OFDM is PAPR value. So, in this work, it also improves the value of PAPR under different modulation formats. The DVB-T system for terrestrial broadcasting is probably the most complex DVB delivery system. It is proven a worldwide success. It has become the de facto world standard for transmitting digital terrestrial television. Originally, the DVB-T standard was created for fixed and portable reception as the main application areas. OFDM can provide large data rates with sufficient robustness to radio channel impairments. In this, different modulation formats are used like QAM, QPSK etc. and their performance with these formats are evaluated.
Keywords: OFDM System, Digital Broadcasting, QAM modulation, DVB etc.
Abstract
Implementation of SDR-based high frequency range OFDM transceiver for Dedicated Short Range Communication
Dr .L. Nirmala Devi
DOI: 10.17148/IJARCCE.2015.41132
No abstract available.
Abstract
An Efficient Approach to Content Based Image Retrieval
Mukunda D. Waghmare, Kailash Patidar
DOI: 10.17148/IJARCCE.2015.41133
Abstract: Content-based image retrieval (CBIR) is image retrieval approach which allows the user to extract an image from a large database depending upon a user specific query. An efficient and effective image retrieval performance is achieved by choosing the best transform and classification techniques. Currently available transform techniques such as Fourier Transform, Cosine Transform, and Wavelet Transform suffer from discontinuities such as edges in images. To overcome this problem, a technique called Ripplet Transform (RT) has been implemented along with the neural network based classifier called Multilayered perceptron (MLP) for finding an effective retrieval of image. Classification using multilayered perceptron (MLP) with the Manhattan Distance measure showed varying experimental results for dimensions of Images. The performance of various Transform is compared to find the of particular wavelet function for image retrieval.
Keywords: Content-based image retrieval (CBIR), Ripplet transforms (RT), Multilayered Perceptron (MLP), Edge Histogram Descriptor, Feature Vector, Similarity Check.
Abstract
Malware Catcher using Crypto Identifier
Thomsy William
DOI: 10.17148/IJARCCE.2015.41134
Abstract: Today the world is heading towards internet related activities at every zone of life. With the same pace the threat of malwares are also proportionally increasing with the usage. Although most of the threats detecting strategies are highly active, malware builders are also trying to strengthen their shield to overcome malware detection. Cryptography�s dark side is being utilized for this purpose. Using cryptography the appearance of the malicious code is scrambled, thereby helping to bypass the anti-virus employed for detection purpose. Hence to identify the underlying cryptography is the main goal to be achieved to stop such malicious activities. To identify the presence of cryptography the execution of such programs were monitored using a DBA tool named Valgrind. The results shows the memory locations of famous cryptographic routines. Further with the help of signature based matching, the malicious presence was confirmed.
Keywords: AES; Cryptography; Malware; Malware Signature; RSA; SHA-1; Valgrind.
Abstract
E-Medical Diagnosis using Semantic Web
Anish Nair, Shantanu Kawlekar, Sharvil Kadam, Neepa Shah
DOI: 10.17148/IJARCCE.2015.41135
Abstract: Generally for the diagnosis of a patient, we tend to reach out to our nearest doctor�s clinic or hospital. In the case of emergencies, we tend to panic and lose precious time because of various factors such traffic, calling problems, etc. In this paper, we are going to propose a system wherein the layman user can help in the medical diagnosis of the patient where we could buy some time. This system will use the concept of Semantic Web for data integration, whereby data from various hospitals and in various formats can be integrated into one, for resource discovery and classification to provide better symptoms specific search engine capabilities, for cataloguing for describing the content and the content relationships available. We aim to bring the data from different sources into a single format using Relation Description Format (RDF) via common ontologies and make the available data interoperable and using mining techniques to obtain significant results. According to the proposed architecture, SPARQL (SPARQL Protocol and RDF Query Language) will be used for querying with the main application server.
Keywords: Semantic Web, RDF, Ontology, OWL, SPARQL, E-Medical Diagnosis.
Abstract
Analysing Success Possibility of a Mobile Application Using Data Mining Technique
Sharvil Kadam, Shantanu Kawlekar, Anish Nair
DOI: 10.17148/IJARCCE.2015.41136
Abstract: In this world of smartphones, everything is now being ported to the mobile in the form of mobile applications. A huge number of developers and entrepreneurs are starting their venture directly via a mobile application. Also many existing businesses and desktop application are now shifting to the mobile application world. This rapid rise of the use of smartphone applications has put a doubt in the minds of the developers that whether their app will be a success on the play store or not. In this paper we are going to propose the use of data mining techniques to analyse whether the newly hosted application will be a success or not. The results from the techniques cited in this paper can be used by various venture capitalist and potential investors for the application to make an informed decision about the funding. This paper uses analogy for this purpose. The analysis of attributes of many successful application hosted on the play store is done and then the most important attributes are found out using data mining. The resultant attributes can be used to compare with the new apps to find out the success possibility.
Keywords: Data Mining, Ideal Attributes, Raw Data, Application, Classification, Algorithms.
Abstract
A Survey on Data Mining and Digital Forensics Techniques for Intrusion Detection and Protection System
M. Jayamagarajothi, P. Murugeswari
DOI: 10.17148/IJARCCE.2015.41137
Abstract: With the dramatic increase in the internet applications, security is becoming a major issue of the network. Intrusive attacks on the network are increasing day-by-day. Intrusion Detection System (IDS) is used for ascertaining intrusion and preserves the security goals of information from attacks. Data mining techniques are used to monitor and analyze large amount of network data and classify these network data into anomalous and normal data. Data mining techniques such as classification and clustering are used to identify the intrusive attacks. An effective IDS requires high accuracy, high detection rate and low false alarm rate. This paper presents a survey on the different data mining techniques and digital forensics techniques for the Intrusion Detection and Protection System (IDPS). This enables effective detection of the both malicious and normal activities in the network, to develop a secure information system.
Keywords: Data Mining Techniques, Digital Forensics Techniques, Intrusion Detection System (IDPS), Intrusion Detection and Protection System (IDPS), Security.
Abstract
Ciphertext-Policy Attribute based Data-Sharing with Enhanced Productivity and Security
Kavita Patil, Vidya Chitre
DOI: 10.17148/IJARCCE.2015.41138
Abstract: Online data sharing systems and social networks provides security through the cryptographic solutions. For this, Cipher text Policy Attribute Based Encryption is mostly suitable for distributed data sharing systems since the data owner has full control to put in force access policies and updating the policies. Even if the CP-ABE has various advantages, it has a major drawback known as the Key Escrow problem. The Key Generation Center may perhaps decrypt any messages addressed to the specific users by generating their private keys. This is not appropriate for data sharing scenarios where the data owners would like to make their own private data only accessible to elected users key. So, this proposed System would help to fixed key escrow problem using a modified Escrow free Key generation Protocol. The improved Escrow free Key generation protocol ensures that neither the Key Generation Center nor the Data storing center can generate the secret keys individually. Instead the Key Generation Center and the Data storing center generate parts of secret key which are then integrated by the user. So as a result, the Escrow free Key issuing protocol will completely eliminates Key Escrow problem as well as the data owner would like to make their private data only accessible to designated users. Hence to overcome this problem fine-grained data access control provides a way of defining access policies based on different attributes of the users or the data object and the security is enhanced in order to send private keys by the data owner or the users .So the issue of key-escrow problem is resolved by issuing 2-pc protocols in system which will definitely protect the secure key escrow problem. As well as to improve the security Data integrity checking will be done. So as a result this Proposed System will help us to obtain more security and performance.
Keywords: Attribute Based Encryption (ABE), Cipher text Policy Attribute Based Encryption (CP-ABE), Data Integrity, Key Generation Center (KGC), Third Party Auditor (TPA).
Abstract
Triphasic CT Liver Characterization and Color Data Fusion
Silvana G. Dellepiane, Mahdieh Khalilinezhad, Roberta Ferretti
DOI: 10.17148/IJARCCE.2015.41139
Abstract: The aim of the present work is the analysis and mining of the informative content related to pathological liver tissues, when acquired by triphasic CT, with the proposal of a data-fusion approach, which is able to visualize and show up such a content in the best way as a support to the medical diagnosis. Since the huge amount of CT volumes to be analyzed in a limited time is the major cause of sensitivity loss during the diagnosis process, a better chance of detection and localization of the pathology can be derived from the method here proposed. This method can be a valid support to the current medical practice, even in the cases where pathology is at the very early stage and has a large probability to be missed by a visual inspection. As expected when analyzing the three phase volumes, one can note that the injection of a contrast agent causes significant changes in the radiological finding for both pathological and healthy parts of the liver. Thanks to a specific statistical analysis performed on a training dataset of real cases, the described study was focused on the characterization of hepatocellular carcinoma (HCC) tumor tissues and liver tissues. In order to detect and discriminate tumor from liver parenchyma, we here propose using both steady-state and dynamic features. Some common patterns have been observed suggesting rules, which have been confirmed by radiology specialists. Based on the rules and the best discriminant power of some of the characterizing features, a new color data fusion approach is then proposed and discussed which improves the mass visibility while increasing contrast with respect to surrounding parenchyma.
Keywords: Triphasic CT, tissue characterization, feature analysis, data fusion, color distance.
Abstract
Prevention against Hacking using Trusted Graphs
Yash Sanzgiri, Kevin Garda, Arush Vichare
DOI: 10.17148/IJARCCE.2015.41140
Abstract: Nowadays, accessing information and exchanging of data in business industry is increasing but it also increases the risk of security. The state of the security on internet is bad and becomes worse. The explosive growth of internet has brought many good things, but there is also a dark side: Criminal hacker. The initial design for common communication protocols indicates that the technology was proposed to meet main requirements such as speed, performance, efficiency and reliability but security was not a concern at that stage. Hacking is the practice of modifying the features of system, in order to accomplish a goal outside of the creator�s original purpose. Number of solutions is provided against hacking but they are unable to address those issues. This project provides security for entire infrastructure to protect against hacking. The proposed infrastructure avoids the three pre-hacking steps. It generates the trusted graph and creates the confusion in front of hacker. Hacker cannot understand the current communication infrastructure and it is difficult for him to break the system easily.
Keywords: Hacking, Trusted Graph, honey pot, Dijkstra�s algorithm.
Abstract
A Survey on fuzzy expert system for improving microarray data classification accuracy
Deepakkumar.S, Mohankumar.M
DOI: 10.17148/IJARCCE.2015.41141
Abstract: Building an accurate fuzzy expert system will improvise the classification of microarray data and reduces the complexity. Modern practice in the classification of microarrays� data has two main limitations: (1) the dependability of the training data sets for building classifiers, (2) the model to be classified does not fit in to any of the existing classes. Medical thermography is very useful in a variety of medical applications as well as the detection of breast cancer by identifying the local temperature and the elevated metabolic commotion of cancer cells. Distinct conventional expert systems, which are mainly symbolic reasoning engines, fuzzy expert systems are oriented toward numerical processing. To address the interpretability-accuracy trade-off, the system proposes hybrid Ant Bee Algorithm (ABA) and it is evaluated using six gene expression data sets.
Keywords: Microarray data, fuzzy expert system, ant colony optimization, artificial bee colony, mutual information.
Abstract
A Review on Functional Encryption Schemes and their usage in VANET
Sandhya Kohli, Kanwalvir Singh Dhindsa, Ravinder Khanna
DOI: 10.17148/IJARCCE.2015.41142
Abstract: A vehicular Ad�hoc network (VANET) is an important component of intelligent transport system (ITS), and provides an eminent way to communicate with other nodes while driving. For vehicular communications (VC) a secure method must be employed for message and data dissemination. Various encryption and decryption schemes have been devised so far for message communication in VANET. Earlier symmetric and asymmetric encryption techniques were employed in VANET for secure communication but it has many inherent shortcomings so a new encryption standard known as functional encryption scheme has been used in VANET. In this paper a comparison of various encryption schemes i.e. symmetric/ asymmetric and various functional encryption schemes has been done to reveal the utility of functional encryption in VANET. Although functional encryption scheme has many subgroups but in this paper two major subgroups of functional encryption i.e. predicate encryption (PE) and Attribute based encryption (ABE) are compared to reveal the benefits of each scheme.
Keywords: VANET, Encryption, Security, Vehicular Communication.
Abstract
Natural Language Database Interface
Asst. Prof.Rakhee Kundu, Asst. Prof. Poonam Gholap, Asst. Prof. Snehal Mane
DOI: 10.17148/IJARCCE.2015.41143
Abstract: Databases have become ubiquitous. Almost all IT applications are storing and retrieving information from databases. Retrieving information from the database requires knowledge of technical languages such as Structured Query Language (SQL). However majority of the users who interact with the databases do not have a technical background and are intimidated by the idea of using languages such as SQL. This has led to the development of a few Natural Language Database Interfaces (NLDBI). A NLDBI allows the user to query the database in a natural language (NL). This dissertation work highlights on architecture of new NLDBI system, which includes designing a grammar which convert NL statement to a machine understandable language like a query which is fired on a database, constructing parse tree/s and analyzing them. In most of the typical NLDBI systems the NL statement is converted into an internal representation based on the syntactic and semantic knowledge of the NL. This representation is then converted into queries using a representation converter. Before a NL query is translated to an equivalent query in technical language like SQL it has to go through various steps. In this paper it highlights the steps of speech tagging followed by tagging of each word of the query, parsing the tagged sentence by a grammar and generating a grammar tree (parse tree) by applying the semantic analysis on that parse tree and finally SQL translator processes the parse tree to obtain the SQL query.
Keywords: Natural Language, Database, SQL Query, Speech tagging, Parse tree.
Abstract
Implementation and Analysis of QoS Aware Routing Protocol for the MANET
Surabhi Gupta, Manish Saxena
DOI: 10.17148/IJARCCE.2015.41144
Abstract: Nowadays, the growth in the field of information processing and wireless data transmission for any wireless system is in its peak.For designing the quality oriented system, evaluation of the two routing protocols under IEEE 802.11 standards has been done. The performance is evaluated in terms of the end-to-end delay and throughput The AODV and MAODV routing protocols are implemented for both the standards. It is concluded that the MAODV is best suited for designing a enhanced quality oriented protocol for better throughput and larger coverage area with lower delay.
Keywords: MANET, QoS, AODV and MAODV.
Abstract
Compression of Encrypted Image using Wavelet Transform
Ravi Prakash Dewangan, Chandrashekhar Kamargaonkar
DOI: 10.17148/IJARCCE.2015.41145
Abstract: In recent years, compression of encrypted image has attracted significantly towards research interest. This paper proposes an approach to compression of encrypted image based on wavelet and random permutation. Original image is first encrypted using random permutation then compressed using wavelet. The primary focus of this work is on the practical design of a pair of image encryption and compression schemes. The process of compressing the encrypted images is nearly same as compressing unencrypted image. The receiver does decryption and enhancement process. Decryption process is same as encryption process but for enhancement we have used median filter. Thus this paper focuses on achieving better security and improved transmission rate.
Keywords: Wavelet ,Image compression, random permutation, Image encryption.
Abstract
An Advanced Cloud Computing and Research Issues
Dr. Atul Khurana
DOI: 10.17148/IJARCCE.2015.41146
Abstract: Cloud Computing is considered as one of the emerging arenas of computer science in recent times. It is providing excellent facilities to business entrepreneurs by flexible infrastructure. Although, cloud computing is facilitating the Information Technology industry, the research and development in this arena is yet to be satisfactory. Our contribution in this paper is an advanced survey focusing on cloud computing concept and most advanced research issues. This paper provides a better understanding of the cloud computing and identifies important research issues in this burgeoning area of computer science.
Keywords: Cloud Computing; Virtualization; Data Center; Server Consolidation.
Abstract
A Review of Feature Selection Algorithms to Identify Risk Factors for Liver Disease
Suri Yaddanapudi, Madhanan Balaram
DOI: 10.17148/IJARCCE.2015.41147
Abstract: Huge volumes of datasets with relatively higher number of dimensions are being collected by medical practitioners to identify the relevant features that cause a disease, which gives rise to an important technique, called feature selection, as the pre-processing strategy in obtaining knowledge and information from datasets. Feature selection is important when machine learning algorithms are applied on medical datasets to make the model easy to understand. Feature section techniques in medical domain should be model independent and at the same time should come with less number of features. Filter feature selection is independent of any model and helps in solving the curse of dimensionality. In this paper different types of filter feature selection algorithms are applied to A.P Liver dataset and performance is evaluated using sensitivity and specificity analysis.
Keywords: Feature Selection, Liver Diagnosis, Data Mining, A.P. Liver Dataset, Wrapper, Filter.
Abstract
An Age of Cloud in Mobile Computing (Mobile Cloud Computing)
Anureet Kaur
DOI: 10.17148/IJARCCE.2015.41148
Abstract: Mobile phone market has been expanding swiftly in past few years and their demand has soared to new heights. In order to support such a high demand cloud computing appears to be right choice for mobile technology. Mobile cloud computing incorporates cloud computing concept to overcome many restraints of mobile devices such as battery life, bandwidth, scalability, diverse platform, storage to name a few. The mobile cloud computing market is predicted to further rise in the next coming years. The focus of this paper is to provide baselines for understanding cloud computing concept. At first an introduction to cloud computing with application areas and services are discussed then how mobile computing is combined with cloud computing to devise a new concept of mobile cloud computing. Various advantages, challenges, applications and architecture of mobile cloud computing is discussed in detail.
Keywords: Mobile Applications, Cloud computing, Mobile Cloud Computing, Mobile Computing.
Abstract
Antenna Simulation Tools and Designing of Log Periodic Dipole Antenna with CST Studio
Vijay Kale, Dnyandev Patil
DOI: 10.17148/IJARCCE.2015.41149
Abstract: The transmitting and receiving antenna is one of the important elements of the wireless communication systems. To design any type of antenna, different complicated procedures are used. A single test takes large amount of time and also high cost. It also puts limitations on design and designer to make changes time to time to improve performance of antenna. So implementation of whole design in practice is somewhat difficult and consumes more time and cost. Simulators are available and able to provide practical feedback when designing real world systems. This paper represents different simulation software�s with their features, availability and also focus on designing log-periodic dipole antenna (LPDA) using CST studio tool. A Computer Aided Design (CAD) tools helps the antenna designers to design an antenna virtually before implementing in practice. Thus both the time and cost of the designing and testing is reduced and implementation of design in practice makes easy. Simulators provide a crucial role in both industry and academia. The simulators can be used as an effective means for teaching or demonstrating antenna concepts to students.
Keywords: Antenna design, Simulators, CAD Tools, Log-periodic dipole antenna, CST studio.
Abstract
A Survey in Scheduling For Real-Time Tasks on Virtualized Clouds
Gowthami .R, Boopal .N, S. Gunasekaran
DOI: 10.17148/IJARCCE.2015.41150
Abstract: One of the current emerging trend is cloud computing. The achievement of cloud computing is the amount of rising level in case of real-time applications such as signal processing and weather forecasting. Temporarily, scheduling for real-time tasks is playing a vital role for a cloud provider to retain its QoS and enrich the system�s performance. A mechanism include in novel agent-based scheduling in the environment of cloud computing is formulated to allocate real-time tasks and dynamically provision resources. On the basis of the bidirectional announcement-bidding mechanism, an agent-based dynamic scheduling algorithm named ANGEL is proposed for real-time, independent and periodic tasks in clouds. In this paper, I have discussed the concept about the scheduling process and have done a survey about the past.
Keywords: Real Time Task Scheduling, Novel Agent-Based Scheduling.
Abstract
Weighted Archetypal Analysis used for Text Summarization
Miss. Vaishali Siddharam Shakhapure, Prof. A. R. Kulkarni
DOI: 10.17148/IJARCCE.2015.41151
Abstract: This paper demonstrate a summarization system that generates a summary for a given documents based on sentence similarity measures using weighted archetypal analysis. Most of the former approaches for multi document summarization give the summary by using only the Query Focused methods applied on the given document. And many approaches like matrix factorization methods to search either low rank approximation method or hard/soft clustering methods to get better document summary for the given documents. In this paper we propose different method called Weighted Archetypal Analysis an efficient approach for multi document summarization using extractive type summarization which uses the term frequency for sentence importance measures: Frequency of the terms i.e. archetypes, values in the sentence. The sentences are ranked according to their respective weights (scores) and the top rank (highest weight) sentences are selected for summary. The summary is generated by using Weighted Archetypal Analysis to compute archetypes, term frequency and significant sentences to the target documents evaluation measure.
Keywords: Generic document summarization, weighted archetypal analysis, Text summarization, Matrix factorization approach, Term Frequency.
Abstract
Survey on Anomaly Detection in Web Usage Mining
Navareena .A, Kathiresan .V, D.Gunasekaran
DOI: 10.17148/IJARCCE.2015.41152
Abstract: In machine learning, Anomaly detection is widely processed for many real-world applications mainly intrusion or fraud detection which also require an actual and efficient framework to classify differed data instances. In this paper, the study of anomaly detection in web usage mining is done. In which, the various methods and functions that are used in various papers in various methods of analysis. This survey attempts to provide broad overview of the research on anomaly detection. Also have done the work in the form of collected existing techniques into different categories based on the method adopted by each procedure. For every paper the used techniques and what that author process are all explained in various behaviors.
Keywords: Anomaly detection, outlier detection.
Abstract
Development of a Content Based Recommender Using Dynamic Artificial Neural Network
Md Zahidul Islam, Feroza Naznin, Asaduz Zaman
DOI: 10.17148/IJARCCE.2015.41153
Abstract: In recent years, we�ve been facing a boost in advancement of Artificial Intelligence (AI). It�s continuously setting benchmark of our thought level of which AI is possible. Although using Artificial Neural Network (ANN) to achieve AI is somewhat less explored area primarily because of complexity of the training process, AI is moving towards ANN approach recently with many sophisticated learning algorithms. In this paper, we tried to develop a content based recommender which can be used as a personal assistant for categorizing and making recommendation on news articles. Our primary goal was to evaluate the possibility of such a recommender which can be dealt with a multiple layer network which has dynamic number of input nodes. We were also evaluating the performance of the system under various configurations. We are calling the system dynamic artificial neural network (DANN).
Keywords: recommender; artificial intelligence; artificial neural network; content based recommender; dynamic artificial neural network.
Abstract
A Case Study on Markov Model for Double Fault Tolerance, Comparing Cloud based Storage System
Swaroop Tewari, Mohana Kumar. S, Dr. S N Jagadeesh
DOI: 10.17148/IJARCCE.2015.41154
Abstract: Cloud storage has gained massive popularity in the IT industry. It has proven to be cost effective and reliable. Research has shown that striping of data across multiple cloud vendors is a remedy for providing fault tolerance. In case the cloud suffers from a permanent failure it leads to loss of data in the cloud. In this scenario the lost data can be repaired or recovered using the other surviving clouds. This multiple cloud-storage system is called a Network-Coding based cloud storage system. NC Cloud is a proxy based system which provides fault tolerance and provides cost-effective repairs for systems which suffer from permanent single-cloud failure. The NC Cloud is built on top of Functional Minimum Storage Regenerating (FMSR) codes which provide the same fault tolerance as traditional erasure codes like RAID 5 and RAID 6 but use less data repair traffic. Therefore, the FMSR codes provide significant cost savings in repair over RAID 6 codes but have similar performance during upload and download of data. The concept of Network-Coding based cloud storage is gaining mass popularity due to its desirable properties like data recovery, cost effectiveness and fault tolerance.
Keywords: FMSR codes; NC Cloud; Fault tolerance; data recovery; regenerating codes; Traditional erasure codes; Mean-time-to-data-loss (MTTDL); MDS property.
Abstract
Advanced Computerized Scheme for Detection of Lung Nodules by Incorporating VDE Image
Bhagyashree Nemade
DOI: 10.17148/IJARCCE.2015.41155
Abstract: The detection of the lung nodules can be found out by introducing Computer Aided Diagnosis (CAD) scheme. Most of lung nodules that are missed by radiologists as well as computer-aided detection (CADe) schemes overlap with ribs or clavicles in (CXRs). Computed tomography is used to detect the lung nodules but it�s costlier. The proposed method uses the X-Rays, are preferred due to cost effective, low radiation dose and effective diagnostic tool. Computerized Detection Scheme system detected nodule candidates on VDE images by use of lung segmentation and morphological filtering techniques. Segmentation of lung regions based on our M-ASM and nodules at the lung borders by using coarse to fine segmentation techniques and watershed segmentation algorithm. The classification and feature analysis of the nodule candidates into nodules or non nodules by use of non linear Support Vector Machine (SVM) with Gaussian kernel classifier. By implementing this work, experimental results show that the different rib contrast parameter, smoothness and entropy are compared with conventional method.
Keywords: Chest radiography (CXR), computer-aided diagnosis (CAD), lung nodule detection, virtual dual energy(VDE).
Abstract
Different Approaches for the Removal of Different Valued Salt and Pepper Noise in Images using Spartan 3
Madhuri Derle, Gorakshanath Gagare
DOI: 10.17148/IJARCCE.2015.41156
Abstract: To remove high density salt and pepper noise from images an efficient algorithm is proposed. Images are corrupted by salt and pepper noise when they are transmitted over channels because of faulty communication. Impulse noise is nothing but the Salt and Pepper noise. The main objective is to recovered the fully noise free images by removing the impulse noise with minimum signal distortion. There are number of existing noise removing techniques are available with which impulse noise can be removed from images. We are going to work on the images corrupted by salt-and-pepper noise. To remove noise Field Programmable Gate Array is used. The noisy pixels which are present in corrupted images can take only the maximum or minimum values i.e. Minimum is 0 and Maximum is 255 for 8-bit gray scale images.
Keywords: Noise, Filtering techniques, FPGA.
Abstract
Review on Biometric Authentication Methods
Mr. Mule Sandip S., Mr.H.B.Mali
DOI: 10.17148/IJARCCE.2015.41157
Abstract: There have been several studies on the different kinds of biometrics authentication systems which uses person�s physiological and behavioral characteristics. So authentication serves first step towards security concern. Biometric have widely used over existing methods due their significant response for secure identification. Biometrics is the preventive solution for unauthorized access due to traditional password or smartcard based authentications. Biometric uses fingerprint, eye patterns (IRIS recognition), hand geometric, facial expression, voice recognition, and signature analysis etc. With the use of unique characteristics of person; various biometrics authentication devices have been developed and in use. Various software/hardware companies are developing such authentication for better security of sensitive, confidential information�s here the paper proposes the brief review on different biometric person identification methods.
Keywords: Authentication, Biometrics, password, smartcards, IRIS.
Abstract
Review of Soft Computing Techniques: Exploring Scope
Dr. Ranjana Rajnish, Dr. Parul Verma
DOI: 10.17148/IJARCCE.2015.41158
Abstract: Soft Computing is the science of reasoning, thinking and deductions. The idea behind soft computing is to model cognitive behavior of human mind. Soft computing is foundation of conceptual intelligence in machines. Unlike hard computing, soft computing is tolerant of imprecision, uncertainty, partial truth and approximation. The role model for soft computing is the human mind. The guiding principle of soft computing is: Exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness and low solution cost. The main techniques in soft computing are evolutionary computing, artificial neural networks, fuzzy logic and Bayesian statistics. Each technique can be used separately, but a powerful advantage of soft computing is the complementary nature of the techniques. Used together they can produce solutions to problems that are too complex or inherently noisy to tackle with conventional mathematical methods. The applications of soft computing have proved two main advantages: first, it made solving non-linear problems, in which mathematical models are not available or possible and second, it introduced the human knowledge such as cognition, recognition, understanding, learning and others into the fields of computing. This resulted in the possibility of constructing intelligent systems such as autonomous self-tuning systems, and automated designed systems. This paper highlights various techniques of soft computing paradigm. Aim of this paper is to explore possibilities of applying soft computing techniques to the problems associated with various domains.
Keywords: Soft Computing, Artificial Neural Network, Fuzzy Logic, Evolutionary Computing, Machine Learning.
Abstract
A survey on Security based data dissemination for VANETs
C.Kiruthika, Ms.N.Gugha Priya
DOI: 10.17148/IJARCCE.2015.41159
Abstract: Vehicular Ad hoc Networks (VANETs) are the hopeful approach to offer safety and other applications to the drivers and customers. A lot of works have been completed towards it but security in VANET is still challenging. Data dissemination is the basic process in vehicular ad hoc networks (VANETs). Data dissemination is used to improve the quality of driving in terms of time, distance and safety. That is, after an accident or jamming is identified by the corresponding sensors build up on the vehicles, an alert message should be quickly disseminated to the vehicles moving towards the affected areas. The limited characteristics of VANETs are high mobility of vehicles, alternating connectivity, and dynamic topology. These following characteristics make data propagation tricky. In this paper, a survey on trust based data dissemination for VANETs is presented. This paper provides a detailed idea about how to securely disseminate the message between vehicles within a time.
Keywords: Vehicular ad hoc networks, Data dissemination, Epidemic routing, VADD.
Abstract
A Novel Approach for Detecting Image Forgery
Nilu Treesa Thomas, Anju Joseph, Shany Jophin
DOI: 10.17148/IJARCCE.2015.41160
Abstract: To identify the traces of various forensic problems is an important issue in digital image forensic. The method propose a novel approach for detecting the traces of JPEG compression and image tampering using statistical feature extraction method. Discrete Cosine Transform Residual Features are used for extraction process. The method includes JPEG compression and that provides quantization noise based solution. Multiple-cycle JPEG compression is performed for noise analysis and define a quantity called forward quantization noise. The method analytically derive that decompressed image have lower variance of forward quantization noise. Using 64 kernels of DCT the quantized feature sets are generated and is so called as undecimated DCT. The proposed method solves the problems such as revealing the traces of JPEG compression history and identifies the tamped images using simple yet very effective detection algorithm. For chroma sub sampling and for small images image size the method is robust. The proposed algorithm can be applied in many practical applications, such as Internet image classification and forgery detection.
Keywords: Image Forgery, JPEG, DCT Features, Forward Quantization Noise, Forgery Detection.
Abstract
A Survey on Privacy and Security of Data Classification
Brindha.M, Prof. S.V.Hemalatha
DOI: 10.17148/IJARCCE.2015.41161
Abstract: Data mining, the mining of hidden predictive information from huge databases, is a powerful new technology with grand latent to help companies focus on the most vital information in their data warehouses. Data mining tools expect future trends and behaviors, allowing businesses to make positive, knowledge-driven decisions. Text classification is the method of conveying text documents based on assured categories. Due to the rising trends in the field of internet and computers, billions of text data are processed at a known time and so there is a require for systematize these data to offer easy storage and accessing .Many text classification approaches were developed for efficiently solving the difficulty of identifying and classifying these data. During the data retrieval of the classified data, privacy and security is the challenging task. In this paper, a survey on privacy and security of the classified data using classification after encryption has been discussed. A classifier is used to define the suitable class for each text document based on the input algorithm used for classification. Encryption is the procedure of encoding messages or information in such a way that only approved parties can read it, which provides high security and privacy.
Keywords: privacy of data, classifier, encryption.
Abstract
Detection of Periodic Limb Movement with the Help of Short Time Frequency Analysis of PSD Applied on EEG Signals
Mohd Maroof Siddiqui, Geetika Srivastava, Syed Hasan Saeed
DOI: 10.17148/IJARCCE.2015.41162
Abstract: Periodic limb movement disorder (PLMD) is recurring cramping or jerking of the legs during sleep. "Periodic" refers to the detail that the actions are recurring and rhythmic, arising concerning every 20-40 seconds. The detailed disorder discussed below is the Periodic Limb Movement (PLM). As the name suggests, PLM is the periodic movement of the inferior part of leg during the sleep hours of a human being. It takes place at some definite period of time in this paper we diagnose the PLM through EEG signals. In this research article, quality and waveform of EEG Signals of human being are analyzed. The plan of this examine is to draw the consequence in the form of signal range analysis of the changes in the domain of dissimilar stages of sleep.
Keywords: PLM, Analysis of EEG Signal, Estimation of PSD.
Abstract
A New Technique for Face Matching after Plastic Surgery in Forensics
Anju Joseph, Nilu Tressa Thomas, Neethu C. Sekhar
DOI: 10.17148/IJARCCE.2015.41163
Abstract: A new problem in face matching is plastic surgery. Popularity of plastic surgery has increased over the past few years and its keeps growing. This work proposes a facial feature based system to determine the original image from a post-surgery image. The system generates facial components from three different feature extraction techniques. Uniform LBP serves as the classifier Dimensionality reduction techniques such as LDA and PCA are also used. The normalized values are compared and score fusion rule is applied to retrieve the results. On the plastic surgery image portraits, the proposed system yields high recognition accuracy.
Keywords: Face matching, Plastic surgery, ULBP, Forensics.
Abstract
An Efficient Semantic Dynamic Query processing based on user interest for web Database using FCM Algorithm
Prabha .P, Vijayakumar .P
DOI: 10.17148/IJARCCE.2015.41164
Abstract: In this journal, we aim at discovering the number of diverse user search goals for a query and depicting each goal with some keywords automatically. we propose a semantic ontology method to map feedback sessions to pseudo-documents which can efficiently reflect user information needs. At last, we cluster these pseudo documents to infer user search goals and depict them with some keywords. Since the evaluation of clustering is also an important problem, we also propose a novel evaluation criterion fuzzy score to evaluate the performance of the restructured web search results.In this paper, we propose an efficient approach to improve user search goals by analyzing search engine query logs automatically. And propose a framework to discover dissimilar user search goals for a query by clustering the proposed automatic feedback process.
Keywords: Clusters,C-Means,Metadata,Classification
Abstract
Students’ Academic Failure Prediction Using Data Mining
Lumbini P. Khobragade, Prof. Pravin Mahadik
DOI: 10.17148/IJARCCE.2015.41165
Abstract: This paper proposes to apply Data Mining Techniques to predict the students� failure on real time data of school or graduating students. Experiment attempts the detection of students� failure to improve their academic performance and to prevent them dropping out. Research has been done on assessing students� failure based on various attributes. In this experiment, 11 best attributes has been selected. Different approaches have been applied to resolve the problem of high dimensionality and using classification algorithm on Engineering students� previous and present education information to generate the model and this model can be used to detect students� academic failure. The results are compared and presented.
Keywords: Educational Data Mining (EDM), Academic failure, Classification, Prediction, Decision tree, Induction Rule.
Abstract
SLM Transmitter and Receiver for PAPR Reduction of OFDM System
Khyati Keyur Desai
DOI: 10.17148/IJARCCE.2015.41166
Abstract: Orthogonal frequency division multiplexing (OFDM) also referred to as multi carrier communication systems, become a key technology in current and for future wireless communication systems. Due to OFDM�s immunity to many channel imperfections, it is the ideal modulation scheme for many applications which transmit signals in hostile environments. Although, OFDM introduces major disadvantages like Peak to Average Power Ratio (PAPR), sensitivity to frequency offset. The PAPR of the transmitted signal power is large, necessitating power back off, unless PAPR reduction techniques are incorporated to control the resulting nonlinear distortion at the power amplification stage. In this paper the technique of Selected Mapping (SLM) is simulated for OFDM and it is found that using this technique with 16 alternative sub carrier vectors, PAPR is reduced to approximately 4.6 dB. But this scheme requires the transmission of the side information to indicate the used masking pattern. Hence, we should consider the reliability of this side information, which requires the high signaling overhead. So, SLM receiver based on maximum likelihood decoding that operates without side information has been implemented.
Keywords: Orthogonal frequency division multiplexing (OFDM), Peak-to-Average Power Ratio (PAPR), Selected mapping (SLM), Partial transmit sequences (PTS), Power amplifier (PA).
Abstract
Accurate Object Detection and Semantic Segmentation using Gaussian Mixture Model and CNN
Sakshi Jain, Satish Dehriya, Yogendra Kumar Jain
DOI: 10.17148/IJARCCE.2015.41167
Abstract: Semantic segmentation and object detection are two most common tasks in the field of image processing, pattern recognition and classification. This paper presents a two stage procedure to perform these two tasks. The proposed work uses the Gaussian mixture model for image segmentation and identifies the segments by optimally searching the possible Gaussian distribution inside the image histogram. The optimal partition searching procedure uses the genetic algorithm. For the object detection, we apply the Convolution Neural Network (CNN) to extract the features of each segments and then apply them to pre-trained Support Vector Machine (SVM) to identify the object the segment belongs to. Finally the proposed system is developed using Matlab computational software and tested with different types of image datasets. The experimental results demonstrate encouraging performance of the proposed technique for both object detection and semantic segmentation tasks.
Keywords: Semantic segmentation, object detection, Gaussian mixture model, Genetic Algorithm, Support Vector Machine.
Abstract
A New Secure Image Transmission Technique via Mosaic Images Using Genetic Algorithm
Surya .T.S, Deepthy Mathews
DOI: 10.17148/IJARCCE.2015.41168
Abstract: A new secure image transmission method is proposed in which it transforms the secret image into secret-fragment-visible mosaic image. The size of the mosaic image is same as that of secret image. Mosaic image is created by dividing the secret image and target image into fragments of equal size and fitting these secret tile blocks into target blocks. For tile image hiding, a mapping sequence is generated using Genetic Algorithm (GA). This provides better clarity in the retrieved secret image. It also reduces the computational complexity. The quality of the original target image remains preserved while embedding the secret image. Therefore better security and robustness is assured. Embed these tile fragments into the target image based on the mapping sequence by genetic algorithm and permuted the sequence again by KBRP with a key. Color transformations are performed to make the mosaic image similar to the target image. After color transformation rotation is performed. Rotating each tile block into an optimal rotation angle with minimum root mean square error value with respect to its corresponding target blocks. For the recovery of the secret image from the mosaic image embed relevant information into the created mosaic image. Overflows/underflows in the transformed color values can also be handled by using this method. By using the same key and the mapping sequence, the secret image can be recovered.
Keywords: Image hiding, Mosaic image, Genetic Algorithm, Key Based Random Permutation, Color Transformation.
Abstract
Approach of Jordan Elman Neural Network to Diagnose Breast Cancer on Three Different Data Sets
S.Swathi, P. Santhosh Kumar, P.V.G.K. Sarma
DOI: 10.17148/IJARCCE.2015.41169
Abstract: Breast cancer is the second leading cause of cancer deaths worldwide, occurs in one out of eight women .still there is no known way of preventing this pathology. Early detection of this disease can greatly enhance the chances of long-term survival of breast cancer victims. Artificial Neural Network is a branch of Artificial intelligence, has been accepted as a new technology in computer science. Neural Networks are currently a 'hot' research area in medicine, particularly in the fields of radiology, urology, cardiology, oncology and etc. It has a huge application in many areas such as education, business; medical, engineering and manufacturing. Neural Networks has been widely used for cancer prediction and prognosis. This paper highlights on Jordan Elman neural networks approaches to solve breast cancer diagnosis, using three different database of breast cancer viz. Wisconsin, WDBC and WPBC. We also introduce recurrent neural network technology as Jordan Elman neural network. To diagnose problems Jordan Elman neural network is successful on three different breast cancer data set is major feature of this paper.
Keywords: recurrent network, benign, malignant, WDBC, WPBC, mammography, FNA, mean square Error, Correlation, Sensitivity, Specificity, ROC.
Abstract
Genetic Algorithm by using MATLAB Program
Mashal Alenazi
DOI: 10.17148/IJARCCE.2015.41170
Abstract: In this paper, an attractive approach for teaching genetic algorithm (GA) is presented. This approach is based primarily on using MATLAB in implementing the genetic operators: initialization, crossover, mutation, evaluation and selection. A detailed illustrative examples is presented to demonstrate that how to solve Traveling Salesman Problem (TSP) and Drawing the largest possible circle in a space of stars without enclosing any of them.
Keywords: Genetic, Matlab, Algorithm, Mutation.
Abstract
Formation of RADAR Images and Their Fusion Using Wavelet Transform
Yogesh V.Chandratre, Rajesh R.Karhe
DOI: 10.17148/IJARCCE.2015.41171
Abstract: Nowadays image analysis is not only applicable in medical field but also it is most effectively used in the fields such as Military monitoring system, land examination, to track geostationary position, weather analysis etc. to get all these information we require images and which are captured by using RADAR technology we have focused on detailed discussion of RADAR image formation and its different technologies which are used to get RADAR images and their fusion using wavelet transform.
Keywords: RADAR, SAR, RAR, DWT, Active radar, Passive radar, Swath, Nadir Point.
Abstract
Indian Share Market Forecasting with ARIMA Model
Swapnil Jadhav, Saurabh Kakade, Kaivalya Utpat, Harshal Deshpande
DOI: 10.17148/IJARCCE.2015.41172
Abstract: Artificial neural networks (ANNs) are a flexible computing frameworks and universal approximates that can be applied to a wide range of time series forecasting problems with a high degree of accuracy for the convenience of predicting the futuristic value in share market and give a better future scope for investment. But yet the artificial neural network is not to the satisfactory as it includes both theoretical and empirical findings have concluded that the combination of different models can be an effective way of improving upon the predictive performance, if the models in ensemble are quite different. In this paper, a novel hybrid model of artificial neural network is proposed using a auto-regressive integrated moving average (ARIMA) model to produce the more accurate forecasting model than artificial neural network. On this context, we collected data on monthly closing stock indices of sensex, on these we have tried to develop an appropriate model that would help us to forecast the future unknown values of Indian stock market indices, i.e, ARIMA. Therefore, it can be used as an appropriate alternative model for forecasting task, especially when higher forecasting accuracy is needed.
Keywords: Sensex, Time Series, ARIMA model, validation.
Abstract
A Novel Knowledge Expert System for Water Type Analysis
D. Anusha, Ch.V. Sarma
DOI: 10.17148/IJARCCE.2015.41173
Abstract: Fuzzy expert system is one of the interesting research areas in the field of artificial intelligence. Now a day�s environment pollution is one of the major issues running wide. Water is one of the essential elements in daily lives. In this aspect we need to have awareness of the water that we are using. Classifying of water resources to satisfy the water quality standards is an important issue. In this paper we are proposing a fuzzy expert system that is helpful for differentiating the type of water based on the parameters taken. Based on the expert opinions and national experiences, five water quality parameters DO, Ph value, BOD, Sulphate and Chlorine were considered as the significant indicator parameters to assess the type of water. Fuzzy expert system makes it possible to combine the certainty levels for the acceptability of water based on the approved parameter. This is possible because of having high variety of rules inserted in the inference system. Here we are going to explain how mamadani inference system works for giving an appropriate result of water type analysis.
Keywords: knowledge base, fuzzy logic, inference system, fuzzy expert system.
Abstract
Visual Cryptography with Color Error Diffusion and Digital Watermarking
Prof Akhil Anjikar, Prof. Rahul Bambodkar
DOI: 10.17148/IJARCCE.2015.41174
Abstract: Visual Cryptography is a special encryption technique that encrypts the secret image into n number of shares to hide information in images in such a way that it can be decrypted by the human visual system, without any cryptographic knowledge and computation devices. To reveal the secret information at least a certain number of shares (k) or more are superimposed. Visual Cryptography using color Error Diffusion objective of this scheme is to apply the VCS for color image and get better quality decrypted image with the size of decrypted image should be same as original image. Visual Cryptography using Random number generator and digital Watermarking where divided shares are enveloped in other images using invisible digital watermarking. The shares are generated using Random Number.
Keywords: Visual cryptography, color error diffusion, random number, digital watermarking.
Abstract
Realization of Synchronized Computation and Communication Using Penta MTJ Elements
A. Lakshminarayanan, V. Krishnakumar, N. Jayapal, R. Shankar, K. Shajudeen
DOI: 10.17148/IJARCCE.2015.41175
Abstract: Advanced computing systems infix spintronic devices to boost the outflow performance of standard CMOS systems. High speed, low power, and infinite endurance area unit vital properties of magnetic tunnel junction (MTJ), a spintronic device that assures its use in reminiscences and logic circuits. This paper presents a Penta MTJ-based gate that provides simple cascading, self-referencing, less voltage headroom downside in pre charge sense electronic equipment and low space overhead contrary to existing MTJ-based gates. Penta MTJ is employed here as a result of it provides warranted disturbance free reading and inflated tolerance to method variations at the side of compatibility with CMOS method. The gate is valid by simulation at the 180 -nm technology in Cadence virtuoso.
Keywords: Counter, magnetic logic gate, magnetic tunnel junction.
Abstract
Controlling Attacks and Intrusions on Internet Banking using Intrusion Detection System in Banks
Pritika Mehra
DOI: 10.17148/IJARCCE.2015.41176
Abstract: Internet usage has increased exponentially and has managed to cover all geographical areas across the world. Business-to-business (B2B), business-to-consumer (B2C), Internet-based e-commerce transactions, and Internet banking have come of age and are facing serious threats of attack, penetration or intrusion, despite the best available firewalls. Efficient and modern security tools are required to prevent information loss during internet banking and e-commerce transactions. One such tool that provides protective mechanism during internet based e-commerce transactions and banking is Intrusion Detection System. This paper discusses the types of attacks and how Intrusion Detection Systems can detect and prevent those attacks and intrusions during internet banking and other e-commerce based transactions.
Keywords: Intrusion, Intrusion detection system, Phishing, Pharming, MitM, MitB, Trojan horse virus, NIDS, HIDS.
Abstract
K-means Clustering with MapReduce Technique
Yaminee S. Patil, M. B. Vaidya
DOI: 10.17148/IJARCCE.2015.41177
Abstract: Clustering analysis is most significant tool for distribution of data. The aim of clustering is to find intrinsic structures in data and categorize them into meaningful subgroups for further study and analysis. In clustering certain assumptions are made about some cluster relationships among the data objects that they are applied on. The process of initiation of cluster formation is based on similarity measure. Unique clusters are formed with the same data set taking help of different notations used in variety of clustering algorithms. K-Means Clustering is one such technique used to provide a structure to unstructured data so that valuable information can be extracted. In this paper we are going to study the implementation of K-Means Clustering Algorithm over a distributed environment using Apache Hadoop. The main focus of the paper is on implementation of the K-Means Algorithm is the design of the Mapper and Reducer routines which has been discussed in the paper. The steps involved in the execution of the K-Means Algorithm has also been described in this paper to serve as a guide for practical implementations.
Keywords: Data mining, clustering analysis, K-means algorithm, Hadoop, MapReduce.
Abstract
Service Populating and QoS Aware Mechanism for Cloud Based Environment
Smita Patil, Arpit Solanki
DOI: 10.17148/IJARCCE.2015.41178
Abstract: In today�s world Cloud Computing and Mobile Computing are most popular trends. Cloud Computing provides a huge processing and computing capability to its users and Mobile Computing provides all time connectivity, mobility and software functionality to mobile devices like smartphones. In future these mobile devices are expected to move around the world and switch from one network to another. For such situation a mechanism would be required in order to provide all time connectivity with the network. So, it can be said that movement of users will add congestion to the network, which will result in degradation of Quality of service and hence Quality of Experience too. So a mechanism is required to manage the resources effectively while improving and maintaining the Quality of Service. This paper introduces a framework which will enable service running on local public Cloud to populate on another public Cloud as per the requirement. It also addresses the situation where frequent migration of services would add network congestion. This will prevent network from experiencing huge traffic load and will offer automated service and resource management.
Keywords: Quality of Service, Cloud Computing, Service migration, Mobile computing, network congestion.
Abstract
A Novel Fingerprint Compression Method Based On Sparse Representation
Mahesh N. Karanjkar, Trishala K. Balsaraf
DOI: 10.17148/IJARCCE.2015.41179
Abstract: Recognition of people by means of their biometric features is very popular among the society. There are a variety of biometric techniques including fingerprint recognition, face recognition and eye detection that are used for the privacy and safety purposes in different applications. In current years there has been an increasing interest in the learning of sparse representation of signals. Using an over completeglossary that contains prototype signal-atoms, signals are illustrated by sparse linear combinations of these atoms. Among several biometric recognition technologies, fingerprint compression is very popular for personal identification. One more fingerprint compression algorithm based on sparse representation is introduced. In the algorithm, first we construct a dictionary for predefined fingerprint photocopy patches. For a new given fingerprint images, suggest its patches according to the dictionary by computing l^0-minimization by MP method and then quantize and encode the representation. This paper compares dissimilar compression standards like JPEG, JPEG-2000, WSQ, K-SVDetc.The experiments demonstrate that this is often cost-effective compared with many competitive compression techniques particularly at high compression ratios.
Keywords: Fingerprint Compression, JPEG, JPEG 2000, WSQ, SPIHT, K-SVD, PSNR.
Abstract
Smart Wheel Chair using Neuro – Sky Sensor
Mr. M. Selva Ganapathy, Mrs. N. Nishavithri
DOI: 10.17148/IJARCCE.2015.41180
Abstract: The goal of this project is to measure electric activity in the brain due to firing of the neurons, parse wave to obtain attention and meditation level of brain and using it to move a Wheel Chair. The interactions between neurons create an electric discharge which cannot be measured using current technology. There are different techniques available to detect electric activity in brain. One technique is Electroencephalography (EEG). EEG measures voltage fluctuation along the scalp that results from the interaction between the neurons in the brain. These voltage fluctuations are processed and output to a microcontroller by the EEG sensor. The data packets obtained from the EEG sensor are stored in microcontroller. The attention and meditation levels are obtained from the processed data. These levels are used to control the direction and motion of the Wheel Chair.
Keywords: Electro Encephalo Gram, Brain Computer Interface, Canonical Variate Analysis.
Abstract
Implementation of Area-Efficient and Low Power OFDM Architecture
Rajidi Sahithi, Dr. T. Venkata Ramana
DOI: 10.17148/IJARCCE.2015.41181
Abstract: Fast Fourier Transform (FFT) algorithm is extensively used in numerous signal processing and communication systems. Due to its rigorous computational requirements, it occupies large area and consumes high power if implemented in hardware. By using the FFT concepts we are certainly in emerging efficient architectures for wireless networks which are common in universally now-a-days. The SDC processing engine (PE) is proposed to achieve 100% hardware resource utilization by sharing the common arithmetic resource in the time-multiplexed approach, including both adders and multipliers.
Keywords: FFT, Pipelined Architecture, SDF-SDC.
Abstract
Unauthorized Access Point Detection in Wireless LAN
Mr. Amol S. Papade, Mr. Vikas E. Pansare, Mr. Rohit D. Patil, Prof. S. S. Gore
DOI: 10.17148/IJARCCE.2015.41182
Abstract: Illegal Access Point called as Rogue Access Point (RAP) is an access point that has been installed on a secure network without explicit authorization from a system administrator Wireless Networks has big security threat called Rogue access points. If care is not taken and if this network threats are not detected and mitigated on time, this will result into the serious network damage and data loss. Every organization should give priority for Finding and avoiding rogue wireless access points. Rogue access points, if undetected, can be an open door to sensitive information on the network. Many data raiders have taken advantage of the undetected rogue access points in enterprises to not only get free Internet access, but also to view confidential information. Most of the current solutions to detect rouge access points are not automated and are dependent on a specific wireless technology. The approach is an automated solution which can be installed on any router at the edge of a network. Rogue Access Point detection is a two step process starting with discovering the presence of an Access Point in the network and then proceeding to identify whether it is a rogue or not. The presence of Rogue Access Point (RAP) is major security concern in wireless network. If this kind of security threat is alive into WLAN, it results into leakage of confidential information to outside network. In our implementation, we have make used of clock skew of wireless LAN access point as its fingerprint to detect the fake APs. Fingerprinting will act as unique identification like human fingerprint work. The major objective of using the clock skew interval for detecting the fake AP is to overcome the limitation of existing approach. Existing methods for detection of fake AP has limitation of detecting MAC address spoofing.
Keywords: Wireless LAN, Rogue Access Point (RAP), Media Access Control (MAC), Internet Protocol (IP).
Abstract
Problem Analysis of Multiple Object Tracking System: A Critical Review
Md Zahidul Islam, Md Shariful Islam, Md Sohel Rana
DOI: 10.17148/IJARCCE.2015.41183
Abstract: Simultaneous tracking of multiple objects is a state-of-art problem in the field of computer vision. The problems of tracking single object and tracking multiple objects are not same. For tracking multiple objects, lots of problem can arise due to abrupt object motion, multiple object interaction, drifting of object etc. Some recent work has proposed solution to minimize those problems. The main goal of this article is analyzing the recent work in multiple objects tracking to handle those challenges. Here, Different proposed penalty functions are discussed to handle the multiple objects tracking problem. We also comprehensively discussed contribution and limitation of recent advancement in the field of multiple object tracking.
Keywords: Tracking problems, Tracklet, Hijacking problem, Drifting problem.
Abstract
VM Selection using Index Approach for Deploying Cloud Computing Application and Approach to Obtain Equal Utilization of Virtual Machine
Garima Dubey, Yogendra Kumar Jain
DOI: 10.17148/IJARCCE.2015.41184
Abstract: Cloud computing is an upcoming field having lots of scope and research areas. In this age of technology everything from our home to office is done in the cloud and it is the future of technology. There are so many research has been done in this field to improve the quality of service. In this paper, we propose a model for deploying cloud computing applications on indexed cluster. The proposed model is a hybrid cluster approach which controls the cloud operations. The cloud manager will allocate a VM within the CLUSTER in such a way that it will save the power to a greater extent while maintain the latency and also maintains the efficient utilization of the CLOUD resources.
Keywords: Cloud Computing, Clustering, VM Allocation, Indexing, and Topology aware.
Abstract
A Polynomial Time Algorithm to Determine Singly Connectivity in Directed Graph
Ishwar Baidari, Rashmi Gangadhar
DOI: 10.17148/IJARCCE.2015.41185
Abstract: In this paper, we consider the problem of determining whether or not a directed graph is singly connectedi.e. a directed graph is singly connected if for any pair of vertices there exists at most one simple path connecting them and also undirected graph is singly connected if and only if it is a tree. We have given a straight forward implementation of this problem using DFS algorithm theory which takes polynomial time to check it.
Keywords: DFS; source vertex; cyclic; tree; spanning forest.
Abstract
Design and Development of a Vehicle Monitoring System Using CAN Protocol
Mohammed Ismail. B, K. Sasidhar, Syed Aquib Ahmed
DOI: 10.17148/IJARCCE.2015.41186
Abstract: Nowadays economical automobiles are developed by more of electro mechanical parts with analog interface for efficient & cost effective operation. Generally a vehicle is built with an analog driver-vehicle interface for indicating various vehicle statuses like speed, fuel level, engine temperature etc. This paper presents a design & development of cost effective solution for digital driving interface with a semi-autonomous vehicle improving the driver-vehicle interaction with increase in safety. Our designed system uses a PIC Microcontroller based data acquisition system that uses in built ADC to gather data from analog sensors to digital format and visualize them to the vehicle driver through a LCD display. The communication module used here is an embedded network bus CAN, which has efficient data transfer. Experimental data with a prototype is obtained for various vehicle parameters like vehicle speed, engine temperature and fuel level in the tank which are compatible with a real time system.
Keywords: Controller area network (CAN), Vehicle Sensors, PIC Microcontroller, Communication Module.
Abstract
Detection of False Sub Aggregated Data in Wireless Sensor Networks
B. Gowtham, A.L. Sreenivasulu
DOI: 10.17148/IJARCCE.2015.41187
Abstract: In wireless sensor networks data aggregation is one of the major issues. To aggregate the data a robust aggregation framework called synopsis diffusion which combines multi path routing algorithm to accurately aggregate the data in case of failures, Nodes are compromising due to the lack of physical protection. Compromised nodes are very vulnerable to attacks in sensor networks. By taking this is an advantage adversary can launch false data over the network. Here a novel light weight verification algorithm performed by the sink node or base station and that can determine any false sub-aggregate data in the aggregated data. It contributes to make synopsis diffusion approach in secure against attacks on compromised node which false sub-aggregate the data. Theoretical analysis and extensive simulations have been conducted and verified.
Keywords: Base station, data aggregation, synopsis diffusion.
Abstract
Web Mining – Data Mining Concepts, Applications, and Research Directions
Mrs. S. R. Kalaiselvi, S. Maheshwari, V. Shobana
DOI: 10.17148/IJARCCE.2015.41188
Abstract: The prolific growth of web-based applications and the enormous amount of data involved therein led to the development of techniques for identifying patterns in the web data. Web mining refers to the application of data mining techniques to the World Wide Web. Web usage mining is the process of extracting useful information from web server logs based on the browsing and access patterns of the users. The information is especially valuable for business sites in order to achieve improved customer satisfaction. Based on the user�s needs, Web Usage Mining discovers interesting usage patterns from web data in order to understand and better serve the needs of the web based application. Web Usage Mining is used to discover hidden patterns from weblogs. It consists of three phases like Preprocessing, pattern discovery and Pattern analysis. In this paper, we present each phase in detail, the process of extracting useful information from server log files and some of application areas of Web Usage Mining such as Education, Health, Human-computer interaction, and Social media.
Keywords: Web Mining, Data Mining, World Wide Web, server log files.
Abstract
Deduplication Techniques in Storage System
Deepali Choudhari, R. W. Deshpande
DOI: 10.17148/IJARCCE.2015.41189
Abstract: Ever increasing volume of back up data in cloud storage may be a vital challenge. There is a need of data management as back up windows are shrinking due to growth of information. So, to make data management scalable deduplication concept is used. It is a technique of keeping only one unique instance of data copy by detecting identical data copies and eliminating those so that it could improve storage utilization, system performance of storage system. There are different schemes introduced by people. This paper surveys these different deduplication approaches.
Keywords: deduplication, convergent encryption, cloud storage, cryptographic.
Abstract
A Survey of Context-Aware Framework for Pervasive Environment
K. H. Walse, Dr. R.V. Dharaskar, Dr. V. M. Thakare
DOI: 10.17148/IJARCCE.2015.41190
Abstract: The context aware framework is the backbone of context aware application the framework. It is because the context has continuously been in the process of evolution. It is implemented depending on the requirements of situation. Many architectures or framework were proposed in order to support the development and to ease the implementation of context-aware systems. Special thrust has been put on characteristics related to the application domain and techniques used. Therefore, a survey of such architectures makes comparison between them. Their evaluation is strongly recommended. Earlier surveys are either restricted to a limited number of architectures or they do not offer a good comparison or their evaluation is not based on appropriate criteria which keep them as mere descriptions. The present survey/research is made with the aim finding out the relevant architectures which mark the evolution of context-aware systems based on criteria related to pervasive computing. The present study would serve as a guide to developers of context-aware systems and help them to make architectural choices.
Keywords: Context-aware, Pervasive Computing, Context-Aware Computing, Framework.
Abstract
A Review on Various Techniques of Secure Signature Verification: SIFT, SURF and G-SURF
Ritika Sachdeva, Ekta Gupta
DOI: 10.17148/IJARCCE.2015.41191
Abstract: Biometrics authentication is utilized as a part of software engineering as a type of ID and access control. It is additionally used to distinguish people in gatherings that are under observation. Biometric identifiers are the particular, quantifiable attributes used to name and depict individuals. In this paper we are talking about signature verification. Signature verification is the process for verification of the signatures for authentication of users. In the process of signature verification two types of scenario has been used, that are online and offline signature verification. In these feature the SIFT, G-SURF & SURF approaches have been utilized. These approaches utilize some parameter for feature extraction using key point. The G-SURF uses the global feature for the image and adds this feature with SURF feature to select optimum feature. In the purposed work we will compare the performance of this feature for the extraction of signature verification & evaluate performance for same database.
Keywords: Biometric, SIFT, G-SURF & SURF.
Abstract
Comparison of Contemporary Real Time Operating Systems
Mr. Sagar Jape, Mr. Mihir Kulkarni, Prof. Dipti Pawade
DOI: 10.17148/IJARCCE.2015.41192
Abstract: With the advancement in embedded area, importance of real time operating system (RTOS) has been increased to greater extent. Now days for every embedded application low latency, efficient memory utilization and effective scheduling techniques are the basic requirements. Thus in this paper we have attempted to compare some of the real time operating systems. The systems (viz. VxWorks, QNX, Ecos, RTLinux, Windows CE and FreeRTOS) have been selected according to the highest user base criterion. We enlist the peculiar features of the systems with respect to the parameters like scheduling policies, licensing, memory management techniques, etc. and further, compare the selected systems over these parameters. Our effort to formulate the often confused, complex and contradictory pieces of information on contemporary RTOSs into simple, analytical organized structure will provide decisive insights to the reader on the selection process of an RTOS as per his requirements.
Keywords: RTOS, VxWorks, QNX, eCOS, RTLinux,Windows CE, FreeRTOS
Abstract
A Review On Clustering Of Streaming Data
Madhuri Vilas Gohad, Prashant Yawalkar
DOI: 10.17148/IJARCCE.2015.41193
Abstract: Data stream clustering is an active research area that has recently used to discover knowledge from continuously generated large amounts of data. There are various data stream clustering algorithms have been developed and proposed to perform clustering on data stream. Clustering is the task of arrangement a set of objects so that objects in the identical group are more related to each other than to those in other groups (clusters). The data stream clustering imposes several challenges that need to be addressed; some of them are dealing with dynamic data, capable of performing processing on fast incoming objects, also capable to perform incremental processing of data objects, and ability to address time, memory and cost limitations.
Keywords: Affinity Propagation, Autonomic Computing, Clustering, Data stream, Grid Monitoring.
Abstract
A Review On Credit Card Fraud Detection Using BLAST-SSAHA Method
Mr Yogesh M Narekar, Mr Sushil Kumar Chavan
DOI: 10.17148/IJARCCE.2015.41194
Abstract: In the current days, with, the usage of credit cards has increased radically due to its varied benefits. The mode of payment through credit card has made people�s life easy for both online and ordinary purchases and thus widespread. This enormous usage of credit card leads to different frauds. Due to the rise and rapid growth of E-Commerce, use of credit cards for online purchases has dramatically increased and it caused an explosion in the credit card fraud. As credit card becomes the most popular mode of payment for both online as well as regular purchase, cases of fraud associated with it are also rising. In real life, fraudulent transactions are scattered with genuine transactions and simple pattern matching techniques are not often sufficient to detect those frauds accurately. This system seeks to investigate the current debate regarding the credit fraud in the banking sector and vulnerabilities in online banking and to study some possible remedial actions to detect and prevent credit fraud. The system reveals lots of channels of fraud in online banking which are increasing day by day. These kinds of fraud are the main barriers for the e-business in the banking sector. This system also gives the details of a survey of various techniques used in credit card fraud detection mechanisms and evaluates each methodology based on certain design criteria.
Keywords: Fraud detection, credit card, BLAST-SSAHA method, E-Commerce.
Abstract
Comparative Analysis Of Microstrip Rectangular Patch Antenna Using Different Height Substrates
Sudarshan Kumar Jain
DOI: 10.17148/IJARCCE.2015.41195
Abstract: A compact rectangular microstrip patch antenna with minimum return loss having enhanced gain and bandwidth is proposed in this paper. The antenna is analysed for the different height of the dielectric substrate. The resonant frequency of the designed antenna is 2.4 GHz. The dielectric substrate used for antenna designed is FR4-epoxy having a dielectric constant of 4.4. The designed antenna has a bandwidth which covers the frequency band of WLAN applications. It is observed that the gain of the designed antenna using more height substrate is greater than the normal substrate height. The return loss is also greater than the conventional patch antenna. Microstrip line feeding method is used to energize the antenna. VSWR for the designed antenna is less than 2. A comparative analysis of the different characteristics of the antenna such as gain, bandwidth, return loss, directivity and VSWR is carried out using different height substrate .Software High Frequency Structure Simulator (HFSS) is used for the simulation of the designed antenna.
Keywords: VSWR, Microstrip antenna, Bandwidth, High Frequency structure simulator.
Abstract
A Review on Categorization of Text Data Using Side Information
Sandeep Jadhav, Dr. K. V. Metre
DOI: 10.17148/IJARCCE.2015.41196
Abstract: In today�s digital environment, text databases are rapidly increases due to use of internet and communication mediums. Different text mining techniques are used for knowledge discovery and Information retrieval. Text data contains the side information along with the text data. Side information may be the metadata associated with text data like author, co-author or citation network, document provenance information, web links or other kind of data which provide more insights about the text documents. Such side information contains tremendous amount of information for the clustering purpose. Using such side information in the categorization process provides more refine clustered data. But sometimes side information may be noisy and results in wrong categorization which decreases the quality of clustering process. Therefore, a new approach for mining of text data using side information is suggested, which combines partitioning approach with probabilistic estimation model for the mining of text data along with the side information.
Keywords: Text data mining, categorization, side information, clustering.
Abstract
Virtual Clustering Based Routing For Power Heterogeneous Manets
M. Vevagananthan, Mr.P.Vijayakumar
DOI: 10.17148/IJARCCE.2015.41197
Abstract: In MANETs, network may consist of devices with multiple characteristics in terms of transmission power, energy, capacity etc. Especially in MANETs, network may consist of devices with multiple, nodes are likely to transmit at different power levels, thereby causing conversation links varying. This causes link asymmetry problem. The link asymmetry problem can be solved by using a cross layer approach without considering the benefits from high power nodes and collision may be avoided using carrier sensing. The presence of high power nodes in MANETs has its own advantages like coverage area and reduction in transmission delay. At the same time, disadvantages like interference and noise .The existing cross layer approach does not address the above problem. At this context, we are considering the problem of improving the routing performance of power multiple MANETs efficiently exploiting the advantages and avoiding the disadvantages of high power nodes mentioned above. In a power heterogeneous network such as mentioned above, there are high power nodes as well as low power nodes. Due to interference raised by high power nodes, the throughput of such networks will be severely affected. To address this issue a loose-virtual-clustering (LVC) based routing protocol is proposed.
Keywords: Mobile ad hoc networks, LRPH, power heterogeneous, routing protocol, geographical routing.
Abstract
Recommendation System for Answering Missing Tuples
Kanchan Pekhale, Dr. K. V. Metre
DOI: 10.17148/IJARCCE.2015.41198
Abstract: In the recent years, the quality and the usability of database systems have received more attention. The performance of database systems has gained more improvement in the past decades so, they are more and more difficult to use. The why-not questions are needs of user to know why her expected tuples are not shown up in the query results i.e. the features of explaining missing tuples in database queries. Database system is having the capability that enables users to seek clarifications on expected query results as well as the absence of expected tuples (i.e. missing tuples). It would be very helpful to users if they referred why-not questions to seek clarifications on expected tuples in query results. There are two algorithms to answer why-not questions efficiently. These algorithms are able to return high quality explanations efficiently. Many users love to pose those kinds of queries when they are making multi-criteria decisions and user need approximate information from the huge Database.
Keywords: Top-k Question, Dominating Question, Skyline Refined Queries, ConQueR Method.
Abstract
Hybrid Approach for Image Restoration
K.S. Sathya, C. Nithya
DOI: 10.17148/IJARCCE.2015.41199
Abstract: Local and Nonlocal image representations have shown great potential in low-level vision tasks leading to several state-of-the-art image restoration techniques. Both of these representations have their own advantages. The work towards combining these two representations seems to be minimal. The paper tries to contribute to the area of unification of these two representations. Ahybrid approach for image restoration has been proposed in this paper that combines both of these representatins. The main idea behind this approach is singular value decomposition (SVD), a bilateral variance estimation perspective. SVD of similar patches has the property of pooling both local and non-local information for estimating signal variances. This, in-turn, has led to the development of new class of image restoration algorithms. For noisy data, the algorithm makes use of iterative regularization concept; for incomplete data, it makes use of deterministic annealing-based solution along with dictionary learning. The performance of this hybrid approach will have the results that can be compared favorably with other leading image restoration algorithms.
Keywords: Deterministic annealing, iterative regularization, singular value thresholding, singular value decomposition, image denoising, image completion, patch clustering.
Abstract
Review Paper on Security Intelligence with Big Data Analytics
Chiquita Prabhu, Omkar Neogi, Kriti Shrivastava, Neha Katre
DOI: 10.17148/IJARCCE.2015.411100
Abstract: Unfortunately no one is immune to security threats that are exponentially increasing in cost, impact, extent and complexity. Though there are traditional security approaches, they aren�t competent enough due to abundant information and lack of tools that don�t allow them to gain insight into the information to obtain knowledge about unknown threats. To resolve this problem as well as detect weak signals of threats that hide behind the noise of huge data in an organization, we have security intelligence platform that is a Big Data solution. This aim of this paper is to summarize why Big Data analytics is used for security intelligence, the ideal requirements of a platform designed for security purpose, how Big Data analytics is helpful compared to traditional approaches and the study of various platforms developed for security intelligence using Big Data analytics. Lastly we have a comparative study between the two most popular security intelligence platforms and then we discuss about how Big Data could be a dominant name in the field of security if it overcomes certain challenges.
Keywords: Anomaly, Beehive, Big Data, Big Data analytics, Intelligence driven security, IBM QRadar.
Abstract
A Review on Nearest Neighbour Techniques for Large Data
Deoyani Sonawane, Prof. P. M. Yawalkar
DOI: 10.17148/IJARCCE.2015.411101
Abstract: This for many computer innovation and machine learning problems, key of good performance is large data set. However, in many computer innovation and machine learning algorithms consist of ?nding nearest neighbour matches in large data set is computationally expansive part. New algorithms are developed for approximate nearest neighbour matching and evaluation and then compare them with preceding algorithms. For matching high dimensional features most efficient algorithms are essential. Perhaps the Locality sensitive hashing (LSH) technique is best known hashing based nearest neighbour technique which requires multiple numbers of hash functions with the property that the hashes of elements that are close to each other are also likely to be close. Variants of LSH such as multi-probe LSH improves the high storage costs by reducing the number of hash tables, and LSH Forest adapts better to the data without requiring hand tuning of parameters. for ?nding the best algorithm to search a particular data set, Optimal nearest neighbour algorithm and its parameters depend on the large data set characteristics and gives description of automated con?guration procedure. In order to scale to very large data sets that would otherwise not ?t in the memory. When dealing with such large data, possible solutions include performing some dimensionality reduction on the data, keeping the data on the disk and loading only parts of it in the main memory or distributing the data on several computers and using a distributed nearest neighbour search algorithm.
Keywords: Nearest Neighbour Search, Big Data, Approximate Search.
Abstract
Co and Adjacent Channel Interference Evaluation in GSM and UMTS Cellular Networks
Selma Sbit, Mohamed Bechir Dadi, Belgacem Chibani
DOI: 10.17148/IJARCCE.2015.411102
Abstract: Interference is the major limiting factor when evaluating the performance of cellular radio systems. Sources of interference could be another mobile at the same cell, a call in progress in a neighboring cell or other base stations operating in the same frequency band. Interference on voice channels causes cross talk due to an undesired transmission. On control channels, interference leads to missed and blocked calls due to errors in the digital signaling. Interference is more severe in urban areas, due to the large number of base stations and mobiles. Interference has been recognized as a major bottleneck when looking for increasing capacity and is often responsible for dropped calls. The two major types of system-generated cellular interference are co-channel interference and adjacent channel interference. Even though interfering signals are often generated within the cellular system, they are difficult to control in practice (due to random propagation effects). Even more difficult to control this interference caused out-of-band users, which arises without warning due to front end overload of subscriber equipment or intermittent inter-modulation products. In practice, the transmitters from competing cellular carriers are often a significant source of out-of-band interference, since competitors often locate their base stations in close proximity to one another in order to provide comparable coverage to customers. This paper presents study of co and adjacent channel interferences in second and third generations of cellular radio systems.
Keywords: Frequency Reuse; Co-Channel Interference; Adjacent Channel Interference; Evaluation.
Abstract
Shadow Removal from Images Using the Concept of Chromaticity
Harpreet Kaur, Navdeep Kaur
DOI: 10.17148/IJARCCE.2015.411103
Abstract: A shadow basically is a sort of image produced when light is blocked. A shadow generally takes up most of the space behind an opaque item with light right in front of it. Shadows often introduce errors in the performance of computer vision algorithms, such as object detection and tracking. For better performance, there is need to remove it to get shadow free image. This paper proposes a simple method to remove shadow from images. In this work, in order to objective evaluate the performance of shadow removal system, the statistical parameters were compared using information mean and average gradient. Average gradient can express the ability of small details and can be used to evaluate the clarity of the image, the greater its value, the more clear that the image. In addition, to some extent, the mean can be used to evaluate the image contrast.
Keywords: Shadow removal, Chromaticity, Mean, Average gradient, Statistical Parameters.
Abstract
Optimized Edge Detection of Colored and Grayscale Images using Matrix Laboratory
Sheenam
DOI: 10.17148/IJARCCE.2015.411104
Abstract: Edge Detection is an image processing technique for finding the boundaries of objects within images. It works by detecting discontinuities in brightness. Edge detection is used for Image Image Segmentation and data extraction in areas such as image processing, computer vision, and machine vision. In an image, an edge is a curve that follows a path of rapid change in image intensity. Edges are often associated with the boundaries of objects in a scene. Edge detection is used to identify the edges in an image. To find edges, you can use the edge function. This function looks for places in the image where the intensity changes rapidly, using one of these two criteria. Places where the first derivative of the intensity is larger in magnitude than some threshold Places where the second derivative of the intensity has a zero crossing edge provides a number of derivative estimators, each of which implements one of the definitions above. For some of these estimators, you can specify whether the operation should be sensitive to horizontal edges, vertical edges, or both. edge returns a binary image containing 1's where edges are found and 0's elsewhere. The most powerful edge-detection method that edge provides is the Canny method. The Canny method differs from the other edge-detection methods in that it uses two different thresholds (to detect strong and weak edges), and includes the weak edges in the output only if they are connected to strong edges. This method is therefore less likely than the others to be fooled by noise, and more likely to detect true weak edges.
Keywords: Edge Detection, Image Processing Technique, Image Segmentation, Data Extraction, Canny method.
Abstract
Fingerprint Compression based on Sparse Representation using Pixel Level Patch Decomposition
J. Saikrishna, Dr. T. Sreenivasulu Reddy
DOI: 10.17148/IJARCCE.2015.411105
Abstract: Reducing memory size of image is known as Compression. In the process of compression there will be degradation in the image, such type of compression is known as Lossy compression. The counter part of losssy compression is Lossless compression, where image is not degraded and quality is retained. The classical methods to compress an image were JPEG using DCT and JPEG 2000 using DWT. These methods come under frequency domain lossy compression standards. To improve the compression ratio and retain the quality of compressed image compared with these classical methods a new method in spatial domain is proposed which is known as sparse representation using pixel level patch decomposition. The decomposition is done on the image based on the pixel values at the level of 20X20 patches and a dictionary is constructed to remove the redundancy in the image, a threshold value is fixed the patch values which are greater than the threshold values retained in the dictionary and remaining are discarded. The metrics like PSNR, MSE and compression ratio are calculated and significant improvement is observed in the proposed method.
Keywords: JPEG, JPEG2000, Sparse Representation, Fingerprint Dictionary.
Abstract
Analysis of EEG Rhythms in Epilepsy Patients Using MPSO Method
M.J.S Joshi, R.S.K Vaibhav, R.V.S Satyanarayana
DOI: 10.17148/IJARCCE.2015.411106
Abstract: One of the most important things in diagnosis of epilepsy is to find the exact location of the epileptogenic point. EEG is a tool commonly used at epilepsy diagnosis centers for diagnosis purposes. . In this paper, the modified particle swarm optimization (MPSO) method used to solve the EEG source localization problem. The attempt has been done here to estimate the brain activity on the basis of spectrum analysis. EEG classification can be very useful in predicting the action or the intention of action performed on the basis of EEG which leads to more development in brain computer interface. The brain waves a, �, ?, d, ? were extracted using frequency filtering and estimating the level of disease is done by the clinical experts based on the amplitudes of the brain waves. The EEG data has been referred from a website MIT-BIH and the mathematical tool for EEG analysis called EEGLAB has been used to perform work in this paper.
Keywords: Electroencephalogram (EEG), particle swarm optimization (PSO).
Abstract
Comparative Analysis of Techniques for Detecting Copy-Move Image Forgery
Pameli Mukherjee, Saurabh Mitra
DOI: 10.17148/IJARCCE.2015.411107
Abstract: Various image editing tools available today let alter image content without any expert guidance. Hence, image based crime is increasing at a faster rate than before. Various algorithms exist to detect copy-move forgery in images. In this paper, we compare two popular methods based on DWT and DCT to study and analyze their performance based on execution time and visual results. In both the methods, phase correlation has been applied after respective transformation, as a similarity criterion to estimate the translate offset between two similar regions of an image.
Keywords: DCT (Discrete Cosine Transform), DWT (Discrete wavelet Transform), lexicographical sorting, phase correlation.
Abstract
Intelligent Techniques with GUI by Challenge Keypad for Secure Password
Krishna S. Gaikwad, Prof. Amruta Amune
DOI: 10.17148/IJARCCE.2015.411108
Abstract: In general, all the keypad based authentication system having several possibilities of password guessing by means of shoulder movements. Shoulder-surfing is an attack on password authentication that has traditionally been hard to defeat. This problem has come up with a new solution. Devising a user authentication scheme based on personal identification numbers (PINs) that is both secure and practically usable is a challenging problem. The greatest difficulty lies with the susceptibility of the PIN entry process to direct observational attacks, such as human shoulder-surfing and camera-based recording.PIN entry mechanism is widely used for authenticating a user. It is a popular scheme because it nicely balances the usability and security aspects of a system. However, if this scheme is to be used in a public system then the scheme may suffer from shoulder surfing attack. In this attack, an unauthorized user can fully or partially observe the login session. Even the activities of the login session can be recorded which the attacker can use it later to get the actual PIN. In this paper, we propose an intelligent user interface, known as Color Pass to resist the shoulder surfing attack so that any genuine user can enter the session PIN without disclosing the actual PIN. The Color Pass is based on a partially observable attacker model. The experimental analysis shows that the Color Pass interface is safe and easy to use even for novice users.
Keywords: PIN, Shoulder Surfing Attack, User Interface, Partially Observable.
Abstract
Internet of Things (IOT) Standards, Protocols and Security Issues
Ahmed Mohammed Ibrahim Alkuhlani, Dr S.B. Thorat
DOI: 10.17148/IJARCCE.2015.411109
Abstract: The internet of things (IOT) is the new revolution of internet after PCS and Servers- Clients communication now sensors, smart object, wearable devices, and smart phones are able to communicate. Everything surrounding us can talk to each other. life will be easier and smarter with smart environment, smart homes,smart cities and intelligent transport and healthcare.Billions of devices will be communicating wirelessly is a real huge challenge to our security and privacy.IOT requires efficient and effective security solutions which satisfies IOT requirements, The low power, small memory and limited computational capabilities . This paper addresses various standards, protocols and technologies of IOT and different security attacks which may compromise IOT security and privacy.
Keywords: Internet of Things, Security, IEEE 802.15.4, 6LoWPAN, ROLL, CoAP.
Abstract
Ant Colony Optimization Algorithm for Composition of Web Service using Mobile agents based Semantic, WSDL and QOS analysis
Mr. Sunil R Dhore, Prof. Dr M U Kharat
DOI: 10.17148/IJARCCE.2015.411110
Abstract: Building a composite web service as per the user�s complex need requires using the multiple online web services available. Selecting the best web service for composition will become combinatorial problem leads to be the NP complete problem. To solve this problem we are proposing a framework for composite web service using ant colony optimization algorithm. For choosing the proper path selection in ACO, we are negotiating with individual web service to select it for the participation during composite web service. For negotiation we are using intelligent agents for analyzing web services for based on semantic, WSDL and QOS description. As per the profile of the users the negotiation agents are built for building composite web services.
Keywords: Mobile agent, web service composition, QOS, SOA, ACO, WSDL.
Abstract
A Review Paper on 2D to 3D Face Recognition Technology
Vinita Bhandiwad
DOI: 10.17148/IJARCCE.2015.411111
Abstract: As one of the most successful application of image analysis and understanding A Face recognition has recently gained significant attention. Basically there are two main reasons for such a trend (a) the first is wide range of commercial & law enforcement applications (b) the availability of feasible technologies. Among the diverse contents of multimedia face objects are preferably more important. For ex: a security system that is able to automatically track human objects and report their ID�s. However the 2D images of 3D face can dramatically change due to lighting and viewing variation. Face recognition in uncontrolled environment is hindered by variation in illumination, pose, expression & occlusion of faces. Many practical face recognition systems are affected by these variations. One way to increase the robustness to illumination, pose expression and occlusions is to use 3D facial images. In this paper we will be able to understand how we are developing from 2D to 3D face recognition technology with a complete overview of face recognition technology.
Keywords: Face recognition, 2D images, 3D images.
Abstract
Survey of Efficient Algorithms in Data Mining For High Utility
V. Keerthy, Mrs. B. Buvaneswari
DOI: 10.17148/IJARCCE.2015.411112
Abstract: Data mining is a wide spreading research topic with its frequent applications in online e-business and web click stream analysis. Mining high utility itemsets from a transactional database relates to the discovery of itemsets with high utility like profits or gains. Efficient discovery of frequent itemsets in large datasets is a crucial task of data mining. From the past few years many methods have been proposed for generating high utility patterns, by this there are some problems as producing a large number of candidate itemsets for high utility itemsets. In high utility itemset mining, the profit values for every item are considered. Generating high utility itemsets from a set of transactions in horizontal data format is a common practice. We hereby present the study of issues related to the different structures used and algorithms for mining the high utility itemsets.
Keywords: Data mining; frequent itemset; high utility itemset; transactional database.
Abstract
Smart Amplify and Forward Relaying in Time and Frequency Diversity Systems
Nelly M. Hussein
DOI: 10.17148/IJARCCE.2015.411113
Abstract: Long distances transmission in mobile communication is considered serious problem especially in 3rd and 4th generations for mobile systems. In order to keep service efficiency in long distance similar to small distance case, two possible solutions are proposed; first is by applying efficient amplification and equalization stages at receiving end. Second solution is by applying relaying network between source and destination. In this paper two relaying techniques were introduced; amplify and forward (AF) relaying and smart AF relaying. Both mention relaying techniques have been simulated using MATLAB program in case of MIMO CDMA and OFDM systems.
Keywords: MATLAB, MIMO, CDMA, OFDM, amplify and forward (AF), Wireless Sensor Networks (WSN).
Abstract
A Multimodal Biometric Identification System Using Finger Knuckle Print and Iris
Sukhdev Singh, Dr. Chander Kant
DOI: 10.17148/IJARCCE.2015.411114
Abstract: Multimodal biometric system becomes an emerging trend in biometric world due to its optimal False Acceptance Rate (FAR) and False Rejection Rate (FRR) Its aim is to fuse two or more biometric traits i.e. face, palm print, finger print, ear, Iris, retina, voice etc. to provide higher security level. This paper describes a new multimodal biometric system by combining Finger Knuckle Print and Iris traits. The identification of proposed system is considerable reliable as compared with unimodal biometric systems. The performance has been tested using PolyU Finger Knuckle Print and CASIA Iris database. The effectiveness of proposed system regarding False Accept Rate (FAR), False Rejection Rate (FRR) and Genuine Accept Rate (GAR) is demonstrated with the help of Multimodal Biometrics Integration (MUBI) software.
Keywords: Biometric Fusion, Finger Knuckle Print, Iris, Matching Score, Multimodal biometrics.
Abstract
Novel Approach for Policy Network Extraction from Web Documents
Miss. Archana Loni, Mrs. Aarti Waghmare
DOI: 10.17148/IJARCCE.2015.411115
Abstract: The increment of information in the web is too large, so search engine come to play an important role to find relation between input keywords. Policy networks are one well defined domain for research. In Today�s World Policy networks are used by economists and political scientists. The analysis of policy networks demands a series of arduous and time-consuming some steps which are manual including interviews and questionnaires .We are calculating the strength of relation between actors in policy networks using feature extracted from data harvested from the web. Features are like out links, webpage counts. Features are like out links, webpage counts, and derived some lexical information from web snippets, documents. The features are evaluated both in jointly and isolation for both positive and negative actor relations Performance is measuring in terms of co-relation between the human rated and the automatically extracted relations.
Keywords: Policy Network, relatedness metrics, web link, web Documents.
Abstract
Enhancing Text Mining Using Side Information
Prof. P. S. Toke, Rajat Mutha, Omkar Naidu, Juilee Kulkarni
DOI: 10.17148/IJARCCE.2015.411116
Abstract: Clustering is a widely studied data mining problem in the text domains. This problem finds numerous applications in classification, visualization, document organization, collaborative filtering and indexing. Large quantity of information from document is present in the form of text. Data is not purely available in text form. It also contains a lot of Side Information, can be different kinds of link in the document, user-access behaviour, document provenance information from web - logs or other non-textual attributes. These attributes may contain large amount of information in the clustering purposes. However, it is difficult to estimate the relative information, when some of information is noisy data. In such situation, it will be risky to integrate this side-information into the mining process, because it can add noise to the process or improve the quality of the illustration for the mining process. An ethical way is needed to perform the mining process, and to maximize the advantages of using this available side information. In this paper, we propose the use of K-means algorithm for better and efficient clustering of the information.
Keywords: Clustering, Information-Retrieval, K-means, Side Information, Text mining.
Abstract
Test Validation of Phase Switched Interferometer Module for Calibration of MST Radar Array
T. Rajendra Prasad, N. Vismayee, T. Venkateswarlu, P. Satyanarayana
DOI: 10.17148/IJARCCE.2015.411117
Abstract: Studies of the earth atmosphere in the regions of Mesosphere, Stratosphere and Troposphere (MST) are being performed by radar operating at a frequency of 53 MHz. This paper outlines the MST Radar signal and data processing along with the test validation activity of phase switching interferometer for phased array antenna calibration. An advanced digital receiver based radio interferometer has been developed which has advantage over prior analog module. The system consists of RF module, phase switch controller module and Digital receiver module. The MATLAB program separates the interlaced data of SUM and DIFF, which were collected on the single channel digital receiver. The successful testing of the system is demonstrated by the signal injection with signal generator and software for separation of data sets of two channels of Phase Switching system that is used to increase discrimination and sensitivity of an interferometer in detecting weak radio sources.
Keywords: MST radar, Coherent Integration End (CIE) pulse, phase switch control, RF interferometer.
Abstract
Grammatical Error Detection Model for Assamese Sentences
Hirakjyoti Sarma, Debasish Das, Kishore Kashyap
DOI: 10.17148/IJARCCE.2015.411118
Abstract: Assamese is an Eastern Indo-Aryan language used mainly in the state of Assam. This paper presents an introduction to the Grammatical analysis model of the structures of Assamese sentences which has evolved from extensive computational, linguistic, and psycholinguistic research, provides a simple set of rules for describing the common properties of all natural languages and the particular properties of individual languages. .Parsing is important in Linguistics and Natural Language Processing to understand the syntax and semantics of a natural language grammar. Parsing natural language text is challenging because of the problems like free word order, ambiguity and inefficiency. We proposed one model which is based on Top down parsing method and a context free grammar (CFG) for Assamese language with some limited domain.
Keywords: Assamese, Grammar Checker, CFG, Parsing.
Abstract
Implementation of Surveillance Monitoring System
Ms. Moharil R. S, Dr. Mrs Patil S. B
DOI: 10.17148/IJARCCE.2015.411119
Abstract: This paper is a review on Implementation of Surveillance Monitoring System based on embedded system and PIR sensor modules. The aim of this work is implement a low cost surveillance system using the serial camera using a desktop programming. Matlab is used to for detection of number of person and GSM module is used to provide necessary information to the owner via SMS. When intruders are detected by ultrasonic sensors, the camera will start & captures the images and save it into memory storage. After saving images, these images will send on the processing unit which provides the information about the number of person. PIR module has made the system more reliable.
Keywords: Embedded System, Matlab, GSM module, PIR sensor module.
Abstract
Detect Pedestrian Orientation by Integrating Multiclass SVM Utilizing Binary Decision Tree
G. Santoshi, G Gowri Pushpa
DOI: 10.17148/IJARCCE.2015.411120
Abstract: In Driver assistance systems the pedestrian protection is an essential component which should be able to predict the probability of collision after detecting the pedestrian and it is also important to consider all the cues available in order to make the prediction. One such cue is the direction in which the pedestrian is facing which could be used in predicting where the pedestrian may move in future. The proposed method describes a novel approach for solving the problem of Pedestrian orientation classification by SVM based Binary Decision Tree (SVMBDT) architecture for solving multiclass problem. The hierarchy of binary decision tree subtasks the use of SVM for decision making. The proposed algorithm takes advantage of both the efficient computation of decision tree and high classification accuracy of SVM. Experiment shows the performance of estimating orientation and describes to show the promise of the approach.
Keywords: Orientation, Multiclass problem, SVMBDT, Binary decision tree.
Abstract
Test Validation of MST Radar 3-channel Digital Receiver
T Rajendra Prasad, K Gayathri, T Venkateswarlu, P Satyanarayana
DOI: 10.17148/IJARCCE.2015.411121
Abstract: A 3-channel Digital Receiver is developed for atmospheric science research applications with MST Radar to implement the spaced antenna, Interferometer techniques using spatially distributed antenna modules. The single channel IF digital receiver operational with VHF Mesosphere-Stratosphere-Troposphere (MST) Radar has been upgraded by additional hardware 3-channel backend IF down/up converter module and digital receiver software module test validated at NARL. The 3-channel Digital receiver is attached to MST radar and satisfactory results are obtained. This paper presents the specifications and hardware details of digital receiver, test results with MST Radar, including performance tests with simulated signals. It outlines the Indian MST Radar signal and data processing, method of offline data processing along with the test validation activity of 3-channel digital receiver.
Keywords: Digital receiver, up/down converter, MST radar.
Abstract
Design and Analysis of Ultra Wide Band Giuseppe Peano Fractal Antenna at Different Height Level of Substrate
Shikha Verma, Sumit Kaushik, Mandeep Singh Saini
DOI: 10.17148/IJARCCE.2015.411122
Abstract: In this paper microstrip patch antenna is designed for ultra wide band application. Giuseppe Peano algorithm is applied on microstrip patch antenna with rectangular patch of size 30� 24 mm2. Coaxial Feed line is used with the patch antenna and results are carried out by using FR-4 as dielectric substrate. The proposed antenna obtained an ultra wide band of 4.94GHz bandwidth having minimum return loss of �32.73dB. The maximum gain produced by the proposed antenna in this ultra wide band range is 2.37 dBi. Hence, the proposed antenna covers C and X band so it can be used for RADAR applications. The change in antenna height is also analysed. Design and simulation is carried out using IE3D simulation software.
Keywords: Microstrip patch antenna, FR-4 substrate, Giuseppe Peano fractal geometry, coaxial feed line. IE3D software.
Abstract
“Efficient Method For Selecting Cluster Head In TRCA Clustering for MANET”
Jitendra Singh Yadav, A P Singh
DOI: 10.17148/IJARCCE.2015.411123
Abstract: Communication in MANET while not having any fixed infrastructure has drawn abundant attention for research. The infrastructure primarily based cellular architecture sets up base stations to support the node mobility. Thus, mapping the ideas of base stations into MANET might meet its challenges like restricted battery power, available band, scalability width etc. Exhaustive simulation based survey has been conducted to study the strengths and weaknesses of existing algorithms that motivated for the design of energy efficient clustering in MANET. Proposed algorithm has been designed to assist the nodes to probe their immediate neighbours. In this protocol, each node broadcasts its own information to the network, in order that it received from a node that lies within its transmission range of the MANET. This algorithm is validated through simulation by using Colour Petri Nets (CPN) before its implementation. We have proposed an efficient method for cluster head selection in Topology robust clustering algorithm (TRCA), that applies the node mobility and its available battery power for calculating the node weights.
Keywords: Mobile Ad Hoc Network (MANET), Cluster head, Non Volunteer head, clustering in MANET.
