Manuscripts

Recent Papers

Novel Approach for Image Forgery Detection Technique based on Colour Illumination using Machine Learning Approach

With the advancement of high resolution digital cameras and photo editing software featuring new and advanced features the chances of image forgery has increased. The images can now be altered and manipulated easily. Image trustworthiness is now more in demand. Images in courtrooms for evidence, images in newspapers and magazines, and digital images used by doctors are few cases that demands for images with no manipulation. Some forgery images that result from portions copied and moved within the same image to “cover-up” something are called as copy-move forgeries. In previous year author use different-different methods such as Principle Component Analysis (PCA), Discrete Wavelet Transform (DWT) & Singular Value Decomposition (SVD) are time consuming. In past many of the algorithm were failed many times in the detection of forged image. Because single feature extraction algorithm is not capable to contain the specific feature of the images. So to overcome the limitation of existing algorithm we will use meta-fusion technique of HOG and Sasi features classifier also to overcome the limitation of SVM classifier. Logistic regression would be able to classify the forged image more precisely.

Published by: K. Sharath Chandra Reddy, Tarun Dalal

Author: K. Sharath Chandra Reddy

Paper ID: V2I4-1157

Paper Status: published

Published: July 15, 2016

Full Details

Novel Approach for Heart Disease using Data Mining Techniques

Data mining is the process of analyzing large sets of data and then extracting the meaning of the data. It helps in predicting future trends and patterns, allowing business in decision making. Presently various algorithms are available for clustering the proposed data, in the existing work they used K mean clustering, C4.5 algorithm and MAFIA i.e. Maximal Frequent Item set algorithm for Heart disease prediction system and achieved the accuracy of 89%. As we can see that there is vast scope of improvement in our proposed system, in this paper we will implement various other algorithms for clustering and classifying data and will achieved the accuracy more than the present algorithm. Several Parameters has been proposed for heart disease prediction system but there have been always a need for better parameters or algorithms to improve the performance of heart disease prediction system.

Published by: Era Singh Kajal, Ms. Nishika

Author: Era Singh Kajal

Paper ID: V2I4-1156

Paper Status: published

Published: July 15, 2016

Full Details

A Review on ACO based Scheduling Algorithm in Cloud Computing

Task scheduling plays a key role in cloud computing systems. Scheduling of tasks cannot be done on the basis of single criteria but under a lot of rules and regulations that we can term as an agreement between users and providers of cloud. This agreement is nothing but the quality of service that the user wants from the providers. Providing good quality of services to the users according to the agreement is a decisive task for the providers as at the same time there are a large number of tasks running at the provider’s side. In this paper we are performing comparative study of the different algorithms for their suitability, feasibility, adaptability in the context of cloud scenario.

Published by: Meena Patel, Rahul Kadiyan

Author: Meena Patel

Paper ID: V2I4-1155

Paper Status: published

Published: July 15, 2016

Full Details

Robust data compression model for linear signal data in the Wireless Sensor Networks

The data compression is one of the popular power efficiency methods for the lifetime improvement of the sensor networks. The wavelet based signal decomposition for data compression, entropy encoding or arithmetic encoding like methods are being used for the purpose of compression in the sensor networks to elongate the lifetime of the wireless sensor networks. The proposed method is based upon the combination of the wavelet signal decomposition of the signal compression with the entropy encoding method of Huffman encoding for the purpose of data compression of the sensed data on the sensor nodes. The compressed data (reduced sized data) consumes the less energy for the small packets in comparison with the non-compressed packets, which directly affects its lifetime. The proposed model has been recorded with more than 70% compression ratio, which is way higher than the existing models. The proposed model has been also evaluated for the signal quality after compression and elapsed time. In both of the latter parameters, the proposed model has been found efficient. Hence, the proposed model effectiveness has been proved from the experimental results.

Published by: Sukhcharn Sandhu

Author: Sukhcharn Sandhu

Paper ID: V2I4-1154

Paper Status: published

Published: July 15, 2016

Full Details

Consumer Trend Prediction using Efficient Item-Set Mining of Big Data

Habits or behaviors presently prevalent amid customers of goods or services. Customer trends trail extra than plainly what people buy and how far they spend. Data amassed on trends could additionally contain data such as how customers use a product and how they converse concerning a brand alongside their communal network. Understanding Customer Trends and Drivers of Deeds provides an overview of the marketplace, analyzing marketplace data, demographic consumption outlines inside the group, and the key customer trends steering consumption. The report highlights innovative new product progress that efficiently targets the most pertinent customer demand states, and proposals crucial recommendations to capitalize on evolving customer landscapes.

Published by: Yukti Chawla, Parikshit Singla

Author: Yukti Chawla

Paper ID: V2I4-1153

Paper Status: published

Published: July 14, 2016

Full Details

A Novel Approach for Detection of Traffic Congestion in NS2

Traffic congestions are formed by many factors; some are predictable like road construction, rush hour or bottle-necks. Drivers, unaware of congestion ahead eventually join it and increase the severity of it. The more severe the congestion is, the more time it will take to clear. In order to provide drivers with useful information about traffic ahead a system must: Identify the congestion, its location, severity and boundaries and Relay this information to drivers within the congestion and those heading towards it. To form the picture of congestion they need to collaborate their information using vehicle-to-vehicle (V2V) or vehicle-to-infrastructure (V2I) communication. Once a clear picture of the congestion has formed, this information needs to be relayed to vehicles away from the congestion so that vehicles heading towards it can take evasive actions avoiding further escalation its severity. Initially, a source vehicle initiates a number of queries, which are routed by VANETs along different paths toward its destination. During query forwarding, the real-time road traffic information in each road segment is aggregated from multiple participating vehicles and returned to the source after the query reaches the destination. This information enables the source to calculate the shortest-time path. By allowing data exchange between vehicles about route choices, congestions and traffic alerts, a vehicle makes a decision on the best course of action.

Published by: Arun Sharma, Kapil Kapoor, Bodh Raj, Divya Jyoti

Author: Arun Sharma

Paper ID: V2I4-1152

Paper Status: published

Published: July 13, 2016

Full Details
Request a Call
If someone in your research area is available then we will connect you both or our counsellor will get in touch with you.

    [honeypot honeypot-378]

    X
    Journal's Support Form
    For any query, please fill up the short form below. Try to explain your query in detail so that our counsellor can guide you. All fields are mandatory.

      X
       Enquiry Form
      Contact Board Member

        Member Name

        [honeypot honeypot-527]

        X
        Contact Editorial Board

          X

            [honeypot honeypot-310]

            X