Manuscripts

Recent Papers

Automated Checking of PCB Circuits using Labview Vision Toolkit

LabVIEW has developed very strong vision intelligence software. The investigator has taken very useful industrial problem and has given a solution. All the PCB fabricating/Electrical and Electronic assembling organization, after culmination of the procedure physically check the PCB if every one of the segments are available or not. In the event that any segment will be missing, then it will be send back again for correction. All of these PCB industry do this procedure physically. As the creation of complete PCB is huge (in the scope of thousand and lakhs pieces for each month), thusly enormous labour and time it takes to check all PCB. Generally it takes 5-20 minutes to check each PCB relying on its complexity. So to physically check 1 lakh PCB, approximately 5-20 lacs minutes are required. It is really a huge problem for electrical and electronics industries. It is one of the biggest challenge and hurdle in PCB manufacturing industry now a days.

Published by: Manoj Kumar, Mrs. Shimi S.L

Author: Manoj Kumar

Paper ID: V2I4-1160

Paper Status: published

Published: July 18, 2016

Full Details

Indian Coin Detection by ANN and SVM

Most of the systems available recognize the coins by taking physical properties like radius, thickness etc into consideration due to which these systems can be fooled easily. To remove above discrepancy features, drawings and numerals printed on the coin could be used as the patterns for which support vector machine can be trained so that more accurate recognition results can be obtained. In Previous techniques less emphasis given on classifier function that’s why classification accuracy is not improved. For solving this problem classifier techniques can be used.

Published by: Sneha Kalra, Kapil Dewan

Author: Sneha Kalra

Paper ID: V2I4-1158

Paper Status: published

Published: July 16, 2016

Full Details

Novel Approach for Image Forgery Detection Technique based on Colour Illumination using Machine Learning Approach

With the advancement of high resolution digital cameras and photo editing software featuring new and advanced features the chances of image forgery has increased. The images can now be altered and manipulated easily. Image trustworthiness is now more in demand. Images in courtrooms for evidence, images in newspapers and magazines, and digital images used by doctors are few cases that demands for images with no manipulation. Some forgery images that result from portions copied and moved within the same image to “cover-up” something are called as copy-move forgeries. In previous year author use different-different methods such as Principle Component Analysis (PCA), Discrete Wavelet Transform (DWT) & Singular Value Decomposition (SVD) are time consuming. In past many of the algorithm were failed many times in the detection of forged image. Because single feature extraction algorithm is not capable to contain the specific feature of the images. So to overcome the limitation of existing algorithm we will use meta-fusion technique of HOG and Sasi features classifier also to overcome the limitation of SVM classifier. Logistic regression would be able to classify the forged image more precisely.

Published by: K. Sharath Chandra Reddy, Tarun Dalal

Author: K. Sharath Chandra Reddy

Paper ID: V2I4-1157

Paper Status: published

Published: July 15, 2016

Full Details

Novel Approach for Heart Disease using Data Mining Techniques

Data mining is the process of analyzing large sets of data and then extracting the meaning of the data. It helps in predicting future trends and patterns, allowing business in decision making. Presently various algorithms are available for clustering the proposed data, in the existing work they used K mean clustering, C4.5 algorithm and MAFIA i.e. Maximal Frequent Item set algorithm for Heart disease prediction system and achieved the accuracy of 89%. As we can see that there is vast scope of improvement in our proposed system, in this paper we will implement various other algorithms for clustering and classifying data and will achieved the accuracy more than the present algorithm. Several Parameters has been proposed for heart disease prediction system but there have been always a need for better parameters or algorithms to improve the performance of heart disease prediction system.

Published by: Era Singh Kajal, Ms. Nishika

Author: Era Singh Kajal

Paper ID: V2I4-1156

Paper Status: published

Published: July 15, 2016

Full Details

A Review on ACO based Scheduling Algorithm in Cloud Computing

Task scheduling plays a key role in cloud computing systems. Scheduling of tasks cannot be done on the basis of single criteria but under a lot of rules and regulations that we can term as an agreement between users and providers of cloud. This agreement is nothing but the quality of service that the user wants from the providers. Providing good quality of services to the users according to the agreement is a decisive task for the providers as at the same time there are a large number of tasks running at the provider’s side. In this paper we are performing comparative study of the different algorithms for their suitability, feasibility, adaptability in the context of cloud scenario.

Published by: Meena Patel, Rahul Kadiyan

Author: Meena Patel

Paper ID: V2I4-1155

Paper Status: published

Published: July 15, 2016

Full Details

Robust data compression model for linear signal data in the Wireless Sensor Networks

The data compression is one of the popular power efficiency methods for the lifetime improvement of the sensor networks. The wavelet based signal decomposition for data compression, entropy encoding or arithmetic encoding like methods are being used for the purpose of compression in the sensor networks to elongate the lifetime of the wireless sensor networks. The proposed method is based upon the combination of the wavelet signal decomposition of the signal compression with the entropy encoding method of Huffman encoding for the purpose of data compression of the sensed data on the sensor nodes. The compressed data (reduced sized data) consumes the less energy for the small packets in comparison with the non-compressed packets, which directly affects its lifetime. The proposed model has been recorded with more than 70% compression ratio, which is way higher than the existing models. The proposed model has been also evaluated for the signal quality after compression and elapsed time. In both of the latter parameters, the proposed model has been found efficient. Hence, the proposed model effectiveness has been proved from the experimental results.

Published by: Sukhcharn Sandhu

Author: Sukhcharn Sandhu

Paper ID: V2I4-1154

Paper Status: published

Published: July 15, 2016

Full Details
Request a Call
If someone in your research area is available then we will connect you both or our counsellor will get in touch with you.

    [honeypot honeypot-378]

    X
    Journal's Support Form
    For any query, please fill up the short form below. Try to explain your query in detail so that our counsellor can guide you. All fields are mandatory.

      X
       Enquiry Form
      Contact Board Member

        Member Name

        [honeypot honeypot-527]

        X
        Contact Editorial Board

          X

            [honeypot honeypot-310]

            X