Manuscripts

Recent Papers

Research Paper

E-learning: Distributed processing of large datasets with parallel algorithm

In today’s lifestyle, every task has been executed by the help of internet. The online system or the internet facilities getting more widespread as well as its becoming part of the human lifestyle. Now in days, every individual recommends that learning should at any-place and any-time, and this recommendation is resolved by E-Learning system. There are multiple E-learning portals are available like javaTpoint. The aim of proposed e-learning platform was: • Course data materials must be secure. • Allowing learner to register and enter into the courses. • Learning should be easier, fluent and learner friendly. • Effective communication between learner and e-learn platform. The learner, using a web browser, interacts with the e-learning application. The learner can register to the system for particular course Next step is learner recommended for textual study material and video study material so that learner can refer notes or be learning the material as per choice. The learner can learn easily, flexible at any time, at anywhere we present a technical analysis of seven studies in the context of the application of data mining approach in e-learning. The results of our analysis support the use of data mining techniques for building a new generation of intelligent e-learning systems for different tasks and domains. E-learning course offerings are now plentiful, and many new e-learning platforms and systems have been developed and implemented with varying degrees of success. These systems generate a top increasing amount of data, and much of this information has the potential to become new knowledge to improve all instances of e-learning. Data mining processes should enable the extraction of this knowledge. Now implementing e-learning web interface can help to design courses more effectively, detect anomalies, inspire and guide further research, and help learners use resources more efficiently. The long-term objective is that to create fully featured learning system for the learning environment.

Published by: Rabia Ashrafi, Sharmila Sankar

Author: Rabia Ashrafi

Paper ID: V4I3-1271

Paper Status: published

Published: May 14, 2018

Full Details
Research Paper

Construction of private methodical query services in the cloud with RASP data commotion

As digital technology is fast evolving and becoming an essential tool for businesses, the concept of cloud is evolved. The phenomenon of the cloud is described in terms of private and public. The proposed approach is based on the public cloud domain, which consists, numerous nodes with distributed computing resources in many different geographic locations. This approach leads the public cloud domain into several cloud partitions. The approach of distributed computing in the cloud simplifies the load balancing and allows database indexes to build over an encryption table. Many times, data into the cloud is stored by maintaining confidentiality, query privacy, efficient query processing at low cost (CPEL Criteria). However, the data owners always desire to submit their quires after realizing the privacy assurance of the cloud. In this aspect, researchers have introduced few techniques such as RASP (Random Space Perturbation), k-NN (k-Nearest Neighbor) Algorithm etc. The main problem across RASP technique is, generating the encryption key which is too large and its implementation makes the time and space overhead. The existing RASP data perturbation technique along with k-NN algorithm is exploited to furnish privacy to the cloud. Wherein, issues such as categorical data and leaked query in the model are identified and addressed, by holding no change in designing the k-NN-R algorithm.

Published by: Shubhashree Sahoo, Gogu Swathi

Author: Shubhashree Sahoo

Paper ID: V4I3-1386

Paper Status: published

Published: May 14, 2018

Full Details
Research Paper

Self curing concrete

Since we identify water shortage is mounting day by day, so an vital research should be needed to do the constructions without water. In early stages, water was mandatory for the curing purposes in construction. Curing of material do a chief job in rising pore structure and microstructure to increase durability and performance with water-soluble polyethylene glycol as a self-curing agent and lightweight aggregate as granite. The aim of this thesis is to revise concerning the power and stability of concrete with water-soluble polyethylene glycol as the self-curing agent. This agent will lessen the water disappearance from concrete. The goal of this investigation is to look at the strength and durability houses of concrete the usage of water-soluble Polyethylene Glycol as the self-curing agent. The characteristic of a self-curing agent is to reduce the water evaporation from concrete, and therefore they grow the water retention capacity of concrete as compared to the conventionally cured concrete. The use of self-curing admixtures may be very crucial from the point of view that saving of water is a necessity every day (every one cubic meter of concrete calls for 3m3 of water in a construction, most of that's used for curing).In this examine, compressive energy and break up the tensile power of concrete containing self-curing agent is investigated and in comparison with those of conventionally cured concrete. It is observed via this experimental take a look at that concrete cast with Polyethylene Glycol as a self-curing agent is stronger than that received via sprinkler curing in addition to by using immersion curing.  

Published by: Rahul Dev, R. Navaneethan

Author: Rahul Dev

Paper ID: V4I3-1392

Paper Status: published

Published: May 14, 2018

Full Details
Research Paper

Selective feature processing with k-Nearest Neighbor classification to predict credit card frauds

The predictive analytics are being used in many applications across the globe ranging from financial risk to avalanche studies. In this paper, a new approach is designed to predict the credit card frauds. This approach utilizes the imbalanced feature correction methodology, which eventually reduces the levitation of the features towards one class. The proposed model is designed to filter the credit card data by analyzing the multiple factors to analyze and predict the fraudulent transactions. The proposed model utilizes the maximum-minimum scaling method to scale the quantitative variables on 0-1 scale, after handling the missing values with column mean value. The SVM and KNN based classification method are used to predict the patterns for the credit card frauds. The experimental results have proved the proposed model based on SVM classification as the most efficient algorithm for the purpose of fraudulent pattern prediction. The SVM has been recorded with 99.94% (mean) of accuracy, which is slightly lower than KNN’s 99.95% (mean). Also KNN outperformed SVM on the basis of recall with (approx 91%) and F1 measure (approx 84%) against approx. 84.50% (recall) and approx 82.50% (F1 measure).

Published by: Simranjeet Kaur, Sikander Singh Cheema

Author: Simranjeet Kaur

Paper ID: V4I3-1399

Paper Status: published

Published: May 14, 2018

Full Details
Research Paper

Fusing the multimodal Image by simple average, simple minimum, simple maximum, PCA, DWT methods of image fusion – A review

Image fusion is a technique that integrates complimentary details from multiple input images such that the new image gives more information and more suitable for the purpose of human visual perception. This paper presents a review on some of the image fusion techniques (simple average, simple minimum, simple maximum, PCA, DWT). Comparison of all the techniques concludes the better approach for its future research.  

Published by: R. S. Arun Kumar, Dr. K. Suganya Devi

Author: R. S. Arun Kumar

Paper ID: V4I3-1341

Paper Status: published

Published: May 12, 2018

Full Details
Case Study

Creating value stream mapping to identify areas of improvements and improve USC mailing process

Undergrad housing of USC receive personal emails, official emails, perishable packages, etc., on a regular basis and it is hence a very important task of the housing department to deliver the mails to the right person without any damage. Mails and packages that are received are numbered and are fed into the system by scanning the tracking number and by assigning the receiver’s name to the tracking number. The process seems to be a fairly simple and effective process that makes sure that no package is lost and no package stays in the mailroom for a long time. But when the process of receiving and sorting is analyzed, we can see that there is a lot of scope for improvement & inventory management. By creating a value stream mapping to identify the areas of improvement, there were many non-value adding steps and a lot of waiting time that lead to an increased cycle time. This paper discusses the improvements that can be implemented in the process steps to make it better and hence increase the efficiency of the service delivered by analyzing the inputs, outputs, inventory and value proposition involved.

Published by: Mahalakshmi Ramasubbu

Author: Mahalakshmi Ramasubbu

Paper ID: V4I3-1375

Paper Status: published

Published: May 12, 2018

Full Details
Request a Call
If someone in your research area is available then we will connect you both or our counsellor will get in touch with you.

    [honeypot honeypot-378]

    X
    Journal's Support Form
    For any query, please fill up the short form below. Try to explain your query in detail so that our counsellor can guide you. All fields are mandatory.

      X
       Enquiry Form
      Contact Board Member

        Member Name

        [honeypot honeypot-527]

        X
        Contact Editorial Board

          X

            [honeypot honeypot-310]

            X