3D CAD Models and its Feature Similarity
Knowing the geometrical object pose of products in manufacturing line before robot manipulation is required and less time consuming for overall shape measurement. In order to perform it, the information of shape representation and matching of objects is become required. Objects are compared with its descriptor that conceptually subtracted from each other to form scalar metric. When the metric value is smaller, the object is considered closed to each other. Rotating the object from static pose in some direction introduce the change of value in scalar metric value of boundary information after feature extraction of related object. In this paper, a proposal method for indexing technique for retrieval of 3D geometrical models based on similarity between boundaries shapes in order to measure 3D CAD object pose using object shape feature matching for Computer Aided Testing (CAT) system in production line is proposed. In experimental results shows the effectiveness of proposed method.
Performance Evaluation of an Online Text-Based Strategy Game
Text-based game is supposed to be a low resource
consumption application that delivers good performances when
compared to graphical-intensive type of games. But, nowadays, some
of the online text-based games are not offering performances that are
acceptable to the users. Therefore, an online text-based game called
Star_Quest has been developed in order to analyze its behavior under
different performance measurements. Performance metrics such as
throughput, scalability, response time and page loading time are
captured to yield the performance of the game. The techniques in
performing the load testing are also disclosed to exhibit the viability
of our work. The comparative assessment between the results
obtained and the accepted level of performances are conducted as to
determine the performance level of the game. The study reveals that
the developed game managed to meet all the performance objectives
A Context-Aware Supplier Selection Model
Selection of the best possible set of suppliers has a
significant impact on the overall profitability and success of any
business. For this reason, it is usually necessary to optimize all
business processes and to make use of cost-effective alternatives for
additional savings. This paper proposes a new efficient context-aware
supplier selection model that takes into account possible changes of
the environment while significantly reducing selection costs. The
proposed model is based on data clustering techniques while
inspiring certain principles of online algorithms for an optimally
selection of suppliers. Unlike common selection models which re-run
the selection algorithm from the scratch-line for any decision-making
sub-period on the whole environment, our model considers the
changes only and superimposes it to the previously defined best set
of suppliers to obtain a new best set of suppliers. Therefore, any recomputation
of unchanged elements of the environment is avoided
and selection costs are consequently reduced significantly. A
numerical evaluation confirms applicability of this model and proves
that it is a more optimal solution compared with common static
selection models in this field.
Sway Reduction on Gantry Crane System using Delayed Feedback Signal and PD-type Fuzzy Logic Controller: A Comparative Assessment
This paper presents the use of anti-sway angle control
approaches for a two-dimensional gantry crane with disturbances
effect in the dynamic system. Delayed feedback signal (DFS) and
proportional-derivative (PD)-type fuzzy logic controller are the
techniques used in this investigation to actively control the sway
angle of the rope of gantry crane system. A nonlinear overhead
gantry crane system is considered and the dynamic model of the
system is derived using the Euler-Lagrange formulation. A complete
analysis of simulation results for each technique is presented in time
domain and frequency domain respectively. Performances of both
controllers are examined in terms of sway angle suppression and
disturbances cancellation. Finally, a comparative assessment of the
impact of each controller on the system performance is presented and
A Survey: Clustering Ensembles Techniques
The clustering ensembles combine multiple partitions
generated by different clustering algorithms into a single clustering
solution. Clustering ensembles have emerged as a prominent method
for improving robustness, stability and accuracy of unsupervised
classification solutions. So far, many contributions have been done to
find consensus clustering. One of the major problems in clustering
ensembles is the consensus function. In this paper, firstly, we
introduce clustering ensembles, representation of multiple partitions,
its challenges and present taxonomy of combination algorithms.
Secondly, we describe consensus functions in clustering ensembles
including Hypergraph partitioning, Voting approach, Mutual
information, Co-association based functions and Finite mixture
model, and next explain their advantages, disadvantages and
computational complexity. Finally, we compare the characteristics of
clustering ensembles algorithms such as computational complexity,
robustness, simplicity and accuracy on different datasets in previous
The New AIMD Congestion Control Algorithm
Congestion control is one of the fundamental issues in computer networks. Without proper congestion control mechanisms there is the possibility of inefficient utilization of resources, ultimately leading to network collapse. Hence congestion control is an effort to adapt the performance of a network to changes in the traffic load without adversely affecting users perceived utilities. AIMD (Additive Increase Multiplicative Decrease) is the best algorithm among the set of liner algorithms because it reflects good efficiency as well as good fairness. Our control model is based on the assumption of the original AIMD algorithm; we show that both efficiency and fairness of AIMD can be improved. We call our approach is New AIMD. We present experimental results with TCP that match the expectation of our theoretical analysis.
3D Spatial Interaction with the Wii Remote for Head-Mounted Display Virtual Reality
This research investigates the design of a low-cost 3D
spatial interaction approach using the Wii Remote for immersive
Head-Mounted Display (HMD) virtual reality. Current virtual reality
applications that incorporate the Wii Remote are either desktop
virtual reality applications or systems that use large screen displays.
However, the requirements for an HMD virtual reality system differ
from such systems. This is mainly because in HMD virtual reality,
the display screen does not remain at a fixed location. The user views
the virtual environment through display screens that are in front of
the user-s eyes and when the user moves his/her head, these screens
move as well. This means that the display has to be updated in realtime
based on where the user is currently looking. Normal usage of
the Wii Remote requires the controller to be pointed in a certain
direction, typically towards the display. This is too restrictive for
HMD virtual reality systems that ideally require the user to be able to
turn around in the virtual environment. Previous work proposed a
design to achieve this, however it suffered from a number of
drawbacks. The aim of this study is to look into a suitable method of
using the Wii Remote for 3D interaction in a space around the user
for HMD virtual reality. This paper presents an overview of issues
that had to be considered, the system design as well as experimental
Support Vector Machines For Understanding Lane Color and Sidewalks
Understanding road features such as lanes, the color
of lanes, and sidewalks in a live video captured from a moving
vehicle is essential to build video-based navigation systems. In this
paper, we present a novel idea to understand the road features using
support vector machines. Various feature vectors including color
components of road markings and the difference between two
regions, i.e., chosen AOIs, and so on are fed into SVM, deciding
colors of lanes and sidewalks robustly. Experimental results are
provided to show the robustness of the proposed idea.
Inefficiency of Data Storing in Physical Memory
Memory forensic is important in digital investigation.
The forensic is based on the data stored in physical memory that
involve memory management and processing time. However, the
current forensic tools do not consider the efficiency in terms of
storage management and the processing time. This paper shows the
high redundancy of data found in the physical memory that cause
inefficiency in processing time and memory management. The
experiment is done using Borland C compiler on Windows XP with
512 MB of physical memory.
Semi-Automatic Approach for Semantic Annotation
The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.
Combining Variable Ordering Heuristics for Improving Search Algorithms Performance
Variable ordering heuristics are used in constraint satisfaction algorithms. Different characteristics of various variable ordering heuristics are complementary. Therefore we have tried to get the advantages of all heuristics to improve search algorithms performance for solving constraint satisfaction problems. This paper considers combinations based on products and quotients, and then a newer form of combination based on weighted sums of ratings from a set of base heuristics, some of which result in definite improvements in performance.
Navigation Patterns Mining Approach based on Expectation Maximization Algorithm
Web usage mining algorithms have been widely
utilized for modeling user web navigation behavior. In this study we
advance a model for mining of user-s navigation pattern. The model
makes user model based on expectation-maximization (EM)
algorithm.An EM algorithm is used in statistics for finding maximum
likelihood estimates of parameters in probabilistic models, where the
model depends on unobserved latent variables. The experimental
results represent that by decreasing the number of clusters, the log
likelihood converges toward lower values and probability of the
largest cluster will be decreased while the number of the clusters
increases in each treatment.
Analytical Study of Component Based Software Engineering
This paper is a survey of current component-based
software technologies and the description of promotion and
inhibition factors in CBSE. The features that software components
inherit are also discussed. Quality Assurance issues in componentbased
software are also catered to. The feat research on the quality
model of component based system starts with the study of what the
components are, CBSE, its development life cycle and the pro &
cons of CBSE. Various attributes are studied and compared keeping
in view the study of various existing models for general systems and
CBS. When illustrating the quality of a software component an apt
set of quality attributes for the description of the system (or
components) should be selected. Finally, the research issues that can
be extended are tabularized.
Software Maintenance Severity Prediction with Soft Computing Approach
As the majority of faults are found in a few of its
modules so there is a need to investigate the modules that are
affected severely as compared to other modules and proper
maintenance need to be done on time especially for the critical
applications. In this paper, we have explored the different predictor
models to NASA-s public domain defect dataset coded in Perl
programming language. Different machine learning algorithms
belonging to the different learner categories of the WEKA project
including Mamdani Based Fuzzy Inference System and Neuro-fuzzy
based system have been evaluated for the modeling of maintenance
severity or impact of fault severity. The results are recorded in terms
of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared
Error (RMSE). The results show that Neuro-fuzzy based model
provides relatively better prediction accuracy as compared to other
models and hence, can be used for the maintenance severity
prediction of the software.
How Prior Knowledge Affects User's Understanding of System Requirements?
Requirements are critical to system validation as they guide all subsequent stages of systems development. Inadequately specified requirements generate systems that require major revisions or cause system failure entirely. Use Cases have become the main vehicle for requirements capture in many current Object Oriented (OO) development methodologies, and a means for developers to communicate with different stakeholders. In this paper we present the results of a laboratory experiment that explored whether different types of use case format are equally effective in facilitating high knowledge user-s understanding. Results showed that the provision of diagrams along with the textual use case descriptions significantly improved user comprehension of system requirements in both familiar and unfamiliar application domains. However, when comparing groups that received models of textual description accompanied with diagrams of different level of details (simple and detailed) we found no significant difference in performance.
A Novel Digital Watermarking Technique Basedon ISB (Intermediate Significant Bit)
Least Significant Bit (LSB) technique is the earliest
developed technique in watermarking and it is also the most simple,
direct and common technique. It essentially involves embedding the
watermark by replacing the least significant bit of the image data with
a bit of the watermark data. The disadvantage of LSB is that it is not
robust against attacks. In this study intermediate significant bit (ISB)
has been used in order to improve the robustness of the watermarking
system. The aim of this model is to replace the watermarked image
pixels by new pixels that can protect the watermark data against
attacks and at the same time keeping the new pixels very close to the
original pixels in order to protect the quality of watermarked image.
The technique is based on testing the value of the watermark pixel
according to the range of each bit-plane.
RFU Based Computational Unit Design For Reconfigurable Processors
Fully customized hardware based technology provides high performance and low power consumption by specializing the tasks in hardware but lacks design flexibility since any kind of changes require re-design and re-fabrication. Software based solutions operate with software instructions due to which a great flexibility is achieved from the easy development and maintenance of the software code. But this execution of instructions introduces a high overhead in performance and area consumption. In past few decades the reconfigurable computing domain has been introduced which overcomes the traditional trades-off between flexibility and performance and is able to achieve high performance while maintaining a good flexibility. The dramatic gains in terms of chip performance and design flexibility achieved through the reconfigurable computing systems are greatly dependent on the design of their computational units being integrated with reconfigurable logic resources. The computational unit of any reconfigurable system plays vital role in defining its strength. In this research paper an RFU based computational unit design has been presented using the tightly coupled, multi-threaded reconfigurable cores. The proposed design has been simulated for VLIW based architectures and a high gain in performance has been observed as compared to the conventional computing systems.
Seamless Flow of Voluminous Data in High Speed Network without Congestion Using Feedback Mechanism
Continuously growing needs for Internet applications
that transmit massive amount of data have led to the emergence of
high speed network. Data transfer must take place without any
congestion and hence feedback parameters must be transferred from
the receiver end to the sender end so as to restrict the sending rate in
order to avoid congestion. Even though TCP tries to avoid
congestion by restricting the sending rate and window size, it never
announces the sender about the capacity of the data to be sent and
also it reduces the window size by half at the time of congestion
therefore resulting in the decrease of throughput, low utilization of
the bandwidth and maximum delay. In this paper, XCP protocol is
used and feedback parameters are calculated based on arrival rate,
service rate, traffic rate and queue size and hence the receiver
informs the sender about the throughput, capacity of the data to be
sent and window size adjustment, resulting in no drastic decrease in
window size, better increase in sending rate because of which there is
a continuous flow of data without congestion. Therefore as a result of
this, there is a maximum increase in throughput, high utilization of
the bandwidth and minimum delay. The result of the proposed work
is presented as a graph based on throughput, delay and window size.
Thus in this paper, XCP protocol is well illustrated and the various
parameters are thoroughly analyzed and adequately presented.
Design and Simulation of a New Self-Learning Expert System for Mobile Robot
In this paper, we present a novel technique called Self-Learning Expert System (SLES). Unlike Expert System, where there is a need for an expert to impart experiences and knowledge to create the knowledge base, this technique tries to acquire the experience and knowledge automatically. To display this technique at work, a simulation of a mobile robot navigating through an environment with obstacles is employed using visual basic. The mobile robot will move through this area without colliding with any obstacle and save the path that it took. If the mobile robot has to go through a similar environment again, then it will apply this experience to help it move through quicker without having to check for collision.
Bi-Criteria Latency Optimization of Intra-and Inter-Autonomous System Traffic Engineering
Traffic Engineering (TE) is the process of controlling
how traffic flows through a network in order to facilitate efficient and
reliable network operations while simultaneously optimizing network
resource utilization and traffic performance. TE improves the
management of data traffic within a network and provides the better
utilization of network resources. Many research works considers intra
and inter Traffic Engineering separately. But in reality one influences
the other. Hence the effective network performances of both inter and
intra Autonomous Systems (AS) are not optimized properly. To
achieve a better Joint Optimization of both Intra and Inter AS TE, we
propose a joint Optimization technique by considering intra-AS
features during inter – AS TE and vice versa. This work considers the
important criterion say latency within an AS and between ASes. and
proposes a Bi-Criteria Latency optimization model. Hence an overall
network performance can be improved by considering this jointoptimization
technique in terms of Latency.
Fast Codevector Search Algorithm for 3-D Vector Quantized Codebook
This paper presents a very simple and efficient
algorithm for codebook search, which reduces a great deal of
computation as compared to the full codebook search. The algorithm
is based on sorting and centroid technique for search. The results
table shows the effectiveness of the proposed algorithm in terms of
computational complexity. In this paper we also introduce a new
performance parameter named as Average fractional change in pixel
value as we feel that it gives better understanding of the closeness of
the image since it is related to the perception. This new performance
parameter takes into consideration the average fractional change in
each pixel value.
Real-Time Hand Tracking and Gesture Recognition System Using Neural Networks
This paper introduces a hand gesture recognition system to recognize real time gesture in unstrained environments. Efforts should be made to adapt computers to our natural means of communication: Speech and body language. A simple and fast algorithm using orientation histograms will be developed. It will recognize a subset of MAL static hand gestures. A pattern recognition system will be using a transforrn that converts an image into a feature vector, which will be compared with the feature vectors of a training set of gestures. The final system will be Perceptron implementation in MATLAB. This paper includes experiments of 33 hand postures and discusses the results. Experiments shows that the system can achieve a 90% recognition average rate and is suitable for real time applications.
WPRiMA Tool: Managing Risks in Web Projects
Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.
All-Pairs Shortest-Paths Problem for Unweighted Graphs in O(n2 log n) Time
Given a simple connected unweighted undirected graph G = (V (G), E(G)) with |V (G)| = n and |E(G)| = m, we present a new algorithm for the all-pairs shortest-path (APSP) problem. The running time of our algorithm is in O(n2 log n). This bound is an improvement over previous best known O(n2.376) time bound of Raimund Seidel (1995) for general graphs. The algorithm presented does not rely on fast matrix multiplication. Our algorithm with slight modifications, enables us to compute the APSP problem for unweighted directed graph in time O(n2 log n), improving a previous best known O(n2.575) time bound of Uri Zwick (2002).
Performance Optimization of Data Mining Application Using Radial Basis Function Classifier
Text data mining is a process of exploratory data
analysis. Classification maps data into predefined groups or classes.
It is often referred to as supervised learning because the classes are
determined before examining the data. This paper describes proposed
radial basis function Classifier that performs comparative crossvalidation
for existing radial basis function Classifier. The feasibility
and the benefits of the proposed approach are demonstrated by means
of data mining problem: direct Marketing. Direct marketing has
become an important application field of data mining. Comparative
Cross-validation involves estimation of accuracy by either stratified
k-fold cross-validation or equivalent repeated random subsampling.
While the proposed method may have high bias; its performance
(accuracy estimation in our case) may be poor due to high variance.
Thus the accuracy with proposed radial basis function Classifier was
less than with the existing radial basis function Classifier. However
there is smaller the improvement in runtime and larger improvement
in precision and recall. In the proposed method Classification
accuracy and prediction accuracy are determined where the
prediction accuracy is comparatively high.
An Artificial Emotion Model For Visualizing Emotion of Characters
It is hard to express emotion through only speech when
we watch a character in a movie or a play because we cannot estimate
the size, kind, and quantity of emotion. So this paper proposes an
artificial emotion model for visualizing current emotion with color and
location in emotion model. The artificial emotion model is designed
considering causality of generated emotion, difference of personality,
difference of continual emotional stimulus, and co-relation of various
emotions. This paper supposed the Emotion Field for visualizing
current emotion with location, and current emotion is expressed by
location and color in the Emotion Field. For visualizing changes
within current emotion, the artificial emotion model is adjusted to
characters in Hamlet.
Fault Detection of Drinking Water Treatment Process Using PCA and Hotelling's T2 Chart
This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.
Consistency Model and Synchronization Primitives in SDSMS
This paper is on the general discussion of memory consistency model like Strict Consistency, Sequential Consistency, Processor Consistency, Weak Consistency etc. Then the techniques for implementing distributed shared memory Systems and Synchronization Primitives in Software Distributed Shared Memory Systems are discussed. The analysis involves the performance measurement of the protocol concerned that is Multiple Writer Protocol. Each protocol has pros and cons. So, the problems that are associated with each protocol is discussed and other related things are explored.
A Step-wise Zoom Technique for Exploring Image-based Virtual Reality Applications
Existing image-based virtual reality applications
allow users to view image-based 3D virtual environment in a more
interactive manner. User could “walkthrough"; looks left, right, up
and down and even zoom into objects in these virtual worlds of
images. However what the user sees during a “zoom in" is just a
close-up view of the same image which was taken from a distant.
Thus, this does not give the user an accurate view of the object from
the actual distance. In this paper, a simple technique for zooming in
an object in a virtual scene is presented. The technique is based on
the 'hotspot' concept in existing application. Instead of navigation
between two different locations, the hotspots are used to focus into
an object in the scene. For each object, several hotspots are created.
A different picture is taken for each hotspot. Each consecutive
hotspot created will take the user closer to the object. This will
provide the user with a correct of view of the object based on his
proximity to the object. Implementation issues and the relevance of
this technique in potential application areas are highlighted.
Experimental Parallel Architecture for Rendering 3D Model into MPEG-4 Format
This paper will present the initial findings of a
research into distributed computer rendering. The goal of the
research is to create a distributed computer system capable of
rendering a 3D model into an MPEG-4 stream. This paper outlines
the initial design, software architecture and hardware setup for the
Distributed computing means designing and implementing
programs that run on two or more interconnected computing systems.
Distributed computing is often used to speed up the rendering of
graphical imaging. Distributed computing systems are used to
generate images for movies, games and simulations.
A topic of interest is the application of distributed computing to
the MPEG-4 standard. During the course of the research, a
distributed system will be created that can render a 3D model into an
MPEG-4 stream. It is expected that applying distributed computing
principals will speed up rendering, thus improving the usefulness and
efficiency of the MPEG-4 standard
Non-destructive Watermelon Ripeness Determination Using Image Processing and Artificial Neural Network (ANN)
Agriculture products are being more demanding in
market today. To increase its productivity, automation to produce
these products will be very helpful. The purpose of this work is to
measure and determine the ripeness and quality of watermelon. The
textures on watermelon skin will be captured using digital camera.
These images will be filtered using image processing technique. All
these information gathered will be trained using ANN to determine
the watermelon ripeness accuracy. Initial results showed that the best
model has produced percentage accuracy of 86.51%, when measured
at 32 hidden units with a balanced percentage rate of training dataset.
Authentication and Data Hiding Using a Reversible ROI-based Watermarking Scheme for DICOM Images
In recent years image watermarking has become an
important research area in data security, confidentiality and image
integrity. Many watermarking techniques were proposed for medical
images. However, medical images, unlike most of images, require
extreme care when embedding additional data within them because
the additional information must not affect the image quality and
readability. Also the medical records, electronic or not, are linked to
the medical secrecy, for that reason, the records must be confidential.
To fulfill those requirements, this paper presents a lossless
watermarking scheme for DICOM images. The proposed a fragile
scheme combines two reversible techniques based on difference
expansion for patient's data hiding and protecting the region of
interest (ROI) with tamper detection and recovery capability.
Patient's data are embedded into ROI, while recovery data are
embedded into region of non-interest (RONI). The experimental
results show that the original image can be exactly extracted from the
watermarked one in case of no tampering. In case of tampered ROI,
tampered area can be localized and recovered with a high quality
version of the original area.
Reliability Evaluation using Triangular Intuitionistic Fuzzy Numbers Arithmetic Operations
In general fuzzy sets are used to analyze the fuzzy
system reliability. Here intuitionistic fuzzy set theory for analyzing
the fuzzy system reliability has been used. To analyze the fuzzy
system reliability, the reliability of each component of the system as
a triangular intuitionistic fuzzy number is considered. Triangular
intuitionistic fuzzy number and their arithmetic operations are
introduced. Expressions for computing the fuzzy reliability of a
series system and a parallel system following triangular intuitionistic
fuzzy numbers have been described. Here an imprecise reliability
model of an electric network model of dark room is taken. To
compute the imprecise reliability of the above said system, reliability
of each component of the systems is represented by triangular
intuitionistic fuzzy numbers. Respective numerical example is
Agreement Options on Multi Criteria Group Decision and Negotiation
This paper presents a conceptual model of agreement
options on negotiation support for civil engineering decision. The
negotiation support facilitates the solving of group choice decision
making problems in civil engineering decision to reduce the impact
of mud volcano disaster in Sidoarjo, Indonesia. The approach based
on application of analytical hierarchy process (AHP) method for
multi criteria decision on three level of decision hierarchy.
Decisions for reducing impact is very complicated since many
parties involved in a critical time. Where a number of stakeholders
are involved in choosing a single alternative from a set of solution
alternatives, there are different concern caused by differing
stakeholder preferences, experiences, and background. Therefore, a
group choice decision support is required to enable each stakeholder
to evaluate and rank the solution alternatives before engaging into
negotiation with the other stakeholders. Such civil engineering
solutions as alternatives are referred to as agreement options that are
determined by identifying the possible stakeholder choice, followed
by determining the optimal solution for each group of stakeholder.
Determination of the optimal solution is based on a game theory
model of n-person general sum game with complete information that
involves forming coalitions among stakeholders.
Proffering a Brand New Methodology to Resource Discovery in Grid based on Economic Criteria Using Learning Automata
Resource discovery is one of the chief services of a grid. A new approach to discover the provenances in grid through learning automata has been propounded in this article. The objective of the aforementioned resource-discovery service is to select the resource based upon the user-s applications and the mercantile yardsticks that is to say opting for an originator which can accomplish the user-s tasks in the most economic manner. This novel service is submitted in two phases. We proffered an applicationbased categorization by means of an intelligent nerve-prone plexus. The user in question sets his or her application as the input vector of the nerve-prone nexus. The output vector of the aforesaid network limns the appropriateness of any one of the resource for the presented executive procedure. The most scrimping option out of those put forward in the previous stage which can be coped with to fulfill the task in question is picked out. Te resource choice is carried out by means of the presented algorithm based upon the learning automata.
Continuity of Defuzzification and Its Application to Fuzzy Control
The mathematical framework for studying of a fuzzy approximate reasoning is presented in this paper. Two important defuzzification methods (Area defuzzification and Height defuzzification) besides the center of gravity method which is the best well known defuzzification method are described. The continuity of the defuzzification methods and its application to a fuzzy feedback control are discussed.
Enhancement Approaches for Supporting Default Hierarchies Formation for Robot Behaviors
Robotic system is an important area in artificial intelligence that aims at developing the performance techniques of the robot and making it more efficient and more effective in choosing its correct behavior. In this paper the distributed learning classifier system is used for designing a simulated control system for robot to perform complex behaviors. A set of enhanced approaches that support default hierarchies formation is suggested and compared with each other in order to make the simulated robot more effective in mapping the input to the correct output behavior.
Effective Digital Music Retrieval System through Content-based Features
In this paper, we propose effective system for digital music retrieval. We divided proposed system into Client and Server. Client part consists of pre-processing and Content-based feature extraction stages. In pre-processing stage, we minimized Time code Gap that is occurred among same music contents. As content-based feature, first-order differentiated MFCC were used. These presented approximately envelop of music feature sequences. Server part included Music Server and Music Matching stage. Extracted features from 1,000 digital music files were stored in Music Server. In Music Matching stage, we found retrieval result through similarity measure by DTW. In experiment, we used 450 queries. These were made by mixing different compression standards and sound qualities from 50 digital music files. Retrieval accurate indicated 97% and retrieval time was average 15ms in every single query. Out experiment proved that proposed system is effective in retrieve digital music and robust at various user environments of web.
Human Facial Expression Recognition using MANFIS Model
Facial expression analysis plays a significant role for
human computer interaction. Automatic analysis of human facial
expression is still a challenging problem with many applications. In
this paper, we propose neuro-fuzzy based automatic facial expression
recognition system to recognize the human facial expressions like
happy, fear, sad, angry, disgust and surprise. Initially facial image is
segmented into three regions from which the uniform Local Binary
Pattern (LBP) texture features distributions are extracted and
represented as a histogram descriptor. The facial expressions are
recognized using Multiple Adaptive Neuro Fuzzy Inference System
(MANFIS). The proposed system designed and tested with JAFFE
face database. The proposed model reports 94.29% of classification
A Neural Network Approach in Predicting the Blood Glucose Level for Diabetic Patients
Diabetes Mellitus is a chronic metabolic disorder,
where the improper management of the blood glucose level in the
diabetic patients will lead to the risk of heart attack, kidney disease
and renal failure. This paper attempts to enhance the diagnostic
accuracy of the advancing blood glucose levels of the diabetic
patients, by combining principal component analysis and wavelet
neural network. The proposed system makes separate blood glucose
prediction in the morning, afternoon, evening and night intervals,
using dataset from one patient covering a period of 77 days.
Comparisons of the diagnostic accuracy with other neural network
models, which use the same dataset are made. The comparison
results showed overall improved accuracy, which indicates the
effectiveness of this proposed system.
Impact of Faults in Different Software Systems: A Survey
Software maintenance is extremely important activity in software development life cycle. It involves a lot of human efforts, cost and time. Software maintenance may be further subdivided into different activities such as fault prediction, fault detection, fault prevention, fault correction etc. This topic has gained substantial attention due to sophisticated and complex applications, commercial hardware, clustered architecture and artificial intelligence. In this paper we surveyed the work done in the field of software maintenance. Software fault prediction has been studied in context of fault prone modules, self healing systems, developer information, maintenance models etc. Still a lot of things like modeling and weightage of impact of different kind of faults in the various types of software systems need to be explored in the field of fault severity.
Information Filtering using Index Word Selection based on the Topics
We have proposed an information filtering system
using index word selection from a document set based on the
topics included in a set of documents. This method narrows
down the particularly characteristic words in a document set
and the topics are obtained by Sparse Non-negative Matrix
Factorization. In information filtering, a document is often
represented with the vector in which the elements correspond
to the weight of the index words, and the dimension of the
vector becomes larger as the number of documents is
increased. Therefore, it is possible that useless words as index
words for the information filtering are included. In order to
address the problem, the dimension needs to be reduced. Our
proposal reduces the dimension by selecting index words
based on the topics included in a document set. We have
applied the Sparse Non-negative Matrix Factorization to the
document set to obtain these topics. The filtering is carried out
based on a centroid of the learning document set. The centroid
is regarded as the user-s interest. In addition, the centroid is
represented with a document vector whose elements consist of
the weight of the selected index words. Using the English test
collection MEDLINE, thus, we confirm the effectiveness of
our proposal. Hence, our proposed selection can confirm the
improvement of the recommendation accuracy from the other
previous methods when selecting the appropriate number of
index words. In addition, we discussed the selected index
words by our proposal and we found our proposal was able to
select the index words covered some minor topics included in
the document set.
ORPP with MAIEP Based Technique for Loadability Enhancement
One of the factors to maintain system survivability is
the adequate reactive power support to the system. Lack of reactive
power support may cause undesirable voltage decay leading to total
system instability. Thus, appropriate reactive power support scheme
should be arranged in order to maintain system stability. The strength
of a system capacity is normally denoted as system loadability. This
paper presents the enhancement of system loadability through
optimal reactive power planning technique using a newly developed
optimization technique, termed as Multiagent Immune Evolutionary
Programming (MAIEP). The concept of MAIEP is developed based
on the combination of Multiagent System (MAS), Artificial Immune
System (AIS) and Evolutionary Programming (EP). In realizing the
effectiveness of the proposed technique, validation is conducted on
the IEEE-26-Bus Reliability Test System. The results obtained from
pre-optimization and post-optimization process were compared
which eventually revealed the merit of MAIEP.
Application of LSB Based Steganographic Technique for 8-bit Color Images
Steganography is the process of hiding one file inside another such that others can neither identify the meaning of the embedded object, nor even recognize its existence. Current trends favor using digital image files as the cover file to hide another digital file that contains the secret message or information. One of the most common methods of implementation is Least Significant Bit Insertion, in which the least significant bit of every byte is altered to form the bit-string representing the embedded file. Altering the LSB will only cause minor changes in color, and thus is usually not noticeable to the human eye. While this technique works well for 24-bit color image files, steganography has not been as successful when using an 8-bit color image file, due to limitations in color variations and the use of a colormap. This paper presents the results of research investigating the combination of image compression and steganography. The technique developed starts with a 24-bit color bitmap file, then compresses the file by organizing and optimizing an 8-bit colormap. After the process of compression, a text message is hidden in the final, compressed image. Results indicate that the final technique has potential of being useful in the steganographic world.