A Dynamic Programming Model for Maintenance of Electric Distribution System
The paper presents dynamic programming based model as a planning tool for the maintenance of electric power systems. Every distribution component has an exponential age depending reliability function to model the fault risk. In the moment of time when the fault costs exceed the investment costs of the new component the reinvestment of the component should be made. However, in some cases the overhauling of the old component may be more economical than the reinvestment. The comparison between overhauling and reinvestment is made by optimisation process. The goal of the optimisation process is to find the cost minimising maintenance program for electric power distribution system.
The Use of KREISIG Computer Simulation Program to Optimize Signalized Roundabout
KREISIG is a computer simulation program, firstly developed by Munawar (1994) in Germany to optimize signalized roundabout. The traffic movement is based on the car following theory. Turbine method has been implemented for signal setting. The program has then been further developed in Indonesia to meet the traffic characteristics in Indonesia by adjusting the sensitivity of the drivers. Trial and error method has been implemented to adjust the saturation flow. The saturation flow output has also been compared to the calculation method according to 1997 Indonesian Highway Capacity Manual. It has then been implemented to optimize signalized roundabout at Kleringan roundabout in Malioboro area, Yogyakarta, Indonesia. It is found that this method can optimize the signal setting of this roundabout. Therefore, it is recommended to use this program to optimize signalized roundabout.
The Study of the Intelligent Fuzzy Weighted Input Estimation Method Combined with the Experiment Verification for the Multilayer Materials
The innovative intelligent fuzzy weighted input
estimation method (FWIEM) can be applied to the inverse heat
transfer conduction problem (IHCP) to estimate the unknown
time-varying heat flux of the multilayer materials as presented in this
paper. The feasibility of this method can be verified by adopting the
temperature measurement experiment. The experiment modular may
be designed by using the copper sample which is stacked up 4
aluminum samples with different thicknesses. Furthermore, the
bottoms of copper samples are heated by applying the standard heat
source, and the temperatures on the tops of aluminum are measured by
using the thermocouples. The temperature measurements are then
regarded as the inputs into the presented method to estimate the heat
flux in the bottoms of copper samples. The influence on the estimation
caused by the temperature measurement of the sample with different
thickness, the processing noise covariance Q, the weighting factor γ ,
the sampling time interval Δt , and the space discrete interval Δx ,
will be investigated by utilizing the experiment verification. The
results show that this method is efficient and robust to estimate the
unknown time-varying heat input of the multilayer materials.
Clinical Benefits of an Embedded Decision Support System in Anticoagulant Control
Computer-based decision support (CDSS) systems can
deliver real patient care and increase chances of long-term survival in
areas of chronic disease management prone to poor control. One such
CDSS, for the management of warfarin, is described in this paper and
the outcomes shown. Data is derived from the running system and
show a performance consistently around 20% better than the
Introducing Sequence-Order Constraint into Prediction of Protein Binding Sites with Automatically Extracted Templates
Search for a tertiary substructure that geometrically
matches the 3D pattern of the binding site of a well-studied protein provides a solution to predict protein functions. In our previous work,
a web server has been built to predict protein-ligand binding sites
based on automatically extracted templates. However, a drawback of such templates is that the web server was prone to resulting in many
false positive matches. In this study, we present a sequence-order constraint to reduce the false positive matches of using automatically
extracted templates to predict protein-ligand binding sites. The binding site predictor comprises i) an automatically constructed template library and ii) a local structure alignment algorithm for
querying the library. The sequence-order constraint is employed to
identify the inconsistency between the local regions of the query protein and the templates. Experimental results reveal that the sequence-order constraint can largely reduce the false positive matches and is effective for template-based binding site prediction.
File Format of Flow Chart Simulation Software - CFlow
CFlow is a flow chart software, it contains facilities to
draw and evaluate a flow chart. A flow chart evaluation applies a
simulation method to enable presentation of work flow in a flow
chart solution. Flow chart simulation of CFlow is executed by
manipulating the CFlow data file which is saved in a graphical vector
format. These text-based data are organised by using a data
classification technic based on a Library classification-scheme. This
paper describes the file format for flow chart simulation software of
Multi-models Approach for Describing and Verifying Constraints Based Interactive Systems
The requirements analysis, modeling, and simulation have consistently been one of the main challenges during the development of complex systems. The scenarios and the state machines are two successful models to describe the behavior of an interactive system. The scenarios represent examples of system execution in the form of sequences of messages exchanged between objects and are a partial view of the system. In contrast, state machines can represent the overall system behavior. The automation of processing scenarios in the state machines provide some answers to various problems such as system behavior validation and scenarios consistency checking. In this paper, we propose a method for translating scenarios in state machines represented by Discreet EVent Specification and procedure to detect implied scenarios. Each induced DEVS model represents the behavior of an object of the system. The global system behavior is described by coupling the atomic DEVS models and validated through simulation. We improve the validation process with integrating formal methods to eliminate logical inconsistencies in the global model. For that end, we use the Z notation.
Computer-aided Lenke Classification of Scoliotic Spines
The identification and classification of the spine deformity play an important role when considering surgical planning for adolescent patients with idiopathic scoliosis. The subject of this article is the Lenke classification of scoliotic spines using Cobb angle measurements. The purpose is two-fold: (1) design a rulebased diagram to assist clinicians in the classification process and (2) investigate a computer classifier which improves the classification time and accuracy. The rule-based diagram efficiency was evaluated in a series of scoliotic classifications by 10 clinicians. The computer classifier was tested on a radiographic measurement database of 603 patients. Classification accuracy was 93% using the rule-based diagram and 99% for the computer classifier. Both the computer classifier and the rule based diagram can efficiently assist clinicians in their Lenke classification of spine scoliosis.
Bio-inspired Audio Content-Based Retrieval Framework (B-ACRF)
Content-based music retrieval generally involves analyzing, searching and retrieving music based on low or high level features of a song which normally used to represent artists, songs or music genre. Identifying them would normally involve feature extraction and classification tasks. Theoretically the greater features analyzed, the better the classification accuracy can be achieved but with longer execution time. Technique to select significant features is important as it will reduce dimensions of feature used in classification and contributes to the accuracy. Artificial Immune System (AIS) approach will be investigated and applied in the classification task. Bio-inspired audio content-based retrieval framework (B-ACRF) is proposed at the end of this paper where it embraces issues that need further consideration in music retrieval performances.
Faults Forecasting System
This paper presents Faults Forecasting System (FFS)
that utilizes statistical forecasting techniques in analyzing process
variables data in order to forecast faults occurrences. FFS is
proposing new idea in detecting faults. Current techniques used in
faults detection are based on analyzing the current status of the
system variables in order to check if the current status is fault or not.
FFS is using forecasting techniques to predict future timing for faults
before it happens. Proposed model is applying subset modeling
strategy and Bayesian approach in order to decrease dimensionality
of the process variables and improve faults forecasting accuracy. A
practical experiment, designed and implemented in Okayama
University, Japan, is implemented, and the comparison shows that
our proposed model is showing high forecasting accuracy and
Text Summarization for Oil and Gas News Article
Information is increasing in volumes; companies are overloaded with information that they may lose track in getting the intended information. It is a time consuming task to scan through each of the lengthy document. A shorter version of the document which contains only the gist information is more favourable for most information seekers. Therefore, in this paper, we implement a text summarization system to produce a summary that contains gist information of oil and gas news articles. The summarization is intended to provide important information for oil and gas companies to monitor their competitor-s behaviour in enhancing them in formulating business strategies. The system integrated statistical approach with three underlying concepts: keyword occurrences, title of the news article and location of the sentence. The generated summaries were compared with human generated summaries from an oil and gas company. Precision and recall ratio are used to evaluate the accuracy of the generated summary. Based on the experimental results, the system is able to produce an effective summary with the average recall value of 83% at the compression rate of 25%.
FCA-based Conceptual Knowledge Discovery in Folksonomy
The tagging data of (users, tags and resources) constitutes a folksonomy that is the user-driven and bottom-up approach to organizing and classifying information on the Web. Tagging data stored in the folksonomy include a lot of very useful information and knowledge. However, appropriate approach for analyzing tagging data and discovering hidden knowledge from them still remains one of the main problems on the folksonomy mining researches. In this paper, we have proposed a folksonomy data mining approach based on FCA for discovering hidden knowledge easily from folksonomy. Also we have demonstrated how our proposed approach can be applied in the collaborative tagging system through our experiment. Our proposed approach can be applied to some interesting areas such as social network analysis, semantic web mining and so on.
A hybrid Tabu Search Algorithm to Cell Formation Problem and its Variants
Cell formation is the first step in the design of cellular
manufacturing systems. In this study, a general purpose
computational scheme employing a hybrid tabu search algorithm as
the core is proposed to solve the cell formation problem and its
variants. In the proposed scheme, great flexibilities are left to the
users. The core solution searching algorithm embedded in the scheme
can be easily changed to any other meta-heuristic algorithms, such as
the simulated annealing, genetic algorithm, etc., based on the
characteristics of the problems to be solved or the preferences the
users might have. In addition, several counters are designed to control
the timing of conducting intensified solution searching and diversified
solution searching strategies interactively.
Region-Based Image Fusion with Artificial Neural Network
For most image fusion algorithms separate
relationship by pixels in the image and treat them more or less
independently. In addition, they have to be adjusted different
parameters in different time or weather. In this paper, we propose a
region–based image fusion which combines aspects of feature and
pixel-level fusion method to replace only by pixel. The basic idea is
to segment far infrared image only and to add information of each
region from segmented image to visual image respectively. Then we
determine different fused parameters according different region. At
last, we adopt artificial neural network to deal with the problems of
different time or weather, because the relationship between fused
parameters and image features are nonlinear. It render the fused
parameters can be produce automatically according different states.
The experimental results present the method we proposed indeed
have good adaptive capacity with automatic determined fused
parameters. And the architecture can be used for lots of applications.
An Edge Detection and Filtering Mechanism of Two Dimensional Digital Objects Based on Fuzzy Inference
The general idea behind the filter is to average a pixel
using other pixel values from its neighborhood, but simultaneously to
take care of important image structures such as edges. The main
concern of the proposed filter is to distinguish between any variations
of the captured digital image due to noise and due to image structure.
The edges give the image the appearance depth and sharpness. A
loss of edges makes the image appear blurred or unfocused.
However, noise smoothing and edge enhancement are traditionally
conflicting tasks. Since most noise filtering behaves like a low pass
filter, the blurring of edges and loss of detail seems a natural
consequence. Techniques to remedy this inherent conflict often
encompass generation of new noise due to enhancement.
In this work a new fuzzy filter is presented for the noise reduction
of images corrupted with additive noise. The filter consists of three
stages. (1) Define fuzzy sets in the input space to computes a fuzzy
derivative for eight different directions (2) construct a set of IFTHEN
rules by to perform fuzzy smoothing according to
contributions of neighboring pixel values and (3) define fuzzy sets in
the output space to get the filtered and edged image.
Experimental results are obtained to show the feasibility of the
proposed approach with two dimensional objects.
A DCT-Based Secure JPEG Image Authentication Scheme
The challenge in the case of image authentication is that in many cases images need to be subjected to non malicious operations like compression, so the authentication techniques need to be compression tolerant. In this paper we propose an image authentication system that is tolerant to JPEG lossy compression operations. A scheme for JPEG grey scale images is proposed based on a data embedding method that is based on a secret key and a secret mapping vector in the frequency domain. An encrypted feature vector extracted from the image DCT coefficients, is embedded redundantly, and invisibly in the marked image. On the receiver side, the feature vector from the received image is derived again and compared against the extracted watermark to verify the image authenticity. The proposed scheme is robust against JPEG compression up to a maximum compression of approximately 80%,, but sensitive to malicious attacks such as cutting and pasting.
Toward Community-Based Personal Cloud Computing
This paper proposes a new of cloud computing for individual computer users to share applications in distributed communities, called community-based personal cloud computing (CPCC). The paper also presents a prototype design and implementation of CPCC. The users of CPCC are able to share their computing applications with other users of the community. Any member of the community is able to execute remote applications shared by other members. The remote applications behave in the same way as their local counterparts, allowing the user to enter input, receive output as well as providing the access to the local data of the user. CPCC provides a peer-to-peer (P2P) environment where each peer provides applications which can be used by the other peers that are connected CPCC.
Evaluation of Chiller Power Consumption Using Grey Prediction
98% of the energy needed in Taiwan has been
imported. The prices of petroleum and electricity have been
increasing. In addition, facility capacity, amount of electricity
generation, amount of electricity consumption and number of Taiwan
Power Company customers have continued to increase. For these
reasons energy conservation has become an important topic. In the
past linear regression was used to establish the power consumption
models for chillers. In this study, grey prediction is used to evaluate
the power consumption of a chiller so as to lower the total power
consumption at peak-load (so that the relevant power providers do not
need to keep on increasing their power generation capacity and facility
In grey prediction, only several numerical values (at least four
numerical values) are needed to establish the power consumption
models for chillers. If PLR, the temperatures of supply chilled-water
and return chilled-water, and the temperatures of supply cooling-water
and return cooling-water are taken into consideration, quite accurate
results (with the accuracy close to 99% for short-term predictions)
may be obtained. Through such methods, we can predict whether the
power consumption at peak-load will exceed the contract power
capacity signed by the corresponding entity and Taiwan Power
Company. If the power consumption at peak-load exceeds the power
demand, the temperature of the supply chilled-water may be adjusted
so as to reduce the PLR and hence lower the power consumption.
Fuzzy Control of a Quarter-Car Suspension System
An active suspension system has been proposed to
improve the ride comfort. A quarter-car 2 degree-of-freedom (DOF)
system is designed and constructed on the basis of the concept of a
four-wheel independent suspension to simulate the actions of an
active vehicle suspension system. The purpose of a suspension
system is to support the vehicle body and increase ride comfort. The
aim of the work described in the paper was to illustrate the
application of fuzzy logic technique to the control of a continuously
damping automotive suspension system. The ride comfort is
improved by means of the reduction of the body acceleration caused
by the car body when road disturbances from smooth road and real
The paper describes also the model and controller used in the
study and discusses the vehicle response results obtained from a
range of road input simulations. In the conclusion, a comparison of
active suspension fuzzy control and Proportional Integration
derivative (PID) control is shown using MATLAB simulations.
Application of Neural Network in User Authentication for Smart Home System
Security has been an important issue and concern in the
smart home systems. Smart home networks consist of a wide range of
wired or wireless devices, there is possibility that illegal access to
some restricted data or devices may happen. Password-based
authentication is widely used to identify authorize users, because this
method is cheap, easy and quite accurate. In this paper, a neural
network is trained to store the passwords instead of using verification
table. This method is useful in solving security problems that
happened in some authentication system. The conventional way to
train the network using Backpropagation (BPN) requires a long
training time. Hence, a faster training algorithm, Resilient
Backpropagation (RPROP) is embedded to the MLPs Neural
Network to accelerate the training process. For the Data Part, 200
sets of UserID and Passwords were created and encoded into binary
as the input. The simulation had been carried out to evaluate the
performance for different number of hidden neurons and combination
of transfer functions. Mean Square Error (MSE), training time and
number of epochs are used to determine the network performance.
From the results obtained, using Tansig and Purelin in hidden and
output layer and 250 hidden neurons gave the better performance. As
a result, a password-based user authentication system for smart home
by using neural network had been developed successfully.
Intelligent Network-Based Stepping Stone Detection Approach
This research intends to introduce a new usage of Artificial Intelligent (AI) approaches in Stepping Stone Detection (SSD) fields of research. By using Self-Organizing Map (SOM) approaches as the engine, through the experiment, it is shown that SOM has the capability to detect the number of connection chains that involved in a stepping stones. Realizing that by counting the number of connection chain is one of the important steps of stepping stone detection and it become the research focus currently, this research has chosen SOM as the AI techniques because of its capabilities. Through the experiment, it is shown that SOM can detect the number of involved connection chains in Network-based Stepping Stone Detection (NSSD).
Development of a Project Selection Method on Information System Using ANP and Fuzzy Logic
Project selection problems on management
information system (MIS) are often considered a multi-criteria
decision-making (MCDM) for a solving method. These problems
contain two aspects, such as interdependencies among criteria and
candidate projects and qualitative and quantitative factors of projects.
However, most existing methods reported in literature consider these
aspects separately even though these two aspects are simultaneously
incorporated. For this reason, we proposed a hybrid method using
analytic network process (ANP) and fuzzy logic in order to represent
both aspects. We then propose a goal programming model to conduct
an optimization for the project selection problems interpreted by a
hybrid concept. Finally, a numerical example is conducted as
Analysis of Social Network Using Clever Ant Colony Metaphor
A social network is a set of people or organization or other social entities connected by some form of relationships. Analysis of social network broadly elaborates visual and mathematical representation of that relationship. Web can also be considered as a social network. This paper presents an innovative approach to analyze a social network using a variant of existing ant colony optimization algorithm called as Clever Ant Colony Metaphor. Experiments are performed and interesting findings and observations have been inferred based on the proposed model.
Secure peerTalk Using PEERT System
Multiparty voice over IP (MVoIP) systems allows a group of people to freely communicate each other via the internet, which have many applications such as online gaming, teleconferencing, online stock trading etc. Peertalk is a peer to peer multiparty voice over IP system (MVoIP) which is more feasible than existing approaches such as p2p overlay multicast and coupled distributed processing. Since the stream mixing and distribution are done by the peers, it is vulnerable to major security threats like nodes misbehavior, eavesdropping, Sybil attacks, Denial of Service (DoS), call tampering, Man in the Middle attacks etc. To thwart the security threats, a security framework called PEERTS (PEEred Reputed Trustworthy System for peertalk) is implemented so that efficient and secure communication can be carried out between peers.
Reducing SAGE Data Using Genetic Algorithms
Serial Analysis of Gene Expression is a powerful
quantification technique for generating cell or tissue gene expression
data. The profile of the gene expression of cell or tissue in several
different states is difficult for biologists to analyze because of the large
number of genes typically involved. However, feature selection in
machine learning can successfully reduce this problem. The method
allows reducing the features (genes) in specific SAGE data, and
determines only relevant genes. In this study, we used a genetic
algorithm to implement feature selection, and evaluate the
classification accuracy of the selected features with the K-nearest
neighbor method. In order to validate the proposed method, we used
two SAGE data sets for testing. The results of this study conclusively
prove that the number of features of the original SAGE data set can be
significantly reduced and higher classification accuracy can be
A Novel Prediction Method for Tag SNP Selection using Genetic Algorithm based on KNN
Single nucleotide polymorphisms (SNPs) hold much promise as a basis for disease-gene association. However, research is limited by the cost of genotyping the tremendous number of SNPs. Therefore, it is important to identify a small subset of informative SNPs, the so-called tag SNPs. This subset consists of selected SNPs of the genotypes, and accurately represents the rest of the SNPs. Furthermore, an effective evaluation method is needed to evaluate prediction accuracy of a set of tag SNPs. In this paper, a genetic algorithm (GA) is applied to tag SNP problems, and the K-nearest neighbor (K-NN) serves as a prediction method of tag SNP selection. The experimental data used was taken from the HapMap project; it consists of genotype data rather than haplotype data. The proposed method consistently identified tag SNPs with considerably better prediction accuracy than methods from the literature. At the same time, the number of tag SNPs identified was smaller than the number of tag SNPs in the other methods. The run time of the proposed method was much shorter than the run time of the SVM/STSA method when the same accuracy was reached.
An Images Monitoring System based on Multi-Format Streaming Grid Architecture
This paper proposes a novel multi-format stream grid
architecture for real-time image monitoring system. The system, based
on a three-tier architecture, includes stream receiving unit, stream
processor unit, and presentation unit. It is a distributed computing and
a loose coupling architecture. The benefit is the amount of required
servers can be adjusted depending on the loading of the image
monitoring system. The stream receive unit supports multi capture
source devices and multi-format stream compress encoder. Stream
processor unit includes three modules; they are stream clipping
module, image processing module and image management module.
Presentation unit can display image data on several different platforms.
We verified the proposed grid architecture with an actual test of image
monitoring. We used a fast image matching method with the
adjustable parameters for different monitoring situations. Background
subtraction method is also implemented in the system. Experimental
results showed that the proposed architecture is robust, adaptive, and
powerful in the image monitoring system.
Diagnosis of the Abdominal Aorta Aneurysm in Magnetic Resonance Imaging Images
This paper presents a technique for diagnosis of the abdominal aorta aneurysm in magnetic resonance imaging (MRI) images. First, our technique is designed to segment the aorta image in MRI images. This is a required step to determine the volume of aorta image which is the important step for diagnosis of the abdominal aorta aneurysm. Our proposed technique can detect the volume of aorta in MRI images using a new external energy for snakes model. The new external energy for snakes model is calculated from Law-s texture. The new external energy can increase the capture range of snakes model efficiently more than the old external energy of snakes models. Second, our technique is designed to diagnose the abdominal aorta aneurysm by Bayesian classifier which is classification models based on statistical theory. The feature for data classification of abdominal aorta aneurysm was derived from the contour of aorta images which was a result from segmenting of our snakes model, i.e., area, perimeter and compactness. We also compare the proposed technique with the traditional snakes model. In our experiment results, 30 images are trained, 20 images are tested and compared with expert opinion. The experimental results show that our technique is able to provide more accurate results than 95%.
Multi-objective Optimization of Graph Partitioning using Genetic Algorithm
Graph partitioning is a NP-hard problem with multiple
conflicting objectives. The graph partitioning should minimize the
inter-partition relationship while maximizing the intra-partition
relationship. Furthermore, the partition load should be evenly
distributed over the respective partitions. Therefore this is a multiobjective
optimization problem (MOO). One of the approaches to
MOO is Pareto optimization which has been used in this paper. The
proposed methods of this paper used to improve the performance are
injecting best solutions of previous runs into the first generation of
next runs and also storing the non-dominated set of previous
generations to combine with later generation's non-dominated set.
These improvements prevent the GA from getting stuck in the local
optima and increase the probability of finding more optimal
solutions. Finally, a simulation research is carried out to investigate
the effectiveness of the proposed algorithm. The simulation results
confirm the effectiveness of the proposed method.
RDFGraph: New Data Modeling Tool for Semantic Web
The emerging Semantic Web has been attracted many
researchers and developers. New applications have been developed on top of Semantic Web and many supporting tools introduced to improve its software development process. Metadata modeling is one of development process where supporting tools exists. The existing
tools are lack of readability and easiness for a domain knowledge expert to graphically models a problem in semantic model. In this paper, a metadata modeling tool called RDFGraph is proposed. This
tool is meant to solve those problems. RDFGraph is also designed to work with modern database management systems that support RDF and to improve the performance of the query execution process. The
testing result shows that the rules used in RDFGraph follows the W3C standard and the graphical model produced in this tool is properly translated and correct.
Equivalence Class Subset Algorithm
The equivalence class subset algorithm is a powerful
tool for solving a wide variety of constraint satisfaction problems and
is based on the use of a decision function which has a very high but
not perfect accuracy. Perfect accuracy is not required in the decision
function as even a suboptimal solution contains valuable information
that can be used to help find an optimal solution. In the hardest
problems, the decision function can break down leading to a
suboptimal solution where there are more equivalence classes than
are necessary and which can be viewed as a mixture of good decision
and bad decisions. By choosing a subset of the decisions made in
reaching a suboptimal solution an iterative technique can lead to an
optimal solution, using series of steadily improved suboptimal
solutions. The goal is to reach an optimal solution as quickly as
possible. Various techniques for choosing the decision subset are
A Simulation Method to Find the Optimal Design of Photovoltaic Home System in Malaysia, Case Study: A Building Integrated Photovoltaic in Putra Jaya
Over recent years, the number of building integrated photovoltaic (BIPV) installations for home systems have been increasing in Malaysia. The paper concerns an analysis - as part of current Research and Development (R&D) efforts - to integrate photovoltaics as an architectural feature of a detached house in the new satellite township of Putrajaya, Malaysia. The analysis was undertaken using calculation and simulation tools to optimize performance of BIPV home system. In this study, a the simulation analysis was undertaken for selected bungalow units based on a long term recorded weather data for city of Kuala Lumpur. The simulation and calculation was done with consideration of a PV panels' tilt and direction, shading effect and economical considerations. A simulation of the performance of a grid connected BIPV house in Kuala Lumpur was undertaken. This case study uses a 60 PV modules with power output of 2.7 kW giving an average of PV electricity output is 255 kWh/month..
Evolutionary Eigenspace Learning using CCIPCA and IPCA for Face Recognition
Traditional principal components analysis (PCA)
techniques for face recognition are based on batch-mode training
using a pre-available image set. Real world applications require that
the training set be dynamic of evolving nature where within the
framework of continuous learning, new training images are
continuously added to the original set; this would trigger a costly
continuous re-computation of the eigen space representation via
repeating an entire batch-based training that includes the old and new
images. Incremental PCA methods allow adding new images and
updating the PCA representation. In this paper, two incremental
PCA approaches, CCIPCA and IPCA, are examined and compared.
Besides, different learning and testing strategies are proposed and
applied to the two algorithms. The results suggest that batch PCA is
inferior to both incremental approaches, and that all CCIPCAs are
Analyzing Artificial Emotion in Game Characters Using Soft Computing
This paper describes a simulation model for analyzing artificial emotion injected to design the game characters. Most of the game storyboard is interactive in nature and the virtual characters of the game are equipped with an individual personality and dynamic emotion value which is similar to real life emotion and behavior. The uncertainty in real expression, mood and behavior is also exhibited in game paradigm and this is focused in the present paper through a fuzzy logic based agent and storyboard. Subsequently, a pheromone distribution or labeling is presented mimicking the behavior of social insects.
A Markov Chain Approximation for ATS Modeling for the Variable Sampling Interval CCC Control Charts
The cumulative conformance count (CCC) charts are
widespread in process monitoring of high-yield manufacturing.
Recently, it is found the use of variable sampling interval (VSI)
scheme could further enhance the efficiency of the standard CCC
charts. The average time to signal (ATS) a shift in defect rate has
become traditional measure of efficiency of a chart with the VSI
scheme. Determining the ATS is frequently a difficult and tedious
task. A simple method based on a finite Markov Chain approach for
modeling the ATS is developed. In addition, numerical results are
Segmentation of Breast Lesions in Ultrasound Images Using Spatial Fuzzy Clustering and Structure Tensors
Segmentation in ultrasound images is challenging due to the interference from speckle noise and fuzziness of boundaries. In this paper, a segmentation scheme using fuzzy c-means (FCM) clustering incorporating both intensity and texture information of images is proposed to extract breast lesions in ultrasound images. Firstly, the nonlinear structure tensor, which can facilitate to refine the edges detected by intensity, is used to extract speckle texture. And then, a spatial FCM clustering is applied on the image feature space for segmentation. In the experiments with simulated and clinical ultrasound images, the spatial FCM clustering with both intensity and texture information gets more accurate results than the conventional FCM or spatial FCM without texture information.
A New Heuristic Approach for the Stock- Cutting Problems
This paper addresses a stock-cutting problem with rotation of items and without the guillotine cutting constraint. In order to solve the large-scale problem effectively and efficiently, we propose a simple but fast heuristic algorithm. It is shown that this heuristic outperforms the latest published algorithms for large-scale problem instances.
A Proxy Multi-Signature Scheme with Anonymous Vetoable Delegation
Frequently a group of people jointly decide and authorize
a specific person as a representative in some business/poitical
occasions, e.g., the board of a company authorizes the chief executive
officer to close a multi-billion acquisition deal. In this paper, an
integrated proxy multi-signature scheme that allows anonymously
vetoable delegation is proposed. This protocol integrates mechanisms
of private veto, distributed proxy key generation, secure transmission
of proxy key, and existentially unforgeable proxy multi-signature
scheme. First, a provably secure Guillou-Quisquater proxy signature
scheme is presented, then the “zero-sharing" protocol is extended
over a composite modulus multiplicative group, and finally the above
two are combined to realize the GQ proxy multi-signature with
anonymously vetoable delegation. As a proxy signature scheme, this
protocol protects both the original signers and the proxy signer.
The modular design allows simplified implementation with less
communication overheads and better computation performance than
a general secure multi-party protocol.
Finding Fuzzy Association Rules Using FWFP-Growth with Linguistic Supports and Confidences
In data mining, the association rules are used to search
for the relations of items of the transactions database. Following the
data is collected and stored, it can find rules of value through
association rules, and assist manager to proceed marketing strategy
and plan market framework. In this paper, we attempt fuzzy partition
methods and decide membership function of quantitative values of
each transaction item. Also, by managers we can reflect the
importance of items as linguistic terms, which are transformed as
fuzzy sets of weights. Next, fuzzy weighted frequent pattern growth
(FWFP-Growth) is used to complete the process of data mining. The
method above is expected to improve Apriori algorithm for its better
efficiency of the whole association rules. An example is given to
clearly illustrate the proposed approach.