Detecting Community Structure in Amino Acid Interaction Networks
In this paper we introduce the notion of protein interaction network. This is a graph whose vertices are the protein-s amino acids and whose edges are the interactions between them. Using a graph theory approach, we observe that according to their structural roles, the nodes interact differently. By leading a community structure detection, we confirm this specific behavior and describe thecommunities composition to finally propose a new approach to fold a protein interaction network.
Using of Latin Router for Routing Wavelength with Configuration Algorithm
Optical network uses a tool for routing which is called
Latin router. These routers use particular algorithms for routing. In this paper, we present algorithm for configuration of optical network that is optimized regarding previous algorithm. We show that by decreasing the number of hops for source-destination in lightpath number of satisfied request is less. Also we had shown that more than
single-hop lightpath relating single-hop lightpath is better.
A New Distribution Network Reconfiguration Approach using a Tree Model
Power loss reduction is one of the main targets in power industry and so in this paper, the problem of finding the optimal configuration of a radial distribution system for loss reduction is considered. Optimal reconfiguration involves the selection of the best set of branches to be opened ,one each from each loop, for reducing resistive line losses , and reliving overloads on feeders by shifting the load to adjacent feeders. However ,since there are many candidate switching combinations in the system ,the feeder reconfiguration is a complicated problem. In this paper a new approach is proposed based on a simple optimum loss calculation by determining optimal trees of the given network. From graph theory a distribution network can be represented with a graph that consists a set of nodes and branches. In fact this problem can be viewed as a problem of determining an optimal tree of the graph which simultaneously ensure radial structure of each candidate topology .In this method the refined genetic algorithm is also set up and some improvements of algorithm are made on chromosome coding. In this paper an implementation of the algorithm presented by  is applied by modifying in load flow program and a comparison of this method with the proposed method is employed. In  an algorithm is proposed that the choice of the switches to be opened is based on simple heuristic rules. This algorithm reduce the number of load flow runs and also reduce the switching combinations to a fewer number and gives the optimum solution. To demonstrate the validity of these methods computer simulations with PSAT and MATLAB programs are carried out on 33-bus test system. The results show that the performance of the proposed method is better than  method and also other methods.
Evolutionary Computation Technique for Solving Riccati Differential Equation of Arbitrary Order
In this article an evolutionary technique has been used
for the solution of nonlinear Riccati differential equations of fractional order. In this method, genetic algorithm is used as a tool for
the competent global search method hybridized with active-set algorithm for efficient local search. The proposed method has been
successfully applied to solve the different forms of Riccati
differential equations. The strength of proposed method has in its
equal applicability for the integer order case, as well as, fractional
order case. Comparison of the method has been made with standard
numerical techniques as well as the analytic solutions. It is found
that the designed method can provide the solution to the equation
with better accuracy than its counterpart deterministic approaches.
Another advantage of the given approach is to provide results on
entire finite continuous domain unlike other numerical methods
which provide solutions only on discrete grid of points.
Defining Programming Problems as Learning Objects
Standards for learning objects focus primarily on
content presentation. They were already extended to support automatic evaluation but it is limited to exercises with a predefined
set of answers. The existing standards lack the metadata required by specialized evaluators to handle types of exercises with an indefinite
set of solutions. To address this issue existing learning object standards were extended to the particular requirements of a
specialized domain. A definition of programming problems as learning objects, compatible both with Learning Management Systems and with systems performing automatic evaluation of
programs, is presented in this paper. The proposed definition includes
metadata that cannot be conveniently represented using existing standards, such as: the type of automatic evaluation; the requirements
of the evaluation engine; and the roles of different assets - tests cases, program solutions, etc. The EduJudge project and its main services
are also presented as a case study on the use of the proposed definition of programming problems as learning objects.
A New History Based Method to Handle the Recurring Concept Shifts in Data Streams
Recent developments in storage technology and
networking architectures have made it possible for broad areas of applications to rely on data streams for quick response and accurate
decision making. Data streams are generated from events of real world so existence of associations, which are among the occurrence of these events in real world, among concepts of data streams is
logical. Extraction of these hidden associations can be useful for prediction of subsequent concepts in concept shifting data streams. In this paper we present a new method for learning association among
concepts of data stream and prediction of what the next concept will be. Knowing the next concept, an informed update of data model will be possible. The results of conducted experiments show that the proposed method is proper for classification of concept shifting data
Order Penetration Point Location using Fuzzy Quadratic Programming
This paper addresses one of the most important issues
have been considered in hybrid MTS/MTO production environments. To cope with the problem, a mathematical programming model is
applied from a tactical point of view. The model is converted to a fuzzy goal programming model, because a degree of uncertainty is involved in hybrid MTS/MTO context. Finally, application of the
proposed model in an industrial center is reported and the results prove the validity of the model.
A Model Driven Based Method for Scheduling Analysis and HW/SW Partitioning
Unified Modeling Language (UML) extensions for real time embedded systems (RTES) co-design, are taking a growing interest by a great number of industrial and research communities. The extension mechanism is provided by UML profiles for RTES. It aims at improving an easily-understood method of system design for non-experts. On the other hand, one of the key items of the co- design methods is the Hardware/Software partitioning and scheduling tasks. Indeed, it is mandatory to define where and when tasks are implemented and run. Unfortunately the main goals of co-design are not included in the usual practice of UML profiles. So, there exists a need for mapping used models to an execution platform for both schedulability test and HW/SW partitioning. In the present work, test schedulability and design space exploration are performed at an early stage. The proposed approach adopts Model Driven Engineering MDE. It starts from UML specification annotated with the recent profile for the Modeling and Analysis of Real Time Embedded systems MARTE. Following refinement strategy, transformation rules allow to find a feasible schedule that satisfies timing constraints and to define where tasks will be implemented. The overall approach is experimented for the design of a football player robot application.
GeNS: a Biological Data Integration Platform
The scientific achievements coming from molecular
biology depend greatly on the capability of computational
applications to analyze the laboratorial results. A comprehensive
analysis of an experiment requires typically the simultaneous study
of the obtained dataset with data that is available in several distinct
public databases. Nevertheless, developing a centralized access to
these distributed databases rises up a set of challenges such as: what
is the best integration strategy, how to solve nomenclature clashes,
how to solve database overlapping data and how to deal with huge
datasets. In this paper we present GeNS, a system that uses a simple and yet innovative approach to address several biological data integration issues. Compared with existing systems, the main
advantages of GeNS are related to its maintenance simplicity and to its coverage and scalability, in terms of number of supported
databases and data types. To support our claims we present the current use of GeNS in two concrete applications. GeNS currently contains more than 140 million of biological relations and it can be
publicly downloaded or remotely access through SOAP web services.
Defect Prevention and Detection of DSP-software
The users are now expecting higher level of
DSP(Digital Signal Processing) software quality than ever before.
Prevention and detection of defect are critical elements of software
quality assurance. In this paper, principles and rules for prevention and
detection of defect are suggested, which are not universal guidelines,
but are useful for both novice and experienced DSP software
A Discrete-Event-Simulation Approach for Logistic Systems with Real Time Resource Routing and VR Integration
Today, transport and logistic systems are often tightly
integrated in the production. Lean production and just-in-time delivering create multiple constraints that have to be fulfilled. As transport networks often have evolved over time they are very
expensive to change. This paper describes a discrete-event-simulation
system which simulates transportation models using real time
resource routing and collision avoidance. It allows for the
specification of own control algorithms and validation of new
strategies. The simulation is integrated into a virtual reality (VR)
environment and can be displayed in 3-D to show the progress.
Simulation elements can be selected through VR metaphors. All data
gathered during the simulation can be presented as a detailed summary afterwards. The included cost-benefit calculation can help to optimize the financial outcome. The operation of this approach is shown by the example of a timber harvest simulation.
A New Model of English-Vietnamese Bilingual Information Retrieval System
In this paper, we propose a new model of English-
Vietnamese bilingual Information Retrieval system. Although there
are so many CLIR systems had been researched and built, the accuracy of searching results in different languages that the CLIR
system supports still need to improve, especially in finding bilingual documents. The problems identified in this paper are the limitation of
machine translation-s result and the extra large collections of document to be found. So we try to establish a different model to overcome these problems.
Integration of Image and Patient Data, Software and International Coding Systems for Use in a Mammography Research Project
Mammographic images and data analysis to
facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file
formats and relate these to other patient information.
This would optimize the use of the data as both primary
reporting and enhanced information extraction of research data could be performed from the single dataset. One desired
improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically
available in the images.
The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research
purposes. An interface was developed for accessing, adding,
updating, modifying and extracting data from the common
database, enhancing the future possible application of the data in CAD processing.
Technically, future developments envisaged include the creation of an advanced search function to selects image files
based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a
user friendly configuration utility for importing of the required fields from the DICOM files must be done.
Cloud Computing: Changing Cogitation about Computing
Cloud Computing is a new technology that helps us to
use the Cloud for compliance our computation needs. Cloud refers to a scalable network of computers that work together like Internet. An
important element in Cloud Computing is that we shift processing, managing, storing and implementing our data from, locality into the
Cloud; So it helps us to improve the efficiency. Because of it is new
technology, it has both advantages and disadvantages that are
scrutinized in this article. Then some vanguards of this technology
are studied. Afterwards we find out that Cloud Computing will have
important roles in our tomorrow life!
Optimal Multilayer Perceptron Structure For Classification of HIV Sub-Type Viruses
The feature of HIV genome is in a wide range because
of it is highly heterogeneous. Hence, the infection ability of the virus changes related with different chemokine receptors. From this point,
R5 and X4 HIV viruses use CCR5 and CXCR5 coreceptors respectively while R5X4 viruses can utilize both coreceptors. Recently, in Bioinformatics, R5X4 viruses have been studied to
classify by using the coreceptors of HIV genome.
The aim of this study is to develop the optimal Multilayer
Perceptron (MLP) for high classification accuracy of HIV sub-type viruses. To accomplish this purpose, the unit number in hidden layer
was incremented one by one, from one to a particular number. The statistical data of R5X4, R5 and X4 viruses was preprocessed by the
signal processing methods. Accessible residues of these virus sequences were extracted and modeled by Auto-Regressive Model
(AR) due to the dimension of residues is large and different from each other. Finally the pre-processed dataset was used to evolve MLP with various number of hidden units to determine R5X4
viruses. Furthermore, ROC analysis was used to figure out the optimal MLP structure.
SeqWord Gene Island Sniffer: a Program to Study the Lateral Genetic Exchange among Bacteria
SeqWord Gene Island Sniffer, a new program for
the identification of mobile genetic elements in sequences of bacterial chromosomes is presented. This program is based on the
analysis of oligonucleotide usage variations in DNA sequences. 3,518 mobile genetic elements were identified in 637 bacterial
genomes and further analyzed by sequence similarity and the
functionality of encoded proteins. The results of this study are stored in an open database http://anjie.bi.up.ac.za/geidb/geidbhome.
php). The developed computer program and the database provide the information valuable for further investigation of the
distribution of mobile genetic elements and virulence factors among bacteria. The program is available for download at www.bi.up.ac.za/SeqWord/sniffer/index.html.
Flow Discharge Determination in Straight Compound Channels Using ANNs
Although many researchers have studied the flow
hydraulics in compound channels, there are still many complicated problems in determination of their flow rating curves. Many different
methods have been presented for these channels but extending them
for all types of compound channels with different geometrical and
hydraulic conditions is certainly difficult. In this study, by aid of nearly 400 laboratory and field data sets of geometry and flow rating
curves from 30 different straight compound sections and using artificial neural networks (ANNs), flow discharge in compound channels was estimated. 13 dimensionless input variables including relative depth, relative roughness, relative width, aspect ratio, bed
slope, main channel side slopes, flood plains side slopes and berm
inclination and one output variable (flow discharge), have been used
in ANNs. Comparison of ANNs model and traditional method
(divided channel method-DCM) shows high accuracy of ANNs model results. The results of Sensitivity analysis showed that the relative depth with 47.6 percent contribution, is the most effective input parameter for flow discharge prediction. Relative width and
relative roughness have 19.3 and 12.2 percent of importance, respectively. On the other hand, shape parameter, main channel and
flood plains side slopes with 2.1, 3.8 and 3.8 percent of contribution, have the least importance.
Community Detection-based Analysis of the Human Interactome Network
The study of proteomics reached unexpected levels of
interest, as a direct consequence of its discovered influence over
some complex biological phenomena, such as problematic diseases
like cancer. This paper presents a new technique that allows for an
accurate analysis of the human interactome network. It is basically
a two-step analysis process that involves, at first, the detection of
each protein-s absolute importance through the betweenness centrality
computation. Then, the second step determines the functionallyrelated
communities of proteins. For this purpose, we use a community
detection technique that is based on the edge betweenness
calculation. The new technique was thoroughly tested on real biological
data and the results prove some interesting properties of those proteins that are involved in the carcinogenesis process. Apart from its
experimental usefulness, the novel technique is also computationally
effective in terms of execution times. Based on the analysis- results, some topological features of cancer mutated proteins are presented
and a possible optimization solution for cancer drugs design is suggested.
The Semantic Web: a New Approach for Future World Wide Web
The purpose of semantic web research is to transform
the Web from a linked document repository into a distributed knowledge base and application platform, thus allowing the vast range of available information and services to be more efficiently
exploited. As a first step in this transformation, languages such as
OWL have been developed. Although fully realizing the Semantic Web still seems some way off, OWL has already been very
successful and has rapidly become a defacto standard for ontology
development in fields as diverse as geography, geology, astronomy,
agriculture, defence and the life sciences. The aim of this paper is to classify key concepts of Semantic Web as well as introducing a new
practical approach which uses these concepts to outperform Word Wide Web.
Concurrency in Web Access Patterns Mining
Web usage mining is an interesting application of data
mining which provides insight into customer behaviour on the Internet. An important technique to discover user access and navigation trails is based on sequential patterns mining. One of the
key challenges for web access patterns mining is tackling the problem
of mining richly structured patterns. This paper proposes a novel
model called Web Access Patterns Graph (WAP-Graph) to represent all of the access patterns from web mining graphically. WAP-Graph
also motivates the search for new structural relation patterns, i.e. Concurrent Access Patterns (CAP), to identify and predict more
complex web page requests. Corresponding CAP mining and modelling methods are proposed and shown to be effective in the
search for and representation of concurrency between access patterns
on the web. From experiments conducted on large-scale synthetic
sequence data as well as real web access data, it is demonstrated that
CAP mining provides a powerful method for structural knowledge discovery, which can be visualised through the CAP-Graph model.
Description and Analysis of Embedded Firewall Techniques
With the turn of this century, many researchers
started showing interest in Embedded Firewall (EF) implementations.
These are not the usual firewalls that are used as checkpoints at network gateways. They are, rather, applied near those hosts that need protection. Hence by using them, individual or grouped network
components can be protected from the inside as well as from external attacks.
This paper presents a study of EF-s, looking at their architecture and problems. A comparative study assesses how practical each kind is. It particularly focuses on the architecture, weak points, and
portability of each kind. A look at their use by different categories of users is also presented.
Impact of Fair Share and its Configurations on Parallel Job Scheduling Algorithms
To provide a better understanding of fair share policies supported by current production schedulers and their impact on scheduling performance, A relative fair share policy supported in four well-known production job schedulers is evaluated in this study. The experimental results show that fair share indeed reduces heavy-demand users from dominating the system resources. However, the detailed per-user performance analysis show that some types of users may suffer unfairness under fair share, possibly due to priority mechanisms used by the current production schedulers. These users typically are not heavy-demands users but they have mixture of jobs that do not spread out.
VoIP Networks Performance Analysis with Encryption Systems
The VoIP networks as alternative method to traditional PSTN system has been implemented in a wide variety of structures
with multiple protocols, codecs, software and hardware–based
distributions. The use of cryptographic techniques let the users to have a secure communication, but the calculate throughput as well as the QoS parameters are affected according to the used algorithm. This
paper analyzes the VoIP throughput and the QoS parameters with
different commercial encryption methods. The measurement–based
approach uses lab scenarios to simulate LAN and WAN
environments. Security mechanisms such as TLS, SIAX2, SRTP,
IPSEC and ZRTP are analyzed with μ-LAW and GSM codecs.
Parametric Primitives for Hand Gesture Recognition
Imitation learning is considered to be an effective way of teaching humanoid robots and action recognition is the key step to imitation learning. In this paper an online algorithm to recognize
parametric actions with object context is presented. Objects are key instruments in understanding an action when there is uncertainty.
Ambiguities arising in similar actions can be resolved with objectn context. We classify actions according to the changes they make to
the object space. Actions that produce the same state change in the object movement space are classified to belong to the same class. This allow us to define several classes of actions where members of
each class are connected with a semantic interpretation.
Improving Spatiotemporal Change Detection: A High Level Fusion Approach for Discovering Uncertain Knowledge from Satellite Image Database
This paper investigates the problem of tracking spa¬tiotemporal changes of a satellite image through the use of Knowledge Discovery in Database (KDD). The purpose of this study is to help a given user effectively discover interesting knowledge and then build prediction and decision models. Unfortunately, the KDD process for spatiotemporal data is always marked by several types of imperfections. In our paper, we take these imperfections into consideration in order to provide more accurate decisions. To achieve this objective, different KDD methods are used to discover knowledge in satellite image databases. Each method presents a different point of view of spatiotemporal evolution of a query model (which represents an extracted object from a satellite image). In order to combine these methods, we use the evidence fusion theory which considerably improves the spatiotemporal knowledge discovery process and increases our belief in the spatiotemporal model change. Experimental results of satellite images representing the region of Auckland in New Zealand depict the improvement in the overall change detection as compared to using classical methods.
Calculus-based Runtime Verification
In this paper, a uniform calculus-based approach for
synthesizing monitors checking correctness properties specified by a
large variety of logics at runtime is provided, including future and past
time logics, interval logics, state machine and parameterized temporal
logics. We present a calculus mechanism to synthesize monitors from
the logical specification for the incremental analysis of execution
traces during test and real run. The monitor detects both good and bad
prefix of a particular kind, namely those that are informative for the
property under investigation. We elaborate the procedure of calculus
Comparative Evaluation of Color-Based Video Signatures in the Presence of Various Distortion Types
The robustness of color-based signatures in the presence of a selection of representative distortions is investigated. Considered are five signatures that have been developed and evaluated within a new modular framework. Two signatures presented in this work are directly derived from histograms gathered from video frames. The other three signatures are based on temporal information by computing difference histograms between adjacent frames. In order to obtain objective and reproducible results, the evaluations are conducted based on several randomly assembled test sets. These test sets are extracted from a video repository that contains a wide range of broadcast content including documentaries, sports, news, movies, etc. Overall, the experimental results show the adequacy of color-histogram-based signatures for video fingerprinting applications and indicate which type of signature should be preferred in the presence of certain distortions.