Excellence in Research and Innovation for Humanity

International Science Index

Commenced in January 1999 Frequency: Monthly Edition: International Paper Count: 63

Computer, Electrical, Automation, Control and Information Engineering

  • 2017
  • 2016
  • 2015
  • 2014
  • 2013
  • 2012
  • 2011
  • 2010
  • 2009
  • 2008
  • 2007
  • 63
    456
    Investigate the Relation between the Correctness and the Number of Versions of Fault Tolerant Software System
    Abstract:
    In this paper, we generalize several techniques in developing Fault Tolerant Software. We introduce property “Correctness" in evaluating N-version Systems and compare it to some commonly used properties such as reliability or availability. We also find out the relation between this property and the number of versions of system. Our experiments to verify the correctness and the applicability of the relation are also presented.
    62
    770
    Image Search by Features of Sorted Gray level Histogram Polynomial Curve
    Abstract:

    Image Searching was always a problem specially when these images are not properly managed or these are distributed over different locations. Currently different techniques are used for image search. On one end, more features of the image are captured and stored to get better results. Storing and management of such features is itself a time consuming job. While on the other extreme if fewer features are stored the accuracy rate is not satisfactory. Same image stored with different visual properties can further reduce the rate of accuracy. In this paper we present a new concept of using polynomials of sorted histogram of the image. This approach need less overhead and can cope with the difference in visual features of image.

    61
    1147
    Automatic Rearrangement of Localized Graphical User Interface
    Abstract:
    The localization of software products is essential for reaching the users of the international market. An important task for this is the translation of the user interface into local national languages. As graphical interfaces are usually optimized for the size of the texts in the original language, after the translation certain user controls (e.g. text labels and buttons in dialogs) may grow in such a manner that they slip above each other. This not only causes an unpleasant appearance but also makes the use of the program more difficult (or even impossible) which implies that the arrangement of the controls must be corrected subsequently. The correction should preserve the original structure of the interface (e.g. the relation of logically coherent controls), furthermore, it is important to keep the nicely proportioned design: the formation of large empty areas should be avoided. This paper describes an algorithm that automatically rearranges the controls of a graphical user interface based on the principles above. The algorithm has been implemented and integrated into a translation support system and reached results pleasant for the human eye in most test cases.
    60
    1196
    Analysis of Sonographic Images of Breast
    Abstract:
    Ultrasound images are very useful diagnostic tool to distinguish benignant from malignant masses of the breast. However, there is a considerable overlap between benignancy and malignancy in ultrasonic images which makes it difficult to interpret. In this paper, a new noise removal algorithm was used to improve the images and classification process. The masses are classified by wavelet transform's coefficients, morphological and textural features as a novel feature set for this goal. The Bayesian estimation theory is used to classify the tissues in three classes according to their features.
    59
    1353
    Towards a Suitable and Systematic Approach for Component Based Software Development
    Abstract:
    Software crisis refers to the situation in which the developers are not able to complete the projects within time and budget constraints and moreover these overscheduled and over budget projects are of low quality as well. Several methodologies have been adopted form time to time to overcome this situation and now in the focus is component based software engineering. In this approach, emphasis is on reuse of already existing software artifacts. But the results can not be achieved just by preaching the principles; they need to be practiced as well. This paper highlights some of the very basic elements of this approach, which has to be in place to get the desired goals of high quality, low cost with shorter time-to-market software products.
    58
    1513
    Selective Minterms Based Tabular Method for BDD Manipulations
    Abstract:

    The goal of this work is to describe a new algorithm for finding the optimal variable order, number of nodes for any order and other ROBDD parameters, based on a tabular method. The tabular method makes use of a pre-built backend database table that stores the ROBDD size for selected combinations of min-terms. The user uses the backend table and the proposed algorithm to find the necessary ROBDD parameters, such as best variable order, number of nodes etc. Experimental results on benchmarks are given for this technique.

    57
    1620
    A Comparison of Software Analysis and Design Methods for Real Time Systems
    Abstract:
    This paper examines and compares several of the most common real time methods. These methods are CORE, YSM, MASCOT, JSD, DARTS, RTSAD, ADARTS, CODARTS, HOOD, HRT-HOOD, ROOM, UML, UML-RT. The methods are compared using attributes like i) usability, ii) compositionality and iii) proper RT notations available. Finally some comparison results are given and discussed.
    56
    2095
    A Proof for Bisection Width of Grids
    Abstract:
    The optimal bisection width of r-dimensional N× · · ·× N grid is known to be Nr-1 when N is even, but when N is odd, only approximate values are available. This paper shows that the exact bisection width of grid is Nr -1 N-1 when N is odd.
    55
    2201
    Parallel Computation of Data Summation for Multiple Problem Spaces on Partitioned Optical Passive Stars Network
    Abstract:
    In Partitioned Optical Passive Stars POPS network,nodes and couplers become free after slot to slot in some computation.It is necessary to efficiently utilize free couplers and nodes to be cost effective. Improving parallelism, we present the fast data summation algorithm for multiple problem spaces on P OP S(g, g) with smaller number of nodes for the case of d =n = g. For the case of d >n > g, we simulate the calculation of large number of data items dedicated to larger system with many nodes on smaller system with smaller number of nodes. The algorithm is faster than the best know algorithm and using smaller number of nodes and groups make the system low cost and practical.
    54
    3484
    WAF: an Interface Web Agent Framework
    Abstract:
    A trend in agent community or enterprises is that they are shifting from closed to open architectures composed of a large number of autonomous agents. One of its implications could be that interface agent framework is getting more important in multi-agent system (MAS); so that systems constructed for different application domains could share a common understanding in human computer interface (HCI) methods, as well as human-agent and agent-agent interfaces. However, interface agent framework usually receives less attention than other aspects of MAS. In this paper, we will propose an interface web agent framework which is based on our former project called WAF and a Distributed HCI template. A group of new functionalities and implications will be discussed, such as web agent presentation, off-line agent reference, reconfigurable activation map of agents, etc. Their enabling techniques and current standards (e.g. existing ontological framework) are also suggested and shown by examples from our own implementation in WAF.
    53
    3880
    The CEO Mission II, Rescue Robot with Multi-Joint Mechanical Arm
    Abstract:
    This paper presents design features of a rescue robot, named CEO Mission II. Its body is designed to be the track wheel type with double front flippers for climbing over the collapse and the rough terrain. With 125 cm. long, 5-joint mechanical arm installed on the robot body, it is deployed not only for surveillance from the top view but also easier and faster access to the victims to get their vital signs. Two cameras and sensors for searching vital signs are set up at the tip of the multi-joint mechanical arm. The third camera is at the back of the robot for driving control. Hardware and software of the system, which controls and monitors the rescue robot, are explained. The control system is used for controlling the robot locomotion, the 5-joint mechanical arm, and for turning on/off devices. The monitoring system gathers all information from 7 distance sensors, IR temperature sensors, 3 CCD cameras, voice sensor, robot wheels encoders, yawn/pitch/roll angle sensors, laser range finder and 8 spare A/D inputs. All sensors and controlling data are communicated with a remote control station via IEEE 802.11b Wi-Fi. The audio and video data are compressed and sent via another IEEE 802.11g Wi-Fi transmitter for getting real-time response. At remote control station site, the robot locomotion and the mechanical arm are controlled by joystick. Moreover, the user-friendly GUI control program is developed based on the clicking and dragging method to easily control the movement of the arm. Robot traveling map is plotted from computing the information of wheel encoders and the yawn/pitch data. 2D Obstacle map is plotted from data of the laser range finder. The concept and design of this robot can be adapted to suit many other applications. As the Best Technique awardee from Thailand Rescue Robot Championship 2006, all testing results are satisfied.
    52
    4036
    A Method of Protecting Relational Databases Copyright with Cloud Watermark
    Abstract:
    With the development of Internet and databases application techniques, the demand that lots of databases in the Internet are permitted to remote query and access for authorized users becomes common, and the problem that how to protect the copyright of relational databases arises. This paper simply introduces the knowledge of cloud model firstly, includes cloud generators and similar cloud. And then combined with the property of the cloud, a method of protecting relational databases copyright with cloud watermark is proposed according to the idea of digital watermark and the property of relational databases. Meanwhile, the corresponding watermark algorithms such as cloud watermark embedding algorithm and detection algorithm are proposed. Then, some experiments are run and the results are analyzed to validate the correctness and feasibility of the watermark scheme. In the end, the foreground of watermarking relational database and its research direction are prospected.
    51
    4237
    Grouping and Indexing Color Features for Efficient Image Retrieval
    Abstract:

    Content-based Image Retrieval (CBIR) aims at searching image databases for specific images that are similar to a given query image based on matching of features derived from the image content. This paper focuses on a low-dimensional color based indexing technique for achieving efficient and effective retrieval performance. In our approach, the color features are extracted using the mean shift algorithm, a robust clustering technique. Then the cluster (region) mode is used as representative of the image in 3-D color space. The feature descriptor consists of the representative color of a region and is indexed using a spatial indexing method that uses *R -tree thus avoiding the high-dimensional indexing problems associated with the traditional color histogram. Alternatively, the images in the database are clustered based on region feature similarity using Euclidian distance. Only representative (centroids) features of these clusters are indexed using *R -tree thus improving the efficiency. For similarity retrieval, each representative color in the query image or region is used independently to find regions containing that color. The results of these methods are compared. A JAVA based query engine supporting query-by- example is built to retrieve images by color.

    50
    4305
    Classifier Based Text Mining for Neural Network
    Abstract:
    Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.
    49
    4406
    The Control Vector Scheme for Design of Planar Primitive PH curves
    Abstract:
    The PH curve can be constructed by given parameters, but the shape of the curve is not so easy to image from the value of the parameters. On the contract, Bézier curve can be constructed by the control polygon, and from the control polygon, we can image the figure of the curve. In this paper, we want to use the hodograph of Bézier curve to construct PH curve by selecting part of the control vectors, and produce other control vectors, so the property of PH curve exists.
    48
    4409
    Heuristic Continuous-time Associative Memories
    Abstract:
    In this paper, a novel associative memory model will be proposed and applied to memory retrievals based on the conventional continuous time model. The conventional model presents memory capacity is very low and retrieval process easily converges to an equilibrium state which is very different from the stored patterns. Genetic Algorithms is well-known with the capability of global optimal search escaping local optimum on progress to reach a global optimum. Based on the well-known idea of Genetic Algorithms, this work proposes a heuristic rule to make a mutation when the state of the network is trapped in a spurious memory. The proposal heuristic associative memory show the stored capacity does not depend on the number of stored patterns and the retrieval ability is up to ~ 1.
    47
    4424
    Learning Process Enhancement for Robot Behaviors
    Abstract:
    Designing a simulated system and training it to optimize its tasks in simulated environment helps the designers to avoid problems that may appear when designing the system directly in real world. These problems are: time consuming, high cost, high errors percentage and low efficiency and accuracy of the system. The proposed system will investigate and improve the efficiency and accuracy of a simulated robot to choose correct behavior to perform its task. In this paper, machine learning, which uses genetic algorithm, is adopted. This type of machine learning is called genetic-based machine learning in which a distributed classifier system is used to improve the efficiency and accuracy of the robot. Consequently, it helps the robot to achieve optimal action.
    46
    4459
    Enhanced Shell Sorting Algorithm
    Abstract:
    Many algorithms are available for sorting the unordered elements. Most important of them are Bubble sort, Heap sort, Insertion sort and Shell sort. These algorithms have their own pros and cons. Shell Sort which is an enhanced version of insertion sort, reduces the number of swaps of the elements being sorted to minimize the complexity and time as compared to insertion sort. Shell sort improves the efficiency of insertion sort by quickly shifting values to their destination. Average sort time is O(n1.25), while worst-case time is O(n1.5). It performs certain iterations. In each iteration it swaps some elements of the array in such a way that in last iteration when the value of h is one, the number of swaps will be reduced. Donald L. Shell invented a formula to calculate the value of ?h?. this work focuses to identify some improvement in the conventional Shell sort algorithm. ''Enhanced Shell Sort algorithm'' is an improvement in the algorithm to calculate the value of 'h'. It has been observed that by applying this algorithm, number of swaps can be reduced up to 60 percent as compared to the existing algorithm. In some other cases this enhancement was found faster than the existing algorithms available.
    45
    4766
    Object-Based Image Indexing and Retrieval in DCT Domain using Clustering Techniques
    Abstract:

    In this paper, we present a new and effective image indexing technique that extracts features directly from DCT domain. Our proposed approach is an object-based image indexing. For each block of size 8*8 in DCT domain a feature vector is extracted. Then, feature vectors of all blocks of image using a k-means algorithm is clustered into groups. Each cluster represents a special object of the image. Then we select some clusters that have largest members after clustering. The centroids of the selected clusters are taken as image feature vectors and indexed into the database. Also, we propose an approach for using of proposed image indexing method in automatic image classification. Experimental results on a database of 800 images from 8 semantic groups in automatic image classification are reported.

    44
    4769
    Game Skill Measure for Mixed Games
    Abstract:
    Games can be classified as games of skill, games of chance or otherwise be classified as mixed. This paper deals with the topic of scientifically classifying mixed games as more reliant on elements of chance or elements of skill and ways to scientifically measure the amount of skill involved. This is predominantly useful for classification of games as legal or illegal in deferent jurisdictions based on the local gaming laws. We propose a novel measure of skill to chance ratio called the Game Skill Measure (GSM) and utilize it to calculate the skill component of a popular variant of Poker.
    Keywords:
    43
    4849
    Performance Prediction of Multi-Agent Based Simulation Applications on the Grid
    Abstract:
    A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.
    42
    5635
    Remote-Sensing Sunspot Images to Obtain the Sunspot Roads
    Abstract:
    A combination of image fusion and quad tree decomposition method is used for detecting the sunspot trajectories in each month and computation of the latitudes of these trajectories in each solar hemisphere. Daily solar images taken with SOHO satellite are fused for each month and the result of fused image is decomposed with Quad Tree decomposition method in order to classifying the sunspot trajectories and then to achieve the precise information about latitudes of sunspot trajectories. Also with fusion we deduce some physical remarkable conclusions about sun magnetic fields behavior. Using quad tree decomposition we give information about the region on sun surface and the space angle that tremendous flares and hot plasma gases permeate interplanetary space and attack to satellites and human technical systems. Here sunspot images in June, July and August 2001 are used for studying and give a method to compute the latitude of sunspot trajectories in each month with sunspot images.
    41
    5728
    Image Similarity: A Genetic Algorithm Based Approach
    Abstract:
    The paper proposes an approach using genetic algorithm for computing the region based image similarity. The image is denoted using a set of segmented regions reflecting color and texture properties of an image. An image is associated with a family of image features corresponding to the regions. The resemblance of two images is then defined as the overall similarity between two families of image features, and quantified by a similarity measure, which integrates properties of all the regions in the images. A genetic algorithm is applied to decide the most plausible matching. The performance of the proposed method is illustrated using examples from an image database of general-purpose images, and is shown to produce good results.
    40
    5745
    A Novel Arabic Text Steganography Method Using Letter Points and Extensions
    Abstract:
    This paper presents a new steganography approach suitable for Arabic texts. It can be classified under steganography feature coding methods. The approach hides secret information bits within the letters benefiting from their inherited points. To note the specific letters holding secret bits, the scheme considers the two features, the existence of the points in the letters and the redundant Arabic extension character. We use the pointed letters with extension to hold the secret bit 'one' and the un-pointed letters with extension to hold 'zero'. This steganography technique is found attractive to other languages having similar texts to Arabic such as Persian and Urdu.
    39
    6551
    Analysis of Message Authentication in Turbo Coded Halftoned Images using Exit Charts
    Abstract:
    Considering payload, reliability, security and operational lifetime as major constraints in transmission of images we put forward in this paper a steganographic technique implemented at the physical layer. We suggest transmission of Halftoned images (payload constraint) in wireless sensor networks to reduce the amount of transmitted data. For low power and interference limited applications Turbo codes provide suitable reliability. Ensuring security is one of the highest priorities in many sensor networks. The Turbo Code structure apart from providing forward error correction can be utilized to provide for encryption. We first consider the Halftoned image and then the method of embedding a block of data (called secret) in this Halftoned image during the turbo encoding process is presented. The small modifications required at the turbo decoder end to extract the embedded data are presented next. The implementation complexity and the degradation of the BER (bit error rate) in the Turbo based stego system are analyzed. Using some of the entropy based crypt analytic techniques we show that the strength of our Turbo based stego system approaches that found in the OTPs (one time pad).
    38
    6762
    Performance Evaluation of a Neural Network based General Purpose Space Vector Modulator
    Abstract:
    Space Vector Modulation (SVM) is an optimum Pulse Width Modulation (PWM) technique for an inverter used in a variable frequency drive applications. It is computationally rigorous and hence limits the inverter switching frequency. Increase in switching frequency can be achieved using Neural Network (NN) based SVM, implemented on application specific chips. This paper proposes a neural network based SVM technique for a Voltage Source Inverter (VSI). The network proposed is independent of switching frequency. Different architectures are investigated keeping the total number of neurons constant. The performance of the inverter is compared for various switching frequencies for different architectures of NN based SVM. From the results obtained, the network with minimum resource and appropriate word length is identified. The bit precision required for this application is identified. The network with 8-bit precision is implemented in the IC XCV 400 and the results are presented. The performance of NN based general purpose SVM with higher bit precision is discussed.
    37
    6843
    A Reconfigurable Processing Element Implementation for Matrix Inversion Using Cholesky Decomposition
    Abstract:
    Fixed-point simulation results are used for the performance measure of inverting matrices using a reconfigurable processing element. Matrices are inverted using the Cholesky decomposition algorithm. The reconfigurable processing element is capable of all required mathematical operations. The fixed-point word length analysis is based on simulations of different condition numbers and different matrix sizes.
    36
    7357
    Deniable Authentication Protocol Resisting Man-in-the-Middle Attack
    Abstract:
    Deniable authentication is a new protocol which not only enables a receiver to identify the source of a received message but also prevents a third party from identifying the source of the message. The proposed protocol in this paper makes use of bilinear pairings over elliptic curves, as well as the Diffie-Hellman key exchange protocol. Besides the security properties shared with previous authentication protocols, the proposed protocol provides the same level of security with smaller public key sizes.
    35
    7580
    A Modified AES Based Algorithm for Image Encryption
    Abstract:
    With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.
    34
    7838
    An Interactive Ontology Visualization Approach for the Networked Home Environment
    Abstract:
    Ontologies are broadly used in the context of networked home environments. With ontologies it is possible to define and store context information, as well as to model different kinds of physical environments. Ontologies are central to networked home environments as they carry the meaning. However, ontologies and the OWL language is complex. Several ontology visualization approaches have been developed to enhance the understanding of ontologies. The domain of networked home environments sets some special requirements for the ontology visualization approach. The visualization tool presented here, visualizes ontologies in a domain-specific way. It represents effectively the physical structures and spatial relationships of networked home environments. In addition, it provides extensive interaction possibilities for editing and manipulating the visualization. The tool shortens the gap from beginner to intermediate OWL ontology reader by visualizing instances in their actual locations and making OWL ontologies more interesting and concrete, and above all easier to comprehend.
    33
    7927
    Fingerprint Compression Using Contourlet Transform and Multistage Vector Quantization
    Abstract:
    This paper presents a new fingerprint coding technique based on contourlet transform and multistage vector quantization. Wavelets have shown their ability in representing natural images that contain smooth areas separated with edges. However, wavelets cannot efficiently take advantage of the fact that the edges usually found in fingerprints are smooth curves. This issue is addressed by directional transforms, known as contourlets, which have the property of preserving edges. The contourlet transform is a new extension to the wavelet transform in two dimensions using nonseparable and directional filter banks. The computation and storage requirements are the major difficulty in implementing a vector quantizer. In the full-search algorithm, the computation and storage complexity is an exponential function of the number of bits used in quantizing each frame of spectral information. The storage requirement in multistage vector quantization is less when compared to full search vector quantization. The coefficients of contourlet transform are quantized by multistage vector quantization. The quantized coefficients are encoded by Huffman coding. The results obtained are tabulated and compared with the existing wavelet based ones.
    32
    8158
    Optimal Path Planner for Autonomous Vehicles
    Abstract:
    In this paper a real-time trajectory generation algorithm for computing 2-D optimal paths for autonomous aerial vehicles has been discussed. A dynamic programming approach is adopted to compute k-best paths by minimizing a cost function. Collision detection is implemented to detect intersection of the paths with obstacles. Our contribution is a novel approach to the problem of trajectory generation that is computationally efficient and offers considerable gain over existing techniques.
    31
    8250
    Tree Based Decomposition of Sunspot Images
    Abstract:
    Solar sunspot rotation, latitudinal bands are studied based on intelligent computation methods. A combination of image fusion method with together tree decomposition is used to obtain quantitative values about the latitudes of trajectories on sun surface that sunspots rotate around them. Daily solar images taken with SOlar and Heliospheric (SOHO) satellite are fused for each month separately .The result of fused image is decomposed with Quad Tree decomposition method in order to achieve the precise information about latitudes of sunspot trajectories. Such analysis is useful for gathering information about the regions on sun surface and coordinates in space that is more expose to solar geomagnetic storms, tremendous flares and hot plasma gases permeate interplanetary space and help human to serve their technical systems. Here sunspot images in September, November and October in 2001 are used for studying the magnetic behavior of sun.
    30
    8758
    A Set Theory Based Factoring Technique and Its Use for Low Power Logic Design
    Abstract:

    Factoring Boolean functions is one of the basic operations in algorithmic logic synthesis. A novel algebraic factorization heuristic for single-output combinatorial logic functions is presented in this paper and is developed based on the set theory paradigm. The impact of factoring is analyzed mainly from a low power design perspective for standard cell based digital designs in this paper. The physical implementation of a number of MCNC/IWLS combinational benchmark functions and sub-functions are compared before and after factoring, based on a simple technology mapping procedure utilizing only standard gate primitives (readily available as standard cells in a technology library) and not cells corresponding to optimized complex logic. The power results were obtained at the gate-level by means of an industry-standard power analysis tool from Synopsys, targeting a 130nm (0.13μm) UMC CMOS library, for the typical case. The wire-loads were inserted automatically and the simulations were performed with maximum input activity. The gate-level simulations demonstrate the advantage of the proposed factoring technique in comparison with other existing methods from a low power perspective, for arbitrary examples. Though the benchmarks experimentation reports mixed results, the mean savings in total power and dynamic power for the factored solution over a non-factored solution were 6.11% and 5.85% respectively. In terms of leakage power, the average savings for the factored forms was significant to the tune of 23.48%. The factored solution is expected to better its non-factored counterpart in terms of the power-delay product as it is well-known that factoring, in general, yields a delay-efficient multi-level solution.

    29
    8810
    Architectural, Technological and Performance Issues in Enterprise Applications
    Abstract:

    Enterprise applications are complex systems that are hard to develop and deploy in organizations. Although software application development tools, frameworks, methodologies and patterns are rapidly developing; many projects fail by causing big costs. There are challenging issues that programmers and designers face with while working on enterprise applications. In this paper, we present the three of the significant issues: Architectural, technological and performance. The important subjects in each issue are pointed out and recommendations are given. In architectural issues the lifecycle, meta-architecture, guidelines are pointed out. .NET and Java EE platforms are presented in technological issues. The importance of performance, measuring performance and profilers are explained in performance issues.

    28
    9015
    A Blind Digital Watermark in Hadamard Domain
    Abstract:

    A new blind gray-level watermarking scheme is described. In the proposed method, the host image is first divided into 4*4 non-overlapping blocks. For each block, two first AC coefficients of its Hadamard transform are then estimated using DC coefficients of its neighbor blocks. A gray-level watermark is then added into estimated values. Since embedding watermark does not change the DC coefficients, watermark extracting could be done by estimating AC coefficients and comparing them with their actual values. Several experiments are made and results suggest the robustness of the proposed algorithm.

    27
    9362
    Distributed 2-Vertex Connectivity Test of Graphs Using Local Knowledge
    Abstract:

    The vertex connectivity of a graph is the smallest number of vertices whose deletion separates the graph or makes it trivial. This work is devoted to the problem of vertex connectivity test of graphs in a distributed environment based on a general and a constructive approach. The contribution of this paper is threefold. First, using a preconstructed spanning tree of the considered graph, we present a protocol to test whether a given graph is 2-connected using only local knowledge. Second, we present an encoding of this protocol using graph relabeling systems. The last contribution is the implementation of this protocol in the message passing model. For a given graph G, where M is the number of its edges, N the number of its nodes and Δ is its degree, our algorithms need the following requirements: The first one uses O(Δ×N2) steps and O(Δ×logΔ) bits per node. The second one uses O(Δ×N2) messages, O(N2) time and O(Δ × logΔ) bits per node. Furthermore, the studied network is semi-anonymous: Only the root of the pre-constructed spanning tree needs to be identified.

    26
    9395
    Cross-Search Technique and its Visualization of Peer-to-Peer Distributed Clinical Documents
    Abstract:

    One of the ubiquitous routines in medical practice is searching through voluminous piles of clinical documents. In this paper we introduce a distributed system to search and exchange clinical documents. Clinical documents are distributed peer-to-peer. Relevant information is found in multiple iterations of cross-searches between the clinical text and its domain encyclopedia.

    25
    10312
    Decoupled Scheduling in Meta Environment
    Abstract:
    Grid scheduling is the process of mapping grid jobs to resources over multiple administrative domains. Traditionally, application-level schedulers have been tightly integrated with the application itself and were not easily applied to other applications. This design is generic that decouples the scheduler core (the search procedure) from the application-specific (e.g. application performance models) and platform-specific (e.g. collection of resource information) components used by the search procedure. In this decoupled approach the application details are not revealed completely to broker, but customer will give the application to resource provider for execution. In a decoupled approach, apart from scheduling, the resource selection can be performed independently in order to achieve scalability.
    24
    10319
    Towards an Effective Reputation Assessment Process in Peer-to-Peer Systems
    Abstract:

    The need for reputation assessment is particularly strong in peer-to-peer (P2P) systems because the peers' personal site autonomy is amplified by the inherent technological decentralization of the environment. However, the decentralization notion makes the problem of designing a peer-to-peer based reputation assessment substantially harder in P2P networks than in centralized settings.Existing reputation systems tackle the reputation assessment process in an ad-hoc manner. There is no systematic and coherent way to derive measures and analyze the current reputation systems. In this paper, we propose a reputation assessment process and use it to classify the existing reputation systems. Simulation experiments are conducted and focused on the different methods in selecting the recommendation sources and retrieving the recommendations. These two phases can contribute significantly to the overall performance due to communication cost and coverage.

    23
    10789
    BDD Package Based on Boolean NOR Operation
    Abstract:
    Binary Decision Diagrams (BDDs) are useful data structures for symbolic Boolean manipulations. BDDs are used in many tasks in VLSI/CAD, such as equivalence checking, property checking, logic synthesis, and false paths. In this paper we describe a new approach for the realization of a BDD package. To perform manipulations of Boolean functions, the proposed approach does not depend on the recursive synthesis operation of the IF-Then-Else (ITE). Instead of using the ITE operation, the basic synthesis algorithm is done using Boolean NOR operation.
    22
    11126
    Fuzzy Trust for Peer-to-Peer Based Systems
    Abstract:

    Trust management is one of the drawbacks in Peer-to-Peer (P2P) system. Lack of centralized control makes it difficult to control the behavior of the peers. Reputation system is one approach to provide trust assessment in P2P system. In this paper, we use fuzzy logic to model trust in a P2P environment. Our trust model combines first-hand (direct experience) and second-hand (reputation)information to allow peers to represent and reason with uncertainty regarding other peers' trustworthiness. Fuzzy logic can help in handling the imprecise nature and uncertainty of trust. Linguistic labels are used to enable peers assign a trust level intuitively. Our fuzzy trust model is flexible such that inference rules are used to weight first-hand and second-hand accordingly.

    21
    11539
    Architecture Based on Dynamic Graphs for the Dynamic Reconfiguration of Farms of Computers
    Abstract:

    In the last years, the computers have increased their capacity of calculus and networks, for the interconnection of these machines. The networks have been improved until obtaining the actual high rates of data transferring. The programs that nowadays try to take advantage of these new technologies cannot be written using the traditional techniques of programming, since most of the algorithms were designed for being executed in an only processor,in a nonconcurrent form instead of being executed concurrently ina set of processors working and communicating through a network.This paper aims to present the ongoing development of a new system for the reconfiguration of grouping of computers, taking into account these new technologies.

    20
    12187
    A New Face Detection Technique using 2D DCT and Self Organizing Feature Map
    Abstract:
    This paper presents a new technique for detection of human faces within color images. The approach relies on image segmentation based on skin color, features extracted from the two-dimensional discrete cosine transform (DCT), and self-organizing maps (SOM). After candidate skin regions are extracted, feature vectors are constructed using DCT coefficients computed from those regions. A supervised SOM training session is used to cluster feature vectors into groups, and to assign “face" or “non-face" labels to those clusters. Evaluation was performed using a new image database of 286 images, containing 1027 faces. After training, our detection technique achieved a detection rate of 77.94% during subsequent tests, with a false positive rate of 5.14%. To our knowledge, the proposed technique is the first to combine DCT-based feature extraction with a SOM for detecting human faces within color images. It is also one of a few attempts to combine a feature-invariant approach, such as color-based skin segmentation, together with appearance-based face detection. The main advantage of the new technique is its low computational requirements, in terms of both processing speed and memory utilization.
    19
    12393
    A Multilanguage Source Code Retrieval System Using Structural-Semantic Fingerprints
    Abstract:
    Source code retrieval is of immense importance in the software engineering field. The complex tasks of retrieving and extracting information from source code documents is vital in the development cycle of the large software systems. The two main subtasks which result from these activities are code duplication prevention and plagiarism detection. In this paper, we propose a Mohamed Amine Ouddan, and Hassane Essafi source code retrieval system based on two-level fingerprint representation, respectively the structural and the semantic information within a source code. A sequence alignment technique is applied on these fingerprints in order to quantify the similarity between source code portions. The specific purpose of the system is to detect plagiarism and duplicated code between programs written in different programming languages belonging to the same class, such as C, Cµ, Java and CSharp. These four languages are supported by the actual version of the system which is designed such that it may be easily adapted for any programming language.
    18
    12459
    A Keyword-Based Filtering Technique of Document-Centric XML using NFA Representation
    Abstract:
    XML is becoming a de facto standard for online data exchange. Existing XML filtering techniques based on a publish/subscribe model are focused on the highly structured data marked up with XML tags. These techniques are efficient in filtering the documents of data-centric XML but are not effective in filtering the element contents of the document-centric XML. In this paper, we propose an extended XPath specification which includes a special matching character '%' used in the LIKE operation of SQL in order to solve the difficulty of writing some queries to adequately filter element contents using the previous XPath specification. We also present a novel technique for filtering a collection of document-centric XMLs, called Pfilter, which is able to exploit the extended XPath specification. We show several performance studies, efficiency and scalability using the multi-query processing time (MQPT).
    17
    12579
    Versioning OWL Ontologies using Temporal Tags
    Abstract:
    Ontologies play an important role in semantic web applications and are often developed by different groups and continues to evolve over time. The knowledge in ontologies changes very rapidly that make the applications outdated if they continue to use old versions or unstable if they jump to new versions. Temporal frames using frame versioning and slot versioning are used to take care of dynamic nature of the ontologies. The paper proposes new tags and restructured OWL format enabling the applications to work with the old or new version of ontologies. Gene Ontology, a very dynamic ontology, has been used as a case study to explain the OWL Ontology with Temporal Tags.
    16
    13059
    Computer Proven Correctness of the Rabin Public-Key Scheme
    Abstract:
    We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.
    15
    13502
    An Exact Solution to Support Vector Mixture
    Abstract:
    This paper presents a new version of the SVM mixture algorithm initially proposed by Kwok for classification and regression problems. For both cases, a slight modification of the mixture model leads to a standard SVM training problem, to the existence of an exact solution and allows the direct use of well known decomposition and working set selection algorithms. Only the regression case is considered in this paper but classification has been addressed in a very similar way. This method has been successfully applied to engine pollutants emission modeling.
    14
    13870
    Multi Language Text Editor for Burushaski and Urdu through Unicode
    Abstract:
    This paper introduces an isolated and unique ancient language Burushaski, spoken in Hunza, Nagar, Yasin and parts of Gilgit in the Northern Areas of Pakistan. It explains the working mechanism of Multi Language Text Editor for Urdu and Burushaski. It is developed under the use of ISO/IEC 10646 Unicode standards for Urdu and Burushaski open-type fonts. It gives an ample opportunity to this regional ancient language to have a modern Information technology for its promotion and preservation. The main objective of this research paper is to help preserve the heritage of such rare languages and give smart way of automation. It also facilitates to those who are interested in undertaking research on Burushaski or keen to trace fonatic relationship between the national Urdu language and Burushaski. Since this editor covers both Burushaski and Urdu so it can play an important role to introduce Burusho linguistic culture to the world at large. Precisely, as a result of this research paper, Burushaski publication through IT means would be possible.
    13
    13928
    Analysis and Comparison of Image Encryption Algorithms
    Abstract:

    With the fast progression of data exchange in electronic way, information security is becoming more important in data storage and transmission. Because of widely using images in industrial process, it is important to protect the confidential image data from unauthorized access. In this paper, we analyzed current image encryption algorithms and compression is added for two of them (Mirror-like image encryption and Visual Cryptography). Implementations of these two algorithms have been realized for experimental purposes. The results of analysis are given in this paper.

    12
    14364
    Arabic Character Recognition using Artificial Neural Networks and Statistical Analysis
    Abstract:
    In this paper, an Arabic letter recognition system based on Artificial Neural Networks (ANNs) and statistical analysis for feature extraction is presented. The ANN is trained using the Least Mean Squares (LMS) algorithm. In the proposed system, each typed Arabic letter is represented by a matrix of binary numbers that are used as input to a simple feature extraction system whose output, in addition to the input matrix, are fed to an ANN. Simulation results are provided and show that the proposed system always produces a lower Mean Squared Error (MSE) and higher success rates than the current ANN solutions.
    11
    14421
    Danger Theory and Intelligent Data Processing
    Abstract:
    Artificial Immune System (AIS) is relatively naive paradigm for intelligent computations. The inspiration for AIS is derived from natural Immune System (IS). Classically it is believed that IS strives to discriminate between self and non-self. Most of the existing AIS research is based on this approach. Danger Theory (DT) argues this approach and proposes that IS fights against danger producing elements and tolerates others. We, the computational researchers, are not concerned with the arguments among immunologists but try to extract from it novel abstractions for intelligent computation. This paper aims to follow DT inspiration for intelligent data processing. The approach may introduce new avenue in intelligent processing. The data used is system calls data that is potentially significant in intrusion detection applications.
    10
    14492
    Economy-Based Computing with WebCom
    Abstract:
    Grid environments consist of the volatile integration of discrete heterogeneous resources. The notion of the Grid is to unite different users and organisations and pool their resources into one large computing platform where they can harness, inter-operate, collaborate and interact. If the Grid Community is to achieve this objective, then participants (Users and Organisations) need to be willing to donate or share their resources and permit other participants to use their resources. Resources do not have to be shared at all times, since it may result in users not having access to their own resource. The idea of reward-based computing was developed to address the sharing problem in a pragmatic manner. Participants are offered a reward to donate their resources to the Grid. A reward may include monetary recompense or a pro rata share of available resources when constrained. This latter point may imply a quality of service, which in turn may require some globally agreed reservation mechanism. This paper presents a platform for economybased computing using the WebCom Grid middleware. Using this middleware, participants can configure their resources at times and priority levels to suit their local usage policy. The WebCom system accounts for processing done on individual participants- resources and rewards them accordingly.
    9
    14864
    A New Pattern for Handwritten Persian/Arabic Digit Recognition
    Abstract:

    The main problem for recognition of handwritten Persian digits using Neural Network is to extract an appropriate feature vector from image matrix. In this research an asymmetrical segmentation pattern is proposed to obtain the feature vector. This pattern can be adjusted as an optimum model thanks to its one degree of freedom as a control point. Since any chosen algorithm depends on digit identity, a Neural Network is used to prevail over this dependence. Inputs of this Network are the moment of inertia and the center of gravity which do not depend on digit identity. Recognizing the digit is carried out using another Neural Network. Simulation results indicate the high recognition rate of 97.6% for new introduced pattern in comparison to the previous models for recognition of digits.

    8
    14941
    The Variation of Software Development Productivity 1995-2005
    Abstract:
    Software development has experienced remarkable progress in the past decade. However, due to the rising complexity and magnitude of the project the development productivity has not been consistently improved. By analyzing the latest ISBSG data repository with 4106 projects, we discovered that software development productivity has actually undergone irregular variations between the years 1995 and 2005. Considering the factors significant to the productivity, we found its variations are primarily caused by the variations of average team size and the unbalanced uses of the less productive language 3GL.
    7
    15040
    Alertness States Classification By SOM and LVQ Neural Networks
    Abstract:
    Several studies have been carried out, using various techniques, including neural networks, to discriminate vigilance states in humans from electroencephalographic (EEG) signals, but we are still far from results satisfactorily useable results. The work presented in this paper aims at improving this status with regards to 2 aspects. Firstly, we introduce an original procedure made of the association of two neural networks, a self organizing map (SOM) and a learning vector quantization (LVQ), that allows to automatically detect artefacted states and to separate the different levels of vigilance which is a major breakthrough in the field of vigilance. Lastly and more importantly, our study has been oriented toward real-worked situation and the resulting model can be easily implemented as a wearable device. It benefits from restricted computational and memory requirements and data access is very limited in time. Furthermore, some ongoing works demonstrate that this work should shortly results in the design and conception of a non invasive electronic wearable device.
    6
    15056
    Extended “2D-RIB“ for Impression-Based Satisfactory Retrieval and its Evaluation
    Abstract:
    Recently, lots of researchers are attracted to retrieving multimedia database by using some impression words and their values. Ikezoe-s research is one of the representatives and uses eight pairs of opposite impression words. We had modified its retrieval interface and proposed '2D-RIB' in the previous work. The aim of the present paper is to improve his/her satisfaction level to the retrieval result in the 2D-RIB. Our method is to extend the 2D-RIB. One of our extensions is to define and introduce the following two measures: 'melody goodness' and 'general acceptance'. Another extension is three types of customization menus. The result of evaluation using a pilot system is as follows. Both of these two measures 'melody goodness' and -general acceptance- can contribute to the improvement. Moreover, it is effective if we introduce the customization menu which enables a retrieval person to reduce the strictness level of retrieval condition in an impression pair based on his/her need.
    5
    15125
    Pakistan Sign Language Recognition Using Statistical Template Matching
    Abstract:
    Sign language recognition has been a topic of research since the first data glove was developed. Many researchers have attempted to recognize sign language through various techniques. However none of them have ventured into the area of Pakistan Sign Language (PSL). The Boltay Haath project aims at recognizing PSL gestures using Statistical Template Matching. The primary input device is the DataGlove5 developed by 5DT. Alternative approaches use camera-based recognition which, being sensitive to environmental changes are not always a good choice.This paper explains the use of Statistical Template Matching for gesture recognition in Boltay Haath. The system recognizes one handed alphabet signs from PSL.
    4
    15310
    An Approach to Image Extraction and Accurate Skin Detection from Web Pages
    Abstract:

    This paper proposes a system to extract images from web pages and then detect the skin color regions of these images. As part of the proposed system, using BandObject control, we built a Tool bar named 'Filter Tool Bar (FTB)' by modifying the Pavel Zolnikov implementation. The Yahoo! Team provides us with the Yahoo! SDK API, which also supports image search and is really useful. In the proposed system, we introduced three new methods for extracting images from the web pages (after loading the web page by using the proposed FTB, before loading the web page physically from the localhost, and before loading the web page from any server). These methods overcome the drawback of the regular expressions method for extracting images suggested by Ilan Assayag. The second part of the proposed system is concerned with the detection of the skin color regions of the extracted images. So, we studied two famous skin color detection techniques. The first technique is based on the RGB color space and the second technique is based on YUV and YIQ color spaces. We modified the second technique to overcome the failure of detecting complex image's background by using the saturation parameter to obtain an accurate skin detection results. The performance evaluation of the efficiency of the proposed system in extracting images before and after loading the web page from localhost or any server in terms of the number of extracted images is presented. Finally, the results of comparing the two skin detection techniques in terms of the number of pixels detected are presented.

    3
    15348
    Analysis of Sonogram Images of Thyroid Gland Based on Wavelet Transform
    Abstract:
    Sonogram images of normal and lymphocyte thyroid tissues have considerable overlap which makes it difficult to interpret and distinguish. Classification from sonogram images of thyroid gland is tackled in semiautomatic way. While making manual diagnosis from images, some relevant information need not to be recognized by human visual system. Quantitative image analysis could be helpful to manual diagnostic process so far done by physician. Two classes are considered: normal tissue and chronic lymphocyte thyroid (Hashimoto's Thyroid). Data structure is analyzed using K-nearest-neighbors classification. This paper is mentioned that unlike the wavelet sub bands' energy, histograms and Haralick features are not appropriate to distinguish between normal tissue and Hashimoto's thyroid.
    2
    15360
    Understanding and Measuring Trust Evolution Effectiveness in Peer-to-Peer Computing Systems
    Abstract:
    In any trust model, the two information sources that a peer relies on to predict trustworthiness of another peer are direct experience as well as reputation. These two vital components evolve over time. Trust evolution is an important issue, where the objective is to observe a sequence of past values of a trust parameter and determine the future estimates. Unfortunately, trust evolution algorithms received little attention and the proposed algorithms in the literature do not comply with the conditions and the nature of trust. This paper contributes to this important problem in the following ways: (a) presents an algorithm that manages and models trust evolution in a P2P environment, (b) devises new mechanisms for effectively maintaining trust values based on the conditions that influence trust evolution , and (c) introduces a new methodology for incorporating trust-nurture incentives into the trust evolution algorithm. Simulation experiments are carried out to evaluate our trust evolution algorithm.
    1
    15859
    Testing Loaded Programs Using Fault Injection Technique
    Abstract:
    Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.