Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.
The Impact of Regulatory Changes on the Development of Mobile Medical Apps
Mobile applications are being used to perform a wide variety of tasks in day-to-day life, ranging from checking email to controlling your home heating. Application developers have recognized the potential to transform a smart device into a medical device, by using a mobile medical application i.e. a mobile phone or a tablet. When initially conceived these mobile medical applications performed basic functions e.g. BMI calculator, accessing reference material etc.; however, increasing complexity offers clinicians and patients a range of functionality. As this complexity and functionality increases, so too does the potential risk associated with using such an application. Examples include any applications that provide the ability to inflate and deflate blood pressure cuffs, as well as applications that use patient-specific parameters and calculate dosage or create a dosage plan for radiation therapy. If an unapproved mobile medical application is marketed by a medical device organization, then they face significant penalties such as receiving an FDA warning letter to cease the prohibited activity, fines and possibility of facing a criminal conviction. Regulatory bodies have finalized guidance intended for mobile application developers to establish if their applications are subject to regulatory scrutiny. However, regulatory controls appear contradictory with the approaches taken by mobile application developers who generally work with short development cycles and very little documentation and as such, there is the potential to stifle further improvements due to these regulations. The research presented as part of this paper details how by adopting development techniques, such as agile software development, mobile medical application developers can meet regulatory requirements whilst still fostering innovation.
A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6
Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.
Towards Developing a Self-Explanatory Scheduling System Based on a Hybrid Approach
In the study, we present a conceptual framework for developing a scheduling system that can generate self-explanatory and easy-understanding schedules. To this end, a user interface is conceived to help planners record factors that are considered crucial in scheduling, as well as internal and external sources relating to such factors. A hybrid approach combining machine learning and constraint programming is developed to generate schedules and the corresponding factors, and accordingly display them on the user interface. Effects of the proposed system on scheduling are discussed, and it is expected that scheduling efficiency and system understandability will be improved, compared with previous scheduling systems.
Data Collection with Bounded-Sized Messages in Wireless Sensor Networks
In this paper, we study the data collection problem in
Wireless Sensor Networks (WSNs) adopting the two interference
models: The graph model and the more realistic physical interference
model known as Signal-to-Interference-Noise-Ratio (SINR). The
main issue of the problem is to compute schedules with the minimum
number of timeslots, that is, to compute the minimum latency
schedules, such that data from every node can be collected without
any collision or interference to a sink node. While existing works
studied the problem with unit-sized and unbounded-sized message
models, we investigate the problem with the bounded-sized message
model, and introduce a constant factor approximation algorithm.
To the best known of our knowledge, our result is the first result
of the data collection problem with bounded-sized model in both
A Practical and Efficient Evaluation Function for 3D Model Based Vehicle Matching
3D model-based vehicle matching provides a new way
for vehicle recognition, localization and tracking. Its key is to
construct an evaluation function, also called fitness function, to
measure the degree of vehicle matching. The existing fitness functions
often poorly perform when the clutter and occlusion exist in traffic
scenarios. In this paper, we present a practical and efficient fitness
function. Unlike the existing evaluation functions, the proposed
fitness function is to study the vehicle matching problem from
both local and global perspectives, which exploits the pixel gradient
information as well as the silhouette information. In view of the
discrepancy between 3D vehicle model and real vehicle, a weighting
strategy is introduced to differently treat the fitting of the model’s
wireframes. Additionally, a normalization operation for the model’s
projection is performed to improve the accuracy of the matching.
Experimental results on real traffic videos reveal that the proposed
fitness function is efficient and robust to the cluttered background
and partial occlusion.
Computing Maximum Uniquely Restricted Matchings in Restricted Interval Graphs
A uniquely restricted matching is defined to be a
matching M whose matched vertices induces a sub-graph which has
only one perfect matching. In this paper, we make progress on the
open question of the status of this problem on interval graphs (graphs
obtained as the intersection graph of intervals on a line). We give
an algorithm to compute maximum cardinality uniquely restricted
matchings on certain sub-classes of interval graphs. We consider two
sub-classes of interval graphs, the former contained in the latter, and
give O(|E|^2) time algorithms for both of them. It is to be noted that
both sub-classes are incomparable to proper interval graphs (graphs
obtained as the intersection graph of intervals in which no interval
completely contains another interval), on which the problem can be
solved in polynomial time.
An Open Source Advertisement System
An online advertisement system and its implementation
for the Yioop open source search engine are presented. This system
supports both selling advertisements and displaying them within
search results. The selling of advertisements is done using a system
to auction off daily impressions for keyword searches. This is an
open, ascending price auction system in which all accepted bids will
receive a fraction of the auctioned day’s impressions. New bids in
our system are required to be at least one half of the sum of all
previous bids ensuring the number of accepted bids is logarithmic
in the total ad spend on a keyword for a day. The mechanics of
creating an advertisement, attaching keywords to it, and adding it
to an advertisement inventory are described. The algorithm used to
go from accepted bids for a keyword to which ads are displayed at
search time is also presented. We discuss properties of our system
and compare it to existing auction systems and systems for selling
Cooperative Cross Layer Topology for Concurrent Transmission Scheduling Scheme in Broadband Wireless Networks
In this paper, we consider CCL-N (Cooperative Cross Layer Network) topology based on the cross layer (both centralized and distributed) environment to form network communities. Various performance metrics related to the IEEE 802.16 networks are discussed to design CCL-N Topology. In CCL-N topology, nodes are classified as master nodes (Master Base Station [MBS]) and serving nodes (Relay Station [RS]). Nodes communities are organized based on the networking terminologies. Based on CCL-N Topology, various simulation analyses for both transparent and non-transparent relays are tabulated and throughput efficiency is calculated. Weighted load balancing problem plays a challenging role in IEEE 802.16 network. CoTS (Concurrent Transmission Scheduling) Scheme is formulated in terms of three aspects – transmission mechanism based on identical communities, different communities and identical node communities. CoTS scheme helps in identifying the weighted load balancing problem. Based on the analytical results, modularity value is inversely proportional to that of the error value. The modularity value plays a key role in solving the CoTS problem based on hop count. The transmission mechanism for identical node community has no impact since modularity value is same for all the network groups. In this paper three aspects of communities based on the modularity value which helps in solving the problem of weighted load balancing and CoTS are discussed.
Improving Cryptographically Generated Address Algorithm in IPv6 Secure Neighbor Discovery Protocol through Trust Management
As transition to widespread use of IPv6 addresses has gained momentum, it has been shown to be vulnerable to certain security attacks such as those targeting Neighbor Discovery Protocol (NDP) which provides the address resolution functionality in IPv6. To protect this protocol, Secure Neighbor Discovery (SEND) is introduced. This protocol uses Cryptographically Generated Address (CGA) and asymmetric cryptography as a defense against threats on integrity and identity of NDP. Although SEND protects NDP against attacks, it is computationally intensive due to Hash2 condition in CGA. To improve the CGA computation speed, we parallelized CGA generation process and used the available resources in a trusted network. Furthermore, we focused on the influence of the existence of malicious nodes on the overall load of un-malicious ones in the network. According to the evaluation results, malicious nodes have adverse impacts on the average CGA generation time and on the average number of tries. We utilized a Trust Management that is capable of detecting and isolating the malicious node to remove possible incentives for malicious behavior. We have demonstrated the effectiveness of the Trust Management System in detecting the malicious nodes and hence improving the overall system performance.
Distributed Coordination of Connected and Automated Vehicles at Multiple Interconnected Intersections
In connected vehicle systems where wireless communication is available among the involved vehicles and intersection controllers, it is possible to design an intersection coordination strategy that leads the connected and automated vehicles (CAVs) travel through the road intersections without the conventional traffic light control. In this paper, we present a distributed coordination strategy for the CAVs at multiple interconnected intersections that aims at improving system fuel efficiency and system mobility. We present a distributed control solution where in the higher level, the intersection controllers calculate the road desired average velocity and optimally assign reference velocities of each vehicle. In the lower level, every vehicle is considered to use model predictive control (MPC) to track their reference velocity obtained from the higher level controller. The proposed method has been implemented on a simulation-based case with two-interconnected intersection network. Additionally, the effects of mixed vehicle types on the coordination strategy has been explored. Simulation results indicate the improvement on vehicle fuel efficiency and traffic mobility of the proposed method.
Network Coding with Buffer Scheme in Multicast for Broadband Wireless Network
Broadband Wireless Network (BWN) is the promising technology nowadays due to the increased number of smartphones. Buffering scheme using network coding considers the reliability and proper degree distribution in Worldwide interoperability for Microwave Access (WiMAX) multi-hop network. Using network coding, a secure way of transmission is performed which helps in improving throughput and reduces the packet loss in the multicast network. At the outset, improved network coding is proposed in multicast wireless mesh network. Considering the problem of performance overhead, degree distribution makes a decision while performing buffer in the encoding / decoding process. Consequently, BuS (Buffer Scheme) based on network coding is proposed in the multi-hop network. Here the encoding process introduces buffer for temporary storage to transmit packets with proper degree distribution. The simulation results depend on the number of packets received in the encoding/decoding with proper degree distribution using buffering scheme.
Using Genetic Algorithms to Outline Crop Rotations and a Cropping-System Model
The idea of cropping-system is a method used by
farmers. It is an environmentally-friendly method, protecting the
natural resources (soil, water, air, nutritive substances) and increase
the production at the same time, taking into account some crop
particularities. The combination of this powerful method with the
concepts of genetic algorithms results into a possibility of generating
sequences of crops in order to form a rotation. The usage of this type
of algorithms has been efficient in solving problems related to
optimization and their polynomial complexity allows them to be used
at solving more difficult and various problems. In our case, the
optimization consists in finding the most profitable rotation of
cultures. One of the expected results is to optimize the usage of the
resources, in order to minimize the costs and maximize the profit. In
order to achieve these goals, a genetic algorithm was designed. This
algorithm ensures the finding of several optimized solutions of
cropping-systems possibilities which have the highest profit and,
thus, which minimize the costs. The algorithm uses genetic-based
methods (mutation, crossover) and structures (genes, chromosomes).
A cropping-system possibility will be considered a chromosome and
a crop within the rotation is a gene within a chromosome. Results
about the efficiency of this method will be presented in a special
section. The implementation of this method would bring benefits into
the activity of the farmers by giving them hints and helping them to
use the resources efficiently.
VANETs: Security Challenges and Future Directions
Connected vehicles are equipped with wireless sensors
that aid in Vehicle to Vehicle (V2V) and Vehicle to Infrastructure
(V2I) communication. These vehicles will in the near future
provide road safety, improve transport efficiency, and reduce traffic
congestion. One of the challenges for connected vehicles is how
to ensure that information sent across the network is secure. If
security of the network is not guaranteed, several attacks can occur,
thereby compromising the robustness, reliability, and efficiency of
the network. This paper discusses existing security mechanisms and
unique properties of connected vehicles. The methodology employed
in this work is exploratory. The paper reviews existing security
solutions for connected vehicles. More concretely, it discusses
various cryptographic mechanisms available, and suggests areas
of improvement. The study proposes a combination of symmetric
key encryption and public key cryptography to improve security.
The study further proposes message aggregation as a technique to
overcome message redundancy. This paper offers a comprehensive
overview of connected vehicles technology, its applications, its
security mechanisms, open challenges, and potential areas of future
A Proposal for Systematic Mapping Study of Software Security Testing, Verification and Validation
Software vulnerabilities are increasing and not only impact services and processes availability as well as information confidentiality, integrity and privacy, but also cause changes that interfere in the development process. Security test could be a solution to reduce vulnerabilities. However, the variety of test techniques with the lack of real case studies of applying tests focusing on software development life cycle compromise its effective use. This paper offers an overview of how a Systematic Mapping Study (MS) about security verification, validation and test (VVT) was performed, besides presenting general results about this study.
Evaluation of Ensemble Classifiers for Intrusion Detection
One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection.
Ubiquitous Life People Informatics Engine (U-Life PIE): Wearable Health Promotion System
Since Google launched Google Glass in 2012, numbers of commercial wearable devices were released, such as smart belt, smart band, smart shoes, smart clothes ... etc. However, most of these devices perform as sensors to show the readings of measurements and few of them provide the interactive feedback to the user. Furthermore, these devices are single task devices which are not able to communicate with each other. In this paper a new health promotion system, Ubiquitous Life People Informatics Engine (U-Life PIE), will be presented. This engine consists of People Informatics Engine (PIE) and the interactive user interface. PIE collects all the data from the compatible devices, analyzes this data comprehensively and communicates between devices via various application programming interfaces. All the data and informations are stored on the PIE unit, therefore, the user is able to view the instant and historical data on their mobile devices any time. It also provides the real-time hands-free feedback and instructions through the user interface visually, acoustically and tactilely. These feedback and instructions suggest the user to adjust their posture or habits in order to avoid the physical injuries and prevent illness.
3D Mesh Coarsening via Uniform Clustering
In this paper, we present a fast and efficient mesh coarsening algorithm for 3D triangular meshes. Theis approach can be applied to very complex 3D meshes of arbitrary topology and with millions of vertices. The algorithm is based on the clustering of the input mesh elements, which divides the faces of an input mesh into a given number of clusters for clustering purpose by approximating the Centroidal Voronoi Tessellation of the input mesh. Once a clustering is achieved, it provides us an efficient way to construct uniform tessellations, and therefore leads to good coarsening of polygonal meshes. With proliferation of 3D scanners, this coarsening algorithm is particularly useful for reverse engineering applications of 3D models, which in many cases are dense, non-uniform, irregular and arbitrary topology. Examples demonstrating effectiveness of the new algorithm are also included in the paper.
An Integrated Cloud Service of Application Delivery in Virtualized Environments
Virtualization technologies are experiencing a renewed interest as a way to improve system reliability, and availability, reduce costs, and provide flexibility. This paper presents the development on leverage existing cloud infrastructure and virtualization tools. We adopted some virtualization technologies which improve portability, manageability and compatibility of applications by encapsulating them from the underlying operating system on which they are executed. Given the development of application virtualization, it allows shifting the user’s applications from the traditional PC environment to the virtualized environment, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the platform maintenances and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible and web-based application virtualization service represents the next significant step to the mobile workplace, and it lets user executes their applications from virtually anywhere.
An Analysis of Classification of Imbalanced Datasets by Using Synthetic Minority Over-Sampling Technique
Analysing unbalanced datasets is one of the challenges that practitioners in machine learning field face. However, many researches have been carried out to determine the effectiveness of the use of the synthetic minority over-sampling technique (SMOTE) to address this issue. The aim of this study was therefore to compare the effectiveness of the SMOTE over different models on unbalanced datasets. Three classification models (Logistic Regression, Support Vector Machine and Nearest Neighbour) were tested with multiple datasets, then the same datasets were oversampled by using SMOTE and applied again to the three models to compare the differences in the performances. Results of experiments show that the highest number of nearest neighbours gives lower values of error rates.
From Research to Teaching: Integrating Social Robotics in Engineering Degrees
When industrial robotics subject is taught in a degree in robotics, social and humanoid robotics concepts are rarely mentioned because this field of robotics is not used in industry. In this paper, an educational project related with industrial robotics is presented which includes social and humanoid robotics. The main motivations to realize this research are: i) humanoid robotics will be appearing soon in industry, the experience, based on research projects, indicates their deployment sooner than expected; ii) its educational interest, technology is shared with industrial robotics; iii) it is very attractive, students are interested in this part of the subject and thus they are interested in the whole subject. As a pedagogical methodology, the use of the problem-based learning is considered. Those concepts are introduced in a seminar during the last part of the subject and developed as a set of practices in the laboratory.
A Method for Modeling Flexible Manipulators: Transfer Matrix Method with Finite Segments
This paper presents a computationally efficient method
for the modeling of robot manipulators with flexible links and
joints. This approach combines the Discrete Time Transfer Matrix
Method with the Finite Segment Method, in which the flexible
links are discretized by a number of rigid segments connected by
torsion springs; and the flexibility of joints are modeled by torsion
springs. The proposed method avoids the global dynamics and has the
advantage of modeling non-uniform manipulators. Experiments and
simulations of a single-link flexible manipulator are conducted for
verifying the proposed methodologies. The simulations of a three-link
robot arm with links and joints flexibility are also performed.
Performance Analysis of Bluetooth Low Energy Mesh Routing Algorithm in Case of Disaster Prediction
Ubiquity of natural disasters during last few decades
have risen serious questions towards the prediction of such events
and human safety. Every disaster regardless its proportion has a
precursor which is manifested as a disruption of some environmental
parameter such as temperature, humidity, pressure, vibrations and
etc. In order to anticipate and monitor those changes, in this paper
we propose an overall system for disaster prediction and monitoring,
based on wireless sensor network (WSN). Furthermore, we introduce
a modified and simplified WSN routing protocol built on the top
of the trickle routing algorithm. Routing algorithm was deployed
using the bluetooth low energy protocol in order to achieve low
power consumption. Performance of the WSN network was analyzed
using a real life system implementation. Estimates of the WSN
parameters such as battery life time, network size and packet delay are
determined. Based on the performance of the WSN network, proposed
system can be utilized for disaster monitoring and prediction due to
its low power profile and mesh routing feature.
MIOM: A Mixed-Initiative Operational Model for Robots in Urban Search and Rescue
In this paper, we describe a Mixed-Initiative Operational
Model (MIOM) which directly intervenes on the state of the
functionalities embedded into a robot for Urban Search&Rescue
(USAR) domain applications. MIOM extends the reasoning
capabilities of the vehicle, i.e. mapping, path planning, visual
perception and trajectory tracking, with operator knowledge.
Especially in USAR scenarios, this coupled initiative has the main
advantage of enhancing the overall performance of a rescue mission.
In-field experiments with rescue responders have been carried out to
evaluate the effectiveness of this operational model.
Learning the Dynamics of Articulated Tracked Vehicles
In this work, we present a Bayesian non-parametric
approach to model the motion control of ATVs. The motion control
model is based on a Dirichlet Process-Gaussian Process (DP-GP)
mixture model. The DP-GP mixture model provides a flexible
representation of patterns of control manoeuvres along trajectories
of different lengths and discretizations. The model also estimates the
number of patterns, sufficient for modeling the dynamics of the ATV.
A Secure Proxy Signature Scheme with Fault Tolerance Based on RSA System
Due to the rapid growth in modern communication systems, fault tolerance and data security are two important issues in a secure transaction. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a secure proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.
Sensor and Actuator Fault Detection in Connected Vehicles under a Packet Dropping Network
Connected vehicles are one of the promising technologies for future Intelligent Transportation Systems (ITS). A connected vehicle system is essentially a set of vehicles communicating through a network to exchange their information with each other and the infrastructure. Although this interconnection of the vehicles can be potentially beneficial in creating an efficient, sustainable, and green transportation system, a set of safety and reliability challenges come out with this technology. The first challenge arises from the information loss due to unreliable communication network which affects the control/management system of the individual vehicles and the overall system. Such scenario may lead to degraded or even unsafe operation which could be potentially catastrophic. Secondly, faulty sensors and actuators can affect the individual vehicle’s safe operation and in turn will create a potentially unsafe node in the vehicular network. Further, sending that faulty sensor information to other vehicles and failure in actuators may significantly affect the safe operation of the overall vehicular network. Therefore, it is of utmost importance to take these issues into consideration while designing the control/management algorithms of the individual vehicles as a part of connected vehicle system. In this paper, we consider a connected vehicle system under Co-operative Adaptive Cruise Control (CACC) and propose a fault diagnosis scheme that deals with these aforementioned challenges. Specifically, the conventional CACC algorithm is modified by adding a Kalman filter-based estimation algorithm to suppress the effect of lost information under unreliable network. Further, a sliding mode observer-based algorithm is used to improve the sensor reliability under faults. The effectiveness of the overall diagnostic scheme is verified via simulation studies.
Maintaining User-Level Security in Short Message Service
Mobile phone has become as an essential thing in our life. Therefore, security is the most important thing to be considered in mobile communication. Short message service is the cheapest way of communication via the mobile phones. Therefore, security is very important in the short message service as well. This paper presents a method to maintain the security at user level. Different types of encryption methods are used to implement the user level security in mobile phones. Caesar cipher, Rail Fence, Vigenere cipher and RSA are used as encryption methods in this work. Caesar cipher and the Rail Fence methods are enhanced and implemented. The beauty in this work is that the user can select the encryption method and the key. Therefore, by changing the encryption method and the key time to time, the user can ensure the security of messages. By this work, while users can safely send/receive messages, they can save their information from unauthorised and unwanted people in their own mobile phone as well.
LiDAR Based Real Time Multiple Vehicle Detection and Tracking
Self-driving vehicle require a high level of situational
awareness in order to maneuver safely when driving in real world
condition. This paper presents a LiDAR based real time perception
system that is able to process sensor raw data for multiple target
detection and tracking in dynamic environment. The proposed
algorithm is nonparametric and deterministic that is no assumptions
and priori knowledge are needed from the input data and no
initializations are required. Additionally, the proposed method is
working on the three-dimensional data directly generated by LiDAR
while not scarifying the rich information contained in the domain of
3D. Moreover, a fast and efficient for real time clustering algorithm
is applied based on a radially bounded nearest neighbor (RBNN).
Hungarian algorithm procedure and adaptive Kalman filtering are
used for data association and tracking algorithm. The proposed
algorithm is able to run in real time with average run time of 70ms
Efficient Broadcasting in Wireless Sensor Networks
In this paper, we study the Minimum Latency Broadcast
Scheduling (MLBS) problem in wireless sensor networks (WSNs).
The main issue of the MLBS problem is to compute schedules
with the minimum number of timeslots such that a base station can
broadcast data to all other sensor nodes with no collisions. Unlike
existing works that utilize the traditional omni-directional WSNs,
we target the directional WSNs where nodes can collaboratively
determine and orientate their antenna directions. We first develop
a 7-approximation algorithm, adopting directional WSNs. Our ratio
is currently the best, to the best of our knowledge. We then validate
the performance of the proposed algorithm through simulation.
Approximately Similarity Measurement of Web Sites Using Genetic Algorithms and Binary Trees
In this paper, we determine the similarity of two HTML web applications. We are going to use a genetic algorithm in order to determine the most significant web pages of each application (we are not going to use every web page of a site). Using these significant web pages, we will find the similarity value between the two applications. The algorithm is going to be efficient because we are going to use a reduced number of web pages for comparisons but it will return an approximate value of the similarity. The binary trees are used to keep the tags from the significant pages. The algorithm was implemented in Java language.
Urban Corridor Management Strategy Based on Intelligent Transportation System
Intelligent Transportation System (ITS) is the application of technology for developing a user–friendly transportation system for urban areas in developing countries. The goal of urban corridor management using ITS in road transport is to achieve improvements in mobility, safety, and the productivity of the transportation system within the available facilities through the integrated application of advanced monitoring, communications, computer, display, and control process technologies, both in the vehicle and on the road. This paper attempts to present the past studies regarding several ITS available that have been successfully deployed in urban corridors of India and abroad, and to know about the current scenario and the methodology considered for planning, design, and operation of Traffic Management Systems. This paper also presents the endeavor that was made to interpret and figure out the performance of the 27.4 Km long study corridor having eight intersections and four flyovers. The corridor consisting of 6 lanes as well as 8 lanes divided road network. Two categories of data were collected on February 2016 such as traffic data (traffic volume, spot speed, delay) and road characteristics data (no. of lanes, lane width, bus stops, mid-block sections, intersections, flyovers). The instruments used for collecting the data were video camera, radar gun, mobile GPS and stopwatch. From analysis, the performance interpretations incorporated were identification of peak hours and off peak hours, congestion and level of service (LOS) at mid blocks, delay followed by the plotting speed contours and recommending urban corridor management strategies. From the analysis, it is found that ITS based urban corridor management strategies will be useful to reduce congestion, fuel consumption and pollution so as to provide comfort and efficiency to the users. The paper presented urban corridor management strategies based on sensors incorporated in both vehicles and on the roads.
Proxisch: An Optimization Approach of Large-Scale Unstable Proxy Servers Scheduling
Nowadays, big companies such as Google, Microsoft,
which have adequate proxy servers, have perfectly implemented
their web crawlers for a certain website in parallel. But due to
lack of expensive proxy servers, it is still a puzzle for researchers
to crawl large amounts of information from a single website in
parallel. In this case, it is a good choice for researchers to use
free public proxy servers which are crawled from the Internet. In
order to improve efficiency of web crawler, the following two issues
should be considered primarily: (1) Tasks may fail owing to the
instability of free proxy servers; (2) A proxy server will be blocked
if it visits a single website frequently. In this paper, we propose
Proxisch, an optimization approach of large-scale unstable proxy
servers scheduling, which allow anyone with extremely low cost to
run a web crawler efficiently. Proxisch is designed to work efficiently
by making maximum use of reliable proxy servers. To solve second
problem, it establishes a frequency control mechanism which can
ensure the visiting frequency of any chosen proxy server below the
website’s limit. The results show that our approach performs better
than the other scheduling algorithms.
Satellite Imagery Classification Based on Deep Convolution Network
Satellite imagery classification is a challenging problem with many practical applications. In this paper, we designed a deep convolution neural network (DCNN) to classify the satellite imagery. The contributions of this paper are twofold — First, to cope with the large-scale variance in the satellite image, we introduced the inception module, which has multiple filters with different size at the same level, as the building block to build our DCNN model. Second, we proposed a genetic algorithm based method to efficiently search the best hyper-parameters of the DCNN in a large search space. The proposed method is evaluated on the benchmark database. The results of the proposed hyper-parameters search method show it will guide the search towards better regions of the parameter space. Based on the found hyper-parameters, we built our DCNN models, and evaluated its performance on satellite imagery classification, the results show the classification accuracy of proposed models outperform the state of the art method.
Reasons for Non-Applicability of Software Entropy Metrics for Bug Prediction in Android
Software Entropy Metrics for bug prediction have been validated on various software systems by different researchers. In our previous research, we have validated that Software Entropy Metrics calculated for Mozilla subsystem’s predict the future bugs reasonably well. In this study, the Software Entropy metrics are calculated for a subsystem of Android and it is noticed that these metrics are not suitable for bug prediction. The results are compared with a subsystem of Mozilla and a comparison is made between the two software systems to determine the reasons why Software Entropy metrics are not applicable for Android.
Product Features Extraction from Opinions According to Time
Nowadays, e-commerce shopping websites have experienced noticeable growth. These websites have gained consumers’ trust. After purchasing a product, many consumers share comments where opinions are usually embedded about the given product. Research on the automatic management of opinions that gives suggestions to potential consumers and portrays an image of the product to manufactures has been growing recently. After launching the product in the market, the reviews generated around it do not usually contain helpful information or generic opinions about this product (e.g. telephone: great phone...); in the sense that the product is still in the launching phase in the market. Within time, the product becomes old. Therefore, consumers perceive the advantages/ disadvantages about each specific product feature. Therefore, they will generate comments that contain their sentiments about these features. In this paper, we present an unsupervised method to extract different product features hidden in the opinions which influence its purchase, and that combines Time Weighting (TW) which depends on the time opinions were expressed with Term Frequency-Inverse Document Frequency (TF-IDF). We conduct several experiments using two different datasets about cell phones and hotels. The results show the effectiveness of our automatic feature extraction, as well as its domain independent characteristic.
Case-Based Reasoning: A Hybrid Classification Model Improved with an Expert's Knowledge for High-Dimensional Problems
Data mining and classification of objects is the process of data analysis, using various machine learning techniques, which is used today in various fields of research. This paper presents a concept of hybrid classification model improved with the expert knowledge. The hybrid model in its algorithm has integrated several machine learning techniques (Information Gain, K-means, and Case-Based Reasoning) and the expert’s knowledge into one. The knowledge of experts is used to determine the importance of features. The paper presents the model algorithm and the results of the case study in which the emphasis was put on achieving the maximum classification accuracy without reducing the number of features.
Radio Frequency Identification Encryption via Modified Two Dimensional Logistic Map
A modified two dimensional (2D) logistic map based on cross feedback control is proposed. This 2D map exhibits more random chaotic dynamical properties than the classic one dimensional (1D) logistic map in the statistical characteristics analysis. So it is utilized as the pseudo-random (PN) sequence generator, where the obtained real-valued PN sequence is quantized at first, then applied to radio frequency identification (RFID) communication system in this paper. This system is experimentally validated on a cortex-M0 development board, which shows the effectiveness in key generation, the size of key space and security. At last, further cryptanalysis is studied through the test suite in the National Institute of Standards and Technology (NIST).
Implementing Fault Tolerance with Proxy Signature on the Improvement of RSA System
Fault tolerance and data security are two important issues in modern communication systems. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on the improved RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.
The Internet of Things Ecosystem: Survey of the Current Landscape, Identity Relationship Management, Multifactor Authentication Mechanisms, and Underlying Protocols
A critical component in the Internet of Things (IoT) ecosystem is the need for secure and appropriate transmission, processing, and storage of the data. Our current forms of authentication, and identity and access management do not suffice because they are not designed to service cohesive, integrated, interconnected devices, and service applications. The seemingly endless opportunities of IoT are in fact circumscribed on multiple levels by concerns such as trust, privacy, security, loss of control, and related issues. This paper considers multi-factor authentication (MFA) mechanisms and cohesive identity relationship management (IRM) standards. It also surveys messaging protocols that are appropriate for the IoT ecosystem.
TBOR: Tree Based Opportunistic Routing for Mobile Ad Hoc Networks
A mobile ad hoc network (MANET) is a wireless communication network where nodes that are not within direct transmission range establish their communication via the help of other nodes to forward data. Routing protocols in MANETs are usually categorized as proactive. Tree Based Opportunistic Routing (TBOR) finds a multipath link based on maximum probability of the throughput. The simulation results show that the presented method is performed very well compared to the existing methods in terms of throughput, delay and routing overhead.
Tree Based Data Fusion Clustering Routing Algorithm for Illimitable Network Administration in Wireless Sensor Network
In wireless sensor networks, locality and positioning information can be captured using Global Positioning System (GPS). This message can be congregated initially from spot to identify the system. Users can retrieve information of interest from a wireless sensor network (WSN) by injecting queries and gathering results from the mobile sink nodes. Routing is the progression of choosing optimal path in a mobile network. Intermediate node employs permutation of device nodes into teams and generating cluster heads that gather the data from entity cluster’s node and encourage the collective data to base station. WSNs are widely used for gathering data. Since sensors are power-constrained devices, it is quite vital for them to reduce the power utilization. A tree-based data fusion clustering routing algorithm (TBDFC) is used to reduce energy consumption in wireless device networks. Here, the nodes in a tree use the cluster formation, whereas the elevation of the tree is decided based on the distance of the member nodes to the cluster-head. Network simulation shows that this scheme improves the power utilization by the nodes, and thus considerably improves the lifetime.
A Case-Based Reasoning-Decision Tree Hybrid System for Stock Selection
Stock selection is an important decision-making problem. Many machine learning and data mining technologies are employed to build automatic stock-selection system. A profitable stock-selection system should consider the stock’s investment value and the market timing. In this paper, we present a hybrid system including both engage for stock selection. This system uses a case-based reasoning (CBR) model to execute the stock classification, uses a decision-tree model to help with market timing and stock selection. The experiments show that the performance of this hybrid system is better than that of other techniques regarding to the classification accuracy, the average return and the Sharpe ratio.
Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping
The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.
Experience Report about the Inclusion of People with Disabilities in the Process of Testing an Accessible System for Learning Management
This article discusses the inclusion of people with
disabilities in the process of testing an accessible system solution
for distance education. The accessible system, team profile,
methodologies and techniques covered in the testing process are
presented. The testing process shown in this paper was designed
from the experience with user. The testing process emerged from
lessons learned from past experiences and the end user is present at
all stages of the tests. Also, lessons learned are reported and how it
was possible the maturing of the team and the methods resulting in
a simple, productive and effective process.
Indoor Mobile Robot Positioning Based on Wireless Fingerprint Matching
This paper discusses the design of an indoor mobile robot positioning system. The problem of indoor positioning is solved through Wi-Fi fingerprint positioning to implement a low cost deployment. A wireless fingerprint matching algorithm based on the similarity of unequal length sequences is presented. Candidate sequences selection is defined as a set of mappings, and detection errors caused by wireless hotspot stability and the change of interior pattern can be corrected by transforming the unequal length sequences into equal length sequences. The presented scheme was verified experimentally to achieve the accuracy requirements for an indoor positioning system with low deployment cost.
An Optimal Steganalysis Based Approach for Embedding Information in Image Cover Media with Security
This paper deals with the study of interest in the fields
of Steganography and Steganalysis. Steganography involves hiding
information in a cover media to obtain the stego media in such a
way that the cover media is perceived not to have any embedded
message for its unintended recipients. Steganalysis is the mechanism
of detecting the presence of hidden information in the stego media
and it can lead to the prevention of disastrous security incidents. In
this paper, we provide a critical review of the steganalysis algorithms
available to analyze the characteristics of an image stego media
against the corresponding cover media and understand the process
of embedding the information and its detection. We anticipate that
this paper can also give a clear picture of the current trends in
steganography so that we can develop and improvise appropriate
Hybrid Authentication System Using QR Code with OTP
As we know, number of Internet users are increasing drastically. Now, people are using different online services provided by banks, colleges/schools, hospitals, online utility, bill payment and online shopping sites. To access online services, text-based authentication system is in use. The text-based authentication scheme faces some drawbacks with usability and security issues that bring troubles to users. The core element of computational trust is identity. The aim of the paper is to make the system more compliable for the imposters and more reliable for the users, by using the graphical authentication approach. In this paper, we are using the more powerful tool of encoding the options in graphical QR format and also there will be the acknowledgment which will send to the user’s mobile for final verification. The main methodology depends upon the encryption option and final verification by confirming a set of pass phrase on the legal users, the outcome of the result is very powerful as it only gives the result at once when the process is successfully done. All processes are cross linked serially as the output of the 1st process, is the input of the 2nd and so on. The system is a combination of recognition and pure recall based technique. Presented scheme is useful for devices like PDAs, iPod, phone etc. which are more handy and convenient to use than traditional desktop computer systems.
A Hypercube Social Feature Extraction and Multipath Routing in Delay Tolerant Networks
Delay Tolerant Networks (DTN) which have sufficient state information include trajectory and contact information, to protect routing efficiency. However, state information is dynamic and hard to obtain without a global and/or long-term collection process. To deal with these problems, the internal social features of each node are introduced in the network to perform the routing process. This type of application is motivated from several human contact networks where people contact each other more frequently if they have more social features in common. Two unique processes were developed for this process; social feature extraction and multipath routing. The routing method then becomes a hypercube–based feature matching process. Furthermore, the effectiveness of multipath routing is evaluated and compared to that of single-path routing.