A Hybrid Approach for Thread Recommendation in MOOC Forums
Recommender Systems have been developed to provide contents and services compatible to users based on their behaviors and interests. Due to information overload in online discussion forums and users diverse interests, recommending relative topics and threads is considered to be helpful for improving the ease of forum usage. In order to lead learners to find relevant information in educational forums, recommendations are even more needed. We present a hybrid thread recommender system for MOOC forums by applying social network analysis and association rule mining techniques. Initial results indicate that the proposed recommender system performs comparatively well with regard to limited available data from users' previous posts in the forum.
Landcover Mapping Using Lidar Data and Aerial Image and Soil Fertility Degradation Assessment for Rice Production Area in Quezon, Nueva Ecija, Philippines
Land-cover maps were important for many scientific, ecological and land management purposes and during the last decades, rapid decrease of soil fertility was observed to be due to land use practices such as rice cultivation. High-precision land-cover maps are not yet available in the area which is important in an economy management. To assure accurate mapping of land cover to provide information, remote sensing is a very suitable tool to carry out this task and automatic land use and cover detection. The study did not only provide high precision land cover maps but it also provides estimates of rice production area that had undergone chemical degradation due to fertility decline. Land-cover were delineated and classified into pre-defined classes to achieve proper detection features. After generation of Land-cover map, of high intensity of rice cultivation, soil fertility degradation assessment in rice production area due to fertility decline was created to assess the impact of soils used in agricultural production. Using Simple spatial analysis functions and ArcGIS, the Land-cover map of Municipality of Quezon in Nueva Ecija, Philippines was overlaid to the fertility decline maps from Land Degradation Assessment Philippines- Bureau of Soils and Water Management (LADA-Philippines-BSWM) to determine the area of rice crops that were most likely where nitrogen, phosphorus, zinc and sulfur deficiencies were induced by high dosage of urea and imbalance N:P fertilization. The result found out that 80.00 % of fallow and 99.81% of rice production area has high soil fertility decline.
Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment
Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.
A Transform Domain Function Controlled VSSLMS Algorithm for Sparse System Identification
The convergence rate of the least-mean-square (LMS)
algorithm deteriorates if the input signal to the filter is correlated.
In a system identification problem, this convergence rate can be
improved if the signal is white and/or if the system is sparse. We
recently proposed a sparse transform domain LMS-type algorithm
that uses a variable step-size for a sparse system identification.
The proposed algorithm provided high performance even if the
input signal is highly correlated. In this work, we investigate the
performance of the proposed TD-LMS algorithm for a large number
of filter tap which is also a critical issue for standard LMS algorithm.
Additionally, the optimum value of the most important parameter is
calculated for all experiments. Moreover, the convergence analysis
of the proposed algorithm is provided. The performance of the
proposed algorithm has been compared to different algorithms in a
sparse system identification setting of different sparsity levels and
different number of filter taps. Simulations have shown that the
proposed algorithm has prominent performance compared to the other
IOT Based Process Model for Heart Monitoring Process
Connecting health services with technology has a huge demand as people health situations are becoming worse day by day. In fact, engaging new technologies such as Internet of Things (IOT) into the medical services can enhance the patient care services. Specifically, patients suffering from chronic diseases such as cardiac patients need a special care and monitoring. In reality, some efforts were previously taken to automate and improve the patient monitoring systems. However, the previous efforts have some limitations and lack the real-time feature needed for chronic kind of diseases. In this paper, an improved process model for patient monitoring system specialized for cardiac patients is presented. A survey was distributed and interviews were conducted to gather the needed requirements to improve the cardiac patient monitoring system. Business Process Model and Notation (BPMN) language was used to model the proposed process. In fact, the proposed system uses the IOT Technology to assist doctors to remotely monitor and follow-up with their heart patients in real-time. In order to validate the effectiveness of the proposed solution, simulation analysis was performed using Bizagi Modeler tool. Analysis results show performance improvements in the heart monitoring process. For the future, authors suggest enhancing the proposed system to cover all the chronic diseases.
On the Construction of Lightweight Circulant Maximum Distance Separable Matrices
MDS matrices are of great significance in the design
of block ciphers and hash functions. In the present paper, we
investigate the problem of constructing MDS matrices which are
both lightweight and low-latency. We propose a new method of
constructing lightweight MDS matrices using circulant matrices
which can be implemented efficiently in hardware. Furthermore, we
provide circulant MDS matrices with as few bit XOR operations as
possible for the classical dimensions 4 × 4, 8 × 8 over the space of
linear transformations over finite field F42
. In contrast to previous
constructions of MDS matrices, our constructions have achieved
Evolving Knowledge Extraction from Online Resources
In this paper, we present an evolving knowledge
extraction system named AKEOS (Automatic Knowledge Extraction
from Online Sources). AKEOS consists of two modules, including
a one-time learning module and an evolving learning module.
The one-time learning module takes in user input query, and
automatically harvests knowledge from online unstructured resources
in an unsupervised way. The output of the one-time learning is a
structured vector representing the harvested knowledge. The evolving
learning module automatically schedules and performs repeated
one-time learning to extract the newest information and track the
development of an event. In addition, the evolving learning module
summarizes the knowledge learned at different time points to produce
a final knowledge vector about the event. With the evolving learning,
we are able to visualize the key information of the event, discover
the trends, and track the development of an event.
Research on Urban Point of Interest Generalization Method Based on Mapping Presentation
Without taking account of the attribute richness of POI (point of interest) data and spatial distribution limited by roads, a POI generalization method considering both attribute information and spatial distribution has been proposed against the existing point generalization algorithm merely focusing on overall information of point groups. Hierarchical characteristic of urban POI information expression has been firstly analyzed to point out the measurement feature of the corresponding hierarchy. On this basis, an urban POI generalizing strategy has been put forward: POIs urban road network have been divided into three distribution pattern; corresponding generalization methods have been proposed according to the characteristic of POI data in different distribution patterns. Experimental results showed that the method taking into account both attribute information and spatial distribution characteristics of POI can better implement urban POI generalization in the mapping presentation.
Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency
Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.
A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database
The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.
Causal Relation Identification Using Convolutional Neural Networks and Knowledge Based Features
Causal relation identification is a crucial task in information extraction and knowledge discovery. In this work, we present two approaches to causal relation identification. The first is a classification model trained on a set of knowledge-based features. The second is a deep learning based approach training a model using convolutional neural networks to classify causal relations. We experiment with several different convolutional neural networks (CNN) models based on previous work on relation extraction as well as our own research. Our models are able to identify both explicit and implicit causal relations as well as the direction of the causal relation. The results of our experiments show a higher accuracy than previously achieved for causal relation identification tasks.
Sentiment Analysis: Comparative Analysis of Multilingual Sentiment and Opinion Classification Techniques
Sentiment analysis and opinion mining have become
emerging topics of research in recent years but most of the work
is focused on data in the English language. A comprehensive
research and analysis are essential which considers multiple
languages, machine translation techniques, and different classifiers.
This paper presents, a comparative analysis of different approaches
for multilingual sentiment analysis. These approaches are divided
into two parts: one using classification of text without language
translation and second using the translation of testing data to a
target language, such as English, before classification. The presented
research and results are useful for understanding whether machine
translation should be used for multilingual sentiment analysis or
building language specific sentiment classification systems is a better
approach. The effects of language translation techniques, features,
and accuracy of various classifiers for multilingual sentiment analysis
is also discussed in this study.
Self-Tuning Fuzzy Control of Seat Vibrations of Active Quarter Car Model
An active quarter car model with three degrees of freedom is presented for vibration reduction of passenger seat. The designed Fuzzy Logic Controller (FLC) and Self-Tuning Fuzzy Logic Controller (STFLC) are applied in seat suspension. Vibration control performance of active and passive quarter car systems are determined using simulation work. Simulation results in terms of passenger seat acceleration and displacement responses are compared for controlled and uncontrolled cases. Simulation results showed the improved results of both FLC and STFLC controllers in improving passenger ride comfort compared to uncontrolled case. Furthermore, the best performance in simulation studies is achieved by STFLC controlled suspension system compared to FLC controlled and uncontrolled cases.
Performance Analysis of Proprietary and Non-Proprietary Tools for Regression Testing Using Genetic Algorithm
The present paper addresses to the research in the area of regression testing with emphasis on automated tools as well as prioritization of test cases. The uniqueness of regression testing and its cyclic nature is pointed out. The difference in approach between industry, with business model as basis, and academia, with focus on data mining, is highlighted. Test Metrics are discussed as a prelude to our formula for prioritization; a case study is further discussed to illustrate this methodology. An industrial case study is also described in the paper, where the number of test cases is so large that they have to be grouped as Test Suites. In such situations, a genetic algorithm proposed by us can be used to reconfigure these Test Suites in each cycle of regression testing. The comparison is made between a proprietary tool and an open source tool using the above-mentioned metrics. Our approach is clarified through several tables.
A Review on Factors Influencing Implementation of Secure Software Development Practices
More and more businesses and services are depending on software to run their daily operations and business services. At the same time, cyber-attacks are becoming more covert and sophisticated, posing threats to software. Vulnerabilities exist in the software due to the lack of security practices during the phases of software development. Implementation of secure software development practices can improve the resistance to attacks. Many methods, models and standards for secure software development have been developed. However, despite the efforts, they still come up against difficulties in their deployment and the processes are not institutionalized. There is a set of factors that influence the successful deployment of secure software development processes. In this study, the methodology and results from a systematic literature review of factors influencing the implementation of secure software development practices is described. A total of 44 primary studies were analysed as a result of the systematic review. As a result of the study, a list of twenty factors has been identified. Some of factors that affect implementation of secure software development practices are: Involvement of the security expert, integration between security and development team, developer’s skill and expertise, development time and communication between stakeholders. The factors were further classified into four categories which are institutional context, people and action, project content and system development process. The results obtained show that it is important to take into account organizational, technical and people issues in order to implement secure software development initiatives.
The Framework of System Safety for Multi Human-in-The-Loop System
In Cyber Physical System (CPS), if there are a large number of persons in the process, a role of person in CPS might be different comparing with the one-man system. It is also necessary to consider how Human-in-The-Loop Cyber Physical Systems (HiTLCPS) ensure safety of each person in the loop process. In this paper, the authors discuss a system safety framework with an illustrative example with STAMP model to clarify what point for safety should be considered and what role of person in the should have.
CyberSecurity Malaysia: Towards Becoming a National Certification Body for Information Security Management Systems Internal Auditors
Internal auditing is one of the most important activities for organizations that implement information security management systems (ISMS). The purpose of internal audits is to ensure the ISMS implementation is in accordance to the ISO/IEC 27001 standard and the organization’s own requirements for its ISMS. Competent internal auditors are the main element that contributes to the effectiveness of internal auditing activities. To realize this need, CyberSecurity Malaysia is now in the process of becoming a certification body that certifies ISMS internal auditors. The certification scheme will assess the competence of internal auditors in generic knowledge and skills in management systems, and also in ISMS-specific knowledge and skills. The certification assessment is based on the ISO/IEC 19011 Guidelines for auditing management systems, ISO/IEC 27007 Guidelines for information security management systems auditing and ISO/IEC 27001 Information security management systems requirements. The certification scheme complies with the ISO/IEC 17024 General requirements for bodies operating certification systems of persons. Candidates who pass the exam will be certified as an ISMS Internal Auditor, whose competency will be evaluated every three years.
Road Accidents Bigdata Mining and Visualization Using Support Vector Machines
Useful information has been extracted from the
road accident data in United Kingdom (UK), using data analytics
method, for avoiding possible accidents in rural and urban areas.
This analysis make use of several methodologies such as data
integration, support vector machines (SVM), correlation machines
and multinomial goodness. The entire datasets have been imported
from the traffic department of UK with due permission. The
information extracted from these huge datasets forms a basis for
several predictions, which in turn avoid unnecessary memory
lapses. Since data is expected to grow continuously over a period
of time, this work primarily proposes a new framework model
which can be trained and adapt itself to new data and make
accurate predictions. This work also throws some light on use of
SVM’s methodology for text classifiers from the obtained traffic
data. Finally, it emphasizes the uniqueness and adaptability of
SVMs methodology appropriate for this kind of research work.
Identity Management in Virtual Worlds Based on Biometrics Watermarking
With the technological development and rise of virtual worlds, these spaces are becoming more and more attractive for cybercriminals, hidden behind avatars and fictitious identities. Since access to these spaces is not restricted or controlled, some impostors take advantage of gaining unauthorized access and practicing cyber criminality. This paper proposes an identity management approach for securing access to virtual worlds. The major purpose of the suggested solution is to install a strong security mechanism to protect virtual identities represented by avatars. Thus, only legitimate users, through their corresponding avatars, are allowed to access the platform resources. Access is controlled by integrating an authentication process based on biometrics. In the request process for registration, a user fingerprint is enrolled and then encrypted into a watermark utilizing a cancelable and non-invertible algorithm for its protection. After a user personalizes their representative character, the biometric mark is embedded into the avatar through a watermarking procedure. The authenticity of the avatar identity is verified when it requests authorization for access. We have evaluated the proposed approach on a dataset of avatars from various virtual worlds, and we have registered promising performance results in terms of authentication accuracy, acceptation and rejection rates.
A Reasoning Method of Cyber-Attack Attribution Based on Threat Intelligence
With the increasing complexity of cyberspace security, the cyber-attack attribution has become an important challenge of the security protection systems. The difficult points of cyber-attack attribution were forced on the problems of huge data handling and key data missing. According to this situation, this paper presented a reasoning method of cyber-attack attribution based on threat intelligence. The method utilizes the intrusion kill chain model and Bayesian network to build attack chain and evidence chain of cyber-attack on threat intelligence platform through data calculation, analysis and reasoning. Then, we used a number of cyber-attack events which we have observed and analyzed to test the reasoning method and demo system, the result of testing indicates that the reasoning method can provide certain help in cyber-attack attribution.
Malware Beaconing Detection by Mining Large-scale DNS Logs for Targeted Attack Identification
One of the leading problems in Cyber Security today
is the emergence of targeted attacks conducted by adversaries with
access to sophisticated tools. These attacks usually steal senior level
employee system privileges, in order to gain unauthorized access to
confidential knowledge and valuable intellectual property. Malware
used for initial compromise of the systems are sophisticated and
may target zero-day vulnerabilities. In this work we utilize common
behaviour of malware called ”beacon”, which implies that infected
hosts communicate to Command and Control servers at regular
intervals that have relatively small time variations. By analysing
such beacon activity through passive network monitoring, it is
possible to detect potential malware infections. So, we focus on
time gaps as indicators of possible C2 activity in targeted enterprise
networks. We represent DNS log files as a graph, whose vertices
are destination domains and edges are timestamps. Then by using
four periodicity detection algorithms for each pair of internal-external
communications, we check timestamp sequences to identify the
beacon activities. Finally, based on the graph structure, we infer the
existence of other infected hosts and malicious domains enrolled in
the attack activities.
Cybersecurity Awareness through Laboratories and Cyber Competitions in the Education System: Practices to Promote Student Success
Cybersecurity is one of the greatest challenges society faces in an age revolving around technological development. With cyber-attacks on the continuous rise, the nation needs to understand and learn ways that can prevent such attacks. A major contribution that can change the education system is to implement laboratories and competitions into academia. This method can improve and educate students with more hands-on exercises in a highly motivating setting. Considering the fact that students are the next generation of the nation’s workforce, it is important for students to understand concepts not only through books, but also through actual hands-on experiences in order for them to be prepared for the workforce. An effective cybersecurity education system is critical for creating a strong cyber secure workforce today and for the future. This paper emphasizes the need for awareness and the need for competitions and cybersecurity laboratories to be implemented into the education system.
Long Term Examination of the Profitability Estimation Focused on Benefits
Strategic investment decisions are characterized by
high innovation potential and long-term effects on the
competitiveness of enterprises. Due to the uncertainty and risks
involved in this complex decision making process, the need arises for
well-structured support activities. A method that considers cost and
the long-term added value is the cost-benefit effectiveness estimation.
One of those methods is the “profitability estimation focused on
benefits – PEFB”-method developed at the Institute of Management
Cybernetics at RWTH Aachen University. The method copes with
the challenges associated with strategic investment decisions by
integrating long-term non-monetary aspects whilst also mapping the
chronological sequence of an investment within the organization’s
target system. Thus, this method is characterized as a holistic
approach for the evaluation of costs and benefits of an investment.
This participation-oriented method was applied to business
environments in many workshops. The results of the workshops are a
library of more than 96 cost aspects, as well as 122 benefit aspects.
These aspects are preprocessed and comparatively analyzed with
regards to their alignment to a series of risk levels. For the first time,
an accumulation and a distribution of cost and benefit aspects
regarding their impact and probability of occurrence are given. The
results give evidence that the PEFB-method combines precise
measures of financial accounting with the incorporation of benefits.
Finally, the results constitute the basics for using information
technology and data science for decision support when applying
within the PEFB-method.
Suggestion for Malware Detection Agent Considering Network Environment
Smartphone users are increasing rapidly. Accordingly, many companies are running BYOD (Bring Your Own Device: Policies to bring private-smartphones to the company) policy to increase work efficiency. However, smartphones are always under the threat of malware, thus the company network that is connected smartphone is exposed to serious risks. Most smartphone malware detection techniques are to perform an independent detection (perform the detection of a single target application). In this paper, we analyzed a variety of intrusion detection techniques. Based on the results of analysis propose an agent using the network IDS.
Towards an Understanding of Social Capital in an Online Community of Filipino Music Artists
Cyberspace has become a more viable arena for
budding artists to share musical acts through digital forms. The
increasing relevance of online communities has attracted scholars
from various fields demonstrating its influence on social capital. This
paper extends this understanding of social capital among Filipino
music artists belonging to the SoundCloud Philippines Facebook
The study makes use of various qualitative data obtained from
key-informant interviews and participant observation of online and
physical encounters, analyzed using the case study approach.
Soundcloud Philippines has over seven-hundred members and is
composed of Filipino singers, instrumentalists, composers, arrangers,
producers, multimedia artists and event managers. Group interactions
are a mix of online encounters based on Facebook and SoundCloud
and physical encounters through meet-ups and events. Benefits
reaped from the community are informational, technical,
instrumental, promotional, motivational and social support. Under the
guidance of online group administrators, collaborative activities such
as music productions, concerts and events transpire. Most conflicts
and problems arising are resolved peacefully. Social capital in
SoundCloud Philippines is mobilized through recognition, respect
An Investigation on Organisation Cyber Resilience
Cyber exercises used to assess the preparedness of a
community against cyber crises, technology failures and Critical
Information Infrastructure (CII) incidents. The cyber exercises also
called cyber crisis exercise or cyber drill, involved partnerships or
collaboration of public and private agencies from several sectors.
This study investigates Organisation Cyber Resilience (OCR) of
participation sectors in cyber exercise called X Maya in Malaysia.
This study used a principal based cyber resilience survey called CSuite
Executive checklist developed by World Economic Forum in
2012. To ensure suitability of the survey to investigate the OCR, the
reliability test was conducted on C-Suite Executive checklist items.
The research further investigates the differences of OCR in ten
Critical National Infrastructure Information (CNII) sectors
participated in the cyber exercise. The One Way ANOVA test result
showed a statistically significant difference of OCR among ten CNII
sectors participated in the cyber exercise.
Leadership in Future Operational Environment
Rapidly changing factors that affect daily life also affect operational environment and the way military leaders fulfill their missions. With the help of technological developments, traditional linearity of conflict and war has started to fade away. Furthermore, mission domain has broadened to include traditional threats, hybrid threats and new challenges of cyber and space. Considering the future operational environment, future military leaders need to adapt themselves to the new challenges of the future battlefield. But how to decide what kind of features of leadership are required to operate and accomplish mission in the new complex battlefield? In this article, the main aim is to provide answers to this question. To be able to find right answers, first leadership and leadership components are defined, and then characteristics of future operational environment are analyzed. Finally, leadership features that are required to be successful in redefined battlefield are explained.
Cyber Security in Nigeria: A Collaboration between Communities and Professionals
Security can be defined as the degree of resistance to, or protection from harm. It applies to any vulnerable and valuable assets, such as persons, dwellings, communities, nations or organizations. Cybercrime is any crime committed or facilitated via the Internet. It is any criminal activity involving computers and networks. It can range from fraud to unsolicited emails (spam). It includes the distant theft of government or corporate secrets through criminal trespass into remote systems around the globe. Nigeria like any other nations of the world is currently having her own share of the menace that has been used even as tools by terrorists. This paper is an attempt at presenting cyber security as an issue that requires a coordinated national response. It also acknowledges and advocates the key roles to be played by stakeholders and the importance of forging strong partnerships to prevent and tackle cybercrime in Nigeria.
Challenges in Anti-Counterfeiting of Cyber-Physical Systems
This paper examines the system protection for cyber-physical
systems (CPS). CPS are particularly characterized by their
networking system components. This means they are able to adapt to
the needs of their users and its environment. With this ability, CPS
have new, specific requirements on the protection against anti-counterfeiting,
know-how loss and manipulation. They increase the
requirements on system protection because piracy attacks can be
more diverse, for example because of an increasing number of
interfaces or through the networking abilities. The new requirements
were identified and in a next step matched with existing protective
measures. Due to the found gap the development of new protection
measures has to be forced to close this gap. Moreover a comparison
of the effectiveness between selected measures was realized and the
first results are presented in this paper.
Distributed Manufacturing (DM) - Smart Units and Collaborative Processes
Applications of the Hausdorff space and its mappings
into tangent spaces are outlined, including their fractal dimensions
and self-similarities. The paper details this theory set up and further
describes virtualizations and atomization of manufacturing processes.
It demonstrates novel concurrency principles that will guide
manufacturing processes and resources configurations. Moreover,
varying levels of details may be produced by up folding and breaking
down of newly introduced generic models. This choice of layered
generic models for units and systems aspects along specific aspects
allows research work in parallel to other disciplines with the same
focus on all levels of detail. More credit and easier access are granted
to outside disciplines for enriching manufacturing grounds. Specific
mappings and the layers give hints for chances for interdisciplinary
outcomes and may highlight more details for interoperability
standards, as already worked on the international level. The new rules
are described, which require additional properties concerning all
involved entities for defining distributed decision cycles, again on the
base of self-similarity. All properties are further detailed and assigned
to a maturity scale, eventually displaying the smartness maturity of a
total shopfloor or a factory. The paper contributes to the intensive
ongoing discussion in the field of intelligent distributed
manufacturing and promotes solid concepts for implementations of
Cyber Physical Systems and the Internet of Things into
manufacturing industry, like industry 4.0, as discussed in German-speaking