Building and evaluating P2P systems using the Kompics component framework

We present a framework for building and evaluating P2P systems in simulation, local execution, and distributed deployment. Such uniform system evaluations increase confidence in the obtained results. We briefly introduce the Kompics component model and its P2P framework. We describe the component architecture of a Kompics P2P system and show how to define experiment scenarios for large dynamic systems. The same experiments are conducted in reproducible simulation, in real-time execution on a single machine, and distributed over a local cluster or a wide area network.

This demonstration shows the component oriented design and the evaluation of two P2P systems implemented in Kompics: Chord and Cyclon. We simulate the systems and then we execute them in realtime. During realtime execution we monitor the dynamic behavior of the systems and interact with them through their Web-based interfaces. We demonstrate how component-oriented design enables seamless switching between alternative protocols.

A routing Ad Hoc network for disaster scenarios

In this work we study the wireless networks without infrastructure especially in emergency situations where groups of rescuers must be on site to accomplish emergency tasks, which is necessary to establish a wireless communication in real time between individuals or groups. The nature of MANET (Mobile Ad Hoc network) makes it suitable to be used in the context of emergencies and that, when the existing infrastructure is down or severely overloaded. In emergency cases Ad Hoc networks can be used to deploy quickly small spontaneous networks. Since nodes are mobile, the network topology may change rapidly and randomly.

The increasing mobility of terminals makes them progressively dependent on their autonomy from the power source; this is illustrated by introducing many mobility models and using many scenario of mobility in emergency situation. Energy efficiency in emergency scenario is the main objective of this paper, achieved by the combination of a low-power mode algorithm and a power-aware routing strategy. A selected set of simulation studies indicate a reduction in energy consumption and a significant increase in node lifetime whereas network performance is not affected significantly. This is the big interest of our works in emergency situation, by increasing life time of nodes individuals can communicate longer and give more chance to rescuers to find them.

Value Is in the Eye of the Beholder: Early Visual Cortex Codes Monetary Value of Objects during a Diverted Attention Task

A central concern in the study of learning and decision-making is the identification of neural signals associated with the values of choice alternatives. An important factor in understanding the neural correlates of value is the representation of the object itself, separate from the act of choosing. Is it the case that the representation of an object within visual areas will change if it is associated with a particular value? We used fMRI adaptation to measure the neural similarity of a set of novel objects before and after participants learned to associate monetary values with the objects. We used a range of both positive and negative values to allow us to distinguish effects of behavioral salience (i.e., large vs. small values) from effects of valence (i.e., positive vs. negative values).

During the scanning session, participants made a perceptual judgment unrelated to value. Crucially, the similarity of the visual features of any pair of objects did not predict the similarity of their value, so we could distinguish adaptation effects due to each dimension of similarity. Within early visual areas, we found that value similarity modulated the neural response to the objects after training. These results show that an abstract dimension, in this case, monetary value, modulates neural response to an object in visual areas of the brain even when attention is diverted.

Energy-efficient privacy homomorphic encryption scheme for multi-sensor data in WSNs

The recent advancements in wireless sensor hardware ensures sensing multiple sensor data such as temperature, pressure, humidity, etc. using a single hardware unit, thus defining it as multi-sensor data communication in wireless sensor networks (WSNs). The in-processing technique of data aggregation is crucial in energy-efficient WSNs; however, with the requirement of end-to-end data confidentiality it may prove to be a challenge. End-to-end data confidentiality along with data aggregation is possible with the implementation of a special type of encryption scheme called privacy homomorphic (PH) encryption schemes. This paper proposes an optimized PH encryption scheme for WSN integrated networks handling multi-sensor data. The proposed scheme ensures light-weight payloads, significant energy and bandwidth consumption along with lower latencies.

The performance analysis of the proposed scheme is presented in this paper with respect to the existing scheme. The working principle of the multi-sensor data framework is also presented in this paper along with the appropriate packet structures and process. It can be concluded that the scheme proves to decrease the payload size by 56.86% and spend an average energy of 8-18 mJ at the aggregator node for sensor nodes varying from 10-50 thereby ensuring scalability of the WSN unlike the existing scheme.

Enhancing Optical-CDMA Confidentiality With Multicode-Keying Encryption

Optical codes with large cardinality and tree structures of multiple subsets of codewords for adjustable code performance and cardinality have recently been proposed. As studied in this paper, these characteristics support multicode-keying encryption for enhancing physical-layer confidentiality in optical code-division multiple-access systems and networks.

The concept of the multicode-keying encryption technique is introduced. The associated all-optical hardware is designed and validated with OptiSystem™ simulation. The theoretical analyses of confidentiality improvement by means of rapid codeword switching and multicode-keying encryption are formulated.

Dynamic chaining of Virtual Network Functions in cloud-based edge networks

This manuscript investigates the issue of implementing chains of network functions in a “softwarized” environment where edge network middle-boxes are replaced by software appliances running in virtual machines within a data center. The primary goal is to show that this approach allows space and time diversity in service chaining, with a higher degree of dynamism and flexibility with respect to conventional hardware-based architectures.

The manuscript describes implementation alternatives of the virtual function chaining in a SDN scenario, showing that both layer 2 and layer 3 approaches are functionally viable. A proof-of-concept implementation with the Mininet emulation platform is then presented to provide a practical example of the feasibility and degree of complexity of such approaches.

Secure communication scheme for wireless sensor networks to maintain anonymity

In wireless sensor networks it is becoming more and more important for sensor nodes to maintain anonymity while communicating data because of security reasons. Anonymous communication among sensor nodes is important, because sensor nodes want to conceal their identities either being a base station or being a source node. Anonymous communication in wireless sensor networks includes numerous important aspects, for instance base station anonymity, communication association anonymity, and source node anonymity. From the literature, we can observe that existing anonymity schemes for wireless sensor networks either cannot realize the complete anonymities, or they are suffering from various overheads such as enormous memory usage, complex computation, and longcommunications.

This paper is presenting an efficient secure anonymity communication protocol (SACP) for wireless sensor networks that can realize complete anonymities offering minimal overheads with respect to storage, computation and communication costs. The given secure anonymitycommunication protocol is compared with various existing anonymity protocols, and the performance analysis shows that our protocol accomplishes all three anonymities: sender node anonymity, base station anonymity, and communication association anonymity while using little memory, lowcommunication cost, and small computation costs.

Design & development of daughter board for Raspberry Pi to support Bluetooth communication using UART

Reliable and secured communication between two or more devices require wired connection. A wireless communication such as – Bluetooth, WI-Fi, ZigBee etc. provides flexible and inexpensive solution for remote applications. A large number of low cost hardware platforms such as Raspberry Pi, Arduino, mbed boards etc. are available that do not provide any inbuilt wireless module but are equipped with UART, I2C ports for design and development of Internet of Things (IoT) and embedded applications. Designers face the difficulty to use such a low cost hardware for wireless communicationin order to implement these applications.

In this paper, Design & Development of Daughter Board for Raspberry Pi to support Bluetooth Communication using UART is proposed as an integrated solution. Results of various QoS parameters such as transmission rate, file format(text, pdf, image & audio), baud ratesand range for Bluetooth communication between two devices are presented. The result shows that Bluetooth module is capable of transmitting files of same size in different format approximately in same time. Proposed solution can further be extended to different protocols such as Wi-Fi and ZigBee etc.

IEEE P1906.1/D2.0, Oct 2014 – IEEE Draft Recommended Practice for Nanoscale and Molecular Communication Framework

This recommended practice provides a definition, terminology, conceptual model, and standard metrics for ad hoc network communication at the nanoscale. The physical properties of nanoscalecommunication extend human-engineered networking in ways beyond that defined in existingcommunication standards including in vivo, sub-cellular medical communication, smart materials and sensing at the molecular level, and the ability to operate in environments that would be too harsh for macroscale communication mechanisms to operate.Nanoscale communication also requires collaboration among a highly-diverse set of disciplines with differing definitions and connotations for some terms, thus a common terminology is required in order to aid in inter-discipline collaboration.

A common framework for thinking abstractly about nanoscale communication aids in defining and relating research and development effort. Components of the framework are independent enough to allow them to be developed in relative isolation, yet the components are also interoperable. Example mappings between specific nanoscale communication use-cases and the common framework serve to illustrate the recommended practice. Simulation code implementing the common framework for both wireless and molecular nanoscale communication provides an embodiment of the common framework demonstrating precisely how the framework is applied.

Test case analytics: Mining test case traces to improve risk-driven testing

In risk-driven testing, test cases are generated and/or prioritized based on different risk measures. For example, the most basic risk measure would analyze the history of the software and assigns higher risk to the test cases that used to detect bugs in the past. However, in practice, a test case may not be exactly the same as a previously failed test, but quite similar. In this study, we define a new risk measure that assigns a risk factor to a test case, if it is similar to a failing test case from history. The similarity is defined based on the execution traces of the test cases, where we define each test case as a sequence of method calls.

We have evaluated our new risk measure by comparing it to a traditional risk measure (where the risk measure would be increased only if the very same test case, not a similar one, failed in the past). The results of our study, in the context of test case prioritization, on two open source projects show that our new risk measure is by far more effective in identifying failing test cases compared to the traditional risk measure.