Design of Sensor Nodes based on Principles of LoRaPhy and LoRaWAN

Author(s):  
Maruti Muthu ◽  
Sanket Sunil Gore ◽  
Sarvesh Sandesh Sawant

The idea of research is based on long distance low power wireless communication. The primary objective is to prolong the battery life of the communication device connected to the sensors. This is done by providing single gateway to multiple nodes. Conventionally in IoT (Internet of Things) applications, the sensors are interfaced with 3G/4G modules to push the data to the server directly. Hence, individual devices require more power to send data to the server. This would deteriorate the battery backup of the system and would increase the power dissipation. By using LORA node-gateway network, a group of sensors send data to a single gateway which requires less power compared to the conventional methods. This article provides insight about the LoRa sensor node design and enclosure setup. Chirp Spread Spectrum (CSS) or Frequency Shift Keying (FSK) modulation technique is used to encode the data that is to be sent to the server under this scheme. The data acquired at the gateway is formatted and forwarded to the server.

Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7613
Author(s):  
Dominik Widhalm ◽  
Karl M. Goeschka ◽  
Wolfgang Kastner

In wireless sensor networks, the quality of the provided data is influenced by the properties of the sensor nodes. Often deployed in large numbers, they usually consist of low-cost components where failures are the norm, even more so in harsh outdoor environments. Current fault detection techniques, however, consider the sensor data alone and neglect vital information from the nodes’ hard- and software. As a consequence, they can not distinguish between rare data anomalies caused by proper events in the sensed data on one side and fault-induced data distortion on the other side. In this paper, we contribute with a novel, open-source sensor node platform for monitoring applications such as environmental monitoring. For long battery life, it comprises mainly low-power components. In contrast to other sensor nodes, our platform provides self-diagnostic measures to enable active node-level reliability. The entire sensor node platform including the hardware and software components has been implemented and is publicly available and free to use for everyone. Based on an extensive and long-running practical experiment setup, we show that the detectability of node faults is improved and the distinction between rare but proper events and fault-induced data distortion is indeed possible. We also show that these measures have a negligible overhead on the node’s energy efficiency and hardware costs. This improves the overall reliability of wireless sensor networks with both, long battery life and high-quality data.


2011 ◽  
Vol 63-64 ◽  
pp. 978-982 ◽  
Author(s):  
Wen Si Wang ◽  
Ning Ning Wang ◽  
Michael Hayes ◽  
Brendan O'Flynn ◽  
Cian O'Mathuna

Wireless sensor networks are frequently used to monitor temperature and other manufacturing parameters in recent years. However, the limited battery life posts a constraint for large sensor networks. In this work, thermoelectric energy harvester is designed to effectively convert the heat into electrical energy to power the wireless sensor node. Bismuth telluride thermoelectric modules are optimized for low temperature conditions. Charge pump and switching regulator based power management module is designed to efficiently step up the 500mV thermoelectric voltage to 3.0V level for wireless sensor nodes. This design employs electric double-layer capacitor based energy storage with considerations on practical wireless sensor node operation. The implemented energy harvester prototype is proposed for Tyndall wireless sensor system to monitor temperature and relative humidity in manufacturing process. The prototype was tested in various conditions to discover the issues in this practical design. The proposed prototype can expect a 15 years operative lifetime instead of the 3-6 months battery lifetime.


Information ◽  
2020 ◽  
Vol 11 (11) ◽  
pp. 524
Author(s):  
Joseph Habiyaremye ◽  
Marco Zennaro ◽  
Chomora Mikeka ◽  
Emmanuel Masabo ◽  
Santhi Kumaran ◽  
...  

Nowadays with the evolution of Internet of Things (IoT), building a network of sensors for measuring data from remote locations requires a good plan considering a lot of parameters including power consumption. A Lot of communication technologies such as WIFI, Bluetooth, Zigbee, Lora, Sigfox, and GSM/GPRS are being used based on the application and this application will have some requirements such as communication range, power consumption, and detail about data to be transmitted. In some places, especially the hilly area like Rwanda and where GSM connectivity is already covered, GSM/GPRS may be the best choice for IoT applications. Energy consumption is a big challenge in sensor nodes which are specially supplied by batteries as the lifetime of the node and network depends on the state of charge of the battery. In this paper, we are focusing on static sensor nodes communicating using the GPRS protocol. We acquired current consumption for the sensor node in different locations with their corresponding received signal quality and we tried to experimentally find a mathematical data-driven model for estimating the GSM/GPRS sensor node battery lifetime using the received signal strength indicator (RSSI). This research outcome will help to predict GPRS sensor node life, replacement intervals, and dynamic handover which will in turn provide uninterrupted data service. This model can be deployed in various remote WSN and IoT based applications like forests, volcano, etc. Our research has shown convincing results like when there is a reduction of −30 dBm in RSSI, the current consumption of the radio unit of the node will double.


Author(s):  
Yang Wang ◽  
Feifan Wang ◽  
Yujun Zhu ◽  
Yiyang Liu ◽  
Chuanxin Zhao

AbstractIn wireless rechargeable sensor network, the deployment of charger node directly affects the overall charging utility of sensor network. Aiming at this problem, this paper abstracts the charger deployment problem as a multi-objective optimization problem that maximizes the received power of sensor nodes and minimizes the number of charger nodes. First, a network model that maximizes the sensor node received power and minimizes the number of charger nodes is constructed. Second, an improved cuckoo search (ICS) algorithm is proposed. This algorithm is based on the traditional cuckoo search algorithm (CS) to redefine its step factor, and then use the mutation factor to change the nesting position of the host bird to update the bird’s nest position, and then use ICS to find the ones that maximize the received power of the sensor node and minimize the number of charger nodes optimal solution. Compared with the traditional cuckoo search algorithm and multi-objective particle swarm optimization algorithm, the simulation results show that the algorithm can effectively increase the receiving power of sensor nodes, reduce the number of charger nodes and find the optimal solution to meet the conditions, so as to maximize the network charging utility.


Sensors ◽  
2021 ◽  
Vol 21 (3) ◽  
pp. 953 ◽  
Author(s):  
Yuanchu Yin ◽  
Jiefan Qiu ◽  
Zhiqiang Li ◽  
Mingsheng Cao

When a wireless sensor node’s wireless communication fails after being deployed in an inaccessible area, the lost node cannot be repaired through a debugging interaction that relies on that communication. Visible light communication (VLC) as a supplement of radio wave communication can improve the transmission security at the physical layer due to its unidirectional propagation characteristic. Therefore, we implemented a VLC-based hybrid communication debugging system (HCDS) based on VLC using smartphone and sensor node. For the system’s downlink, the smartphone is taken as the VLC gateway and sends the debugging codes to the sensor node by the flashlight. To improve the transmission efficiency of the downlink, we also propose a new coding method for source coding and channel coding, respectively. For the source coding, we analyze the binary instructions and compress the operands using bitmask techniques. The average compression rate of the binary structure reaches 84.11%. For the channel coding, we optimize dual-header pulse interval (DH-PIM) and propose overlapped DH-PIM (ODH-PIM) by introducing a flashlight half-on state. The flashlight half-on state can improve the representation capability of individual symbols. For the uplink of HCDS, we use the onboard LED of the sensor node to transmit feedback debugging information to the smartphone. At the same time, we design a novel encoding format of DH-PIM to optimize uplink transmission. Experimental results show that the optimized uplink transmission time and BER are reduced by 10.71% and 22%, compared with the original DH-PIM.


Author(s):  
Songzuo Liu ◽  
Habib Hussain Zuberi ◽  
Yi Lou ◽  
Muhmmad Bilal Farooq ◽  
Shahabuddin Shaikh ◽  
...  

AbstractLinear chirp spread spectrum technique is widely used in underwater acoustic communication because of their resilience to high multipath and Doppler shift. Linear frequency modulated signal requires a high spreading factor to nearly reach orthogonality between two pairs of signals. On the other hand, nonlinear chirp spread spectrum signals can provide orthogonality at a low spreading factor. As a result, it improves spectral efficiency and is more insensitive to Doppler spread than the linear counterpart. To achieve a higher data rate, we propose two variants (half cycle sine and full cycle sine) of the M-ary nonlinear sine chirp spread spectrum technique based on virtual time-reversal mirror (VTRM). The proposed scheme uses different frequency bands to transmit chirp, and VTRM is used to improve the bit error rate due to high multipath. Its superior Doppler sensitivity makes it suitable for underwater acoustic communication. Furthermore, the proposed method uses a simple, low-power bank of matched filters; thus, it reduces the overall system complexity. Simulations are performed in different underwater acoustic channels to verify the robustness of the proposed scheme.


2019 ◽  
Vol 11 (21) ◽  
pp. 6171 ◽  
Author(s):  
Jangsik Bae ◽  
Meonghun Lee ◽  
Changsun Shin

With the expansion of smart agriculture, wireless sensor networks are being increasingly applied. These networks collect environmental information, such as temperature, humidity, and CO2 rates. However, if a faulty sensor node operates continuously in the network, unnecessary data transmission adversely impacts the network. Accordingly, a data-based fault-detection algorithm was implemented in this study to analyze data of sensor nodes and determine faults, to prevent the corresponding nodes from transmitting data; thus, minimizing damage to the network. A cloud-based “farm as a service” optimized for smart farms was implemented as an example, and resource management of sensors and actuators was provided using the oneM2M common platform. The effectiveness of the proposed fault-detection model was verified on an integrated management platform based on the Internet of Things by collecting and analyzing data. The results confirm that when a faulty sensor node is not separated from the network, unnecessary data transmission of other sensor nodes occurs due to continuous abnormal data transmission; thus, increasing energy consumption and reducing the network lifetime.


Sign in / Sign up

Export Citation Format

Share Document