Cloud data logger with temperature sensor
Cloud data loggers store data into the cloud through secure channels. However, for data loggers that come as standalone, the data collected is stored in a particular SD – secure digital card. The data can later be recovered for analysis. The main disadvantage with independent data loggers is that data cannot be analyzed in real time. Independent data loggers are equipped with microcontrollers which manage process and resource allocation in the device. There are several benefits associated with it, where some include short lead time for the data gathering process, high abstract level of production monitoring, and user interface development. Introduction. Temperature regulation is a very crucial aspect in everyday situation to ensure human survival and safety. Temperature measuring is essential on every operation that uses heaters, and it is carried out primarily in the manufacturing industries, hospitals, greenhouses among other settings.
Temperature sensors are used to capture temperature levels and are installed in various settings where heaters are used to control the temperature of rooms with the respective configurations. Monitoring of the room temperature can be done in real time over the internet as the system continues to monitor the temperature of the room. Temperature monitoring is done on a large scale in the various process such as air conditioning, industries, power plants, automotive, among other crucial areas. The temperature data from these areas need to be stored and analyzed to ensure the smooth running of the process in the respective field. The primary purpose of the cloud data logger with the temperature sensor is to make it easy to view the current level of temperatures by the user.
Cloud data is the data gathered from the field and stored in the cloud storages. According to advocates, companies can cut or even avoid the IT associated cost when they rely on cloud computing (Zhou et al. , pg109). The proponent claimed that when companies use cloud computing technology, they can implement their systems faster as well as ensure high-level and less cost of maintenance. This enables the IT department to mitigate the unpredictable and fluctuating demands quickly. To avoid unexpected operation expenses from the cloud storage services which offers a pay as you go terms of payment, the administrators ought to be familiar with the cloud pricing model. With a built-in sensor, the data logger is a standard measurement application that is programmed as well as over some time records electrical parameters.
Microcontrollers are used in the data loggers to convert electric impulse to digital data from the process instrument. The digital data is recorded and then stored in the storage systems to undergo further analysis. There are various storage devices (Yuriyama and Kushida pg. They include floppy disks, memory cards, and non-volatile memory. Data loggers with the capability to have all its resources function for an extended period offers the user a long time to carry out data collection which statistically provides more accurate results. Statistically, the more the data sample from the population the more the certain the estimated parameters (Zhou et al. , pg109). Finally, the display system should be in a position to display all the settings from the data logger system as well in a way the user can read them.
A wide range of data logging operations is mainly found in hospitals, weather stations, laboratories, or in the manufacturing industries. This has prompted the necessity to come up with a system that can function in any environment and is portable. A cloud data logger is able to capture and transmit temperature data simultaneously which is a major problem in manual data loggers. The answer comes once the analysis knowledge is recorded mechanically and enforced knowledge sensors monitor the atmosphere. With these ways, the analysis will target the results, not registering the values appropriately. Further price comes later within the studies once all gathered knowledge is analyzed to the core and useful phenomena's are elaborate (Liang et al. Collection of temperature data is crucial as it helps prevent numerous uncertainty that would probably happen in case overheating occurred in a room.
The objective of the project. The project aims at coming up with a design of a remotely based microcontroller data logger that can capture the temperature of a given room say a server room then record the temperature readings in real time. The model will be a sort of a data logger equipped with a temperature sensor to sense the temperature in the room. The data captured will be then logged in remotely from the system to the cloud for an after analysis. Below is the elaboration of each chapter details: The thesis makes a hypothetical look for a cloud computing hypothesis, cloud computing innovation providers, Dashboard structures from the field of interface strategies and IoT for exchanging data between various assemblies' parts.
The dissertation will similarly look at the highlights of the cloud suppliers and clarify the distinctions in every innovation. In this way, through the work, conceivable usage prospects for little and medium-sized assembling organizations and scholarly research are highlighted (Vuppala, pg. Furthermore, ponder over the Additive Manufacturing technique for Direct Energy Deposition is led. Cognizance of this strategy is fundamental for the reason that usage is intended for this specific creation process. In this chapter, there will be an inclusion of the conclusion to the project outcome. Finally, section six will cover the end of all the paper as well as the recommendation of the project research. Assumptions and limitations At the preliminary phase of developing the thesis, some confines and expectations of the application stage equipment came clear (Soghoian, 2018).
The setting offering the stand for executing the intended explanation is defined in detail in the subsequent chapters. Additionally, the technology studied inside the context is also comprehensive. NET resolution 7. Information collection and conception must be handled based on the unrestricted cloud. Cloud services must retain a small knowledge bend. Designated cloud service structures must be ones depend on for imminent actuality. Literature review. Attainment of the objectives is conceivable by a new cloud founded resolution where the application is separated into two distinct comprehensions, backend and bordered. Backend functions as server gathering the information and presenting it to the front end where the data to the operator envision information. The backend can be constructed on cloud facilities and fronted can be executed with IoT Dash-board outlines.
Internet of things model connecting the user to the cloud and the data loggers to the cloud. When the result is deliberate in this way, it assists investigators to alter the information gathering as research progresses. Internet of things have made the process of doing research easy since every step of the study can be automated, for instance the analysis of the data can be automated to come up with a report as well as a data visualizations which are very vital to understanding the data in details and discovering hidden features and patterns which would not be drowned from the raw data. Long is gone the tedious work of reading values from a data logger that might be located in several locations where in some cases some more excellent details that would be important in making informed decisions would be left or discovered late in the process and the decisions made based on the information gathered lead to a fatal failure.
Cloud data logger with a temperature sensor is developed to take away the human effort as well as error, where all the data reading captured are automatically sent to the cloud server in real time. By doing so, the personnel deployed to monitor the text of the temperature will quickly notice any fluctuations and take the necessary action as required. A temperature reading is crucial, and failure to doing so can lead to a fatal disaster which could even read to death (Zhou et al. The data in the cloud server is easy to review since the cloud server is free of the operating system. Some cloud services support email alarms and ensure that the traveling parameters defined aren't surpassed. In the past, there has been introduced several methods for assembly of web-based applications.
Service-oriented architecture (SOA) is one of these methods. Some of the main features associated with using SOA is its architecture from which service consumers can interact with the provider of service who offers the requested services (Liang et al. The term cloud computing also was influenced by the fact that the data server where the data was to be stored was moved to a location far from where the data was being collected and still the storage process for the data could still be done without a hitch. Each of the three categories of cloud computing is constructed on top of each other. However, it is good to note that each service can be operated as an individual service by the service providers and satisfy the customer needs to be based on their wants.
SaaS is a service-oriented architecture, where the customers are offered another platform which they can access the software through the web. The software provided is maintained by the providers and are centralized which is a major fault since in case the software fails there will be no software running at the moment however the advantages outlays the disadvantages as the software can be accessed at any time and any location hence the reason it’s still implemented or subscribed by major companies (Liang et al. Due to this fact, numerous companies are working tirelessly to make sure they advance in the services they offer and lower on cost as much as possible to ensure they are positively different from their competitors hence gaining more customers.
Internet of things can be less technically be defined as an internet of smart sensors which are interconnected and eventually connected to the internet. Device to device method is used to ensure that the sensors can communicate fluently to each other. Through this method, different figures could be logged remotely to the cloud server for additional scrutiny. The technology behind the internet of things has acted as its catalyst to development which has grown exponentially since its discovery. They are doing so to exploit the advantages associated with cloud computing. According to Liang (pg. 19), cloud computing will be a significant factor influencing the development of factories and companies in the coming years. Configurable (channel) data logger for quality assurance and reliability testing.
In Dan Pitica as well as Jano Rajmond study work, they have constructed eight frequency configurable information system as input frequency to quantify either current or voltage. In RS-232 element, it is well-designed as a sequential interchange between Atmega128 with PC. Master and slave element was useful in this venture. They used BIM-418-F as their RF component to virtually convey and obtain the indicator. In the primary circuit, they constructed the critical device of the RS-232 element and RF component. Nonetheless, the slave circuit contains the RF component, liquid crystal display and power unit. Two receiver nodes contain air pollutants radar (TGS 2600) and are impulsive of biological thinners sensors (TGS 2620) to quantify sensitivity absorption in the atmosphere. For a receiver node, a motor regulator is a node which triggers or disengages the fan to regulate indoor air quality.
When the structure activates, both antennas will start gauging air quality depending on its task. Analog information from antennas will convey to the AT89C51CC03 microcontroller by using ingrained ADC which is translating from analog to the digital system. Then, the microcontroller will transmit a digital output to CAN transceiver and CAN will use this to interconnect between two sensor nodes and motor regulator node (Zhou et al. In a similar spot, RS-232 will convey the data to the PC for checking and scrutiny. In the course of the process, investigators faced some encounters in the system. They established that if ZigBee unit is positioned in a protracted range distance between source and receiver, there an error will occur. To address the problem and make this structure to be more faultless, academics suggested that the ZigBee be built in a memory databank by using chip memory and linked distantly to the PC via a wireless connection (Zhou et al.
, pg109). The critical imperative feature for the monitoring structure is the necessity to stock the outcomes in memory storage for future graph study (Soghoian, 2018). Most of the schemes have opted to store into a PC because of its limitless storage space. Nevertheless, for researchers Cheng, Wong and Ying, they have preferred SD card storage in their structure as memory storage. The central drive in this structure is to create a monitoring background for the dissimilarity structure. They are using 8-bit PIC18F4620 microcontroller with exterior crystal 10MHz which functions as a fundamental regulator structure followed by some components such as antennas, LCD and SD card. If the structure is using the SPI element in the SD card for storage reason, it is difficult to use I2C in the concurrently (Vuppala, pg.
To address this delinquent, investigators recommended that the structure is advanced by using bit banging technique from software design work. Conclusion In conclusion, automatic data loggers in the temperature control system, the structure will be made more efficient and will be able to carry out its functions simultaneously. Unlike manual data loggers, this system can work in any environment and transmit the data for analysis and a response generated immediately. Also, sensor network and cloud computing are becoming dominant in the market today. Luo, A. Terzis, and F. Zhao, “RACNet: a high fidelity data center sensing network,” in Proceedings of the 7th ACM Conference on Embedded Networked Sensor Systems. ACM, 2009, pp. Hu, Shu-Chiung, et al. Qian, and A. Zhou, “Security and privacy in cloud computing: a survey,” in Semantics Knowledge and Grid (SKG), 2010 Sixth International Conference on.
From $10 to earn access
Only on Studyloop