The number of objects connected to IoT is expected to reach 50 billion by 2020, giving rise to an enormous amount of valuable data. As IoT solutions emerge, the amount of available sensor data is growing, but developing insight into that data can be difficult. Analyzing historical data is often the first step in understanding data you intend to use in real time. You may want to perform some basic statistics on your data to find anomalies, and you may also need to clean up your data by removing bad data points or filtering out noise.
So how do we link algorithm to this? An algorithm is a procedure or formula for solving a problem, based on a sequence of specified actions. In mathematics and computer science, an algorithm usually means a small procedure that solves a recurrent problem. In simple terms, it is a step of simple task to accomplish a task.
With algorithm, you will know your steps, and these means data. As you get more familiar with your data, you may try to predict future data points. To use algorithm in Internet of Things, you can apply machine learning techniques. Machine learning algorithms use computational methods to “learn” information directly from data without assuming a predetermined equation as a model. They can adaptively improve their performance as you increase the number of samples available for learning. Then, we perform classification, regression and clustering analyses which are the common tasks in IoT.
Source : Mathworks