Member-only story
Analysing Million of records in milliseconds — Thanks to ClickHouse
- Amazon keeps updating their record of your purchase till the duration, when your order reaches you.
- Your university keeps updating your grades/activities in their database, for possibly 3 or 4 years of your bachelor.
What happens eventually, is that your present activities get converted to your grand past and this grand past of many, accumulated in old database stores become huge warehouse of information which proves to be of strategic importance for organizations to know about your academic excellence, purchasing power, pattern of your thinking and actually habit loops which you & your circumstances unconsciously make during you lifetime.
There have been some specialised neural networks as RNN, LSTM and some SOTA concepts as attention models and Transformers which are dedicated to find meaningful knowledge from such time-stamped, sequential datasets. But before applying any algorithm on them, its uttermost important to handle such large grand past collected from multiple sources and query them properly for obtaining meaningful information in real or near real time.
Most of the database management systems are made with transaction in mind that guarantee ACID (Atomicity, Consistency, Isolation, Durability) property of a database and work with a SQL…