First of all, if you are reading this, please share it. It is amazing how widely many engineers have adopted wrong concepts as if it were correct.
Even in Wikipedia there is a completely wrong explanation of Map Reduce pattern. Although, I’m not sure if this is the original pattern, or just the issue was not addressed.
We are in the beginnings of what already is a revolution of productive, labor, and social nature, which has very positive aspects, but also aspects that might not be so.
It is quite important that Big Data specialists not only know all technical aspects, but also they should be aware of social implications. And this is the meaning of politicizing Big Data algorithms; it has nothing to do with politics, but with discussing social implications and risks.
In these discussions, we should see who wins and who loses with certain algorithm implementation. For example, lets suppose that we have an…
As you may already know, Big Data has being described by several characteristics, known as the V’s of Big Data, some of them are — Volume of data, Velocity which data is acquired with, Variety of data, Veracity of data, and the Value data can bring.
Different experiments done in scientific fields such as — physics, biology, astrophysics, astronomy, geology — generate a huge amount of data. These amounts of data are already pushing limits of technology, such is the case of a particle accelerator laboratory (CERN in Geneva).
Along decades, scientists have been improving their experiments, and they accumulate…
The more you give, the more you get — And couldn’t be more true at the time we share what we know.
One studying technique I loved to use at university to fully understand certain topic was just to explain to someone else what I was trying to learn. This technique not only help others, but mainly helps to the one is giving the explanation. This studying technique leads you to answer questions regarding the topic of your interest you didn’t thought before. …
In many articles we can find a definition of big data. But now let’s put big data in perspective. Before big data, what we used to do was implement business intelligence models. In short, business intelligence allows us to create conceptual models which help us to analyze historical and current data, and also it might create statistical models to predict future. On the other hand, big data doesn’t create conceptual models only, but focuses on creation of mathematical models, and it’s able to predict future behavior better, answer known and unknown questions, and take decisions on real time.
The future digitization of currency and payments is inevitable and Central Banks will play an important role in driving innovation. This post is not about Great Reset, as everyone knows this is a fact and will happen, but this is about incorporation of cryptocoins in the world economy.
Many things happened in 2020, and one of the most important in the financial world was the normalization of cryptocurrencies such as Bitcoin and other blockchain-based assets. For that reason, 2020 ended up with rapid cryptocurrency adoption and record prices.
Anywhere around we will see Apache Kafka tutorials regarding how to read/write topics, stream data processing, etc. Many of these tutorials seem to be on the basis of Object Oriented Paradigm (OOP). Before going further, we should ask — What is Kafka for? — We can flatly say that Kafka is used for real-time data streaming solutions. Now the question is — Is it optimum if we model a real-time data streaming solution using OOP? — I don’t think so.
In data processing, a stream is a flow of data. Video/audio playing, stocks market data are typical examples of streams…
One of the great breakthroughs of blockchain technology, is that it allows for digital items to take on certain physical properties, such as irreplicability or the ability to exist in only one place at a time. This means that we can now create virtual objects that can be transacted digitally while preserving the value of their physical form. Such objects are known as tokens.
For users, a token is simply a container of data whose chain of ownership is tracked by a distributed ledger. Tokens have existed in traditional finance for some time: credit card issuers, for example, use tokens…
Distributed Ledger Technology (DLT) holds the promise of redefining existing industries, inventing new business models, and enabling new markets through disrupting innovation.
There is a great deal of confusion about what DLT entails. Blockchain is merely one manifestation of distributed ledger technology. Many people see blockchain technology as Bitcoin; however, blockchain has many use cases beyond digital money. Read on to learn about what DLT is, how it works, why it is important, and how financial services can use it.
No universal definition of DLT has gained widespread acceptance yet within the tech community. To make it simple, we can…