What is the definition of real time here?
The engineering definition of real time is roughly fast enough to be
interactive. However, I put a stronger definition. In real time application
or data, there is no such thing as an answer which is supposed to be late
and correct. The timeliness is part of the application. If we get the right
answer too slowly it becomes useless or wrong. We also need to be aware of
latency trades off with throughput. So it all depends what do you want to
do with all these artifacts for your needs.
Also within a larger architecture often latency is dictated by the lowest
denominator which often does not adhere to our definition of low latency.
For example, Kafka as widely deployed today in Big Data Architecture is
micro-batch. A moderate-latency message queue that is Kafka plus low
latency processor equals a moderate-latency architecture. Hence, the low
latency architecture must be treated within that context.
Have a look at this article of mine:https://www.linkedin.com/pulse/real-time-processing-trade-data-kafka-flume-spark-talebzadeh-ph-d-/
Dr Mich Talebzadeh
LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.
On Wed, 21 Aug 2019 at 09:53, Eliza <[EMAIL PROTECTED]> wrote: