On Thursday, February 2, 2023, the tenth Bug Future Show will be held, which for the first time will introduce a third parallel track called “.debug Future Show” intended primarily for IT people, where, among other things, you can find my lecture titled called “Real-time Deep Learning” starting at 11:15.

 

This is a short description of the presentation, which you can find at the link below:

https://bfs.bug.hr/

 

Machine Learning and its subset “Deep Learning” shows its real value only if it is automatically triggered by some event (in so-called “event driven architecture”) and if it is executed in real time.

In order to get the maximum benefit from such a way of applying ML, it is necessary to fulfill two more important prerequisites.

The first one is that algorithms from the field of machine learning are applied to current data instead of data warehouses and “Data Lakes” where data can be “delayed” for several days. This actually means achieving data integration in real time (real-time data integration).

The second one is quick retrieval of necessary data (“features” in ML terminology) from external systems, which is the responsibility of the solution architecture, selected software and performance tuning.

 

In the presentation, we will identify the main obstacles you will encounter when creating a solution that must apply a built neural network to new data hundreds of times per second, starting with enterprise architecture and its impact, real-time data integration, and various ways to accelerate the data pipeline.



Get notified when a new post is published!

Loading

Comments

There are no comments yet. Why not start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.