Quantcast
Channel: Active questions tagged ruby - Stack Overflow
Viewing all articles
Browse latest Browse all 4631

Ruby + sidekiq - best solution for execute and handle big data

$
0
0

Imagine we have 10k entities-x. For each entity-x we should make async api call. Each api call returns 100 entities-y. Then in total we have 10k * 100 = 1_000_000 entities-y. For each entity-y we should make another async api call and get the result. Question - what is the best way to do this?

For context: my core have 16 threads.

My first thought was to separate entities-x (10k) between threads so each thread could handle its own number of entities-x and entities-y. For example if we have 10k entities-x then we could divide it by 16 (number of threads) and gave the result to every thread. But then I understood that ruby can run only one thread at the time. Although sidekiq run jobs concurrently on separate threads.

Then my thought was to separate entities-x between sidekiq jobs so if we have 10k entities-x then we could divide it by 16 (number of threads) and gave the result to every job. But I don't know about that. In theory we can use more number of jobs than number of threads and I don't know does this will be more efficient or not. What do you think?


Viewing all articles
Browse latest Browse all 4631

Latest Images

Trending Articles



Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>