Hi,
I was reading this Gem - https://github.com/brandonhilkert/sucker_punch#usage . To understand its synchronous way to process jobs, I wrote a sample code.
###code########
require 'sucker_punch'
class SpeakerJob
include SuckerPunch::Job
def perform(person)
Person.new(person).prepare
end
end
class Person
def initialize n
@n = n
end
def prepare
task1
task2
task3
task4
end
def task1; sleep(10); p "#{@n}-T1" ;end
def task2; sleep(15); p "#{@n}-T2" ;end
def task3; p "#{@n}-T3" ;end
def task4; sleep(5) ; p "#{@n}-T4" ;end
end
# 50.times { |n| SpeakerJob.new.perform n }
threads = []
10.times do |i|
threads << Thread.new(i) { |t|
SpeakerJob.new.perform t
}
end
threads.each(&:join)
···
######################
Output is :
[arup@Ruby]$ ruby a.rb
"1-T1"
"4-T1"
"2-T1"
"8-T1"
"7-T1"
"3-T1"
"5-T1"
"0-T1"
"6-T1"
"9-T1"
"1-T2"
"1-T3"
"4-T2"
"4-T3"
2-T2""8-T2"
7-T2"
"2-T3"
"7-T3"
"8-T3"
"3-T2"
"3-T3"
"5-T2"
"5-T3"
"0-T2"
"0-T3"
"6-T2"
"6-T3"
"9-T2"
"9-T3"
"1-T4"
"4-T4"
"8-T4"
"2-T4"
"3-T4"
"0-T4"
6-T4""9-T4""7-T4""5-T4"
And after looking at the output it seems suckerpunch don't work synchronously with multiple threads. But is it possible to make the tasks synchronous using this Gem ? I have no problem with which person object is picked up by the worker. Suppose I have o1 to oN jobs in queue and I want to process each job o synchronously. Not like the output as I have given. Say if any job oi is started, until it got finished no other job should start, rather they wait....
--
Regards,
Arup Rakshit
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.
--Brian Kernighan
You can synchronize job processing with a Mutex, but that breaks down
with multiple processes or multiple computers. You could use a locking
service like Zookeeper too.
Why, though? The point of background workers is to reduce user latency
through background processing, and slowing it down means longer queues
and while apparent latency goes down, the latency between job
initiation and execution will be much much worse, and harder to fix.
···
On May 24, 2015, at 16:08, Arup Rakshit <aruprakshit@rocketmail.com> wrote:
And after looking at the output it seems suckerpunch don't work synchronously with multiple threads. But is it possible to make the tasks synchronous using this Gem ? I have no problem with which person object is picked up by the worker. Suppose I have o1 to oN jobs in queue and I want to process each job o synchronously. Not like the output as I have given. Say if any job oi is started, until it got finished no other job should start, rather they wait....
Your test code is incorrect a bit. You call a perform
method that processes a job immediately that is equivalent to simply calling a method of a class. All you do here is just running some code concurrently in manually created threads
Just use async.perform
every time — sucker_punch would execute your code in its own worker pool concurrently. perform
is only usable in unit test code.
If you want a ‘synchronous’ processing simply set workers count to 1. But it’s not scalable as you will not get benefit of multithreaded/multi-process processing. You should rethink your system to be not dependant on order of processing.
···
24.05.2015, 23:09, “Arup Rakshit” aruprakshit@rocketmail.com:
50.times { |n| SpeakerJob.new.perform n }
threads = []
10.times do |i|
threads << Thread.new(i) { |t|
SpeakerJob.new.perform t
}
end
‘Synchronously’ here means synchronous relatively to the code calling perform
. It means that job will be processed right now, not in background so all your code after a perform
call may assume that job is processed.
···
24.05.2015, 23:09, “Arup Rakshit” aruprakshit@rocketmail.com:
And after looking at the output it seems suckerpunch don’t work synchronously with multiple threads. But is it possible to make the tasks synchronous using this Gem ? I have no problem with which person object is picked up by the worker. Suppose I have o1 to oN jobs in queue and I want to process each job o synchronously. Not like the output as I have given. Say if any job oi is started, until it got finished no other job should start, rather they wait…
I tried before and it was throwing error :
···
On Monday, May 25, 2015 01:18:19 PM Vladimir Kochnev wrote:
24.05.2015, 23:09, "Arup Rakshit" <aruprakshit@rocketmail.com>:
# 50.times { |n| SpeakerJob.new.perform n }
threads = []
10.times do |i|
threads << Thread.new(i) { |t|
SpeakerJob.new.perform t
}
end
Your test code is incorrect a bit. You call a `perform` method that processes a job immediately that is equivalent to simply calling a method of a class. All you do here is just running some code concurrently in manually created threads
Just use `async.perform` every time — sucker_punch would execute your code in its own worker pool concurrently. `perform` is only usable in unit test code.
If you want a 'synchronous' processing simply set workers count to 1.
########------------------
require 'sucker_punch'
class SpeakerJob
include SuckerPunch::Job
workers 1
def perform(person)
Person.new(person).prepare
end
end
class Person
def initialize n
@n = n
end
def prepare
task1
task2
task3
task4
end
def task1; sleep(10); p "#{@n}-T1" ;end
def task2; sleep(15); p "#{@n}-T2" ;end
def task3; p "#{@n}-T3" ;end
def task4; sleep(5) ; p "#{@n}-T4" ;end
end
# 50.times { |n| SpeakerJob.new.perform n }
threads = []
10.times do |i|
threads << Thread.new(i) { |t|
SpeakerJob.new.perform t
}
end
threads.each(&:join)
#########--------
[arup@Ruby]$ ruby a.rb
E, [2015-05-25T21:39:40.092508 #4269] ERROR -- : Actor crashed!
ArgumentError: minimum pool size is 2
/home/arup/.rvm/gems/ruby-2.2.0/gems/celluloid-0.16.0/lib/celluloid/pool_manager.rb:14:in `initialize'
----------
I checked the code : https://github.com/brandonhilkert/sucker_punch/blob/master/lib/sucker_punch/queue.rb#L38 ... This code is not telling from where the error is coming.
But it's not scalable as you will not get benefit of multithreaded/multi-process processing. You should rethink your system to be not dependant on order of processing.
--
Regards,
Arup Rakshit
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.
--Brian Kernighan