Just wondering if there is something simple already built in the std
library to remove duplicates from an array (or an enumerable). I've
seen and used various approaches, like:
module Enumerable
def dups
inject({}) {|h,v| h[v]=h[v].to_i+1; h}.reject{|k,v| v==1}.keys
end
end
a = [1,2,3,4,5,4,2,2]
p a.inject([,a[1..-1]]){|r,e|r[1].include?(e) ? [r[0]<<e, r[1][1..-1]]
: [r[0], r[1][1..-1]]}[0].uniq # => [2, 4]
b = %w(a b c c)
p b.inject([,b[1..-1]]){|r,e|r[1].include?(e) ? [r[0]<<e, r[1][1..-1]]
: [r[0], r[1][1..-1]]}[0].uniq # => ["c"]
Score one for me :-))
~ Ari
English is like a pseudo-random number generator - there are a bajillion rules to it, but nobody cares.
···
On Aug 19, 2007, at 6:39 AM, Thibaut Barrère wrote:
Hi!
Just wondering if there is something simple already built in the std
library to remove duplicates from an array (or an enumerable). I've
seen and used various approaches, like:
module Enumerable
def dups
inject({}) {|h,v| h[v]=h[v].to_i+1; h}.reject{|k,v| v==1}.keys
end
end
Here's a modification of a technique used by
Simon Kroger:
class Array
def dups
values_at( * (0...size).to_a - uniq.map{|x| index(x)} )
end
end
==>nil
%w(a b a c c d).dups
==>["a", "c"]
···
On Aug 19, 5:38 am, Thibaut Barrère <thibaut.barr...@gmail.com> wrote:
Hi!
Just wondering if there is something simple already built in the std
library to remove duplicates from an array (or an enumerable). I've
seen and used various approaches, like:
module Enumerable
def dups
inject({}) {|h,v| h[v]=h[v].to_i+1; h}.reject{|k,v| v==1}.keys
end
end
Actually you are not deleting duplicates as far as I can see. Here's another one
irb(main):012:0> a.inject(Hash.new(0)) {|h,x| h+=1;h}.inject(){|h,(k,v)|h<<k if v>1;h}
=> ["c"]
You could even change that to need just one iteration through the original array but it's too late and I'm too lazy.
Kind regards
robert
···
On 19.08.2007 12:38, Thibaut Barrère wrote:
Hi!
Just wondering if there is something simple already built in the std
library to remove duplicates from an array (or an enumerable). I've
seen and used various approaches, like:
module Enumerable
def dups
inject({}) {|h,v| h[v]=h[v].to_i+1; h}.reject{|k,v| v==1}.keys
end
end
I just thought I would put in my 2 cents. I actually had to create a
script that would run through a file and find all the duplicate account
numbers and the number of times they were duplicated and write that to a
new file.
a = [1,2,3,4,5,4,2,2]
p a.inject([,a[1..-1]]){|r,e|r[1].include?(e) ? [r[0]<<e, r[1][1..-1]]
: [r[0], r[1][1..-1]]}[0].uniq # => [2, 4]
b = %w(a b c c)
p b.inject([,b[1..-1]]){|r,e|r[1].include?(e) ? [r[0]<<e, r[1][1..-1]]
: [r[0], r[1][1..-1]]}[0].uniq # => ["c"]
Just wondering if there is something simple already built in the std
library to remove duplicates from an array (or an enumerable). I've
seen and used various approaches, like:
module Enumerable
def dups
inject({}) {|h,v| h[v]=h[v].to_i+1; h}.reject{|k,v| v==1}.keys
end
end
which will give:
%w(a b c c).dups
=> ["c"]
Anything more elegant ?
Couldn't you also just do a union with itself?
a = %w(a b c b a)
b = a & a #=> ["a", "b", "c"]
Score one for me :-))
I think that just reinvents uniq (see my previous reinvention
For what it's worth, here's a nice-looking but probably very
inefficient version:
module ArrayStuff
def count(e)
select {|f| f == e }.size
end
def dups
select {|e| count(e) > 1 }.uniq
end
end
a = [1,2,3,3,4,5,2].extend(ArrayStuff)
p a.dups # [2,3]
David
···
On Sun, 19 Aug 2007, Ari Brown wrote:
On Aug 19, 2007, at 6:39 AM, Thibaut Barrère wrote:
Does everyone agree that #dups is the best name for this? I recently
added this to Facets as #duplicates to avoid proximity to #dup. Is
that reasonable?
(Facets already had #nonuniq, btw.)
T.
···
On Aug 19, 12:34 pm, William James <w_a_x_...@yahoo.com> wrote:
On Aug 19, 5:38 am, Thibaut Barrère <thibaut.barr...@gmail.com> wrote:
> Hi!
> Just wondering if there is something simple already built in the std
> library to remove duplicates from an array (or an enumerable). I've
> seen and used various approaches, like:
> module Enumerable
> def dups
> inject({}) {|h,v| h[v]=h[v].to_i+1; h}.reject{|k,v| v==1}.keys
> end
> end
> which will give:
> > %w(a b c c).dups
> => ["c"]
> Anything more elegant ?
> cheers
> Thibaut
Here's a modification of a technique used by
Simon Kroger:
class Array
def dups
values_at( * (0...size).to_a - uniq.map{|x| index(x)} )
end
end
==>nil
Just wondering if there is something simple already built in the std
library to remove duplicates from an array (or an enumerable). I've
seen and used various approaches, like:
module Enumerable
def dups
inject({}) {|h,v| h[v]=h[v].to_i+1; h}.reject{|k,v| v==1}.keys
end
end
which will give:
%w(a b c c).dups
=> ["c"]
Actually you are not deleting duplicates as far as I can see.
Did I say it's too late? Man, I should've worn my glasses...
Here's another one
irb(main):012:0> a.inject(Hash.new(0)) {|h,x| h+=1;h}.inject(){|h,(k,v)|h<<k if v>1;h}
=> ["c"]
You could even change that to need just one iteration through the original array but it's too late and I'm too lazy.
...would work. My solution is not recommended at all - it's sunday after
lunch time, and I had the decision between cleaning the dishes or to do
some nice things before...
Wolfgang Nádasi-Donner
···
On Sun, 19 Aug 2007, Wolfgang Nádasi-Donner wrote:
> > Just wondering if there is something simple already built in the std
> > library to remove duplicates from an array (or an enumerable). I've
> > seen and used various approaches, like:
On Aug 19, 5:16 pm, Robert Klemme <shortcut...@googlemail.com> wrote:
On 19.08.2007 23:15, Robert Klemme wrote:
> On 19.08.2007 12:38, Thibaut Barrère wrote:
>> Hi!
>> Just wondering if there is something simple already built in the std
>> library to remove duplicates from an array (or an enumerable). I've
>> seen and used various approaches, like:
>> module Enumerable
>> def dups
>> inject({}) {|h,v| h[v]=h[v].to_i+1; h}.reject{|k,v| v==1}.keys
>> end
>> end
>> which will give:
>>> %w(a b c c).dups
>> => ["c"]
> Actually you are not deleting duplicates as far as I can see.
Did I say it's too late? Man, I should've worn my glasses...