I am trying to build a script to convert all .csv files in a folder to
.json.
There are several ways to do this. You could simply use a bash script:
#!/bin/bash
for csv in *.csv; do
ruby csv_to_json $csv
end
in which case, you'd need to modify your script to pick up the file from ARGV:
csvfilename = ARGV.shift
jsonfilename = csvfilename.sub(/\.csv$/i, '.json')
jsonFile=File.open(jsonfilename.'w')
…
FasterCSV.foreach(csvfilename,…
You could also do that for… bash stuff directly in the script with the command line:
ruby csv_to_json *.csv
in which case, your script will become:
while (csvfilename = ARGV.shift) do
jsonfilename = csvfilename.sub(/\.csv$/i,'.json')
…. # and so on, inside the while loop
end
Or pull everything inside:
csvfiles = Dir['*.csv']
while (csvfilename = csvfiles.shift) do
jsonfilename … # like above
end
I have a script with me where I use FasterCSV to convert
individual CSVs to JSON i.e like this:
Code:
#!/usr/local/bin/ruby
require "rubygems"
require "fastercsv"
require "json"
jsonFile = File.open("some.json",'w');
jsonData = {}
FasterCSV.foreach("some.csv",
:headers => true, :header_converters => :symbol) do |row|
# puts "Row is #{row}"
jsonData[somename] = {
:name => row[2],
:length => row[3],
:lat => row[4],
:long => row[5],
};
# puts "data is #{jsonData[jsonData.length-1]}"
jsonFile.write(JSON.pretty_generate(jsonData));
But how can I change this to search all csv files in a folder and
generate corresponding JSON.
2. One other thing I am stuck with is how to generate nested JSON i.e
currently my JSON is like this:
Code:
{
"NAME1": {
"LAT": "37.847048",
"Name": "SDFSDG",
"length": "0.03",
"LON": "-123.3433"
}
}{
"NAME1": {
"LAT": "37.847048",
"Name": "SDFSDFSDAFDG",
"length": "0.03",
"LON": "-123.32334433"
}
}
One issue with above is that NAME1 is redundant. So I want to add it to
the top and then have 2 seperate fields with other data like "lat",
"lon", "length".
Why not just make them elements in a JSON array:
{[{
"LAT": "37.847048",
"Name": "SDFSDG",
"length": "0.03",
"LON": "-123.3433"
},
{
"LAT": "37.847048",
"Name": "SDFSDFSDAFDG",
"length": "0.03",
"LON": "-123.32334433"
},
…
]}
code for that:
jsonData = # Array instead of hash
FasterCSV.foreach("some.csv", :headers => true, :header_converters => :symbol) do |row|
jsonData<< {
:name => row[2],
:length => row[3],
:lat => row[4],
:long => row[5],
};
end
Alternatively, you can keep the hash of hashes and do this:
jsonData = {}
FasterCSV.foreach("some.csv", :headers => true, :header_converters => :symbol) do |row|
jsonData[row[2]]= {
:name => row[2],
:length => row[3],
:lat => row[4],
:long => row[5],
};
end
Assuming the "name" column in some.csv is uniq.
···
On Sep 16, 2013, at 8:14 PM, Varun Joshi <lists@ruby-forum.com> wrote:
I would really appreciate the help.
--
Posted via http://www.ruby-forum.com/\.