Hi lugbz friends,
I have started looking for ways to convert json data into csv format.
For example to convert the json response provided by the Open Data Hub API about the list of accommodations:
http://tourism.opendatahub.bz.it/swagger/ui/index#/Accommodation/Accommodati...
Looking for simple shell tools, I came across jq, which is a powerful tool to filter json streams.
http://manpages.ubuntu.com/manpages/bionic/man1/jq.1.html
Usually it has to be used on the shell, but for playing there is even a handy website providing an online installation of the tool:
Even if I haven't been able yet to convert the json into csv format, still, I think it is a cool tool and worthwhile to be shared with you.
Did someone on the list already use it?
Best regards Patrick
Il 25 luglio 2019 19:02:36 CEST, Patrick Ohnewein patrick.ohnewein@lugbz.org ha scritto:
Hi lugbz friends,
Hi!
Even if I haven't been able yet to convert the json into csv format, still, I think it is a cool tool and worthwhile to be shared with you.
My advice when working with jq: aim to get an array and then pass it to the @csv filter (builtin since a few versions AFAIK). You might have to be creative if you want field names in the first row, so I advice to do it as last step.
Ciao, Daniele
--
Hi Patrick,
i've worked with Python+Pandas about a month ago, 1 line of code should almost do this job.
Official website: https://pandas.pydata.org/
An idea of the code:
import pandas as pd
pd.read_json(your_file_or_string).to_csv(export_file)
See you,
Marco
Il 2019-07-25 19:02 Patrick Ohnewein ha scritto:
Hi lugbz friends,
I have started looking for ways to convert json data into csv format.
For example to convert the json response provided by the Open Data Hub API about the list of accommodations:
http://tourism.opendatahub.bz.it/swagger/ui/index#/Accommodation/Accommodati...
Looking for simple shell tools, I came across jq, which is a powerful tool to filter json streams.
http://manpages.ubuntu.com/manpages/bionic/man1/jq.1.html
Usually it has to be used on the shell, but for playing there is even a handy website providing an online installation of the tool:
Even if I haven't been able yet to convert the json into csv format, still, I think it is a cool tool and worthwhile to be shared with you.
Did someone on the list already use it?
Best regards Patrick
On July 25, 2019 7:31:22 PM GMT+02:00, Marco Marinello mmarinello@fuss.bz.it wrote:
i've worked with Python+Pandas about a month ago, 1 line of code should almost do this job.
Official website: https://pandas.pydata.org/
An idea of the code:
import pandas as pd
pd.read_json(your_file_or_string).to_csv(export_file)
Thank you Marco, will check it out!
Patrick
Hi lugbz friends,
I have started looking for ways to convert json data into csv format.
For example to convert the json response provided by the Open Data Hub API about the list of accommodations:
http://tourism.opendatahub.bz.it/swagger/ui/index#/Accommodation/Accommodati...
Looking for simple shell tools, I came across jq, which is a powerful tool to filter json streams.
http://manpages.ubuntu.com/manpages/bionic/man1/jq.1.html
Usually it has to be used on the shell, but for playing there is even a handy website providing an online installation of the tool:
Even if I haven't been able yet to convert the json into csv format, still, I think it is a cool tool and worthwhile to be shared with you.
Did someone on the list already use it?
Best regards Patrick
Hi,
I'm late to chime in...
I think it is important to understand that the *hard* part is extracting/remapping the info you need into something flat. From there, it's easy to store it into rows and columns.
How you do that depends on what you already know.
*If* whoever writes these scripts, already knows JavaScript, just coding the script in JavaScript might be the easiest ways:
- you can use Node to run it as a traditional script
- I see you API uses bit masks - you can handle those with JavaScript's biwise opertors (I doubt JQ allows to do this - too lazy to look iy up)
- If you use .filter() .map(), .reduce(), ecc. => functions, the code is going to be pretty compact
I don't think JavaScript is a good language fo all purposes, but for *this* purpose in particular it looks pretty fit if not ideal to me.
If you post what you want to extract from a sample outpust, I can provide some code as an example. If somebody else does the same with JQ we can compare the solutions.
Bye, Chris.
If you post what you want to extract from a sample outpust, I can provide some code as an example. If somebody else does the same with JQ we can compare the solutions.
I have no specific output in mind. Just playing and investigating a little bit.
The dream would be to create a general converter function, which creates a flat csv over any json. A function which takes any json as input and produces a flat csv as output.
Often non technical Data Consumers of Open Data Hub ask for a CVS version, because using the json api is too complicated for them. In this cases a generic converter would be very helpful.
Patrick
If you post what you want to extract from a sample outpust, I can provide some code as an example. If somebody else does the same with JQ we can compare the solutions.
I have no specific output in mind. Just playing and investigating a little bit.
The dream would be to create a general converter function, which creates a flat csv over any json. A function which takes any json as input and produces a flat csv as output.
Often non technical Data Consumers of Open Data Hub ask for a CVS version, because using the json api is too complicated for them. In this cases a generic converter would be very helpful.
Patrick
OK,
I understand that people want to see data in spreadsheets.
But you still need to provide some logic (be it code or a JQ expression or whatever) to map your *particular* scheme into a CSV...
Something general like anyjsondata.toCSV() can be done but likely will do some tree trasversal that doesn't really give practical output for non-trivial nested data structures such as the response of /api/Accommodation.
Of course if the JSON is just an array of hashmaps with all scalar values then anything will work fine :)
Bye, Chris.
m2c: an automatic transposition tool in the hands of final users [with regarding to the original datas], which may have poor or any knowledge of structure, meaning and trustness of the queried data looks unsafe to me. CSV output should be arranged and provided from data owner in parallel to any other data format like Json. diego
On Sun, 28 Jul 2019 at 14:47, Chris Mair chris@1006.org wrote:
If you post what you want to extract from a sample outpust, I can provide some code as an example. If somebody else does the same with JQ we can compare the solutions.
I have no specific output in mind. Just playing and investigating a
little bit.
The dream would be to create a general converter function, which creates
a flat csv over any json. A function which takes any json as input and produces a flat csv as output.
Often non technical Data Consumers of Open Data Hub ask for a CVS
version, because using the json api is too complicated for them. In this cases a generic converter would be very helpful.
Patrick
OK,
I understand that people want to see data in spreadsheets.
But you still need to provide some logic (be it code or a JQ expression or whatever) to map your *particular* scheme into a CSV...
Something general like anyjsondata.toCSV() can be done but likely will do some tree trasversal that doesn't really give practical output for non-trivial nested data structures such as the response of /api/Accommodation.
Of course if the JSON is just an array of hashmaps with all scalar values then anything will work fine :)
Bye, Chris.