I would like to write ten (or a billion) events to JSON and save as files.
I am writing in a Databricks notebook in Scala. I want the JSON string to have randomly generated values for fields like "Carbs":
{"Username": "patient1", "Carbs": 92, "Bolus": 24, "Basal": 1.33, "Date": 2017-06-28, "Timestamp": 2017-06-28 21:59:...}
I successfully used the following to write the date to an Array() and then save as a JSON file.
val dateDF = spark.range(10)
.withColumn("today", current_date())
But what is the best way to write random values to an Array and then save the Array as a JSON file?
Aucun commentaire:
Enregistrer un commentaire