Convert CSV and JSON files to SQL is a tool I created that takes any CSV file or JSON list of objects and converts them to a SQL file that can be imported to any SQL server.

It is useful to take a big CSV file and be able to query it in SQL, or to export JSON documents from a MongoDB database and import them into a SQL server.

Check it out here.

Typesafe serialization and unserialization of Go structs in JSON

Being able to encode our data and decode it back again intact is an important part of transmitting data between different services. Go provides encoders for XML, JSON and even a fast binary encoder.

JSON is an obvious choice for various reasons, it has a very simple syntax that almost all languages can understand, the only downside to this is that is a bit slow compared to binary encoders like encoding/gob.

But there’s one thing that encoding/gob doesn’t give you, which is portability, what if you want to send your data through RabbitMQ to a Node.js or Elixir backend? Well good luck because it won’t know how to decode your data.

Enter encoding/json

Encoding a struct into JSON is fairly straightforward:

package main

import (

type User struct {
	Name string
	Age  int

func main() {
	result, _ := json.Marshal(User{"John", 34})
    // Output: {"Name":"John","Age":34}

And if we want to read it back:

user := User{}
json.Unmarshal([]byte("{\"Name\":\"John\",\"Age\":34}"), &user)
fmt.Println(user.Name, user.Age)
// Output: John 34

What if I don’t know the type at compile time?

Now let’s say you’re building a library that does some kind of message passing and expect to be encoding and decoding arbitrary types.

If you’re retrieving a JSON document and need to decode data without knowing the type before hand, you can use reflection to decode it into your data type using a type safe manner.

The first thing that we need to do is be able to get a reflect.Type from it’s string representation, a simple way to do that is to maintain a map[string]reflect.Type with all the types that you need to decode.

Everytime you encode a type or you have access to the type, it can be registered in the map:

var PayloadTypes map[string]reflect.Type

type Message struct {
	Type    string
	Payload interface{}

func sendMessage(payload interface{}) {
	payloadType := reflect.TypeOf(payload)
	_, exists := PayloadTypes[payloadType.Name()]
	if !exists {
		PayloadTypes[payloadType.Name()] = payloadType
	result, _ := json.Marshal(Message{payloadType.Name(), payload})

Here we’re storing the types we encode in the PayloadTypes map so that we can reference it later when we decode.

Now the fnction can be used like this:

func main() {
	PayloadTypes = make(map[string]reflect.Type)
	sendMessage(User{"John", 34})
// Output:
// {"Type":"User","Payload":{"Name":"John","Age":34}}
// map[User:main.User]

Then we can create another function to decode the message:

type RawMessage struct {
	Type    string
	Payload json.RawMessage

func receiveMessage(msg []byte) {
	var message RawMessage
	json.Unmarshal([]byte(msg), &message)
	messageType := PayloadTypes[message.Type]
	messageValue := reflect.New(messageType)
	_ = json.Unmarshal(message.Payload, messageValue.Interface())

func main() {
	PayloadTypes = make(map[string]reflect.Type)
	sendMessage(User{"John", 34})
// Output:
// "Type":"User","Payload":{"Name":"John","Age":34}}
// {John 34}

Here we define another type identical to Message but setting Payload to be a json.RawMessage, this means that decoding the payload will be delayed untill we do it manually.

This is mainly to declare a struct of the right type and unmarshall to it later.

messageValue.Elem().Interface() will be an interface with a User value, so it can be casted or passed into a function that expects that type.