• Nathan McCallum • Programming
I keep a journal using a little Rails app I built. Nothing super fancy. It’s just so I can write what I’m thinking and feeling throughout the day. Every so often I look back and think about whether I’m okay with how things are going.
Some issues came up and I ended up merging my journalling app functionality into another personal project. The application code was really simple but of course the old entries needed to get moved across as well. So the objective was to get the entries out of the old Postgres database and into a new one. Along the way one column needed to be renamed as well.
How hard can it be?
I first tried to use Rails Admin to export the entries. Soon afterward, I discovered that it doesn’t have an import function. So it was possible to download the entries but not import them into the new database.
Next I tried to use pgAdmin to export the entries as some kind of
csv file and then import them into the new database.
Was I able to get this to work? Nope!
The import always failed because of something wrong with the date format in the source file.
(This stunt was made particularly difficult by Heroku hobby datastores, which appear to contain hundreds of databases pertaining to different Heroku user’s applications. Obviously your user can only access your own database table but the performance implications were fascinating. It looked like there were 50-100 reads occurring every second. This is why it’s free and also slow sometimes!)
So what worked?
Okay, here we go:
jsonfile of the entires using Rails Admin
rails consoleof the destination app:
require 'json' require 'net/http' url = URI('<Github Gist url here>') entries = JSON.parse(Net::HTTP.get(url)) entries.each do |entry| e = PersonalLog::Entry.new e.content = entry['content'] e.created_at = Time.zone.parse(entry['created_at']) e.updated_at = Time.zone.parse(entry['updated_at']) e.save! end
This worked but I’m convinced that there was a better way eluding me.
‘Till next time, I guess.