Categories
Computers Linux

Backup of Lazy8Web data

Lazy8Web is a free web application for basic bookkeeping tasks. I use it myself for the community where I live and I can really recommend it.

However, I needed to add a way to do backups. There is a function to export all data to XML but you must be logged on to do that and the authentication system doesn’t allow just using wget with username/password.

To automate the process I wrote the following little snippet that I run as a cron job. It logs onto the site, downloads the data and saves it to a file with the current date and time as part of the filename.

#!/usr/local/bin/ruby
#
# Author: Martin Bergek (http://www.spotwise.com)
# Dependencies: Ruby, Mechanize (http://mechanize.rubyforge.org)
#

require 'rubygems'
require 'mechanize'

# Make changes here
base_url = 'http://www.example.com'
username = 'user'
password = 'password'
log_folder = '/home/user/'
# No changes below this line

# Login
agent = Mechanize.new
page = agent.get(base_url + '/index.php?r=site/login')
form = page.forms.first
form['LoginForm[username]'] = username
form['LoginForm[password]'] = password
page = agent.submit(form, form.buttons.first)

# Get backup for all companies
xml = agent.get_file(base_url = '/index.php?r=company/exportAll')

# Save backup
t = Time.now
logfile = log_folder + t.strftime("%Y%m%d-%H%M%S.xml")
File.open(logfile, 'w') {|f| f.write(xml) }
css.php