Elixir: Log data changes


So far we’ve made our Authentication and Sign up, Sign in, Sign out features based on Guardian JWT Token and Uberauth OAuth 2 standards. We now want to add a feature that will log each modification for a specified model. This part is a little more tricky because there is no standard way to do it. In this post I’ll show you my way of implementing it. First of all let’s understand what we are working on : we needed to store all model changes which could have files attached to it. We want to track every creation, edition, deletion and media management action done by the user.

In order to manage all kinds of log we’ll use a PostgreSQL BSON field allowing us to have no predefined structure for our logs. That's how we defined our log table, attached to a data entry with a comment map field.

defmodule MyApp.Log do
  use Ecto.Schema

  import Ecto.Changeset

  alias MyApp.Log

  @primary_key {:id, Ecto.UUID, autogenerate: true}

  @derive {Phoenix.Param, key: :id}

  schema "logs" do
    field(:comment, {:map, :string})

    belongs_to(
      :data,
      MyApp.Data,
      on_replace: :update,
      foreign_key: :data_id,
      type: Ecto.UUID
    )

    timestamps()
  end

  @doc false

  def changeset(%Log{} = log, attrs) do
    log
    |> cast(attrs, [:comment])
    |> validate_required([:comment])
    |> foreign_key_constraint(:data_id)
  end
end


We have just a single map field in our schema which will contain all the logged data we need and of course the foreign key to the followed object. Keep in mind this simple structure as we’ll be using it afterwards.

As a reminder we’ll see how we insert a model to the datastore and we will see how to get back the changeset.

def create_data(%User{} = user, attrs \\ %{}) do
  # create the data changeset that validate changes

  g =
    %Data{medias: []}
    |> Data.changeset(attrs)
    |> Ecto.Changeset.put_assoc(:user, user)

  # create the log changeset by getting data changes

  bc_attrs =
    Map.put(
      bc_attrs,
      :comment,
      Map.merge(g.changes, %{
        user: Ecto.Changeset.get_field(g, :user, %User{}).id
      })
    )

  bc_log = Log.changeset(%Log{}, bc_attrs)

  # insert data and store log in a transaction to prevent unknown states

  Multi.new()
  |> Multi.insert(:data, g)
  |> Multi.run(:log, &create_log_data_relation(&1.data, bc_log))
  |> Repo.transaction()
end

defp create_log_data_relation(%Data{} = data, %Ecto.Changeset{} = bc_log) do
  Ecto.Changeset.put_assoc(bc_log, :data, data) |> Repo.insert()
end

So to be able to log we first need a changeset for the data we're monitoring, that’s why we create the changeset first. Then we generate a log changeset for the transaction because we do not want the log to be saved separately from original data. We store every changes stored in Ecto.Changeset.changes/0 in the comment field and add the attached user information by using Ecto.Changeset.get_field/3 which try to get the information from changes or in original data if there is no modification, this way we can track associated user too.

Then we use Ecto.Multi for a transactional request. The specific is the Ecto.Multi.run/3 function allowing us (through partial application) to pass the last Ecto.Multi.insert/3 result as first parameter with the current log changeset so we can associate the inserted data before saving it in our logging table.

As you can see we’re performing a customizable change recording machine in just about 15 code lines. It is condensed in term of feature but the code seems pretty readable when knowing the context. As a training you could try to make this code more generic so it can support also editing the data.

For deletion logging, it is more tricky. As Ecto will mark your data as deletable, you’ll not be able to associate it to you logging entry. My solution was simply to change the operation order in my transaction so Ecto will let me store my log first and then delete the entry.


Multi.new()
|> Multi.run(:log, fn _ -> create_log_data_relation(data, bc_log) end)
|> Multi.delete(:data, g)
|> Repo.transaction()

Just be aware that on a hard deletion there is no changes applied as no attributes is modified. You’ll need to manually add your own necessary log information accordingly as of the user association.

Hope you'll better understand what are the changeset and how you can use them to manipulate your application models.

Next we'll see some more Elixir libraries to make some cryptography and documentation.

Elixir: Media Management


Let's imagine that our Product Owner decided it was absolutely essential for our customers to be able to attach some images to their stored data. You need it secure and done ASAP (last week), this is when you should have some interests in using the Arc library. As described on GitHub, it is a flexible file upload and attachment library for Elixir.

It allows you to simply manipulate and store (using separate threads of course) images on various providers like Amazon S3. Actually, it do not only manage images but also any binary file upload.

I needed to upload images and word / PDF documents to my data model to store them on S3 and link it in my datastore to my model.

I’ll let you have a look at the official Arc documentation to grab all the details but in my case I chose to store files at the path /uploads/data/{data.id}/medias/{version_filename} were data.id is the data UUID, version the thumbed or original version and filename the original file name.

defmodule MyApp.MediaUploader do
  use Arc.Definition

  use Arc.Ecto.Definition

  alias MyApp.Repo

  # To add a thumbnail version:

  @versions [:original, :thumb]

  # Whitelist file extensions:

  def validate({file, _}) do
    ~w(.jpg .jpeg .gif .png .pdf) |> Enum.member?(Path.extname(file.file_name))
  end

  # Define a thumbnail transformation:

  def transform(:thumb, {file, _scope}) do
    if Enum.member?(~w(.jpg .jpeg .gif .png), Path.extname(file.file_name)) do
      {:convert, "-strip -thumbnail 250x250^ -format png", :png}
    else
      :noaction
    end
  end

  # Override the persisted filenames:

  def filename(version, {file, _scope}) do
    file_name = Path.basename(file.file_name, Path.extname(file.file_name))

    "#{version}_#{file_name}"
  end

  def filename(version, _) do
    version
  end

  # Override the storage directory:

  def storage_dir(_version, {_file, media}) do
    "uploads/data/#{media.data.id}/medias/"
  end
end


Let’s have a look at the model we defined using the arc_ecto library which provides a way to store the media reference in datastore.


defmodule MyApp.Media do
  use Ecto.Schema

  use Arc.Ecto.Schema

  import Ecto.Changeset

  alias MyApp.Media

  @primary_key {:id, Ecto.UUID, autogenerate: true}

  @derive {Phoenix.Param, key: :id}

  schema "medias" do
    field(:name, MyApp.MediaUploader.Type)

    belongs_to(
      :data,
      MyApp.Data,
      on_replace: :update,
      foreign_key: :data_id,
      type: Ecto.UUID
    )

    timestamps()
  end

  @doc false

  def changeset(%Media{} = model, attrs \\ %{}) do
    model
    |> cast_attachments(attrs, [:name])
    |> validate_required([:name])
    |> foreign_key_constraint(:data_id)
    |> cast_assoc(:data)
  end
end

We can see the usage of Arc.Ecto.Schema in the name field referring to our MediaUploader presented just before. It will automatically store for you the references to the media in the version you need. To get the media you’ll simply use the MediaUploader.url/1 or MediaUploader.url/2 to get the URL for a specified version.

To get back to the MediaUploader where all the logic is done, we see that the validate/1 function only allows some file extensions. The transform/2 function creates a thumbnail for images only. The storage_dir/2 and filename/2 allows us to customize the directory where we’ll store our file.

Note that when getting the media URL you can ask Arc to get a signed URL which will be available for a short time. This allows you to protect accesses to your resources like with direct access links.


I strongly encourage you to check the Arc module (hex) documentation to fully understand what it is possible to do with this awesome piece of software.

Hope you enjoy this blog post series and that you'll share it. Thanks.

Most seen