Upgrading Angular Symfony in 10 Steps

Introduction


In 2013 I got some free time and provided a bootstrap project for people who wanted to create a website with a Rest API based on Symfony 2 and AngularJS. Back in the day, the project attracted some attention and some contributors begun to work on it too. It was more and more forked. But, for  lack of time, I couldn’t maintain the project and upgrade to the new software versions. So it got very outdated.

The good thing is that I took a moment to upgrade the project recently to Symfony 4 and Angular 8. So yes it was a very big gap between the software but because of the very limited size of the project it wasn’t so much of a pain. In the process I’ve learned some tricks that I wanted to share with you.

In order to make things clear I want to remember that the project use Web Service Security standard for API communication and use the REST protocol.

You might want to have the project opened in another window in order to fully follow the steps to upgrade : https://github.com/FlyersWeb/angular-symfony.

Preamble


The project use a technique called WS-Security UserToken in order to authenticate the connected user. You can have a detailed explanation by going to OASIS specifications : https://www.oasis-open.org/committees/download.php/13392/wss-v1.1-spec-pr-UsernameTokenProfile-01.htm.

The simplified process is as follow :
  • client access the shared secret by authenticating with the server
  • client generate a user token by using a nonce (random string), a created (date time) and a secret (shared with the server)
  • client send a token with a different nonce for each subsequent query

The server knows at which date time the token was created, it also can detect replay attack if the nonce is sent twice and it can authenticate the client through the shared secret.

Now that we had a quick refresh let’s dive in the upgrade process.

Upgrade Node


The project was using NodeJS 5 and we upgraded it to NodeJS 12. Because it is a big gap, we better started a new Angular 8 project from scratch based on webpack.

Upgrade PHP


The project was using PHP 5 and we upgraded it to PHP 7. There were a lot of breaking changes and some real improvements between these two versions so we also started a new project from scratch using Symfony 4.

Upgrade Angular


Between AngularJS and Angular 8 there is an abyss. The frameworks are so different that it was easier to start over a new project by copy/pasting the important parts of the algorithm. It was also a good exercise to add some improvements to the existing code.

This is how we removed a useless custom Base64 encoding function and preferred the CryptoJS version.

I’ve been searching online and didn’t get any direct usage of a Base64 encoding using CryptoJS, this is how I did it :

CryptoJS.enc.Utf8.parse(nonce).toString(CryptoJS.enc.Base64);

You first need to parse your string using the correct encoding then use the toString function specifying the Base64 output.

We also removed the custom random string function and added the ‘random-string’ dependency. FYI there is a ‘randomstring’ dependency but it didn’t work on my laptop saying that there is no global defined, maybe the NodeJS version break something for that dependency.

The Angular structure totally changed but you can find all the interesting parts in the ‘token.service.ts’ file.

Upgrade Symfony


Same as before, the gap between Symfony 2 and Symfony 4, was so big that we just started a new project from scratch. The implementation of the WS-Security UserToken is available in the official Symfony documentation https://symfony.com/doc/4.4/security/custom_authentication_provider.html.

The implementation we used is exactly the same. FYI, the regular expression to parse the X-WSSE header wasn’t working because of escaped double quote coming from Symfony request headers. Besides that it’s all the same.

It was also a good excuse for some improvements, so we preferred the use of Nelmio/CORS in lieu of a custom request listener for Cross Origin Request. We added FOS/FOSRestBundle to manage the REST API part more easily and upgraded the FOS/FOSUserBundle dependency.

Thing is, this last dependency is not fully compatible with Symfony 4 so we had to use some tricks to make it work. So while you’re installing the dependency by following the instructions you might have to.

First, move the configuration to ‘config/packages/fos_user.yaml’. Second, add the FOSUserBundle routes definitions to ‘config/routes/fos_user.yaml’.
Finally, we generated a migration and created the User entity in database using the following command :  
bin/console doctrine:migrations:diff && bin/console doctrine:schema:update –force’

Additionally we decided to set logs to stderr because the project is dockerized. This way you can have logs in real time in your docker daemon. To do that it was necessary to update ‘monolog.yaml’ configurations to use ‘php://stderr’.

Upgrade database


While working on the project I decided to migrate the existing MySQL database to PostgreSQL. Because PostgreSQL is such an amazing project, being so powerful and well maintained. I do think that MySQL might have reached this quality without all the commercial fuzz around it, but that is another story.

So moving from MySQL to PostgreSQL was actually really easy on this project, we just had to install the correct database, add pdo_psql to PHP and update the connection URL in ‘.env’. I really loved the new way to configure Symfony 4 it make it a breeze.

We also decided to improve a little the project by adding doctrine fixture for the sample user for the demo.

Staying connected


To be able to stay connected after a page refresh, we had to use the localStorage to store the token generation data. This way we can come back to our client and still be connected. Tokens have a lifetime of 5 minutes.

You can change this lifetime in the ‘WsseProvider.php’ file if necessary.

Update docker configuration


By upgrading so much of the project, we could actually make some significant improvements to docker configuration. On the Angular part, the building and watch mode process is now natively supported. On the Symfony part, the NGINX configuration through PHP-FPM was also much easier without necessity for custom configuration anymore. You can have a look to it in the dockerify folder containing the ‘docker-compose.yml’ infrastructure and the nginx configuration files.

Please be aware that the project is in development mode with watch mode activated, it is not suited for production deployment as is. The idea is more to offer a bootstrap project for developers to begin working on their project with authentication out of the box. You might have to configure your own Continuous Delivery System for deployments.

Update License


To finish the License was also upgraded to MIT License. So you're free to use, modify and/or redistribute this software. The README was also upgraded with latest installation instructions.

Conclusion


Thanks for reading, I hope that the project will be useful to you. You should have a look to the README at https://github.com/FlyersWeb/angular-symfony for more details.

Elixir : Upload zip to S3

How to upload a zip to S3


Today we'll see how to upload a bunch of files stored in a zip file using Elixir and Phoenix and a little bit of Erlang too.

First of all let's see the process, the user send us a multipart zip file using a form, we receive and store it. Then we need to unzip the file (in our case we chose to do it in /tmp). Then we use the Elixir Arc library to store it in S3. Actually we're using Arc Ecto to do it as we want to store the file reference in our database to access it later.

We want to make it transactional as we don't want to have just part of the zip stored in our database, but all entries. We'll introduce Ecto.Multi that allows us to do this.

First of all let's have a look to our Media model that will store the media in our PostgreSQL.

defmodule MyApp.Media do
  use Ecto.Schema
  use Arc.Ecto.Schema
  import Ecto.Changeset

  @primary_key {:id, Ecto.UUID, autogenerate: true}
  @derive {Phoenix.Param, key: :id}

  schema "model_medias" do
    field(:name, MyApp.Uploader.Media.Type)

    timestamps()
  end

  @doc false
  def changeset(media, attrs) do
    media
    |> cast(attrs, [:name])
    |> cast_attachments(attrs, [:name])
    |> validate_required([:name])
  end
end

At the beginning we've defined an UUID primary key but that is not mandatory and a name field defined as a MyApp.Uploader.Media. This field is just a simple string with the file name and a timestamp that will be stored in database. Now let's have a look to our media uploader module.

defmodule MyApp.Uploader.Media do
  use Arc.Definition
  use Arc.Ecto.Definition

  # To add a thumbnail version:
  @versions [:original, :thumb]

  # Whitelist file extensions:
  def validate({file, _}) do
    ~w(.jpg .jpeg .gif .png .pdf) |> Enum.member?(Path.extname(file.file_name))
  end

  # Define a thumbnail transformation:
  def transform(:thumb, {file, _scope}) do
    {:convert, "-strip -thumbnail 250x250^ -format png", :png}
  end

  # Override the persisted filenames:
  def filename(version, {file, _scope}) do
    file_name = Path.basename(file.file_name, Path.extname(file.file_name))
    "#{version}_#{file_name}"
  end

  def filename(version, _) do
    version
  end

  # Override the storage directory:
  def storage_dir(_version, {_file, _scope}) do
    "uploads/medias/"
  end
end

You can see that this is an Arc definition, to grab all the details I strongly recommend you to have a look to the official documentation. This script define that we want to store the file in the "uploads/medias/" folder and we want to create a thumbnail of 250x250 when uploading.

Now that we have our model we need to load the zip file and unzip it, to do that we'll use the :zip erlang library. You first need to “open” the archive by unzipping it (you can do it on hard drive or in memory) and then access each file to store it and then close the handler by calling zip_close. Here is how you do it :

def import(%Plug.Upload{} = zipfile) do
    path = to_charlist(zipfile.path)
    path_name = to_charlist("/tmp")

    with {:ok, handle} <- :zip.zip_open(path, [{:cwd, path_name}]),
          {:ok, file_names} = :zip.zip_get(handle) do
      try do
        # remove files beginning with '.'
        filter_hidden_files(file_names)
        # transform file to %Plug.Upload{}
        |> to_plug_upload()
        # create multi with all inserts
        |> to_multi(Multi.new())
        # run the transaction
        |> Repo.transaction()
      after
        :zip.zip_close(handle)
      end
    end
  end
end

As you can see we chose to unzip in “/tmp” folder. After getting the list of file paths, we remove the hidden ones (beginning by '.') to generate the transaction for data storage. I'll pass the filter_hidden_files function that is trivial to code and show you the to_plug_upload function that is a litle trickier.

defp to_plug_upload(_, uploads \\ [])

defp to_plug_upload([], uploads) do
  uploads
end

defp to_plug_upload([file | tail], uploads) do
  upload = %Plug.Upload{
    content_type: MIME.from_path(file),
    filename: Path.basename(file),
    path: to_string(file)
  }

  to_plug_upload(tail, uploads ++ [upload])
end


You see that the recursion is done on each file contained in the zip, we use it to generate a Plug.Upload structure by specifying the MIME type thanks to the MIME module, the filename and the file path. Remember that to use the :zip library we had to transform our string to a char list (because of Erlang compatibility), we need to do the same in the other side by changing our file path to a string.

Then there is the transaction creation through the to_multi implementation.


defp to_multi([], multi) do
  multi
end

defp to_multi([file | tail], multi) do
  attrs = %{name: file}
  m = %Media{} |> Media.changeset(attrs)
  to_multi(tail, multi |> Multi.insert(file.filename, m))
end

It's straightforward too as you see we recurse over the file list and add an insert in our transaction for each file.

I hope the quick example will make you're work easier if you're coding in Elixir or make you wanting to try it out.

Oh, and if you find the possible bug in this very basic implementation you can put it as a comment.

Elixir: Document your API


In order to have someone who will actually use your API, you need to provide some documentation and usages for it. Developer eXperience is also very important when making an API. Making back-end is difficult but having a great front-end too, that is why using your API should be a breeze.

I really like to have my documentation as near as my code, but I want it to be decoupled too. There’s different libraries that offers such a feature, some might use controller annotations like ExDoc, others generate the documentation from the test cases using Bureaucrat, personally I prefer to add my documentation explicitly manually for a matter of separation of concerns using PhoenixSwagger. This way I can eventually not provide external documentation for some endpoints, and have more test cases. It’s a matter of taste.

For the sake of standardisation I prefer to use the OpenAPI Specification (aka Swagger) in order to document my API, it provides some great tooling and is widely supported. That is why I used the phoenix_swagger library and added the documentation in my controllers. You can have a look at the OpenAPI 2.0 Specs to know more about supported formats, parameters, headers and authentication. In Phoenix we have a DSL allowing us to generate the swagger.json file based on our definitions.

To make it work just follow the phoenix_swagger installation guide, you’ll mainly need to provide your router and endpoint module name in the config.ex and a definition of swagger_info/0 in your router


def swagger_info do
  %{
    info: %{
      version: "1.0",
      title: "MY API",
      consumes: [
        "multipart/form-data",
        "application/json",
        "application/vnd.api+json"
      ]
    },
    securityDefinitions: %{
      bearerAuth: %{
        type: "apiKey",
        name: "Authorization",
        in: "header"
      }
    },
    security: [
      %{bearerAuth: []}
    ]
  }
end


This is mine, you can see that you need to define all of your API supported content-type. I’ve also defined that my API ask for a bearer authorization token.

I’ve also added a route to access the generated API at /api/swagger. It will allow developers to access the last available version of the documentation easily :


scope "/api/swagger" do
  pipe_through([:api_doc_auth])

  forward("/", PhoenixSwagger.Plug.SwaggerUI, otp_app: :MY_API, swagger_file: "swagger.json")
end


You can see that I have a pipeline applied to this route, this is because we want to add a basic authentication to the route.


pipeline :api_doc_auth do
  plug(BasicAuth, use_config: {:gi_api, :api_doc_auth})
end


We used the basic_auth Phoenix plug to make this possible and easy, that’s how you can configure it to use environment variables as login / password for your API documentation route.


config :gi_api,
  api_doc_auth: [
    username: System.get_env("BASIC_AUTH_API_USERNAME"),
    password: System.get_env("BASIC_AUTH_API_PASSWORD"),
    realm: "API Doc Area"
  ]


Now that you’ve configured how you documentation will be available, you just need to declare it :) This is done in your controller, you’ll need to call swagger_path/2 in order to define each endpoint, they can then refer to more complexes schemas that needs to be loaded calling swagger_definitions/0.


swagger_path :index do
  PhoenixSwagger.Path.get("/api/datas")
  consumes("application/vnd.api+json")
  produces("application/vnd.api+json")

  operation_id("index")

  tag("Data")

  paging(size: "page[page_size]", number: "page[page]")

  description("List data")

  response(200, "Success", Schema.ref(:Data))
  response(401, "Not Authenticated")
end


This is my definition for the paginated data schema as a JSON-API resource.


%{
  DataResource:
    JsonApi.resource do
      description("A data.")

      relationship(:user)

      relationship(:media, type: :has_many)

      attributes do
        some_attribute(:string, "Data attribute")
      end
    end,
  Data: JsonApi.single(:DataResource),
  Datas: JsonApi.page(:DataResource)
}


On our project we use JsonApi through JaSerializer and pagination thanks to Scrivener. As it is a common stack PhoenixSwagger helpers around JSON-API resources. Very useful when you’re using it as your Data Transfer Protocol. You can define a single resource with JsonApi.single/1 or a paginated resource (based on the paging parameter) with JsonApi.page/1.

Another example with a POST request getting a file from a multipart/form-data form could be.


swagger_path :create do
  PhoenixSwagger.Path.post("/api/datas/{data_id}/medias")
  consumes("multipart/form-data")
  produces("application/json")

  operation_id("create")

  tag("Medias")

  description("Create a media")

  parameters do
    data_id(:path, :string, "Data UUID", required: true)
    kind(:formData, :string, "Should be [image|document]", required: true)
    file(:formData, :file, "Attached media", required: true)
  end

  response(200, "Success", Schema.ref(:Media))
  response(404, "Not Found")
  response(422, "Unprocessable Entity")
  response(401, "Not Authenticated")
end


The :formData specify to Swagger that the API accepts a form formatted field and the type :file means that we will have a binary field and expect a multipart/form-data content type.

With all this you’ll have a smooth API documentation available at /api/swagger, protected by a login / password which supports JWT token authorization header.



You can go further by looking at the PhoenixSwagger documentation. The documentation is not as up to date as I thought but you can access the @doc annotations directly in the source code to have some more examples. I had to find in passed issues too for some of my needs. Besides it also offer easier controller testing through schema validation, but this is another story :)

I hope that this will help you provide a great developer experience to your front end developers and that they will let you waste more time on 9gag now :)

If you like this Elixir / Phoenix blog post serie please share it or drop a comment.

Most seen