Endpoints

I discuss the benefits and drawbacks of our auto-generated datatypes for backend communication. Ideally, you want your en/decoders to be generated for you as it leaves no room for misinterpretation. Having generated type aliases means you can also generate setters. Json.Decode.Pipeline is great, but limited to concrete types. I present a different way of decoding by folding over extensible records which are required by our encoders.

So this is very exciting because recently, we've taken our domain specific datatypes from good to great! But to explain how that works, I need to start at the beginning. Our backend is sql + stored procs + C#. We're very lucky in that our backend guy loves to structure his code, create interfaces, write meta-code everywhere and hack Elm on the side.

We have over 100 datatypes right now, and they are all generated by the backend. So our src/Types/Api folder contains over 100 files and 16k Loc (1/3 of our app). These datatypes are used for two things.

  1. Every endpoint we hit will communicate via one of these, so we never get errors about using the wrong field or having the wrong type. The decoder is generated using the backend and resolves to type aliases or primitive types. Soon, union types too.
  2. These types are used on the client as view models for forms. Since their types are known, they have setters, encoders, init all auto-generated too. This saves a lot of repetitive code mapping from request to view model and back.

Domain specific datatypes

Here's what one of them looks like:

module Types.Api.Person exposing (..)
{-| Auto generated from ... C# library...
-}
import Json.Decode as JD
import Json.Decode.Pipeline as JDP
import Json.Encode as JE

type Field
    = Id Int
    | PersonDetail PersonDetail
    | ...


update : PersonField -> Person -> Person
update f o =
    case f of
        Id v ->
            { o | id = v }

        PersonDetail v ->
            { o | personDetail = v }

type alias Person =
    { id : Maybe Int
    , personDetail : PersonDetail
    , ...
    }

init : Person
init =
    { id = Nothing
    , personDetail = PersonDetail.init
    , ...
    }

decoder : JD.Decoder Person
decoder =
    JDP.decode Person
        |> JDP.optional "id" JD.int 0
        |> JDP.optional "personDetail" PersonDetail.decoder PersonDetail.init

encoder : Person -> JE.Value
encoder o =
    JE.object
        [ ( "id", HP.encodeMaybe JE.int o.id )
        , ( "personDetail", PersonDetail.encoder o.personDetail )
        ]

So as you can imagine, the Person object has a lot of data. It can have nested types and these are also auto-generated by the backend. It means that the frontend doesn't have the opportunity to screw up encoding/decoding as we get a strongly typed record with each call to the backend.

The Field type is exposed, so the record comes with auto-generated setter functions.

All our forms use at least one of these. For example, when creating a person, we use the CreatePersonRequest type, so the page looks like this:

-- NewPerson.elm

type alias NewPerson =
    { request : CreatePersonRequest
    , step : Step
    , errors : List String
    }


init : Lookup -> ( NewPerson, Job Msg )
init lookup =
    let
        initRequest =
        CreatePersonRequest.init

update : Msg -> NewPerson -> Lookup -> ( NewPerson, Job Msg )
update msg panel lookup =
    case msg of
        CreatePersonRequestMsg msg ->
        let
            request =
              CreatePersonRequest.update msg panel.request
        in
        ( { panel | request = request }, Job.init )

view : Lookup -> NewPerson -> Html Msg
view lookup panel =
    [ ( "Title", Components.titleDropdown lookup request.title (CreatePersonRequest.Title >> CreatePersonRequestMsg) )
    , ( "Surname", Components.input request.surname (CreatePersonRequest.Surname >> CreatePersonRequestMsg) )
    ]

So the page owns the data in NewPerson.request, it uses the auto-generated CreatePersonRequest.init and routes updates to CreatePersonRequest.update and finally maps the msgs in the view to its exposed Field via CreatePersonRequest.Surname >> CreatePersonRequestMsg.

This translates to our views having two states, a UI state and the endpoint (domain) state which is used to send data back to the backend. All forms contain a domain state and most of them will contain some UI state which lives as long as the form is open.

Generic domain specific datatypes

The above pattern works very well for us, but as our components became more generic (Hybrid components) we encountered a problem. When a component wishes to encode to json (e.g http request), it must pass the concrete type to encode because the encoder is fixed to a particular type encoder : Person -> JE.Value. However, from Components, we know that no components hold concrete domain models. So logically we had to make the encoder generic. encoder : Person a -> JE.Value. But then, we had to change the alias type alias Person a = { a | ... } and that means changing every single reference to Person and all the other 100+ datatypes spread throughout the code. It would be a massive refactor.

The solution was in defining both a generic and a concrete record:

-- we introduce a interface that covers all of Person

type alias IPerson a =
    { a
        | id : Int
        , personDetail : PersonDetail
        , ...
    }

-- we make Person the same for all intents and purposes

type alias Person =
    IPerson {}

-- no one's the wiser
init : Person
decoder : JD.Decoder Person

-- but now we can auto-generate a generic encoder
encoder : IPerson a -> JE.Value


-- This means the caller, our hybrid component, isn't forced to use a concrete type to encode.

One stumbling block was that I realised that Json.Decode.Pipelinejdp couldn't decode into an extensible record because the way it works is to expose a curried record constructor and fills in each field on each pipe which gives it quite a nice usage.

decoder : JD.Decoder Person
decoder =
    -- JDP.decode just returns a JD.succeed, which looks like
    -- ( a -> b -> c -> d -> Person) assuming Person had 4 fields
    JDP.decode Person
        -- gives the curried function one field
        |> JDP.optional "id" JD.int Nothing
        -- gives the curried function a second field
        |> JDP.optional "personDetail" PersonDetail.decoder PersonDetail.init
        -- etc...
        -- last field, now I have a JD.Decoder Person, yaya!!
        |> JDP.optional ....

So rather than use the constructor, and since the update function for each Field is conveniently given to us, we called upon Alfred.

optional : String -> JD.Decoder a -> a -> (a -> field) -> (field -> model -> model) -> model -> JD.Decoder model
optional jsonField decoder default toField update model =
    JD.field jsonField decoder
        |> JD.maybe
        |> JD.map (Maybe.Extra.unwrap (toField default) toField)
        |> JD.map (\field -> update field model)

required : String -> JD.Decoder a -> (a -> field) -> (field -> model -> model) -> model -> JD.Decoder model
required jsonField decoder toField update model =
    JD.field jsonField decoder
        |> JD.map toField
        |> JD.map (\field -> update field model)

The usage is very similar to Json.Decode.Pipeline, but now it creates a empty record first and folds over that, setting any number of fields to the decoded values.

The gist of it is that we use JD.andThen to fold over the initial state JD.succeed init much like Json.Decode.Pipeline but rather than adding to a curried function, we fold over an actual record and update each field as we decode them.

decoder : JD.Decoder Person
decoder =
    [ optional "id" JD.int 0 Id update
    , optional "personDetail" PersonDetail.decoder PersonDetail update
    , ...
    ]
    |> List.foldl JD.andThen (JD.succeed init)

One draw back of this method is that it would need to init a record without any fields thus opening up impossible state in the records. For us that is not a problem as all our records were already initialised into a default invalid state! Perhaps I should say, it didn't make the problem worse.

So there you have it, auto-generated domain datatypes which decode from all endpoints and encodes to them as well, being flexible enough to handle extensible records and handling setters. Thanks backend C# guy (who actually really hates M$, EF, .Net cause he's a Java guy)!

Drawbacks to auto-generated endpoints

So our auto-generated endpoints are great! They leave no ambiguity between what the backend returns and what it expects to receive on requests.

However, there are two main issues we have in practice.

  1. The backend is in C#, it doesn't care about nullable types, preferring to use try, catch blocks to roll back failed transactions in case something goes wrong. This lack of focus on the type of data flows through to the endpoints generated and very commonly the frontend has to handle stuff like id : Maybe Int when it really should be a id: Int.

  2. This is more about how we use these endpoints. Since we have these endpoints and the types are fairly loosely defined, they are all initialised with default values of 0, "", DateTime.min and contains a lot of impossible state which means we just have to be more careful around client-side validation and making the data correct before sending it back.

Both issues have been the cause of a few minor regressions bugs.

results matching ""

    No results matching ""