Cover image

Record Video and Upload with Phoenix LiveView Hooks

Written by Gus Workman
Published on 2023-05-31

While developing Candiq, a video cover letter business, I ran into the issue of being able to record an upload video for users to record their asynchronous interview videos and upload to the server. LiveView is really nice in this regard - it has built in support for live uploads. However, I feel like my application of uploading a file which was created in the browser via the MediaRecorder API was non-standard and was not covered clearly in the examples in the docs. It took me a while to get the uploads working properly, so I wanted to share with everyone a guide on how to implement video recording + upload in Phoenix LiveView.

Setup

First of all, I am using the latest versions of Phoenix and LiveView at the time of writing:

{:phoenix, "~> 1.7.2"},
{:phoenix_live_view, "~> 0.18.16"},

To record video in the browser, we are going to need a hook. This hook needs to take care of several key tasks. That includes:

  • Loading the user’s camera and microphone streams
  • Start recording
  • Stop recording
  • Uploading the file to the server

I’m going to start with the heex markup outlined here plus the following barebones hook, and then build from there:

<div id="video_recorder" phx-hook="VideoRecorder" class="...">
  <video autoplay muted playsinline id="live_video" class="..." />
  <div class="...">
    <button type="button" phx-click="{handle_record(@recording)}" class="...">
      <.icon name="hero-video-camera-solid" class="..." />
    </button>
  </div>
</div>
hooks.VideoRecorder = {
  mounted() {},
  handleRecord() {},
  handleStop() {},
  handleUpload() {},
  async loadMedia() {},
  destroyed() {},
};

Accessing User Media Streams

First thing first, let’s initialize the hook to the starting state. In the mounted function:

mounted() {
    this.stream = null;
    this.mediaRecorder = null;
    this.loadMedia();
}

This sets some initial variables which can be used in other functions. Now, to load the media:

async loadMedia() {
    let stream = await navigator.mediaDevices.getUserMedia({
        video: true,
        audio: true,
    });
    this.el.dispatchEvent(new CustomEvent("medialoaded", { detail: stream }));
}

We dispatch an event to the hook element once the media is loaded. This is because the function runs asyncronously (as the browser must first request permission to record from the user first), and we need to sync the stream to the state of the hook. We can thus handle the event with a listener in the mounted callback:

mounted() {
    this.stream = null;
    this.mediaRecorder = null;
    this.loadMedia();

    this.el.addEventListener("medialoaded", (event) => {
        this.stream = event.detail;
        this.el.children[0].srcObject = this.stream;
    });
}

In this case, the first child of the hook must be the HTML <video> element where we want to show our image preview.

Now the image preview is showing up when we load the application. However, the browser doesn’t release the camera resources when I navigate away from the modal where I show the video preview. Strange. So let’s make sure it cleans up the resources on exit, using the destroyed callback

destroyed() {
    if (this.stream) this.stream.getTracks().forEach((t) => t.stop());
},

This code here simply stops the stream for each of the tracks (audio and video) in the media stream.

Okay great! Now we have a working example of how to initialize the media for recording, so we now want to record the stream to a file and upload it to the server.

Recording the Stream

Let’s start with the recording. To do this we want to start the recording when the user presses the video camera icon button, and stop when it is clicked again. Preferrably with some visual feedback in there to indicate the current state.

To do this, we need to jump into the LiveView code to assign some variables to the socket and handle the button press. We can’t use a normal phx-click and handle_event/3 here because we want to pass the event to the hook, not to the server. Here’s how to work around that:

@impl true
def update(_params, socket) do
    socket = assign(socket, recording: false, recorded: false)
    {:ok, socket}
end

In the above snippet (which is in the LiveComponent update callback, but could also be in the mount callback of a normal LiveView), I assign the variables recording and recorded to the socket to allow us to track the state - whether we are currently recording, and whether or not we have already recorded a video.

To send the start and stop event to the hook when the button is pressed, we’re going to implement the handle_record/1 private function which is called in the phx-click attribute of the button. We’ll use some JS commands to make it work:

defp handle_record(false) do
    JS.dispatch("start_record", to: "#video_recorder")
    |> JS.remove_class("text-white")
    |> JS.add_class("text-red-500")
end

defp handle_record(true) do
    JS.dispatch("stop_record", to: "#video_recorder")
    |> JS.add_class("text-white")
    |> JS.remove_class("text-red-500")
end

This function will send the start_record event to the hook (which initialized on the element with ID #video_recorder) when the @recording assign is false. However, we don’t update the assign here - so how will that work? We have to handle the event in the hook, then send the event to the server to update the recording assign. So for the JS event handler:

mounted() {
    this.stream = null;
    this.mediaRecorder = null;
    this.loadMedia();

    // we must first bind the hook to event handlers, or else
    // they have the wrong context
    this.handleRecord = this.handleRecord.bind(this);
    this.handleStop = this.handleStop.bind(this);

    this.el.addEventListener("start_record", this.handleRecord);
    this.el.addEventListener("stop_record", this.handleStop);

    this.el.addEventListener("medialoaded", (event) => {
        this.stream = event.detail;
        this.el.children[0].srcObject = this.stream;
    });
}

handleRecord() {
    if (this.stream) {
        this.mediaRecorder = new MediaRecorder(this.stream, {
            mimeType: "video/mp4",
        });

        this.mediaRecorder.ondataavailable = this.handleUpload;
        this.mediaRecorder.start();
        this.pushEventTo(this.el, "start_record");
    } else {
        console.log("error, stream not ready");
    }
},

handleStop() {
    if (this.mediaRecorder) {
        this.mediaRecorder.stop();
        this.pushEventTo(this.el, "stop_record");
    }
},

When the hook receives the start_record event, it first checks to make sure the stream has started (and isn’t null). It creates a MediaRecorder using the stream as the source media. I’ve specified that I want to save an .mp4 file, but you could also specify other video formats, such as video/webm.

Then, the recorder starts saving the stream, and the hook pushes the start_record event to the server. Specifically, because this hook is used in a LiveComponent and not a LiveView, I use the pushEventTo function, which can ensure that the event is delivered to the LiveComponent process.

Now, on the server, we have to handle the events and update the assigns:

def handle_event("start_record", _params, socket) do
    socket = assign(socket, recording: true, recorded: false)
    {:noreply, socket}
end

def handle_event("stop_record", _params, socket) do
    socket = assign(socket, recording: false, recorded: true)
    {:noreply, socket}
end

From this point we can start and stop the recording. Now how do we upload it?

Uploading the Recording

First, we need to let the LiveView know that we want the user to be able to upload a file. We can do that with the allow_upload/3 function in the mount (for LiveView) or update (for LiveComponent):

@impl true
def update(assigns, socket) do
    socket =
        socket
        |> assign(assigns)
        |> allow_upload(:video,
            accept: ~w(.mp4),
            auto_upload: true,
            max_file_size: 104_857_600
        )
        |> assign(recording: false, recorded: false)

    {:ok, socket}
end

This code allows the user to upload files to the video assign with .mp4 encoding, with a max size of 100 MB.

Now, I want to render a <form> with a <.live_file_input> and automatically set the input’s file to the saved recording when the user presses the stop button. Let’s write the markup for the form and input first:

<.simple_form
for={@form}
id="video-form"
phx-target={@myself}
phx-change="validate"
phx-submit="save"
>
    <div id="video_recorder" phx-hook="VideoRecorder" class="...">
        <!-- This is the hook defined above -->
    </div>

    <.live_file_input upload={@uploads.video} hidden />

    <:actions>
        <.button
        phx-disable-with="Saving..."
        class="..."
        disabled={!@recorded}>
            Save Video
        </.button>
    </:actions>
</.simple_form>

Now for the handleUpload function - it’s quite simple, actually. We already set the this.mediaRecorder.ondataavailable callback to this.handleUpload to receive the Blob of data when the recording is finished. So we just need to implement the function parse the data and upload to the server:

mounted() {
    // ...

    // bind this
    this.handleUpload = this.handleUpload.bind(this);

    // ...
}

handleUpload(event) {
    const file = new File([event.data], "file_to_upload.mp4", {
        type: "video/mp4",
    });

    this.upload("video", [file]);
}

That’s all! We just parse the event.data (which is a Blob) into a file, and then we set that file as the file we wish to upload using the this.upload function defined by the hook. The first argument is the assign we want to upload it to (from the allow_upload/4 call earlier), and the second one is the files we wish to upload.

Caution: you need to make sure that the second argument is a list of files! Even though we are only uploading one, the API asks for a list, and there were not many helpful error messages pointing me in the right direction when I made this mistake.

Consuming the Uploaded File

The final step in this process is to consume the uploaded file. We can do so in the phx-submit handler, which is as follows:

@impl true
def handle_event("save", _params, socket) do
    [video_url] =
        consume_uploaded_entries(socket, :video, fn %{path: path}, _entry ->
            dest = Path.join("priv/static/uploads", Path.basename(path))
            File.cp!(path, dest)
            {:ok, ~p"/uploads/#{Path.basename(dest)}"}
        end)

    # do stuff with the uploaded video here

    {:noreply, socket}
end

In this handler, we use the LiveView consume_uploaded_entries/3 function to get the temporary file location, then copy that file to the path of our choice. In this case, I decided to put my uploads in the priv/static/uploads directory.

If I wanted to serve the video back to the user, then I would need to make sure that the uploads folder is served with my static assets. In my_app.ex, we need to make the following change:

def static_paths, do: ~w(assets fonts images uploads favicon.ico robots.txt)

Note that this atom list now includes the uploads folder.

Demo

That’s it! So without further ado, this is the final result, which allows me to preview the media, record the video on button press, and upload to the server for further processing.

Thanks for reading!