Proof of Play System Needed

Hi all,

Is there a way to apply a monthly proof of play for assets that need to be displayed on info-beamer driven screens?

If we need to build one, where should we start?

Nothing like this is built-in at the moment. Right now the best approach would be to modify the package you’re currently using to record “play” events. You could then send those to some external analytics server.

I would imagine something like would for for analytics.

What kind of information are you looking for? I guess time/date, device id, asset name and duration at least?

Thanks :slight_smile:

Exactly! looking to automate these info extraction. Looking for that little hint of how to modify the initial packages and where to start.

Don‘t know of you are still looking for a solution to this Problem but maybe I could offer you some help. I wrote 2 versions of a playout logger. One working and in production using mongodb and one atm testing with elasticsearch. We use it to prove to customers that they got what they payed for. Also has an Export to .xls.

What LUA code did you use to get current running asset? I’m trying to do this now.

The basic idea

Logging the playback

Within the Lua code you exactly tell when an asset is playing: The usual packages all work like this:

  • Preload asset
  • Wait for correct moment to start
  • Play asset
  • Unload asset

For the HD Image/Video player, this is the code that does this for videos:

Basically you would then need to generate a “proof of play event” once the video starts playing. In the above case that would be this line:

So what’s a “proof of play event”? Basically you’d probably want to generate something like:

     asset_id = asset_id, -- What are we playing
     duration = duration, -- .. for how long
     start = os.time(), -- unix timestamp of when playback started
     device = sys.getenv "SERIAL", -- the serial number of the device
     uuid = generate_some_uuid(), -- see below

You can get the asset_id and duration beforehand. They will be in config.json so you can just fetch them from there. You can fetch the serial number of the device using sys.getenv "SERIAL". If you prefer the device_id, you can fetch that from the __metadata entry in the generated config.json file.

The uuid is optional, but probably a good idea as it allows you to ensure an “at-most-once” delivery by simple throwing away duplicate play events at your data store backend.

Delivering the playback event to your backend

The save_proof_of_play above is a function. You’ll have to implement that somehow. The idea is to send out play events to an external process that then forwards these events to your backend system. The way to get data out of your Lua code into an external process is to use the TCP feature of info-beamer: An external program (more on this later) will connect to the info-beamer process using TCP. Your Lua code can then send data to this program. Here’s the simplest code that does just that:

local json = require "json"
local clients = {}

-- If a new TCP client connects, see if it tries to connect to the
-- "proof-of-play" path and if so, same a reference to it in the 
-- clients table.
node.event("connect", function(client, path)
    if path == "proof-of-play" then
        clients[client] = true

-- Client disconnected? Then remove our reference
node.event("disconnect", function(client)
   clients[client] = nil

-- This is the function used above which sends events to a locally
-- running progam on your Pi.
local function save_proof_of_play(event)
    -- encode event to JSON
    local data = json.encode(event)
    -- send it to all connected clients
    for client, _ in pairs(clients) do
        node.client_write(client, data)

Now everything is ready. And external package service can now connect to the info-beamer process using TCP and start receiving proof-of-play events. Here’s how this works from the command line. If you’re connected to your Pi using SSH, the following example shows how to use the included telnet client to connect to info-beamer:

info-beamer-4183873228 /space/root # telnet localhost 4444
Info Beamer PI 0.9.9-...... [pid 381/uptime 1672]. *help to get help.
{"asset_id":1234,"duration":10,"start":1234567,"device":"4890", "uuid": "foobar"}

So just run telnet localhost 4444 to connect to info-beamer. You will be greeted by the second line in the above example. Then send *raw/root/proof-of-play and your telnet client will be connected to your Lua code and will receive proof-of-play events sent by the Lua code above.

Using the Python library makes this easy: Here’s code that also connects to info-beamer and directly forwards the received proof-of-play JSON to an external backend using a POST request:

import ibquery, requests
ib = ibquery.InfoBeamerQuery("")
stream = ib.node('root/proof-of-play').io(raw=True)

for event in stream:"", data={"event": event.strip()})

That’s basically it. Events generated and sent using the save_proof_of_play end up being forwarded to the above Python code which then sends them to some external service. Of course how exactly you send those events depends on how your backend works. Right now this is the minimal code you’ll need. There is no real error handling, so events are lost of the network connection is down. For a more sophisticated system, you’d probably want to spool events in the package service and handle network errors by retrying the POST request. You could implement that by saving events to file(s) in the scratch directory.

I‘m using a custom go binary which connects locally to infobeamer and then send out the data. That way i can change the transport mechanism. Basically what Florian statet above but switched ibquery for our custom logger binary. We used http first, switched to grpc, then twrip (like grpc but easier) and finally moved to something with using the library and gnatsd server.

I guess in the future a simple system doing what I described above could be part of what info-beamer hosted offers by default. Most of what’s required for that is already there, so it wouldn’t be too difficult to add that. It’s mostly reasoning about how to best store these events and build a UI and the supporting code on the backend and OS.

1 Like

That would be useful and when integrated just a tick of a box away. Figuring out the presentation layer is still a thing that makes it more complicated. Salesdepartment loves excel files. Admins can live with kinaba/grafana charts and graphs or json responses.

Well, I tried the solution you gave, but I just don’t know LUA enough (actually I don’t know it at all) to implement it. To make my problem easier I just decided to get people walking by with a motion sensor, asset running time isn’t as important, people traffic is more important.

I made this python service that sends data to an endpoint and it works, but how can I get the device_id in python without using LUA?

import requests
from hosted import device, node, config

for (pin, state) in device.gpio.poll_forever():
    deviceId = ?????????
    counter = 0
    while state == True:
        counter += 1
        r = requests.get(''+ str(counter) + str(deviceId) )

Just need to get a proof of concept out to the clients so I can sell them on Infobeamer, the backend is already created :grin:

The device_id should be accessible on config.metadata['device_id']

(It’s read from the __metadata value in config.json in the current directory)

Yes it worked!!! Going to add this service and GPIO / API settings to HD Image/Video player options and figure out the best cadence to send data. Thanks so much!

Nice. Looking forward to seeing what you built in action some day :slight_smile:

I thought I’d resurrect this thread, as we too would benefit from having a proof of play system.

Being able to easily view/download logs, or a play history for a device is critical to be able to provide a customer with proof that an asset was played.

I did look at the documentation, and see that the last 1mb of log data is stored in RAM, if there was a way to view/export this from the Device Configuration page that would help.

Also, we are looking at a method of checking that Assets are added to playlists, apart from having a list of files that should be added to particular setups, and then taking screenshots of the setups/playlists and manually checking them off the list, I don’t know if there’s an easier method?

The 1mb log file is completely useless for any proof of play purposes: It’s unstructured and very noisy. And 1mb scrolls by fast, depending on how verbose the install setup generates output. It’s really only meant for debugging.

I’m not sure I understand the second part of your question. It sounds like you have a “source of truth” problem, as you keep two redundant lists of files: One in your “list of files” (an Excel file?) and the other one in the configuration of a particular setup. The best approach to solve this is:

  • Avoid two different lists at all: If you need to add a file, add it to info-beamer and don’t keep a secondary list of files somewhere else.
  • Or use the API to automatically sync in one or both direction, so both lists are guaranteed to be the same.

I’m conflicted on adding a native proof of play feature for the following reasons:

  • Proof of play is essentially analytics and info-beamer’s expertise is in building a digital signage platform. I’m not sure how useful the bare minimum of just storing structured information is.
  • Due to how info-beamer hosted works, this can’t be feature that works for every package available. Some don’t even play anything. It might make it difficult to understand from a users perspective. As an example: In the schedule player package, you can create a page that is playing a video that is then fully covered by an image. A proof of play feature for that package cannot really detect issues like that.
  • Customers needs vary greatly and a built-in proof of play might be a mostly unused feature as it’s really simple for anyone integrating into an existing ad booking system to create their own proof of play system. The code in one of the above answers is basically all that’s required for a minimal integration.

Feedback welcome.

Yes, the setups that allow content to be ‘layered’ would be very difficult/impossible to report on, as you say, it could have been played but been ‘hidden’ by other content.

What I meant by the lists, is that if a customer has paid to have their content added/played on a particular device, then we need to transfer that from the CRM system, where we have a list of locations ( devices ) that the content must be played at.

We already have dates in the system for when the content is actually ‘live’ and when it needs to be removed.
Even using an API, at some point the details of where the content must be added needs to be entered by a human, and so there is always a point in the process where something can be missed.

We are considering exporting a list, from the CRM, which indicates what setups/devices the content needs to be added to, and then comparing this to the actual setups/devices in info-beamer, to see if the actual asset is in the relevant setup/playlist/s.

This is basically a checklist, to double check that all content has indeed been added, and likewise, any content that is in a setup that needs to be removed will be visible.

As we have many setups/devices, this process could be quite time consuming though.

I’ll look into the code in this thread regarding the proof of play reporting.

Oh, and we know from using info-beamer, that anything we add/save to the playlist will sync and play, I have never known any issues that have been related to info-beamer, as it’s a rock solid platform :+1:

However, a customer wouldn’t just take our word that their content was played on a device.

In the ideal world, that’s not the case. Some process could automatically take the information from your system and update all info-beamer setups accordingly. That way you only have a single source of truth and problems like mismatching configuration cannot happen unless of course there’s a bug in the synchronization process.

info-beamer hosted was build to support these kind of automation from the very start. Here’s a smallish CMS that allows users to upload their content, set start/end date and automation makes sure all content ends up on all devices: It’s probably not a good example as it was hacked together in a few days, but shows one way of implementing this idea.

Interesting, and yes I agree the more automatic syncing the better.

In your example the starting point of the process is actually uploading the content, whereas this would be the final part of the process for us, as we go through quite a complex contract, Invoicing, design, and proofing process first.

Yes, it would reduce the chances of errors, but ultimately there has to be someone who enters the information, as we use contracts that specify what locations the asset will be be displayed at, and the duration of the asset, so even if we were to use a digital form, that integrated with the CRM, and then with info-beamer, it’s possible that an error could be made during the completion of the original contract.

Obviously this shouldn’t happen, as this contract forms the very basis of the whole process, and it’s critical that it’s completed correctly, but again, it’s possible that a mistake could be made, which would then propagate through the entire process.

This is where having a manual check in place, at some point in the process is really the only way to check that location/s that the asset needs to be displayed on, and it’s duration is correct.

info-beamer - Digital Signage for the Raspberry Pi community forum - Imprint